Skip to content

Latest commit

 

History

History
62 lines (34 loc) · 5.48 KB

synchronization.md

File metadata and controls

62 lines (34 loc) · 5.48 KB

Synchronization mechanism

Technical background

Mixer is an extension of an internal Ubisoft Animation Studio tool named VRtist. As such, it currently contains code related to VRtist, but this code is meant to be moved out of Mixer.

This section describes the high level operation principles of the Mixer addon that are common to VRtist and datablock-level synchronization.

Each Mixer addon connects to a server named broadcaster, and joins a room. Mixer keeps an in-memory copy of Blender data that is relevant for synchronization. When a depsgraph update is detected, the in-memory copy is compared to the Blender data. Detected changes are encoded into update messages and are used to update the in-memory copy. The update messages are sent to the server that broadcasts them to the other room users. On reception, the Mixer addons update the Blender data and the in-memory copy.

The network communication uses TCP sockets to exchange messages that are generated by a custom encoder. All these mechanisms were developed with a strong focus on delivering functionalities as soon as possible and there is a lot of room for enhancement.

The synchronization mechanism was originally developed for our VRtist software that include a Unity VR client. As a consequence, the data that are synchronized for VRtist are selected for their relevance to VRtist and the the synchronization "cooks" Blender data into VRtist related messages.

A probable evolution is to move the VRtist protocol into a plugin, and this would make it possible to connect other DCCs by implementing other plugins.

Blender datablock synchronization

The datablock level synchronization is in mixer/blender_data and mainly in proxy.py. This file implement Proxy classes that can be loaded from Blender properties, serialized, then deserialized and saved into Blender properties on the receiving side.

These Proxy classes are meant to mirror Blender data storage in bpy.data. This included simple properties, structures, IDs, references to IDs in bpy.data, structures and collections of all these.

At the top level, a single BpyDataProxy instance recursively mirrors most of bpy.data.

Controlling items to synchronize

What is actually synchronized is "statically" controlled by filters defined in filter.py :

  • default_exclusions controls excluded attributes for each Blender structure. This can be used to exclude properties that are used for internal use or that are just views into other properties.
  • safe_depsgraph_updates and safe_blenddata_collections control the datablock collections that are synchronized. More types and bpy.data collections are added to these items as more tests can be performed. When development is complete, both should be in sync and list most of bpy.data collections and types.

There is currently no way for the user to control what can be synchronized.

Subclasses of Proxy handles bpy.data synchronization in a very generic way, but type-specific adjustments are needed, for instance because factory methods for collection elements differ according to the element type. All these type-specific code is in specifics.py.

How to synchronize a new datablock collection?

Add a new bpy.data collection and datablock type to the synchronization in filter.py and test the synchronization during datablock creation and modification with log in info or debug mode.

There are several causes for failures :

  • a property cannot be written, for instance because it is a computed property. Filter it out in filter.py.
  • an exception occurs that is caused by a call to a specifics.py function that does not handle the type being synchronized. In this case, add type specific code to the faulty function.
  • a stack overflow is caused by circular references, like when Parent.child and Child.parent reference each other. In this case, you need to identify the authoritative field and filter out the other one in filter.py.
  • this new type might require a generic scheme that has not yet been discovered with supported types, for instance a collection with float keys, or worse ...
  • the data structure is more complex and requires type specific proxies like in node_proxy.py

The synchronization process

Changes are detected using a depsgraph modification handler, processed in handler_generic.py by send_scene_data_to_server().

We start by computing a difference between the cached BpyDataProxy and the current Blender state. This difference is restricted to added, removed and renamed datablocks.

Updated datablocks are be taken from the depsgraph update. The proxy is requested to update itself and compute a list of Delta updates. Each Delta if is "differential" update that contains updated members of the datablock. The updates are then serialized and sent.

The serialization currently uses JSON (json_codec.py) and this is just a choice to deliver features quickly. At this point, each datablock is sent as a whole and an addition mechanism should be implemented to compute a property-level difference, in order to send a minimal amount of data.

The messages are received by the server that broadcasts them to the users that are joined to the room. On reception, the build_data_xxx() functions in mixer/blender_client/data.py deserialize the command then call the appropriate BpyDataProxy.xxx_datablock() so that the global proxy updates itself and the corresponding bpy.data item, recursively updating all the sub-properties.

Known restrictions

  • Undo redo is not supported and may break the synchronization