Made by Jacob Slone

Design and implement a set of best practices for creating shared world multiplayer experiences across multiple mixed and augmented reality devices.

Created: November 20th, 2015



Design and implement a set of best practices for creating shared world multiplayer experiences across multiple mixed and augmented reality devices.



Mixed reality devices are opening the doors for creating interesting content which can interact with your environment, but no visible work is being done on exploring how we share these experiences with others in natural and engaging ways. Traditionally motion tracking on these devices has not been accurate enough to hope for any amount of synchronization being maintained between two devices, but with newer technologies which are leveraging accelerometers and visual tracking to maintain much more precise spatial information this may finally be a possibility.

While spatial tracking in these mixed reality devices has improved greatly, there are still inaccuracies which will compound as more devices are introduced. Additionally, while each device is able to maintain its own position and orientation with a certain degree of accuracy, it has no mechanism for determining the pose of other devices relative to its own since they are all operating within their own self defined coordinate system. I hope to determine the best method for synchronizing all of these coordinate systems into a common world space as well as determine how often devices need to be resynchronized once a shared session has begun. Answering these two questions will be a huge step in establishing the tools and practices to share content in the same physical space with mixed and augmented reality devices.


Google Tango

For this project we were using the tablet version of the Google Tango, which is essentially a tablet equipped with a camera and a depth sensor as well as some sophisticated area learning software which is capable of mapping a room volumetrically to either create a 3d representation of a room or attempt to accurately detect the devices location within the room.

For our purposes there are two major advantages to the technology: significantly more accurate motion tracking and the ability to learn environments and find its position within these environments. This becomes especially useful given that these environments can be shared across devices.



We hope to create an open source Unity package which can be distributed to support convenient use of our golden path synchronization and extended to support multiple devices. Additionally, we will create a detailed report outlining our findings on the user research of each synchronization method and the technical information on drift, resynchronization, and network requirements as well as guidelines on how the requirements of a game should influence the fidelity vs network load balance.

This will include the production of two sample 'games' for testing purposes: laser tag and a much simpler block demo. These two test scenarios provide an excellent amount of coverage as they each represent opposite ends of the spectrum on our balance between speed and accuracy, with the laser tag having much higher speed and distance requirements where the block demo involves a much smaller space and less movement but it must be more accurate.



Our development has become a parallel approach: one team is designing and implementing the the games as our testing tools and tech demo, and the other is developing the underlying synchronization schemes necessary to drive the technology.

Currently the flow is as follows:

Create 2-3 synchronization schemes -> Create common API -> Document & Package
                                  ↘                      ↗
           Create block testing -> Create laser tag game -> Test synchronization methods
Click to Expand

Laser Tag

In order to demonstrate the synchronization technology, we built a laser-tag game demo that can enable the users to get direct experience of play in the same mixed reality world.

We designed and created the laser-gun prefab packages that including :

Laser gun model:

Including laser shooting script.

Including ammo count labels.

Animation and scripts that enable the player to touch the screen to shoot out lasers or energy balls ( energy ball shader included)

Player label with their name which will floating upon the real world player's head.

Projector with materials that can project texture into generated meshes.


Not only would this provide an example for other developers to look at and reference when working on their own experiences, but it provides an extremely important area of coverage for us: a game which has a low latency requirement but a looser fidelity requirement.

For laser tag it is extremely important that we are able to recognize movement quickly, but it much less important for us to be pixel perfect in our positioning. Overall we decided this leant itself best towards the ADF synchronization due to the large amounts of movement.




Synchronization Methods

Static - Static synchronization follows a simple rule: assume that two devices start the game in predetermined pose, then use the difference between this pose and the pose the device believes it started in as a constant offset. If the user places them correctly they will begin in the correct positions, but this method doesn't provide a means to correct drift.


Area Descriptor File - ADF synchronization is a bit more advanced. Essentially, the Tango platform takes the area learning data and compresses it down into a few key points. We will call these points anchors. The area descriptor file contains the data necessary for a Tango device to visually recognize an anchor as well as where the anchor is relative to the original scan. This in turn creates a consistent coordinate frame that Tango devices can position themselves within when they detect an anchor. Consequently, if this file is shared among multiple devices, they will actually report their position in a common coordinate frame!


Proper setup for more robust systems can be quite time consuming. From this we have made a couple heuristic observations:

Short play sessions with little movement tend to favor static positioning synchronization (placing devices in a predetermined configuration then connecting) because it: 1) saves battery by using very few sensors, 2) has very low setup time, 3) is generally accurate enough if you're not moving very far and if resynchronization can happen often.

Longer play sessions, or play sessions which require movement over a larger area tend to suite themselves to ADF synchronization. This requires a pre-scanning the environment, which has a longer setup cost, but is able to correct the discrepancies that will often occur over longer play periods.


Device Capability

Plain old motion tracking on the devices is quite stable, with minor drift over large amounts of movement. However, capturing a good sufficient ADF can be quite the chore, and the area learning occasionally makes mistakes which can result in wildly varying interpretations of a space though they are usually short lived. This issue is compounded if there are lots of objects moving through the space during play, or if the area capture is from a different time of day with different lighting.

For these reasons it is recommended that areas are always scanned before starting a game unless the same area was used immediately prior even if all devices already know the space, and play should generally be done in areas with minimal through traffic.


Issues Encountered

Common Unsolved Issues:

- Area learning offsets sometimes different, usually manifests in a player seeming very tall or very short.

- Area learning is occasionally lost causing the device to enter a sort of 'falling' state where the scanned world slowly drifts away, usually up

- Blank walls very frequently cause area learning to stop, which in turn completely disables tracking until something else is looked at, and the movement to look away is lost, causing sever rotational drift

- Battery life is abysmal. The devices need to be on the charger almost always, and even if they are left to sit idle with nothing running the battery will die within an hour or two.

Issues with workarounds:

- Mesh generation causes an internal sigbus error if an ADF is loaded at startup, solution is to generate the mesh and the ADF before the game starts, without loading an ADF

- Many Tango API pose controllers are setup to initialize the Tango service; this causes an issue as an exception is thrown with duplicate calls to initialize and specifying a uuid for the ADF is subject to race conditions. The solution is to copy the desired pose scripts, remove the TangoApplication initialization portion, and have that controlled by a separate script or enable auto-initialization on the TangoApplication script.

- In general several Tango features were broken (AR, mesh generation) on Unity 5.1/5.2 and Tango SDK Zeno. This appears to be due to android bugs in Unity 5.1 and changes in compatibility with android plugins in 5.2. As of Tango SDK Ancha these are now mostly working with Unity 5.2 and 5.3 (excluding the mesh generation issue above)

- Unity's network manager player prefab doesn't allow for different prefabs on owners vs viewers, which is important because certain scripts shouldn't be run on every machine (i.e. pose scripts). Our workaround is to use a script which selectively enables specified scripts if run on the network owner.


API Status

Tango API - overall the API isn't quite stable nor volatile. There don't seem to be any major changes happening besides bug fixes and simplification of the communication between application code and the Tango SDK.

swushd API - very volatile. Unfortunately there hasn't been enough time to develop a nice, stable API for outside developer use yet. Right now it everything is still very much in devland, and this should be resolved before leaving an 'alpha' like state.



Definitely the biggest issue we've had is the dissemination of knowledge required to make everyone productive. Since this technology is so new and lacks significant developer support networks at this point, most the knowledge we have acquired so far isn't exactly readily available, and it becomes hard to keep track of who knows what. More documentation definitely would have helped with this, but the sweet point for the tradeoff between making progress and making documentation always seemed unclear.

Share this Project


Design and implement a set of best practices for creating shared world multiplayer experiences across multiple mixed and augmented reality devices.