Back to Parent

Context

Mixed reality devices are opening the doors for creating interesting content which can interact with your environment, but no visible work is being done on exploring how we share these experiences with others in natural and engaging ways. Traditionally motion tracking on these devices has not been accurate enough to hope for any amount of synchronization being maintained between two devices, but with newer technologies which are leveraging accelerometers and visual tracking to maintain much more precise spatial information this may finally be a possibility.

While spatial tracking in these mixed reality devices has improved greatly, there are still inaccuracies which will compound as more devices are introduced. Additionally, while each device is able to maintain its own position and orientation with a certain degree of accuracy, it has no mechanism for determining the pose of other devices relative to its own since they are all operating within their own self defined coordinate system. I hope to determine the best method for synchronizing all of these coordinate systems into a common world space as well as determine how often devices need to be resynchronized once a shared session has begun. Answering these two questions will be a huge step in establishing the tools and practices to share content in the same physical space with mixed and augmented reality devices.


Content Rating

Is this a good/useful/informative piece of content to include in the project? Have your say!

0