A merged reality experience blending real world objects with virtual world interactions
Technology: Unity3D, MQTT, Arduino
To showcase Second Story’s Merged Reality (MR) capability, I built an MR demo in Unity3D where physical objects in the real world had virtual-world counterparts located in the same place. When the physical objects were interacted with in the real world, the objects in the virtual world would react as well, and vice versa.
In order to easily imbue a physical object with MR capabilities, I leveraged the messaging protocol MQTT and wrote a plugin for Unity I called "Mqttify", which worked alongside an Arduino sketch running on a Feather to create relationships between physical and digital objects.
Working alongside a cross-geo team of developers and designers, I used this plugin to build an engaging merged-reality experience which combined a Vive tracker, a Leap Motion, and haptics to allow users to capture virtual fireflies in a physical jar in an ethereal VR world in Unity . My role was to build the Unity environment and set up the corresponding physical environment, program the firefly and 3D jar behavior in Unity, integrate the physical jar into the MR world using my Mqttify plugin, install and support the Atlanta build, and own the master build for all studios.