Experimenting with generative augmentations

Our team recently demonstrated how our mixed reality platform is able to tie data from other sensors, such as Kinect, with HoloLens. Thanks to its unified coordinate system, it is a good starting point to get more creative with this extra data. Could the data be used to manipulate reality, creating generative augmentations on objects or people in real time? This experiment explores different ways to animate and render auxiliary point clouds within the HoloLens environment.

From swirling geometry effects morphing between repetitive oscillating paths, flying off into space and coming back, to psychedelic color transitions, the simulation is linked to 20 tweakable parameters controlled from a tablet and patched into a shader. The experience works with any non HoloLens participants, however, as a HoloLens user, there is a special magical moment when one walks into a sensor’s range to see the simulation augmented onto yourself.

Dmitri Svistula is a Principal Software Engineer for the Razorfish Emerging Experiences team, based out of our San Francisco office.