top of page

The MR Explorer is an experimental concept made for children to experience immersion in a parallel virtual space, while continuing to interact with physical objects.

MR Explorer conceptualizes how children can fully engage with the tactile affordances of the physical world, while also enabling the myriad of immersion possibilities allowed in virtual reality.  

UX VISION

Think 

They are in control of the elements they interact with. 

Feel 

Be in awe of the infinite possibilities present in the virtual world.

Do 

Have fun making ad imagining things.

THE CONCEPT

The concept is to move blocks across a physical space into specific zones that trigger related virtual immersions.

 

This showcases a novel way for children to engage - in a simple and creative way - with mixed reality.

PROCESS DEVELOPMENT

STEP 1: Understanding VR Systems

Our initial investigation explored the Oculus and Vive eco-systems. We mapped controllers to physical objects such as chairs, and experienced an immersive space where we physically touched a virtual asset. 

STEP 2: Research

We studied how children engage with Immersion as well as objects in physical world. Through our research, we understood concepts like the Uncanny Valley, Immersive Gamification, and its mental health impact on children. This research directed the ethics of our immersion aesthetic.

STEP 3: Ideation and Exploration

Based on preliminary tests where we mapped different virtual assists to our controllers, we ideated the possible triggers and zones that we can have within the immersion. We used a simple white table as our surface of exploration and implementation.

STEP 4: Conceptualizing the MR experience

The MR Experience comprised of distinct physical spaces called "Trigger Zones" that could be activated by moving a physical block into it. Interaction with that zone would spawn specific immersions therein.

STEP 5: Gridding and Mapping

To make sure that the MR experience is precise, we used the Oculus holders' dimensions as the unit of movement, and created a grid using blue tape. We then adjusted the scale of the desk in the virtual world to match real world. These griddings and scalings ensured the virtual world mimicked the coordinates of the physical world

STEP 6: The Oculus to Vive Jump

We experienced the occasional coordinate mapping error with the Oculus headset. This disturbed the placement of our virtual assets' alignment to the real world. So, we shifted to the Vive Ecosystem as it pins and maps the coordinates to a physical space using its spatial sensors and cameras. 

 

This allowed us to build more precise interactions because the coordinates between 

the real and virtual world were in sync.

Occulus to Vive.HEIC

STEP 7: Initial Prototypes

We made our physical blocks by attaching HTC Vive sensor pucks to recycled foam core blocks. We created all the virtual assets and environments using these blocks. The vantage points were also mapped to these sensors to enable the immersive experience.

Initital Prototypes.jpg

STEP 8: Refining block prototypes

Taking inspiration from the Jenga blocks, we created our final blocks using wood and screwed our sensors to them. Each of these were individually polished, and the main white board was painted black to refine the aesthetic for the concept demonstration

STEP 9: Final space mapping

These new high fidelity block prototypes were again mapped using the Vive cameras via Photogrammetry. This completed the identification of the physical world with the virtual world - enabling a seamless mixed reality experience.

FINAL DESIGNS

Imagine if LEGO implemented this for every one of their blocks!

 

Mixed reality seems to be the natural transition step before we are all in the metaverse.

Interaction Diagram

The Side table is where the view sensor can be placed to trigger a bottom looking view of the immersion. The person can then take the block and move it to the trigger zones and experience the related immersions. When the view sensor is brought closer to the table, it triggers a vantage point that stays on top of the sensor. This enables a micro-view of the  immersion environment

interaction Diagram.png

Presentation Setup and the worlds

There are 3 worlds - the Purple Alien World, the Red Meta Train World and the Grey Mountain Peace World. When the block is kept on the zone, it will trigger the respective environment. 

Setup Image.png

DEMO AND FINAL PRESENTATION

Video 1: Immersion experience while wearing the headset.

Video 2: Immersion experience with headsets, while seeing the unity engine.

Video 3: The demonstration of the project. 

bottom of page