Categories
Project 2

Mixed Reality Game

I’ve been experimenting with the idea of creating a mixed reality game—something I first got excited about during a group project last term. That earlier experience got me thinking about how physical and virtual systems can be connected, but now I want to take it further and really explore in this project what mixed reality can mean as a creative and interactive space.

Mixed reality is often understood as bridging the physical and virtual, where virtual objects are rendered on top of the real world, touching on the idea of augmentation. But what I was more interested in was the idea that experiences and interactions happening within the virtual world, inside the headset, can actually result in physical outcomes. I’m really fascinated by that transition between the two worlds, especially how user actions in VR can influence or change something in the real world.

I’ve begun with reading to back this concept up. One book that’s really influenced my thinking so far is Reality+ by David Chalmers. He argues that virtual realities aren’t fake or “less real” than the physical world. Virtual objects and experiences—if they’re consistent, immersive, and meaningful to the user—can be just as “real” as anything in the physical world. That idea stuck with me, especially when thinking about digital objects as things that have structure, effects, and presence—even if they’re built from code instead of atoms. What I’m interested in now is creating a system where virtual actions have real-world consequences. So instead of just being immersed in a virtual world, the player’s success or failure can actually change something in the physical world. That’s where I started thinking of repurposing my Venus flytrap installation—something physical and mechanical, driven by what happens in the VR game.

The game idea is still forming, but here’s the general concept: the player is in a virtual world where they need to defeat a group of Venus flytrap-like creatures. If they succeed, a real-life Venus flytrap sculpture (which I’m building using cardboard, 3D-printed cogs, and a DC motor) will physically open up in response.




Inspirational Work

Duchamiana by Lillian Hess

One of the pieces that really inspired my thinking for this project is Duchamiana by Lillian Hess. I first saw it at the Digital Body Festival in London in November 2024. After that, I started following the artist’s work more closely on Instagram and noticed how the piece has evolved. It’s been taken to the next level — not just visually, but in terms of interactivity and curation.

Originally, I experienced it as a pre-animated camera sequence that moved through a virtual environment with characters appearing along the way. But in a more recent iteration, it’s been reimagined and enriched through the integration of a treadmill. The user now has to physically walk in order for the camera to move forward. I really love this translation, where physical movement in the real world directly affects how you navigate through the digital space.

Instead of relying on hand controllers or pre-scripted animations, the system tracks the user’s physical steps. That real-world data drives the experience. What stood out to me most was not just the interaction, but the way physical sound — like the noise of someone walking on the treadmill — becomes part of the audio-visual experience. It creates this amazing fusion: a layered, mixed-reality moment where real-world action affects both the sound and visuals of the virtual world.

It’s made me think a lot about how we might go beyond the VR headset — how we can design for transitions between physical and virtual spaces in more embodied, sensory ways. It’s a fascinating direction, and this work really opened up new possibilities in how I’m thinking about my own project.





Mother Bear, Mother Hen by Dr. Victoria Bradbury



Mother Bear Mother Hen Trailer on Vimeo

One project that really caught my attention while researching mixed reality and physical computing in VR is Mother Bear, Mother Hen by Dr. Victoria Bradbury, an assistant professor of New Media at UNC Asheville. What I found especially compelling is how the piece bridges physical and virtual spaces through custom-built wearable interfaces — essentially, two bespoke jackets: one themed as a bear and the other as a chicken.

These wearables act as physical computing interfaces, communicating the user’s real-world movements into the game world via a Circuit Playground microcontroller. The gameplay itself is hosted in VR using an HTC Vive headset. In this setup, the user’s stomping motion — tracked through the microcontroller — controls the movement mechanics within the game, while the HTC Vive controllers are used for actions like picking up objects or making selections.

Both the bear and chicken jackets communicate via the Arduino Uno. A stamping motion controls the movement, and the costumes allow for auditory and visual responsiveness.

What makes this piece even more immersive is how the wearables themselves provide both audio and visual feedback. They incorporate built-in lighting and responsive sound outputs that reflect what’s happening in the game — a brilliant example of reactive media design.

The project was awarded an Epic Games grant, which shows the increasing recognition and support for experimental VR and new media works that fuse physical computing with virtual environments. I found it incredibly inspiring, especially in the context of exploring embodied interaction, wearable technology, and the creative possibilities of mixed reality. It’s a powerful reminder of how far VR can go when it intersects with tactile, real-world interfaces and artistic intention.


Unreal Plant by leia @leiamake

Unreal Plant

Another project that combines Unreal Engine with physical computing is Real Plant by leia @leiamake. It explores the idea of a virtual twin, where a real plant is mirrored in both the physical and digital realms. The setup includes light sensors that monitor the plant’s exposure to light in the real world. That data is then sent to the virtual world, where the digital twin of the plant responds in the same way—lighting up when the physical plant is exposed to lighth.

This project is a great example of bridging the two worlds and creating an accurate mapping between them. It really embodies the concept of the virtual twin by translating real-world inputs into virtual actions. There’s also some additional interaction, like pressing a button to water the plant, though that part is more conventional and relies on standard inputs like a mouse or keyboard. Still, the core idea—mirroring real-world changes in the virtual space—is what stood out to me.

Virtual Reality and Robotics

A similar concept using physical computing and robotics can be found in real-world applications, such as remote robotic control systems. In these systems, the user operates from a VR control room equipped with multiple sensors. Using Oculus controllers, the user can interact with virtual controls in a shared space—sometimes even space-like environments.

Because Oculus devices support features like hand tracking and pose recognition, users can perform tasks such as gripping, picking up objects, and moving items around. What’s impressive about this setup is that the designers have mapped the human physical space into virtual space, and then further mapped that virtual space into the robotic environment. This creates a strong sense of co-location between the human operator and the robot’s actions.

This setup is a great example of a virtual twin, where there’s a precise and responsive mapping between virtual and physical environments. It also points towards the emerging concept of the industrial metaverse, particularly in training robots and AI systems.

A similar example is the early development of Tesla’s humanoid robots. While these robots are designed to be autonomous, they can also be operated by humans through virtual reality systems. The aim is often to support tasks like healthcare delivery or improving social connections.

This is an example where VR and robotics are used to stock up the shop shelves remotely, from a distance.

This ties into the idea of the virtual twin—replicating what happens in the virtual world and mapping it onto physical robots. It’s essentially about translating human actions in a digital environment into responses in the physical world, which is a key aspect of bridging virtual and real spaces.

Industrial meatverse.

Leave a Reply

Your email address will not be published. Required fields are marked *