
Project Title: Venus Flytrap – Mixed Reality Game Experience
Platform: Meta Quest 3
Software: Unity (with Meta XR SDK), Arduino (for physical computing integration)
Overview & Context
This project was developed in response to a design brief that asked us to create an engaging experience using animation and storytelling, with a focus on user interaction and immersive feedback. The brief encouraged experimentation with multiverse concepts — blending physical and virtual worlds — to craft a compelling, thought-provoking experience.
I chose to approach this by designing a mixed reality (MR) game experience for Meta Quest 3 using Unity and physical computing via Arduino. The main focus was on user agency, visual feedback, and the integration of real-world components that react dynamically to virtual actions.
Concept & Gameplay Summary
The game centres around a Venus Flytrap theme and is designed as a seated, single-player experience. The player is surrounded by “flytrap monsters” and must shoot them to rescue a trapped fly, physically represented in the real world via a mechanical flytrap installation.
Core Mechanics:
- Shoot flytrap monsters (each requiring 3 hits to be defeated)
- Defeat all enemies to trigger a real-world reward: opening the physical flytrap and “releasing” the fly
- Fail condition: If monsters reach the player, health depletes, and the game ends
This concept questions the idea that “what happens in VR stays in VR” by making real-world elements react to in-game choices.
Animation & Visual Feedback
A central focus of the project was the role of animation in creating strong audiovisual feedback and reinforcing player agency:
- State-based animations: Each monster visually reacts to bullet hits via Unity’s Animator Controller and transitions (e.g., three different states reflect remaining health).
- Color feedback: Monsters transition from vibrant green to desaturated red as they take damage.
- Stop-motion-inspired effects: When monsters die, animated flies burst from their abdomen — a stylized visual reward.
- Audio cues: Complement visual states and further reinforce player actions.
This dynamic feedback enhances immersion, turning simple interactions into meaningful, sensory-rich experiences.
Technical Implementation
- Unity + Meta XR SDK: Developed the core game logic and animation system within Unity using the Meta SDK for mixed reality features like passthrough, hand/controller input, and spatial anchoring.
- Arduino Integration: Connected the physical flytrap installation to Unity using serial communication. Actions in VR (e.g. defeating monsters) trigger real-world servo movements.
- Pathfinding with NavMesh Agents: Monsters move toward the player using Unity’s NavMesh system, avoiding terrain obstacles and interacting with the player collider.
- Procedural Bullet Logic: Shooting was implemented via C# scripts that instantiate projectile prefabs with force and collision detection. Hits are registered only on monsters’ abdominal areas.
- Game State Management: Score tracking, monster kill count, and health depletion are all managed via a custom script, providing a clear player progression system.
Design Decisions & Constraints
- Asset Use: Monsters were rigged using Mixamo animations and a model sourced from Unreal Engine assets (Creative Commons licensed), allowing focus on interaction design and technical implementation over custom 3D modeling.
- Animator Learning Curve: A major portion of the project involved learning Unity’s animation pipeline, particularly in creating clean transitions and state-based logic driven by gameplay.
- Hardware Limitations: Due to Meta Quest 3 restrictions, the final APK shared for submission runs without Arduino hardware. The version with physical feedback requires an external microcontroller setup, which is documented in the accompanying video.
Physical & Virtual Worlds Integration
This project explores the interplay between digital interaction and real-world feedback, where actions in the virtual world have tangible consequences. It challenges typical game environments by asking: What if virtual success or failure could reshape physical reality — even subtly?
By positioning the player in the center of a game board-like world and creating a reactive environment both inside and outside the headset, I aimed to explore storytelling through presence, consequence, and multisensory feedback.
Key Learnings
- Deepened understanding of Unity’s animation system (Animator Controller, blend trees, transitions)
- Gained practical experience with physical computing and VR device integration
- Improved technical problem-solving through scripting gameplay logic and Arduino communication
- Recognised the power of animation fidelity in conveying emotion, feedback, and agency
Submission Info
- Final APK (without Arduino requirement) included
- Showcase video with documentation of the Arduino setup + physical installation
- Additional documentation includes technical breakdown, flowcharts, and design process