
When Worlds Collide is an interactive installation that spans two interconnected realms:
the physical world, navigated through an Arduino UNO microcontroller mounted beneath a table and connected to a custom circuit, and the virtual world, built in Unreal Engine and running in real time. The transition between these worlds is enabled by sending data from the physical interface into the virtual environment via a serial communication plugin.
The piece offers a gamified experience in which the audience interacts with capacitive touch sensors, proximity sensors, a stretch sensor, and a grip sensor—bringing back the sense of physicality often missing in immersive media accessed through HMDs (Head-Mounted Displays). As Slater highlights in his research on Plausibility and Place Illusion, interaction in VR is often not considered “valid” because users touch virtual objects but feel no tactile feedback. This installation reintroduces physical engagement to address that gap.
In the virtual world, characters inhabit a theatre-like scene. They begin in an idle, subtle breathing state and remain so until user input is detected from the sensors on the table. Once activated, the characters move according to the game logic: Unreal Engine retrieves data from Arduino, checks predefined conditions, and calls the corresponding animation blueprints—specifically, the relevant states within each character’s animation state machine.
The final work is not a rendered video but a packaged Unreal Engine application for Windows. It requires a live Arduino connection using the exact COM port and baud rate specified in the Unreal plugin blueprint. These settings must match the Arduino code uploaded to the microcontroller and the wiring of the custom circuit, both of which are explained in the making-of video.
This is a data-driven, real-time application, where character movement responds directly to user interaction with physical sensors. While the animation states are predefined, Unreal’s blend spaces allow for real-time interpolation based on incoming data. One example—shown on the poster—is a toggle sensor that makes two central characters begin walking. When the toggle remains on, a neighbouring proximity sensor controls the interpolation between walking in place, walking forward, running, and finally speed-running. As the user’s hand gradually approaches the cushion, the character accelerates until full contact triggers a “Naruto-like” sprint.
To achieve this, each sensor had to be calibrated. I measured their output ranges, mapped those ranges in Arduino, cleaned the data, sent it to Unreal, stored it globally, and then retrieved it locally within the blueprints.
In this sense, the installation functions as data visualisation through digital body motion.
The project explores the role of animation in user experience, focusing on the visual feedback loop generated by user actions. Each action triggers a corresponding animated reaction, reinforcing the user’s sense of agency.
The interaction design is grounded in the motility of the user’s physical body, which is fundamental in VR and essential for enhancing avatar fidelity.
Ultimately, this work proposes a novel interaction model for the metaverse era—one that moves beyond keyboards and mice and instead relies on the user’s body, while still maintaining physical contact with an interface, either directly (touch) or indirectly (proximity).
FMP Artefact
Making Of/ Showreel
Exhibition

High fidelity concept design for the exhibition

Wireframe for the setup.

The end goal delivery.

The work offers a gamified experience in which the audience engages with a bespoke set of tactile sensors arranged on a table. Each sensor is mapped to a specific character or behaviour within the virtual scene. The interface includes:
- Silver cushion — Constructed from conductive fabric and foam. Programmed as a toggle (on/off).
When activated, characters transition from idle to walking; when deactivated, they return to idle. - Copper cushion — Conductive fabric with foam filling. Functions as a proximity sensor.
When the silver cushion is on, this cushion modulates movement by interpolating between animation states based on hand distance. Full contact triggers the extreme value, causing a speed-run effect. - Oval cushion — Conductive fabric and foam. A quick-touch sensor that triggers an Animation Offset in Unreal, forcing the character to perform a jump over the base animation.
- Apple under a glass dome — A real apple connected to the circuit. Serves as a proximity-based trigger.
When the glass dome is touched, the apple’s signal causes a character to flip after bouncing off the screen edge. - Touch Pad 1 — Made from conductive fabric and thread on a non-conductive base.
A touch-and-hold sensor that makes the character dance for the duration of contact. - Touch Pad 2 — Conductive thread stitched onto a non-conductive surface.
Similar to Touch Pad 1, but interpolates from static pose to an animation trough unreal blend space BP, and also drives a light effect: the character becomes illuminated or fades out as movement intensifies and slows. - Stretch sensor — A resistive rubber sensor that detects changes in resistance.
Each stretch event triggers a jump-and-hang animation. - Grip sensor — Built with an LDR inside a grip-exercise tool.
When fully closed and held for several seconds, it triggers a jump animation for both characters.
Unlike the oval cushion’s offset jump, this version becomes a looping base animation.
Project Management delivery of final piece


For the project management of the final setup—including soldering, wiring, consultations, getting the project approved for health and safety, laser cutting, fabrication tutorials, ongoing thesis work, exhibition team deadlines, printing, and more—I used a Miro board to organize and navigate the workload.

Setting up the UI, wiring the Arduino.
Reflection
My motivation for this project began during the course, when I was introduced to the concept of virtual twin technology. I had also worked with this idea during my summer internship and I became fascinated with the transition between the physical and virtual worlds.
The project started as an exploration of that connection through creating a virtual twin of a simple LED light as a proof of concept. Previously, I had successfully connected Arduino with Unity in my third term, but this time I wanted to use Unreal Engine because of its cinematic capabilities as well as built on top skills gained during the course. This LED experiment became the starting point that pushed the whole project forward.
Although similar projects existed, most focused on audio or visual effects rather than real-time animation from Arduino input. This led me to develop an application that enables real-time exploration of a virtual twin.
My project was also shaped by my thesis research, which focused on immersive media and the sense of presence. I initially planned for the piece to work in VR, but technical limitations—mainly not having a strong enough GPU—made that impossible at the time.
Two key research sources influenced my direction:
- Matthew Ball’s work on the metaverse, which argues that traditional interfaces like the mouse and keyboard are outdated, and that future interactions will rely more on full-body engagement. This encouraged me to bring physical movement into the experience.
- Mel Slater’s research on Plausibility and Place Illusion, which explains how immersion breaks when virtual actions don’t produce physical sensations. That insight pushed me to design a more physical interface, leading me to experiment with capacitive touch and proximity sensors to reintroduce real, tactile interaction into the digital space.
A major part of the project was designing a full technical pipeline:
- capturing sensory information from the physical space
- processing and cleaning that data
- sending it to the virtual world
- triggering meaningful visual feedback based on user actions
Animation became a central part of the user experience—it acts as a feedback system, showing users what action they’ve triggered and reinforcing their sense of agency. I didn’t want this feedback to be too literal or obvious, so I used digital body movement as an abstract form of data visualisation. To make this work, I essentially had to “gamify” the experience inside Unreal Engine, creating rules and conditions that connect physical input to animated responses.
By gamifying, I mean implementing rules and conditions—so if a certain sensor is activated, a specific animation or visual effect happens. I also created dependencies, for example: one sensor must be activated before another one becomes interactive. This adds a learning curve and makes the experience more exploratory for the audience.
Parallel to the conceptual work, a huge amount of technical development was happening.
This included:
- building and rebuilding the circuit multiple times
- testing different ways of creating capacitive sensors
- eventually switching to the Adafruit MPR121 for better precision and lower latency
- constantly recalibrating sensors due to environmental noise
- designing and sewing custom conductive sensors
- soldering and securing all wiring safely
- prototyping the exhibition table, drilling holes, measuring cable lengths, and making sure everything was hidden for both aesthetic and experiential reasons
The physical layout required careful planning because proximity sensors are extremely sensitive. They needed to be spaced correctly so they wouldn’t interfere with each other, and the wiring also had to be placed to avoid accidental noise or cross-triggering.
On the Unreal side, I worked on creating the entire virtual world populated with characters, some as Unreal Manny fully optimised, others generated with MeshyAI from own concept art sketches, autorigged with mxiamo, and re-paontes ksin weithg with maya. I made an extensive use of the campus motion capture lab, using both Vicon and Rokoko systems. Then I retargeted and cleaned dat in Maya and Cascadeur respectively . This gave me a deep understanding of how different mocap systems behave, what types of noise they produce, and how much manual cleaning is needed.
Inside Unreal Engine, much of my time went into building animation blueprints, state machines, and blend spaces. I learned how to design the logic behind transitions—how the system moves from one animation to another based on the incoming data from the physical world. This was a huge technical learning curve but also essential for creating smooth and immediate feedback, which is crucial for immersion.
I rewrote the Unreal code at least five times to reduce latency and ensure the system responded almost instantly. Without that, the illusion and the whole concept of physical-virtual connection would fall apart.
So overall, the project operated on multiple layers at once:
and constructing a new, playful interaction experience that the audience can explore
exploring interfaces for the metaverse era
integrating body movement and physical interaction
building a robust hardware system
designing a virtual world
creating animations that serve as abstract feedback