Categories
Final Major Project

Arduino & Unreal 5


To connect the microcontroller with the game engine, a serial communication object is created with a baud rate that matches the value set in the Blueprint of the plugin. Data is retrieved from the Arduino using the analogRead() function and sent to Unreal Engine via Serial.println(). In Unreal, the data is received and read using the Serial Read Line function after a successful connection to the serial communication port.


For the prototype of the technical implementation testing, I connected an LED controlled by a potentiometer to an Arduino, which sends the mapped potentiometer data to Unreal Engine via serial communication.


Arduino LED code.

LED connected to the PIN 3
Potentiometer input on A0, performs an Analogue Read from A0pin and stores it in the sensorValue variable
The mapped value is sent from Arduino

Serial.println :

“Prints data to the serial port as human-readable ASCII text followed by a carriage return character (ASCII 13, or ‘\r’) and a newline character (ASCII 10, or ‘\n’). This command takes the same forms as Serial.print().”





Reading multiple data

screenshot image from: Arduino C# Serial Communication Connected to Multiple Sensors

Categories
Final Major Project

Digital Twin of the LED Light. Referencing in UE5

LED on breadboard IRL

DIGITAL TWIN OF LED in UE5




Referencing allows for communication between blueprints, the sender and the receiver.


To enable communication between two Blueprints in Unreal Engine — for example, a sender (like a serial communication Blueprint) and a receiver (such as a Spotlight Blueprint) — I followed the steps below:


  1. I saved both Blueprints in the same directory to ensure that one could easily reference the other.
  2. I placed both Blueprints in the World View by dragging the sender and receiver Blueprints into the level so they existed as instances in the world.
  3. Next, I created a reference variable in the Sender Blueprint (the one responsible for sending data).
    In the Variables panel, I clicked the + icon to create a new variable and changed its type to the class of my Receiver Blueprint (for example, BP_Spotlight).
    I named this variable ReceiverRef to keep it meaningful.
  4. After that, I made the variable editable.
    With the variable selected, I went to the Details panel and checked Instance Editable (the small eye icon appeared next to the variable name).
    This allowed me to assign a reference to the variable directly from the Level Editor.
  5. In the level, I selected the Sender Blueprint instance and, in the Details panel, found my editable variable under the Default section.
    I then used the dropdown menu to assign the Receiver Blueprint instance (for example, the Spotlight Blueprint) that I had placed in the world.
  6. Finally, inside the Sender Blueprint, I dragged the ReceiverRef variable into the Event Graph and used it to call functions, set variables, and trigger events in the Receiver Blueprint.

Resources:

Potentiometer hookup: How to Use a Potentiometer with Arduino analogRead (Lesson #7)
Serial Communciation arudino: Arduino to Unreal Engine Serial Communication


Summary:

So basically, I have creates 2 blueprints. The first one handles serial port communication. 1st the serial communication plugin that I outsourced from GitHub, but then I’ve modified it. I created a custom event that reads data coming from the Arduino as strings, converts those strings into floats, and then maps the range of values (from 0–255 on the Arduino side) to a new range of 0–100,000 in Unreal Engine. This mapped value represents the light intensity inside Unreal.

Using that data, I update the light’s intensity in real time, making the digital twin of a red LED in Unreal Engine. I managed to get the two blueprints communicating with each other by referencing the red light blueprint inside the serial communication blueprint.

Next steps:

Now, I’m going to be working on a new blueprint for the character animation. I will be working with Unreal state machine and transition state that will be initialised with Arduino data. I want to use the Arduino data again to control this transition — for example, switching from idle to walking when the input value crosses a threshold.

Let’s say if the Arduino’s light intensity value is greater than 100, I want the character to transition from idle to walking. In other words, the character’s animation changes dynamically based on the real-time Arduino input, reflecting user interaction and sensor data in the scene. This will be a real-time interaction of predefined animation states.





Categories
Final Major Project

Serial Communication bewteen Arduino UNO and Unreal Engine 5

Finding the Right Plugin

To bridge Arduino and Unreal Engine, I used a plugin originally made for Unreal Engine 4 and later adapted for Unreal Engine 5. The plugin is called Unreal Engine SerialCOM Plugin, developed by Ramiro Montes De Oca (Ramiro’s Lab).
I sourced it from GitHub:
https://github.com/videofeedback/Unreal_Engine_SerialCOM_Plugin

I’m running Unreal Engine version 5.4.4, which meant I had to rebuild the plugin version for 5.0.3 available in the resource folder, to make it compatible with this version. It worked smoothly after a few configuration steps.


Setting It Up in Unreal

Once I downloaded the plugin ZIP from GitHub, I followed these steps:

  1. Create a Plugins Folder
    Inside my Unreal project directory, I created a folder named Plugins.
  2. MyProject/Plugins
  3. Place the Plugin Inside
    I extracted the downloaded plugin folder directly into this Plugins directory — not inside the /Content folder.
  4. Restart Unreal
    After placing the plugin, I closed the project and reopened it. Unreal automatically detected the new plugin and asked to rebuild it for version 5.4.4. I let it rebuild, and everything compiled correctly.
  5. Enable the Plugin
    In Unreal, I went to:
  6. Edit → Plugins

Then I searched for “SerialCOM,” ticked the checkbox, and made sure it was active. Once enabled, I restarted Unreal one more time to apply the change.


Setting Up the Connection

The plugin comes with a Blueprint example inside the ZIP package, which made the setup process much easier.

Here’s what I did:

  1. Open the Example Blueprint
    I added the provided Blueprint to my level and opened it.
  2. Find the Right COM Port
    I checked which port my Arduino was connected to by opening the Device Manager on Windows and looking under “Ports (COM & LPT).”
    For me, it was something like COM3.
  3. Update the Blueprint Settings
    In the Blueprint, I changed the serial port to match my Arduino’s COM port and adjusted the Baud Rate to the same value I had used in my Arduino sketch.
  4. Test the Connection
    Once everything was set, I ran the project in Unreal. The connection status appeared in the window, confirming that Unreal was successfully talking to the Arduino.

What Worked and What Didn’t

It wasn’t all smooth sailing — there were a few hiccups along the way:

  • Plugin detection issues: Unreal wouldn’t recognize the plugin at first because I’d placed it inside the /Content folder by mistake. Moving it to /Plugins fixed that instantly.
  • Rebuild prompts: Unreal had to rebuild the plugin every time I changed versions or made small edits, but it eventually stabilised.
  • COM port mismatch: If the wrong port or baud rate was set, the communication failed silently — double-checking these two settings was key.

Port Initialised in debug view, confirming Arduino successful connection with Unreal.


Final Thoughts

Getting Arduino and Unreal Engine to communicate felt like unlocking a new level of interactivity. The SerialCOM Plugin by Ramiro Montes De Oca works really well once it’s set up properly, and it opens up so many possibilities for combining physical hardware with virtual experiences — from motion sensors and haptic devices to robotics and interactive art installations.

This process taught me a lot about how Unreal handles plugins and serial communication, and now that the setup is working, I can start experimenting with sending real sensor data directly into my Unreal projects.


Resources:

Categories
Final Major Project

FMP Proposal: Transition trough worlds

My work bridges the physical and digital worlds, exploring the transition between real-life (IRL) and virtual reality (VR) events. These two domains are often seen as opposites — the virtual world as an alternative to the real one — a juxtaposition of the real and the non-real, the real and the fake, the real and the surreal, the physical and the digital, the real and the imaginary, colliding in complex ways.

The motivation for this project was to create an experience through which the audience can discover that actions and experiences in VR can have real consequences, as David Chalmers argues in Reality+ (2022). The physical and the virtual are thus connected, rather than distant.

Human cognition is shaped by collective experience; we approach new situations with expectations formed by past encounters.


From Virtual to Physical

Chalmers argues that experiences in VR can affect our emotions, relationships, and choices. I make this more tangible by materializing it through a physical computing sculpture that reacts to actions taken in the VR experience — creating immediate, real-world results from virtual actions.


From Physical to Virtual

We enter VR through head-mounted displays (HMDs) and interact with the world as avatars. We experience, explore, and undergo cognitive changes. In essence, we operate within the capacity of technology, entering a hybrid relationship between the human and the non-human. Our agency emerges from this bidirectional relationship — we shape technology, and technology shapes us in return (as described by Actor-Network Theory, ANT).

To capture this bidirectionality, I designed the interaction to be open-ended: the audience’s engagement with certain physical flowers directly influences the VR experience.

Categories
Uncategorised

Animation in Cascadeur

Why I’m Learning Cascadeur for My Project

Because of the project management challenges and the scope of what I’ll be delivering, I’ve decided to learn and incorporate a new animation software: Cascadeur. It’s an AI-powered tool that enables auto-posing during blocking and comes with physics assistance to help polish movement.

That said, just because it uses AI doesn’t mean it’s a shortcut to free animation. It still requires solid fundamentals: working with arcs, planning ahead, studying movement, posing carefully, and delivering dynamic body motion. None of that comes automatically—you still need to bring your own skills and knowledge to the table.

My decision is largely based on time limitations and the ambitious direction of the project. Animation is a huge part of my work, but the project isn’t strictly about on-screen content. I’m also thinking about installation, physical computing, and virtual reality. Because of this, it feels justified to use a tool like Cascadeur. It fits both the project management side of things and my career trajectory, since I don’t aspire to become a traditional animator.

Of course, learning new software comes with a curve. Some skills I already have are transferable, but getting familiar with a new workflow always takes effort. Still, I think it’s worth it, given my position, project aspirations, and the need to optimise production. In practice, Cascadeur should let me deliver results much more quickly than if I had to build everything in Maya.

For this project, I’ll also be documenting my workflow—through images, screenshots, and screencasts—so I can share the process of creating and refining the animations I’ll be using.


Overlap of animating in Cascadeur.

Categories
Project 2

Mixed Reality Project Reflection

Project Title: Venus Flytrap – Mixed Reality Game Experience
Platform: Meta Quest 3
Software: Unity (with Meta XR SDK), Arduino (for physical computing integration)



Overview & Context

This project was developed in response to a design brief that asked us to create an engaging experience using animation and storytelling, with a focus on user interaction and immersive feedback. The brief encouraged experimentation with multiverse concepts — blending physical and virtual worlds — to craft a compelling, thought-provoking experience.

I chose to approach this by designing a mixed reality (MR) game experience for Meta Quest 3 using Unity and physical computing via Arduino. The main focus was on user agency, visual feedback, and the integration of real-world components that react dynamically to virtual actions.


Concept & Gameplay Summary

The game centres around a Venus Flytrap theme and is designed as a seated, single-player experience. The player is surrounded by “flytrap monsters” and must shoot them to rescue a trapped fly, physically represented in the real world via a mechanical flytrap installation.

Core Mechanics:

  • Shoot flytrap monsters (each requiring 3 hits to be defeated)
  • Defeat all enemies to trigger a real-world reward: opening the physical flytrap and “releasing” the fly
  • Fail condition: If monsters reach the player, health depletes, and the game ends

This concept questions the idea that “what happens in VR stays in VR” by making real-world elements react to in-game choices.


Animation & Visual Feedback

A central focus of the project was the role of animation in creating strong audiovisual feedback and reinforcing player agency:

  • State-based animations: Each monster visually reacts to bullet hits via Unity’s Animator Controller and transitions (e.g., three different states reflect remaining health).
  • Color feedback: Monsters transition from vibrant green to desaturated red as they take damage.
  • Stop-motion-inspired effects: When monsters die, animated flies burst from their abdomen — a stylized visual reward.
  • Audio cues: Complement visual states and further reinforce player actions.

This dynamic feedback enhances immersion, turning simple interactions into meaningful, sensory-rich experiences.


Technical Implementation

  • Unity + Meta XR SDK: Developed the core game logic and animation system within Unity using the Meta SDK for mixed reality features like passthrough, hand/controller input, and spatial anchoring.
  • Arduino Integration: Connected the physical flytrap installation to Unity using serial communication. Actions in VR (e.g. defeating monsters) trigger real-world servo movements.
  • Pathfinding with NavMesh Agents: Monsters move toward the player using Unity’s NavMesh system, avoiding terrain obstacles and interacting with the player collider.
  • Procedural Bullet Logic: Shooting was implemented via C# scripts that instantiate projectile prefabs with force and collision detection. Hits are registered only on monsters’ abdominal areas.
  • Game State Management: Score tracking, monster kill count, and health depletion are all managed via a custom script, providing a clear player progression system.

Design Decisions & Constraints

  • Asset Use: Monsters were rigged using Mixamo animations and a model sourced from Unreal Engine assets (Creative Commons licensed), allowing focus on interaction design and technical implementation over custom 3D modeling.
  • Animator Learning Curve: A major portion of the project involved learning Unity’s animation pipeline, particularly in creating clean transitions and state-based logic driven by gameplay.
  • Hardware Limitations: Due to Meta Quest 3 restrictions, the final APK shared for submission runs without Arduino hardware. The version with physical feedback requires an external microcontroller setup, which is documented in the accompanying video.

Physical & Virtual Worlds Integration

This project explores the interplay between digital interaction and real-world feedback, where actions in the virtual world have tangible consequences. It challenges typical game environments by asking: What if virtual success or failure could reshape physical reality — even subtly?

By positioning the player in the center of a game board-like world and creating a reactive environment both inside and outside the headset, I aimed to explore storytelling through presence, consequence, and multisensory feedback.


Key Learnings

  • Deepened understanding of Unity’s animation system (Animator Controller, blend trees, transitions)
  • Gained practical experience with physical computing and VR device integration
  • Improved technical problem-solving through scripting gameplay logic and Arduino communication
  • Recognised the power of animation fidelity in conveying emotion, feedback, and agency

Submission Info

  • Final APK (without Arduino requirement) included
  • Showcase video with documentation of the Arduino setup + physical installation
  • Additional documentation includes technical breakdown, flowcharts, and design process
Categories
Project 2

World Defining

The word-building element is really just there to make the scene more visually engaging. The main purpose, though, is functional. It’s meant to enhance the gamified experience by adding some challenge, and also by introducing objects into the environment that interact with the gameplay.

These new digital “plants” serve two main purposes:
First, they help constrain the user’s field of vision, with long, plant-like forms appearing on the horizon. They also embrace the Venus flytrap aesthetic, reinforcing the overall theme. The idea is to partially obscure distant vision, creating a more immersive and slightly tense experience.

I’m keeping this project in mixed reality because it’s important that the user can still see the physical Venus flytrap installation and observe how it responds at the end of the game. That physical interaction is core to the experience.

By introducing these digital plants into the scene, I also managed to constrain the movement of the monsters. Now, not all monsters move directly toward the user in a straight line. Because the mesh surface for movement has been reduced by the placement of these digital obstacles, the monsters must now navigate around them to reach the user. This makes the gameplay more dynamic and adds natural variation to how and when monsters arrive.

Another layer of diversity in movement was achieved by tweaking the NavMesh Agent settings on each monster prefab. By adjusting parameters like speed, acceleration, and angular speed for each monster individually, their behavior became more varied and unpredictable.

I also added more detailed textures to the floor and plane to better fit the aesthetic, updated the UI to match the theme, and fine-tuned the color palette. All of these design choices help reinforce the atmosphere of a plant-monster, green Venus flytrap world.

So overall, these updates have made the project feel more cohesive, engaging, and visually in line with the original concept.

Categories
Project 1

Acting + LipSync: refining of blocking

When animating the eyes, make sure not to cut into the pupil; the iris can be slightly cropped, but only in extreme cases—such as when the character is intoxicated—should the pupil be cut.

The whites of the eyes should not be visible at the top or bottom unless the character is expressing intense emotions like fear or excitement. For example, when my character is looking downward, perhaps at someone with disgust—they are neither scared nor excited. Therefore, the eyelids should not be wide open in that pose.

The arc of the organic movement is crucial. The above synsketch includes notes on tracking the mark on the nose across frames to define the trajectory of the head movement. Since the nose moves very little, it should be used as a reference point for tracking the head’s motion and its pathway.

In the next steps, I’m going to adjust the head’s positioning—specifically its rotation—to ensure the movement feels more organic.

Right now, there’s too much happening in the neck area, making it feel like the neck is the pivot point. Rework the organic movement of the head first, and that should help correct the neck’s motion.

Remember that this shot is a 2D representation of movement, which will visually flatten things. Make sure to show both the upper and lower teeth. For example, if only the top teeth are visible, and too much gum is shown, the result may look like dentures.

The eyebrows need more animation to support the transition between expressions of surprise and disgust. Avoid holding the default pose for too long.

The mouth movement is currently exaggerated, which is fine—it just needs to be slightly toned down as the animation progresses.

As the head dynamically moves into the expression of disgust, there should be a clear preparation for that beat. The mouth will take on the shape of an inverted triangle, which forms the base for articulating that emotion.

Before: Visualising the nose’s trajectory. As per feedback, arcs have to be delivered and polished off.

After applying changes according to feedback given, the arcs for movement are delivered – (red line ) tip of the nose trajectory.

Motion trail animation feature is causing a lot of lagging. The better way of achieving the result is with animbot, motion trail function, more efficient way, may actually cant to be used, it’s not usable.

Categories
Project 2

Venus Fly Trap Physical Computing Installation

Blending Worlds: Building My Venus Flytrap VR Game

Over the past few weeks, I’ve been figuring out how to bridge what I’m doing creatively with the technical setup—finding a way to connect my ideas in Unity with a working space that includes a VR headset, microcontroller, and physical components. After sorting out how to get Unity communicating to the microcontroller based on my exploration, described in the previous blog post, I began working on my more sophisticated idea.

One small but meaningful breakthrough was setting up a button that, when pressed, would light up an LED on a breadboard. It was a simple interaction, but it confirmed that the virtual and physical systems could communicate. It might seem basic, but this helped me break through a technical wall I’d been stuck on. Sometimes the simplest prototypes are the most important steps forward.

From there, I started thinking about how to bring coherence between the physical and virtual elements of the project. The game I’m building revolves around a Venus flytrap, and I wanted the whole aesthetic and gameplay experience to revolve around that concept. In the game, the Venus flytrap acts as both protector and trap. It hides a real-world object inside its petals (in this case, the user) and stays closed. The player’s goal in VR is to defeat all the “fly monsters” surrounding it. Once they’re defeated, the Venus flytrap opens, revealing the trapped player and marking the win.


Repurposing Physical Models

For this, I built a physical model of a Venus flytrap. The petals are made of painted cardboard and 3D-printed components, designed around a gear-based movement system that controls how the trap opens and closes. A DC motor mounted at the back drives the movement, using a cog system with four gears that allow the left and right petals to move in opposite directions. The mechanics are relatively straightforward, but designing the gear system took a fair amount of design thinking and trial-and-error.


Bridging Code and Motion

The movement logic is coded in Arduino, which I then uploaded to the microcontroller. It communicates with Unity through a patch that tracks what’s happening in the game. Specifically, the system monitors how many fly monsters have been “killed” in the virtual world. Once all the fly monsters are defeated, a signal is sent to the motor to open the Venus flytrap in real life—a moment of physical transformation that responds to virtual action.

This project sits somewhere between VR and physical computing. It’s not just about creating a game, but about exploring what happens when virtual reality meets metaphysical, embodied experiences. I’m fascinated by this transition between worlds—the way something in the virtual space can have a tangible, physical consequence, and how that loop can create a more meaningful sense of interaction and presence.

Setting up this system—both technically and conceptually—helped me shape the final scenario. It’s about merging playful game mechanics with thoughtful digital-physical storytelling, and I’m excited to keep exploring how far this kind of hybrid setup can go.

Categories
Project 1

Acting Intro

Planning, planning, planning.

Importance of planning: “In order for the audience to be entertained by the characters, the shots need to be appealing in composition, posing, rhythm and phrasing, and contain acting choices that feel fresh.”

Emotional choices, acting.

What happens before the shot, during and after, in order to keep continuity across the shots.


12 rules:

1. Character does not always need to be in motion, actually, juxtaposition with still (moment of stillness) poses works best.

2. Dialogue doesn’t drive action, but thought does. Establish the emotional state of the character.

What are the main emotional poses? Decide on the timing of the changes between the emotional states.

3. The melody
base = main poses
middle = overlapping actions leading into and out of main poses and eight shift to support small gestures
high = non-verbal body movement (head, shoulders, eyes)
moments of stillness = punctucations

“character’s performance like a song—layering rhythms on top of each other to create interesting texture and timing within your animation. “

4. HANG TIME just like a bouncing ball, loosing its energy when boundsic back and getting stack in the air

HANG TIME = the thought processing moment, there must be a time allowed for the shift between emotions

5. Create a neutral pose for your character (Do not naimate on default T pose of the rig)

What are the character’s main traits?

6. What’s the style of the movement? Style fo the movement gives away about the charchters personality.

e.g.:
Spanish flamenco dancer Buzz Lightyear Toy Story.
redefining non-vanilla walkcycle (sassy, drunk, confident etc)

7. The line of action to create expressions.

face
shoulder lines
hip lines


Contrapposto is an italian term, and is used in the visual arts to describe a human figure standing with most of its weight on one foot.

Contraposition: one side is stretched while the other side of the face is squashed


8. MOVE THAT BODY : Sincerity of the feelings to be expressed through an entire body movement, not only a face.

exaggerate the TY (Translate Y) value in the hips to connect the lower body more to the performance, which enhances the emotional delivery


9. EXAGERATION + extreme poses + smear

juxtaposition of opposites (extreme balanced out with subtle) that gives the animation MORE energy and life


10. Thoughts no words

When a scene does have dialogue, a great trick to figuring out how to animate to a character’s thoughts is to write out the actual dialogue on paper, leaving spaces between the lines. Then, in a different color, write what the character is *thinking* right below what they’re saying.