Building and Refining a Capacitive Sensor System for Unreal Engine
I began this project by experimenting with DIY capacitive sensors built from conductive materials connected to an Arduino. My goal was to capture both touch and proximity data and use that information to control character animation inside Unreal Engine. The early setup used simple resistors: 1 MΩ for touch sensors (small surface area) and 100 MΩ for proximity sensors (larger surface area). After some tuning, the system worked well enough to stream sensor values into Arduino and pass them to Unreal through a serial connection.
Mapping Sensor Data to Animation
Once Unreal was receiving a comma-separated stream of values, I used the “Parse into Array” function to split the data and feed it into animation logic. One of my first tests was to connect proximity values to a blendspace controlling a character’s transition from walking to running (see blend samples). Higher proximity values meant the user’s hand was closer to the sensor, which increased the character’s running speed.

Before building the blendspace logic, I had to measure the actual sensor range. For example, if the maximum proximity reading was around 700 (on the pic above the reading at the time was 100 from arduino and I opt for 75 for blendspace), that value became the threshold for the “full running” state in the blendspace. Values between the minimum and maximum produced interpolation between walking and running, visually resulting in a sort of jog. This was an important early step: confirming the sensor range and mapping it cleanly into Unreal’s animation system.
A Layered Workflow
The overall workflow quickly became layered:
- Hardware
Building DIY sensors, wiring circuits, and tuning resistor values. - Arduino Programming
Reading input, filtering noise, and sending reliable data over serial. - Unreal Engine Integration
Parsing the data, storing it in game components, and distributing it to Blueprints. - Animation
Designing blendspaces, setting up transitions, and creating smooth visual feedback.
Each layer depended on the previous one. Small issues in hardware or serial timing often showed up later as animation glitches.
Prototyping process so far
To begin testing, I wired a potentiometer and LED on a breadboard. I verified that changing the resistance adjusted the LED brightness. Then I mirrored this behavior in Unreal by sending potentiometer values over serial and mapping them to light intensity. This created a simple “digital twin” of the LED inside Unreal. This was my first complete hardware-to-software loop, and it helped confirm that serial communication and input parsing were working correctly.

Tinkecard wiring LED with 220KOhma
Choosing Capacitive Sensors
My goal was to move beyond conventional interactions such as buttons or sliders. I wanted users to interact through bodily movement and proximity rather than standard UI controls. Capacitive sensors fit this vision because they support touch, toggle, and proximity detection, which creates richer possibilities for movement-based input. Although I initially considered distance sensors, the flexibility of capacitive sensing seemed more appropriate.
Problems with the DIY Sensors
The DIY sensors worked, but they produced unstable behavior when used at high frequency. Unreal was receiving rapid spikes or drops in values, which created harsh jumps in the blendspace. The animations responded too literally to every jitter, so the character would abruptly shift between states, which was visually disruptive.
From an animation perspective, I wanted ease-in and ease-out behavior instead of instant transitions. The jitter not only looked bad, but it also broke the intended user experience, since the system emphasizes animation as feedback for user actions.
Another major issue was latency. At times, there was a noticeable delay between the user touching a sensor and the animation updating in Unreal. This lag broke the illusion of responsiveness and the immersion and made interaction feel unreliable.

Set up with DIY capacity controller using 1M resistors, single capacity sensor (conductive copper material + DIY alligator clip)
Capacity sensing
#include <CapacitiveSensor.h>
Imports the library that lets Arduino measure touch input by timing how long it takes to charge/discharge a pin.
CapacitiveSensor Sensor = CapacitiveSensor(4, 6);
Creates a CapacitiveSensor object.
- Pin 4 → “send” pin (outputs charge pulses).
- Pin 6 → “receive” pin (connected to your conductive fabric through a resistor).
So between pins 4 and 6, you have a resistor (usually 1–10 MΩ).
long val; → stores the sensor reading.
int pos; → stores the LED’s state (0 = off, 1 = on).
#define led 13 → defines the LED pin (the built-in LED on most Arduinos).
Wire a 1–10 MΩ resistor between SEND_PIN and RECV_PIN. Connect your conductive fabric to RECV_PIN.
Capacity sensing.
It requires 2 pins, send pin and receiver pin, in such an order within the argument, connected through the resistors, of min 1M ohm resistance. The second pin is the one that will be connected to the material.

Tinkecard wirirng, 1M Ohm Resistor for capacity sensor

Switching to the MPR121
During a consultation with a Joanne at CTH, I was advised to switch to the Adafruit MPR121 capacitive sensing microcontroller. Unlike my DIY setup, the MPR121 includes built-in resistors, calibrated touch detection, and support for up to 12 electrodes. It is safer, more stable, and better optimized for consistent proximity readings.
Switching to the MPR121 meant starting over. I had to learn how the sensor worked, rewrite my Arduino code, and adjust my Unreal parsing logic. Because my original Unreal code ignored zeros to avoid noise from the DIY sensors, the new sensor output behaved incorrectly until I removed that filtering. Once updated, the readings became consistent across all electrodes.
One issue I encountered early on was a mismatch between the Arduino’s pace of sending data and Unreal’s pace of reading data. If these two rates do not align, Unreal sometimes reads the serial buffer at a moment when no fresh data has arrived. This produces an empty string, which Unreal interprets as 0. Those unexpected zeros caused visible spikes in the animation, because a single frame at “0” looked like an abrupt drop in proximity.
Delays can occur at any stage of the pipeline, but in my setup they were mainly caused by the Unreal Blueprint logic. I had followed a YouTube tutorial that used a recursive function to continuously read the serial port. Inside that recursive function was a Delay node. Over time, this delay created latency between the actual sensor updates and Unreal’s polling rate. Because of that latency, Unreal would occasionally read the buffer too early, producing empty strings and therefore zeros.
To avoid the sudden drops in values, I filtered out zeros entirely and only processed values above zero. This prevented jitter and kept the animation stable.
To optimize the Blueprint, I stopped using the recursive function approach and instead called a separate function with a Delay node only once at Begin Play. This removed the accumulated latency and made the system more consistent.
However, once I switched to the new MPR121 capacitive sensor, the old filtering no longer worked. The new sensor could legitimately output low values, so ignoring zeros caused incorrect behavior. I had to revise the Unreal code to properly read and handle the MPR121’s data without relying on the previous workaround.
Touch vs Proximity on the MPR121
The MPR121 handles touch and proximity differently:
- Touch readings are highly stable and binary-like. The sensor reliably reports when an electrode is touched or released.
- Proximity requires interpreting the raw filtered data rather than relying on simple touch events. Proximity values fluctuate more gradually, which is useful for driving continuous animation parameters such as blendspaces.
This difference helped create much smoother transitions in Unreal and eliminated the jitter issues I had with the DIY approach.

MPR121

Set up with MPR121, single capacity sensor (conductive copper material + alligator clip) no breadboard required, electrode 0 connected via female to male connector

So when working with the capacitive sensors and different fruit, the readings still needed fine-tuning. I had to find a “sweet spot” for the pacing of the sensor data, because the delay inside each loop of the Arduino code directly affects how smoothly the visuals animate in Unreal.
After troubleshooting, I discovered that a 200-millisecond delay in the Arduino loop produced the best results. Anything lower or higher caused the animation to drift out of sync with the sensor cycle, which created visible artefacts or stuttering in the visual output.
With the 200-millisecond timing, the animation in the Unreal visual prototype appeared smooth, stable, and visually pleasing.
Reflection on the Learning Curve
This process involved much more than I expected at the beginning. I had to:
- Troubleshoot noisy DIY sensors
- Understand how serial timing affects Unreal’s data processing
- Rework Blueprints to handle real-time input cleanly
- Learn a new sensor system (MPR121) from scratch
- Refactor Unreal parsing logic after switching hardware
- Rebuild the animation logic to respond smoothly rather than abruptly
The experience highlighted how interconnected physical computing and digital animation are. Each layer—hardware, code, engine integration, and animation—affects the next. Small assumptions in one layer can create major issues downstream.
Despite the challenges, the result is a much more stable and responsive system. The MPR121 significantly improves reliability, and the animation feedback now matches the user’s physical actions more closely, which is important for my project’s focus on embodied interaction.