Gaitr
Can I run like Eliud Kipchoge? Probably not. But I built a system to find out exactly how far off I am.
The Story
The Wildest Question I've Ever Asked Myself
What if I could analyze Eliud Kipchoge's running gait from video, model him in Blender to scale, export that motion data, and then compare my own gait to his in real-time using wearable sensors? And what if I could get audio feedback through headphones telling me to adjust my form as I ran? Yeah... I don't know. I don't get it either.
Why did I think this was a reasonable first hardware project? I'd never touched a microcontroller. Never soldered anything. Never written a line of code that talked to a physical sensor. And yet, my brain said: "Yeah, let's build a full-body motion capture system and compare it to an Olympic athlete."
I had no idea if this was even possible — at least not for someone like me. But I wanted to learn. Microcontrollers. Accelerometers. Wiring. Soldering. This project touched all of it. Turns out, it's a lot harder than it sounds. Which is funny because it sounds hard.
The plan was ambitious: strap 7 ESP32 devices with accelerometers to different body parts (feet, shins, thighs, hips, etc.), stream all that motion data to a phone app via Bluetooth, and compare it frame-by-frame to Kipchoge's reference gait. The app would calculate a "Kipchoge Similarity Index" (KSI) and give you real-time feedback.
Did I finish it? No. Did I learn a ton about hardware, Bluetooth protocols, and motion analysis? Absolutely. Did I have minor success getting sensor data streaming to the app? Yes! The foundation works — it just needs someone with more patience (or obsession) to take it further.
The Numbers
Project Scope
The Pipeline
How It Was Supposed to Work
Capture Kipchoge
Find high-quality video of Kipchoge running. Analyze his gait frame by frame.
Model in Blender
Recreate his running motion as a 3D skeleton. Export joint positions for each frame.
Build the Hardware
Wire up ESP32 boards with MPU6050 accelerometers. Learn to solder. Curse a lot.
Stream via Bluetooth
Connect all 7 devices to the phone app. Synchronize the data streams.
Compare in Real-time
Match your current motion to Kipchoge's reference. Calculate similarity scores.
You (with sensors)
Kipchoge (from Blender)
KSI: Kipchoge Similarity Index
How close is your form to the GOAT?
Under the Hood
The Tech Stack
Hardware
- ESP32 — WiFi + Bluetooth microcontroller
- MPU6050 — 6-axis accelerometer/gyroscope
- Bluetooth BLE — Low energy data streaming
- Custom wiring — Lots of soldering practice
Software
- React Native — Cross-platform mobile app
- react-native-ble-plx — Bluetooth communication
- Blender — 3D modeling and motion export
- Custom gait processor — Angle comparison algorithm
The Reality
Where It Got Complicated
I bought velcro straps and a hot glue gun. That was my sensor mounting solution. Classy, I know. Each ESP32 got wrapped in velcro and strapped to a body part — two on my feet, two on my shins, two on my hips, one on my pelvis. Seven little computers all trying to talk to my phone at once.
Getting data on the screen was incredibly satisfying. Watching numbers stream in from sensors strapped to my body — that felt like magic. Real motion data, in real time, from hardware I'd wired together myself. For a moment, I felt like a genius.
Then I tried to actually use the data, and reality hit. Calibration was a nightmare. Every time I put the sensors on, they'd be in slightly different positions. The accelerometers had no idea where they were in 3D space relative to each other. I needed some way to establish a reference frame — to tell the system "okay, this is what standing still looks like" — and then maintain that calibration as I moved.
The other problem? Gait phase detection. Even if I could perfectly capture my movement, how do I know which frame of Kipchoge's gait I should be comparing to? Am I at foot strike? Mid-stance? Toe-off? The reference data is a loop, but my running is continuous. Syncing them up in real-time turned out to be a much harder problem than I anticipated.
This is where the project stalled. Not because I gave up, but because I realized I'd need to go way deeper into biomechanics and signal processing than I had time for. The foundation worked. The vision was sound. I just ran out of runway.
Features
What I Built (Before Running Out of Runway)
7 BLE Sensors
ESP32 devices with accelerometers strapped to key body points for full-body motion capture.
Real-time Analysis
Live gait feedback as you run. The app compares your movement to the reference in real-time.
KSI Score
Kipchoge Similarity Index — a score showing how close your running form is to the world's best.
Stride Metrics
Track stride length, cadence, and ground contact time. All the numbers runners obsess over.
Session Recording
Start/stop sessions, track duration, and save performance reports for later analysis.
Audio Feedback
Real-time audio cues through headphones to correct your form without looking at a screen.
The Point
Why I Don't Regret Not Finishing
This project was never about building a product. It was about answering the question: "Can I do hardware?" Turns out, yes — kind of. I learned to solder. I learned about Bluetooth protocols and their many, many edge cases. I learned that synchronizing 7 wireless sensors is harder than it sounds.
I got data streaming from ESP32 devices to a React Native app. I built a gait comparison algorithm. I exported Kipchoge's motion data from Blender. The pieces exist — they just never came together into something polished.
Maybe someday I'll come back to this. Or maybe it'll stay as a reminder that the journey matters more than the destination. Either way, I'm a better engineer because I tried. And I have a drawer full of ESP32 boards to prove it. 🏃♂️
Brian Stever