AIUNITES is a middleware company. We build the integration layer between systems that don't normally talk to each other. Our first protocol is for human movement โ one string that works in a gym log, a game engine, a clinical record, and a cable rig controller. No conversion. No proprietary software. No paying anyone to read your own data.
How can you have AI gains in this world without having unity?
Every field that touches human movement built its own silo. Whether by design or by accident, the result is the same: your data doesn't travel with you.
A biomechanics lab uses files only their software can read. A gym manufacturer stores joint angles in firmware only their machines can interpret. A clinic documents exercises in prose that no other system can parse. A game studio uses motion capture files that encode skeleton data but nothing about which muscles fired or why. These fields evolved independently, in different decades, solving different problems. Nobody planned the fragmentation โ but nobody solved it either.
The practical effect is the same regardless of intent: when your movement data lives inside a vendor's format, taking it with you is expensive and painful. Exporting, converting, and re-importing between systems requires specialized software, technical expertise, or both. For most people, the data effectively stays behind when they switch providers, facilities, or platforms.
MNN is a notation for human movement โ not just exercise. The same string works across all three.
Gym logging, physical therapy, clinical documentation, personal training. Track which angle clears the acromion, log nerve flare-ups alongside sets, document compensation patterns over time.
Virtual worlds, VR training, Second Life / OpenSim, digital twins, animation. Pose an avatar precisely using joint angles, animate contraction sequences, build training simulations.
Cable rigs, exoskeletons, robotic rehabilitation, isokinetic machines, teleoperation. Drive a pulley to the exact height and angle, set joint limits, reproduce a prescribed position.
These fields evolved independently, in different decades, solving different problems. The silos weren't necessarily planned โ but they're real.
Gym machines track joint angles in proprietary firmware. That data lives on their hardware and doesn't follow you to the next facility.
Motion capture, biomechanics, and EMG each have their own file formats. Reading the data often requires the same software that generated it.
Physical therapy and clinical notes are written in prose. Rich in detail, but no other system can parse or reuse them.
Biomechanics, sports medicine, dance, and animation each developed their own terminology for the same body doing the same things.
Game engines use skeleton data with no concept of which muscles fired or why. The movement looks right but carries no physiological meaning.
The string you write in the gym is the same string a game engine executes. Your data. Your format. Portable.
Existing standards each cover one layer. MNN covers all of them in a single string.
| Domain | Standard | What It Captures | What It Lacks |
|---|---|---|---|
| Joint angles | ISB JCS | Per-joint Euler rotations | No muscle/nerve, no text format |
| Motion data | C3D / BVH / OpenSim | Full-body time-series | No semantic layer, not human-authored |
| Muscle activity | EMG + SENIAM | Voltage waveforms | No symbol table, no nerve mapping |
| Exercise dose | ACSM FITT / NSCA | Sets ร reps ร load | No joint position, no targeting |
| Choreography | Labanotation | Spatial path, effort quality | No nerve, no muscle, no vector |
| All of the above | MNN | Muscle + nerve + joint + vector + compensation | โ |
The MNN builder is live. Build a notation string, log a workout, pose an avatar.