![ipi mocap fingers ipi mocap fingers](https://docs.ipisoft.com/images/7/71/hand_tracks.png)
The system processes video footage of an actor to estimate a pose for each frame, then translates that to skeletal motion data that could be retargeted to a 3D character. There isn’t any documentation, so you’ll need a bit of tech savvy to get it to work, and the GitHub repo comes with the disclaimer that “this still isn’t really in a state … for outside users yet”.Īn open-source markerless mocap system that runs on consumer webcamsīut with that out of the way, let’s take a look at what the FreeMoCap Project aims to do. Warning: it’s still very early in developmentįirst up, a caveat: in its initial release, The FreeMoCap Project is more “one to watch” than “one to use”. Mathis says that his ultimate aim is to enable “a 14-year-old with no technical training and no outside assistance to recreate a research-grade motion capture system for less than 100 US dollars”. The promising open-source framework can generate full-body skeletal motion from footage of an actor captured on two USB webcams, and comes with a Blender integration plugin. Researcher Jonathan Matthis has launched The FreeMoCap Project: an ambitious attempt to develop a low-cost, research-quality markerless optical motion-capture system. The current iteration relies on #anipose, #openpose and Animation via #OpenScience /kukQ7EtjFU Introducing the system! A free, open-source framework for easy-to-use, low-cost markerless motion capture!