We've seen motion capture technology used on pro-athletes and actors to insert their likenesses in video games or even for CGI special effects (it’s the reason why most of our blockbuster movies aren’t much more than high-quality cartoons).
But researchers at the University of Bath didn't think that the data went far enough (or perhaps wasn't inclusive enough), so they set their sights on dogs. Researchers from the University's Center for Analysis of Motion, Entertainment Research & Applications (CAMERA) are taking dogs from a local rescue shelter and outfitting them with coats covered in reflective markers.
When infrared light hits the markers, it is picked up by cameras that are placed around the studio to record their positions in 3D. This data is translated to reconstruct the dog’s movement on a computer screen. The hope is to have better data that will make animated dogs look more realistic.
Typically, when an actor portrays a dog, he/she actually gets down on all fours, moves around, and software changes them into an animal. The researchers hope to use this data to make those actions easier to translate into natural animal movements.
Martin Parsons, the head of the studio at CAMERA, compares it to the way a puppeteer brings a puppet to life. With this software, a human can act out the basic movements and be transformed into a photorealistic representation of a specific dog breed.
The data will also be used as part of collaborative R&D projects that are driving the next generation of design tools used in the visual effects and gaming industries.