All
Suppliers
Products
CAD Models
Diverse Suppliers
Insights
By Category, Company or Brand
All Regions
Alabama
Alaska
Alberta
Arizona
Arkansas
British Columbia
California - Northern
California - Southern
Colorado
Connecticut
Delaware
District of Columbia
Florida
Georgia
Hawaii
Idaho
Illinois
Indiana
Iowa
Kansas
Kentucky
Louisiana
Maine
Manitoba
Maryland
Massachusetts - Eastern
Massachusetts - Western
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
New Brunswick
New Hampshire
New Jersey - Northern
New Jersey - Southern
New Mexico
New York - Metro
New York - Upstate
Newfoundland & Labrador
North Carolina
North Dakota
Northwest Territories
Nova Scotia
Nunavut
Ohio - Northern
Ohio - Southern
Oklahoma
Ontario
Oregon
Pennsylvania - Eastern
Pennsylvania - Western
Prince Edward Island
Puerto Rico
Quebec
Rhode Island
Saskatchewan
South Carolina
South Dakota
Tennessee
Texas - North
Texas - South
Utah
Vermont
Virgin Islands
Virginia
Washington
West Virginia
Wisconsin
Wyoming
Yukon

Why Researchers Are Strapping Dogs with Coats Covered in Reflective Markers

Subscribe
Why Researchers Are Strapping Dogs with Coats Covered in Reflective Markers

We've seen motion capture technology used on pro-athletes and actors to insert their likenesses in video games or even for CGI special effects (it’s the reason why most of our blockbuster movies aren’t much more than high-quality cartoons).

But researchers at the University of Bath didn't think that the data went far enough (or perhaps wasn't inclusive enough), so they set their sights on dogs. Researchers from the University's Center for Analysis of Motion, Entertainment Research & Applications (CAMERA) are taking dogs from a local rescue shelter and outfitting them with coats covered in reflective markers.

When infrared light hits the markers, it is picked up by cameras that are placed around the studio to record their positions in 3D. This data is translated to reconstruct the dog’s movement on a computer screen. The hope is to have better data that will make animated dogs look more realistic.

Typically, when an actor portrays a dog, he/she actually gets down on all fours, moves around, and software changes them into an animal. The researchers hope to use this data to make those actions easier to translate into natural animal movements.

Martin Parsons, the head of the studio at CAMERA, compares it to the way a puppeteer brings a puppet to life. With this software, a human can act out the basic movements and be transformed into a photorealistic representation of a specific dog breed.

The data will also be used as part of collaborative R&D projects that are driving the next generation of design tools used in the visual effects and gaming industries.

Next Up in Industry Trends
White House Announces Funding to Expand Baltimore Hub
Show More in Industry Trends