Biomechanical turk

At a research meeting today, one of my colleagues was bemoaning the fact that there isn’t enough publicly available open source human movement data. That is, data recorded on a motion capture stage of people walking around, waving, sitting, standing, and doing all sorts of natural movements.

Medical researchers use such data to understand human movement and to study balance disorders. Animation studios and game designers use it to create animated characters. Computer scientists use it to train machine learning algorithms that mimic human movement.

But there just isn’t enough of it. There is an open source human movement library at CMU, but that’s pretty much the only one and it’s not nearly extensive enough. It’s mostly all of a single young white male grad student, which is not nearly good enough to represent a general population.

So in the meeting, I suggested that we take a tip from Amazon’s Mechanical Turk. That’s a system which allows people to earn money by solving problems on-line that computers aren’t yet good at. The name was inspired by the famous 18th century hoax by Wolfgang von Kempelen.

We could pay people to act out movements. Participants would see somebody moving or gesturing on their computer screen, then imitate it, and the result would be recorded by their webcam. In this way we could capture all of the many variations in movement style that you get from a diverse population.

In general, figuring out human movement from video is hard, but it gets a lot easier if you know what movement to look for. So if you know you are looking at people imitating a particular movement or gesture, it’s fairly straightforward.

We can also add a layer of verification: We pay other people to look at those movements, and we filter out the ones where the imitation clearly doesn’t match the original.

The best thing about this project is that I already have a cool name for it: Biomechanical Turk.

One thought on “Biomechanical turk”

  1. At the pre-dawn of cinema, Eadweard Muybridge photographed hundreds of short scenes in a series called “The Human Figure in Motion”. He used a wide range of models taking parts in many different activities. Many were filmed in multiple camera angles.

    Of course, the frame registration and photo quality of equipment 150 years ago lags a bit behind your iPhone, but some straightforward image processing could clean that up. And hey, it’s all in the public domain by now.

Leave a Reply

Your email address will not be published. Required fields are marked *