In three to five years, we might all have highly accurate 3-D digital avatars of our own bodies, with potential uses ranging from plopping our digital selves into a videogame to shopping online for bespoke suits to visualizing what we’ll look like in 30 years. At least, that’s according to Bill O’Farrell, CEO of Body Labs, which bills itself as “the world’s most advanced 3-D body modeling platform.” Body Labs announced today that it has raised $2.2 million in a seed round of funding led by FirstMark Capital, with participation from New York Angels.

You could use it to compare body shapes with matches on Match.com, to make sure that guy is as athletic and fit as he says he is.

The Body Labs platform, which is based on a comprehensive set of data culled from thousands and thousands of scans of the human body, has a grisly origin story. In 2002, Michael Black, the co-founder of Body Labs, was researching how to create a reliable statistical model of the human body and gearing up to teach a course on computer vision–a robot analogue of human vision, in which visual information on video is analyzed by computers–at Brown University. Before the start of the course, the Virginia state police contacted him, hoping to enlist his expertise. There had been a robbery and a murder at a 7-Eleven, and law enforcement officials wanted to try to use computer vision to help identify the perpetrator in the surveillance video. “Michael basically said, ‘I’m changing the course syllabus,'” O’Farrell says. “‘We’re going to break into teams and try to figure out this problem of how to accurately identify a human being through computer vision techniques.” Ultimately, the class’s use of computer vision vindicated some of the evidence in the case–it helped confirm the perpetrator’s height, for example.

A Body Labs digital avatar comparing a woman 11 weeks and 4 days pregnant to 17 weeks and 4 days pregnant.

Black’s group created the statistical model of the human body that became the basis of Body Labs. They used existing databases filled with thousands and thousands of scans of different people in various shapes, which included their exact measurements, ethnicity, gender, age, weight, and height. “From this data, they created a statistical model of all the possible shapes a human could have and all the possible poses a human could be in,” O’Farrell says. The platform can then take incoming data, either measurements or from scanners, to create a highly realistic, anatomically accurate digital avatar of any specific human. “It could be you, or me, or a hypothetical prototype,” O’Farrell says, “and we can make that avatar run through any motion available, whether it’s running, jumping, kicking, or swimming, with full fidelity to the way a human really looks and moves.”

Up until a few years ago, getting your body scanned required a quarter-million-dollar machine. “We had one on loan from the Army,” O’Farrell says. “It wasn’t even that accurate–it misses spots, like your underarms.” But scanning technology, like the Microsoft Kinect, a $200 piece of hardware that can be attached to a PC, is becoming widespread. As it stands, consumers can now scan their bodies with any existing body scanner–“We’re scanner agnostics,” O’Farrell says–then upload the file to BodyHub, Body Labs’ digital platform. Body Labs send them a 3-D .obj file of your digital avatar.

The same woman 11 weeks and 4 days weeks pregnant to 35 weeks and 5 days pregnant. You can see the increasing s curve in the small of her back, indicative of the classic late pregnancy back pain.

In the apparel design industry, BodyLabs could make it easier for companies to design clothing for specific body types and offer mass customization, shortening the design cycle. Let’s say a sports apparel company is designing a tracksuit specifically for sprinters: “Body Labs could say, okay, give us scans for 12 actual sprinters. We make those into body models, then align these bodies at a point-to-point level, then average them,” O’Farrell explains. “When you average 10 unique sprinter bodies, you can say, ‘Here’s your prototypical sprinter. Here’s size medium, small, large.'” The apparel company can then use measurement data to inform the design of their tracksuit, making for better fit. Clothing manufacturers could also use the platform to target consumers in specific geographical regions: “Say a client is trying to break into the Chinese market,” O’Farrell says. “We could use our data to find the breadth of shapes of 18- to 25-year-old women in China, and the company could then design their clothing sizes around a statistically calculated geometry.”

In a matter of years, “Everyone will have their own digital body model,” O’Farrell predicts. “You’ll be able to upload the avatar to sites to, say, shop for your body shape on Amazon, or send the file to a ski company to order custom-made ski boots. You could use it to compare body shapes with matches on Match.com, to make sure that guy is as athletic and fit as he says he is.” And it will contribute to the growing quantified-self movement: “People want to see how their bodies change over time,” O’Farrell says. Gym rats could scan their bodies weekly to track bulk-up progress, or expectant mothers could use the technology to track how their bodies change before and after pregnancy. You could print yourself, as a big mannequin or a mini-me.