3D Body Measurement on a Smartphone

From Personal Science Wiki
Jump to navigation Jump to search
Project Infobox Question-icon.png
Self researcher(s) Eleanor Watson
Related tools camera, phone
Related topics Diet and weight loss, Body measurements

Builds on project(s)
Has inspired Projects (0)
Show and Tell Talk Infobox
Featured image
Date 2013/05/12
Event name 2013 QS Europe Conference
Slides
UI icon information.png This content was automatically imported. See here how to improve it if any information is missing or out outdated.

3D Body Measurement on a Smartphone is a Show & Tell talk by Eleanor Watson that has been imported from the Quantified Self Show & Tell library.The talk was given on 2013/05/12 and is about Diet and weight loss, and Body measurements.

Description[edit | edit source]

A description of this project as introduced by Quantified Self follows:

Nell Watson runs a company called Poikos. Poikos enables one to measure his/her body in 3-D on a smart phone, tablet, or a PC in seconds with no extra hardware. They are taking this data and applying it to new technologies.

Video and transcript[edit | edit source]

A transcript of this talk is below:

Hi, I’m Nell Watson. I run a company called Poikos which I founded. We enable you to measure your body in 3-D on a smart phone, tablet in a couple of seconds. Now I set off on a journey to enable mass customization. So that’s being able to customize goods, make stuff that perfectly suited to you cheaply and affordably and I found it was pretty difficult because of all the troubles of getting measurements in. how can you configure something if you don’t have the specifications.


So I set out to change that and we created a technology for measuring the body just using a smart phone, a tablet, or a PC in seconds with no extra hardware. So this is the web base version. This is running in Flash. We ask the user to put in their height. The user stands about three meter back from the device. They’re guided into position just using the camera, watching the screen three, two, one, one shot is taken from the front. And then turn to the side, three, two, one, one shot is taken from the side. Front, side, that all it is. Two snapshots very simple, then the cleaver bit. Then we recombine those two images in the cloud in a couple of seconds, we use image segmentation to cut out the edge of the person, which is easy for people and hard for machines. You’ll notice she’s wearing a dress and we’re able to see the body underneath, then we combine those two images together into a 3-D model of the person. it takes about 10 seconds for the pictures, and about five seconds of processing and we have accuracy around the body of about eight millimeters. So we enable easy body measurement in seconds for anyone, and we have an API, we have a software development kit, and we offer white label if people just want to take it and do interesting stuff with it. So we took it into mass customization and we took it into retail and that was cool, happily ever after. But then I realized something. I realized that when you take measurements you’re actually taking a snapshot of history. So I started using our technology to measure my own body. And I measured how it changed over time because my body grew as my body shrank, and I found it very very powerfully motivating to see an ideal version of me. same structure and less body weight, and I lost 15 kilograms in five months using our system. There’s the proof, the before on the left and after on the right. so this technology for mass customization for retail, I though wow, I wonder what other people would be interested in this, and then I discovered not just six or seven weeks ago that QS, Quantified Self, like lots of people around the world are doing this stuff, and I thought it was just me, wow! And so now I’m coming up with this sort of idea of a solution called QSU, and an early adopter is a wonderful gentleman called Doctor Lex Houdijk, Alkmaar, who wants to use our technology in consent with others for patient risk classification, so you could say whether people are at risk of certain conditions or not. Something else which I think is really interesting is that we don’t just take measurements, but we take volumes of the body and that’s something that’s pretty difficult to do. But if you have data like we can generate and you cross reference that with MRI data, then you can look inside the body and get a virtual MRI on a smart phone. Now it’s low-fidelity and we’ve had to create all kinds of technology in order to boost it. Even though you know the pictures maybe crap, we can take pretty good measurements from it. And now we are taking that data and we are applying it for new technologies; this is Med-Sensation I met them at Future Med at Palo Alto; they have technology for looking at breast cancer. Well, how do you get the body? Well that’s from our system. So I’m looking for people that want to plug our data into what you guys are doing or maybe vice versa, that’s why I’m here. But I’m also interested in talking to people that would like to work with our technology just to play with it, just to use it day by day, because what we’re creating is a global anthropometric database. A huge census of people all around the world of different ages, genders, ethnicities in how their bodies look like, so you can see how you compare to your peer group. I’m Nell Watson. Our company is Poikos, and you can check it out right now as QS.ME if you have a laptop.

Thank you very much.

About the presenter[edit | edit source]

Eleanor Watson gave this talk.