Have you tried to touch and move something on a screen that isn’t touch-sensitive? As my computer-screen will attest, my kids try it ALL THE TIME – flipping through photos, tapping ‘OK’, spinning 3D models – can’t be done on my monitor yet. We’re in that transition, and if the research that a PhD student at Purdue University is working on makes it to the mainstream, it could be that one day, you (or your kids) are at another transition as you experience the futile attempts to create 3D geometry with the wave of your hand.





Handy Potter

Vinayak is a graduate of the Indian Institute of Science School and PhD student at Purdue’s School of Mechanical Engineering. He works at the Computational Design and Innovation Lab and has been developing a system for human interaction to create, modify and manipulate 3D geometry. It’s called the Handy Potter, gets rid of the mouse and uses your skeletal system as the main input.

We track the skeleton of users using the depth data provided by low cost depth sensing camera (KinectTM). Our modeling tool is configurable to provide a variety of implicit constraints for shape symmetry and resolution based on the position, orientation and speed of the arms.

That’s it, I propose a model/dance-off. This reminds me of SolidThinking Inspire that allows you to create a general shape, run “morphogenesis” and explore the different designs, although completely done with a mouse. Honestly, with as long as the Kinect has been out and the scanning mods that have been shown, I thought we would see this and something beyond a digital potting wheel at this point. Your thoughts?

Source: Purdue

Via: Futurity