An official Microsoft blog highlights MS’s plans about the future of how we’ll interact with computers. Apparently, it’s touchscreens and “natural user interfaces” all the way. Has MS finally realized Windows isn’t touchscreen friendly?

In its blog post “Microsoft is imagining a NUI future,” Steve Ballmer’s company describes pretty accurately the way things are going with personal technology right now: “You don’t have to look very far to realize that technology is becoming more natural and intuitive,” it begins. But then it notes, “In a typical day, many people use touch or speech to interact with technology–on their phones, at the ATM, at the grocery store and in their cars,” and while Microsoft technology is often behind your ATM (as an embedded OS that you never get to see) it’s really not winning marketspace in the smartphone or tablet PC business.

Microsoft championed the tablet PC a decade ago, driven by the fervor of Bill Gates. But the devices Gates was touting were very different to the tablets of today, led by Apple’s iPad. Partly due to technological necessity, partly due to slightly askew design thinking, those early MS tablets were essentially transformable laptops with a touch-sensitive screen that relied on a user manipulating a stylus with a slightly modified, pen-friendly version of Windows. They didn’t catch on.

Things, of course, have changed. And as MS notes, “The learning curve for working with computers is becoming less and less of a barrier thanks to more natural ways to interact.” It seems Microsoft learned its lesson from Surface, which has won many a fan around the world despite its huge cost. But it’s also learning its lesson from Kinect, which is becoming wildly popular as an input tool for PCs–albeit via clever hackers, since MS didn’t think to release PC or Mac drivers for it. Natural user interfaces are the future, MS thinks.

And they’ll be cleverer than touchscreen tablets or gesture-sensing games. MS foresees NUI technology rapidly advancing from its current sensor-centric state to include “knowledge of what you’re trying to do (contextual awareness)” and “where you are and what is around you (environmental awareness).”

By combining clever processing with Kinect-style sensors and touchscreens, MS imagines that its systems will become much more intuitive. The hope is that they become “almost invisible,” in fact, and not a barrier between you and what you want your PC to do. Add in haptic feedback, where your devices communicate back to you in non-visual ways to add to the quality of interactivity, and you’ve got some very powerful thinking here.

MS has done research to back this thinking, and looked at the level of understanding of NUI around the world, as well as its future potential–check out MS’s infographic by clicking for a large-screen version.