Project Glass: Developers' verdicts on Google's headset By Laura Locke

Technology reporter, I/O conference, San Francisco Published duration 28 June 2012

image caption Google's co-founder interrupted another presentation to discuss Project Glass at the firm's I/O developers conference image caption The headset's engineers said they thought it would be used mainly to record videos and photos, and to see information about the user's location image caption Other Google employees were later seen sporting the headsets outside the San Francisco event image caption Google has practised recording videos with the devices while skydiving image caption A concept video released earlier this year suggested the headsets could be used to flash text messages from friends image caption Google signalled that t would like to trigger location-specific information to users wearing the eyewear image caption The search giant plans to link the headsets to its Google Maps service to help guide users to their desired destinations image caption Google also suggests the device could ultimately be used to hold video conference calls in which the user broadcasts their point of view image caption Google took this photo using the device in San Francisco image caption For now the explorer edition of the headset is only being made available to developers for pre-order at Google's I/O conference previous slide next slide

Google's augmented reality headsets still remain prototypes, but it appears the firm is determined to bring them to market.

It showed off the devices during one of the flashiest tech presentations to date at its I/O developers conference on Wednesday.

The presentation involved live videos streamed from headsets worn by skydiving employees and daredevil cyclists, as well as the announcement that attendees could pre-order test versions of the product.

Project Glass can record and stream video and display information through a small transparent screen above the user's right eye.

The product is controlled by voice or via a small touchpad on the right arm of the device; can be connected to the internet via a tethered phone; and lasts for about six hours thanks to an attached battery - although the All Things D site reports that Google intends to extend that to a day

Web searches, email, and photos can be seen in-frame, and the company intends to add more functions over time.

Google aims to release the eyewear to consumers before the end of 2014, but developers are being offered the chance to buy an "explorer" edition to start work on related software.

The catch is that they are being charged $1,500 (£965) for the privilege, must be US-based, have attended the conference and will have to wait for delivery until "early next year".

The BBC spoke to four developers at I/O to see what they made of the announcement.

$1,500 is really a drop in the bucket when you think about what they've actually been able to do.

I used to do research and development at Motorola [part of which is now owned by Google] and for 12 years we were developing all sorts of technology to make it very easy for people in public safety - first responders, police, fire and rescue - to carry out their mission-critical missions.

Time and concentration are the most important aspects of their job and when you are in a very stressful situation like trying to save someone's life or fight a fire, focusing on your phone or another device, is very, very difficult.

So having glasses that allow you to see in your field of view information that's relevant to a mission that you're trying to carry out in a very stressful situation is incredibly valuable.

$1,500 is nothing - the prototypes that we used to work with were $15,000.

Mr Kostresevic was the 427th developer to pre-order Project Glass.

It's a far-out idea - it's still in its infancy I think.

It's an interesting concept but I don't see myself wearing a device like that because I wear prescription glasses.

Will there be something I can put on my prescription glasses? I don't know.

I don't see everybody wearing the device all day.

When I go to work, should I take pictures of my desktop every five minutes?

It kind of spoils the view if everybody is wearing these devices on their heads. I like to see people's faces as they are.

It's also maybe a bit intrusive if they start blinking - and oops - you've taken a picture.

Just a week before Google I/O, I was walking down the street with my girlfriend and looking for a place to eat. We had four restaurants in a four-block radius, and we checked them out on foot.

I had help on my phone and never bothered to look it up because it's this cumbersome experience of reaching into your pocket, grabbing the phone, launching the app, waiting for it to figure out where you are, telling it what restaurants [to research] and then clicking it.

So I remembered Glass, and I told my girlfriend: "If I had those glasses, I could have just looked at the restaurant, seen the average Zagat ratings and the average price.

So there definitely is a lot of use cases.

As awesome as the Android phone was five years ago when I first started using it, it has started feeling too cumbersome.

A much better experience would be to look at what I'm interested in and have the information that I want displayed automatically.

Mr Kostresevic was the 574th software developer to order Project Glass.

In the long arc of wearable computing, I just think that what we are going to see in retrospect, is this technology - Glass - is a new interface to the internet, and that is all it is.

When we look back 10 years or so in the future, it's going to be so cheap to integrate computing and communication into everything that it's going to stop being about gadgets.

The real impacts of wearable computing are most likely to come in things that don't look like computers at all.

When you add smarts and functionality to things that are around us all the time - like a purse, wallet or anything that you carry with you - you don't have to invent a new device to get the best use out of wearable computing.

Computing is infusing every physical object that we interact with, so I think that's going to end up being the more important angle.