Through a looking glass (Image: Google/AP)

TEXT displays are only as good as your ability to read what’s written on them. Wearable displays like Google Glass or the Epson Moverio will free you from looking down at your smartphone to read tweets. But that only works if the text stands out from the constantly changing background – one minute a dimly lit room, the next a bright blue sky – as you walk along.

Jason Orlovsky and colleagues at Osaka University in Japan have developed a text display algorithm that places the current message – a tweet, your location or your walking speed, say – on the darkest region in view at any given moment and in a readable colour.

This is done using the headset’s camera, which plots a constantly changing heat map of viable on-screen reading locations. The algorithm can also split up a message into two small dark regions either side of your field of view. “Twitter feeds or text messages could be placed throughout the environment in a logical manner, much like signs are placed on either side of a street,” the developers say.

The team presented their work last month at the Intelligent User Interfaces conference in Santa Monica, California. With the launch of an early version of Glass due in the next month, such software is likely to be in demand.