A new software program called GoogleFaces uses Google Maps and facial recognition software to isolate geographic structures that look like human faces. It's far from perfect, but the program has uncovered some remarkably face-like surface features.


Humans are good at picking out faces from everyday objects and structures — a characteristic known as pareidolia; we see faces nearly everywhere, whether it be on the foam of our lattes or the surface of Mars.


Looking to take advantage of this psychological phenomenon, designers at Berlin's Onformative developed an algorithm that scans the surface of the Earth with Google Maps, picking out geographical structures that are likely to be construed as having face-like features.

Here's how the software works.

One of the key aspects of this project, is the autonomy of the face searching agent and the amount of data we are investigating. The source of our image data is halfway voluntary provided by Google Maps. Our agent flips through one satellite image after the other, in order to feed the face detection algorithm with landscape samples. The corresponding iteration algorithm steps sequentially along the latitude and longitude of our globe. Once the agent circumnavigated the world, it switches to the next zoom level and starts all over again. In order to process the face detection algorithm on top of different satellite images and store the geographical coordinates, we needed a precise communication between our standalone application and a virtual browser surfing Google Maps. Therefore we decided to use ofxBerkelium, which is an OpenFrameworks wrapper for Berkelium. This library offers the possibility to capture browser images within a standalone application and to communicate via Javascript.




This video shows GoogleFaces at work:


All images Onformative/GoogleMaps.

[Via Geek]