Face Detection with Android APIs

Through two main APIs, Android provides a simple way for you to identify the faces of people in a bitmap image, with each face containing all the basic location information. This tutorial focuses on utilizing these APIs to accomplish the face detection task, which can be extended for many other interesting applications. As we work through these APIs, we will develop a simple working project. The entire source package is available for download as a reference.

One thing to note is face detection is a computer technology that determines the locations and sizes in arbitrary images. Do not confuse it with face recognition. A facial recognition system is a computer application for automatically identifying or verifying a person from a digital image. One of the ways to do this is by comparing selected facial features from the image and a facial database. Simply put, face detection extracts people's faces in images but face recognition tries to find out who they are.

How To Install Android Face Detection APIs

As mentioned before, there are two main APIs introduced in this tutorial:

android.media.FaceDetector : Identifies the faces of people in a Bitmap graphic object.

: Identifies the faces of people in a Bitmap graphic object. android.media.FaceDetector.Face: Contains all the information identifying the location of a face in a bitmap.

There is no installation necessary since they come with the base Android APIs, not from optional packages.

Constructing An Android Activity For Face Detection

You can construct a generic Android activity. We extend the base class ImageView to MyImageView, which we use as our main view to display the image as well as face feature markers. At the moment, the bitmap containing faces must be in 565 format for the APIs to work correctly. A detected face needs to have a confidence measure above the threshold defined in android.media.FaceDetector.Face.CONFIDENCE_THRESHOLD.

The most important method is implemented in setFace() . It instantiates the FaceDetector object and calls findFaces. The result is then stored in faces. Face midpoints are passed onto MyImageView for display.

public class TutorialOnFaceDetect1 extends Activity {

private MyImageView mIV;

private Bitmap mFaceBitmap;

private int mFaceWidth = 200;

private int mFaceHeight = 200;

private static final int MAX_FACES = 1;

private static String TAG = "TutorialOnFaceDetect"; @Override

public void onCreate(Bundle savedInstanceState) {

super.onCreate(savedInstanceState); mIV = new MyImageView(this);

setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT)); // load the photo

Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);

mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true);

b.recycle(); mFaceWidth = mFaceBitmap.getWidth();

mFaceHeight = mFaceBitmap.getHeight();

mIV.setImageBitmap(mFaceBitmap); // perform face detection and set the feature points setFace(); mIV.invalidate();

} public void setFace() {

FaceDetector fd;

FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];

PointF midpoint = new PointF();

int [] fpx = null;

int [] fpy = null;

int count = 0; try {

fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);

count = fd.findFaces(mFaceBitmap, faces);

} catch (Exception e) {

Log.e(TAG, "setFace(): " + e.toString());

return;

} // check if we detect any faces

if (count > 0) {

fpx = new int[count];

fpy = new int[count]; for (int i = 0; i < count; i++) {

try {

faces[i].getMidPoint(midpoint); fpx[i] = (int)midpoint.x;

fpy[i] = (int)midpoint.y;

} catch (Exception e) {

Log.e(TAG, "setFace(): face " + i + ": " + e.toString());

}

}

} mIV.setDisplayPoints(fpx, fpy, count, 0);

}

}

In the following code we added setDisplayPoints() to our MyImageView to render markers at the detected face features. Figure 1 shows a marker centered on the midpoint of the detected face.

// set up detected face features for display

public void setDisplayPoints(int [] xx, int [] yy, int total, int style) {

mDisplayStyle = style;

mPX = null;

mPY = null; if (xx != null && yy != null && total > 0) {

mPX = new int[total];

mPY = new int[total]; for (int i = 0; i < total; i++) {

mPX[i] = xx[i];

mPY[i] = yy[i];

}

}

}

Figure 1: Single Face Detected in Android

Android Face Detection: Detecting Multiple Faces

You can specify the maximum number of faces to be detected using FaceDetector. You can modify the following variable for this purpose, for example. In the API documentation, it does not specify whether an upper limit exists, so you can try to detect as many faces as possible.

private static final int MAX_FACES = 10;

Then you can use count returned from findFaces to obtain all the results from the list. Figure 2 is one example showing multiple markers centered on the respective midpoints of the detected faces.

Figure 2: Multiple Faces Detected in Android

Android Face Detection: Approximating Eye Center Locations

Android face detector returns other information as well for us to fine-tune the results a little bit. For example, it also returns eyesDistance, pose, and confidence. We can use eyesDistance to estimate where the eye center locations are.

This time we also put setFace() in a background thread inside of doLengthyCalc(), because the computation of face detection can potentially take too long and cause the "Application Not Responding" error when dealing with big images or images with many faces to detect.

Figure 3 is one example showing multiple markers centered on the respective eyes of the detected faces.

public class TutorialOnFaceDetect extends Activity {

private MyImageView mIV;

private Bitmap mFaceBitmap;

private int mFaceWidth = 200;

private int mFaceHeight = 200;

private static final int MAX_FACES = 10;

private static String TAG = "TutorialOnFaceDetect";

private static boolean DEBUG = false; protected static final int GUIUPDATE_SETFACE = 999;

protected Handler mHandler = new Handler(){

// @Override

public void handleMessage(Message msg) {

mIV.invalidate(); super.handleMessage(msg);

}

}; @Override

public void onCreate(Bundle savedInstanceState) {

super.onCreate(savedInstanceState); mIV = new MyImageView(this);

setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT)); // load the photo

Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);

mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true);

b.recycle(); mFaceWidth = mFaceBitmap.getWidth();

mFaceHeight = mFaceBitmap.getHeight();

mIV.setImageBitmap(mFaceBitmap);

mIV.invalidate(); // perform face detection in setFace() in a background thread

doLengthyCalc();

} public void setFace() {

FaceDetector fd;

FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];

PointF eyescenter = new PointF();

float eyesdist = 0.0f;

int [] fpx = null;

int [] fpy = null;

int count = 0; try {

fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);

count = fd.findFaces(mFaceBitmap, faces);

} catch (Exception e) {

Log.e(TAG, "setFace(): " + e.toString());

return;

} // check if we detect any faces

if (count > 0) {

fpx = new int[count * 2];

fpy = new int[count * 2]; for (int i = 0; i < count; i++) {

try {

faces[i].getMidPoint(eyescenter);

eyesdist = faces[i].eyesDistance(); // set up left eye location

fpx[2 * i] = (int)(eyescenter.x - eyesdist / 2);

fpy[2 * i] = (int)eyescenter.y;



// set up right eye location

fpx[2 * i + 1] = (int)(eyescenter.x + eyesdist / 2);

fpy[2 * i + 1] = (int)eyescenter.y; if (DEBUG) {

Log.e(TAG, "setFace(): face " + i + ": confidence = " + faces[i].confidence()

+ ", eyes distance = " + faces[i].eyesDistance()

+ ", pose = ("+ faces[i].pose(FaceDetector.Face.EULER_X) + ","

+ faces[i].pose(FaceDetector.Face.EULER_Y) + ","

+ faces[i].pose(FaceDetector.Face.EULER_Z) + ")"

+ ", eyes midpoint = (" + eyescenter.x + "," + eyescenter.y +")");

}

} catch (Exception e) {

Log.e(TAG, "setFace(): face " + i + ": " + e.toString());

}

}

} mIV.setDisplayPoints(fpx, fpy, count * 2, 1);

} private void doLengthyCalc() {

Thread t = new Thread() {

Message m = new Message(); public void run() {

try {

setFace();

m.what = TutorialOnFaceDetect.GUIUPDATE_SETFACE;

TutorialOnFaceDetect.this.mHandler.sendMessage(m);

} catch (Exception e) {

Log.e(TAG, "doLengthyCalc(): " + e.toString());

}

}

}; t.start();

}

}

Figure 3: Eyes Detected in Android

Page 1 of 2