Privacy Rights Groups Ask Eric Holder To Ensure The FBI's Biometric Database Doesn't Become Just Another Domestic Surveillance Tool

from the give-us-your-address,-your-shoe-size,-your-years dept

The FBI is continuing to push ahead with its development of a biometric database (Next Generation Identification, or NGI), which will combine old school fingerprints and background records with facial recognition technology and other biometric data.



The technology continues to improve, but the FBI originally greenlit the database back when it still allowed the database a 20% error rate on its facial recognition. That was back in 2010, and of course, the only reason we know the FBI was perfectly fine with a 1-in-5 screw up rate was because EPIC liberated this information with an FOIA request.



This is also being rolled out without the FBI providing an updated Privacy Impact Assessment, a mandatory document demanded by the DOJ. It told Congress in 2012 that it was working on producing one. It's still telling this same story in 2014, as detailed in a letter to Eric Holder, signed by the ACLU, the EFF, EPIC and several other civil liberties/privacy rights groups.

The FBI recognizes this transformation and, at a July 2012 Senate hearing, committed to updating its privacy assessment of the agency's use of facial recognition. Jerome Pender, Deputy Assistant Director of the FBI's Criminal Justice Information Service Division, stated in his statement for the record that "[a]n updated PIA is planned and will address all evolutionary changes since the preparation of the 2008 IPS PIA." Furthermore, Assistant Director Pender said the updated privacy assessment would have "an emphasis on Facial Recognition." Nearly two years later an updated privacy assessment has not been completed.

According to an FBI study, the quality of images in the database is inconsistent and often of low resolution. Partly for this reason, the FBI doesn’t promise accuracy in its search results. Instead, it ensures only that “the candidate will be returned in the top 50 candidates” 85% of the time “when the true candidate exists in the gallery.” In fact, the overwhelming number of matches will be false. This false-positive risk could result in even greater racial profiling by disproportionately shifting the burden of identification onto certain ethnicities. The false-positive risk can also alter the traditional presumption of innocence in criminal cases by placing more of a burden on the suspect to show he is not who the system identifies him to be. And this is true even if a face recognition system such as NGI offers several results for a search instead of one, because each of the people identified could be brought in for questioning, even if he or she has no relationship to the crime.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

This lack of the required privacy assessment also has had little impact on the speed of the FBI's NGI rollout. It has stated that it hopes to have the program fully operational in "fiscal year 2014." In the slight defense of the FBI, there's plenty of "privacy impact" to "assess."The NGI database not only gathers criminal records from across multiple state and federal databases but also pulls in non-criminal data gathered from federal employees and employer background checks. This database, containing photographs, iris recognition data, palm prints and vast numbers of information collected from existing databases will be accessible by local law enforcement agencies. The possibilities for abuse are nearly endless, and the program itself is far from flawless when it comes to correctly identifying suspects.To head off abuse, the letter asks the Attorney General to ensure that the databasecollects data on "individuals who are part of the criminal justice system." It also asks Holder to prevent the NGI program from becoming just another way for the FBI (and its partners in law enforcement) to surveill innocent Americans.Those signing this letter likely know that neither Holder nor the FBI are particularly sympathetic to the privacy interests of Americans, but the letter does create another opportunity to bring the issue to the attention of the public. Enough public pressure can push agencies in the right direction, especially if the public also gets its representatives involved. There's been surprisingly little oversight of the FBI's activities, especially with the NSA claiming most of the oversight spotlight in recent months, but the ACLU and others are always there to remind citizens that there's more than one agency playing fast and loose with American's privacy.

Filed Under: biometrics, doj, eric holder, privacy, surveillance