"Automated identification of terrorists through data mining (or any other known methodology) is neither feasible as an objective nor desirable as a goal of technology development efforts," the report found. "Even in well-managed programs, such tools are likely to return significant rates of false positives, especially if the tools are highly automated."

"Terrorists can damage our country and way of life in two ways: through physical, psychological damage and through our own inappropriate response to that threat," Vest said in opening remarks (.mp3).

Yesterday, Wired Magazine's online network blog covered a report by a privacy and terrorism commission funded by the Department of Homeland Security that found that the technology designed to decide from afar whether a person had terrorist intent would not work. The committee, created by the National Research Council in 2005, says that false positives could quickly lead to privacy invasions.Committee co-chair Charles Vest made it clear at the unveiling of the report in Washington yesterday that the committee was not dismissing the threat of terrorism to us physically and as a nation.The committee emphasized that the government should have useful tools to fight terrorism, but that they must respect Americans' privacy.See article in Wired Blog Network