More than 1,700 people people on a University of Colorado campus were unknowingly photographed as part of a facial recognition project funded by the U.S government.

The study, originally carried out between 2012 and 2013 has been highlighted by several recent reports and raises ethical questions about researchers' methods, particularly the lack of consents from the project's subjects.

Using a long-range surveillance camera to peer out of an office window at a campus in Colorado Springs, the study's author, Dr. Terrance Boult, captured more than 16,000 images of students, professors, and others during a period in 2012 and 2013.

To help research facial recognition software the U.S. military funded research that recorded 1,700 subjects on a Colorado campus without their knowledge. Stock image

The images, says Boult, formed the basis for a data set called 'Unconstrained College Students' which would be used to test the ability of a facial recognition algorithm to identify people in murky conditions -- many of the subjects swept up in the images were in less-than-ideal lighting and sometimes looking away or even down at their phones.

Feasibly, this type of data would be useful the U.S. military and intelligence in helping to design facial recognition that can be used in reconnaissance or even more acute domestic surveillance.

The database was made available to the public in 2016 but was eventually taken down this April according to the Denver Post.

While none of the people captured by the camera's recordings were named and entities using the database were required to sign a legal agreement saying that they would not release any photos according to the Colorado Springs Independent, the research has still given rise to ethical questions.

'It’s yet another area where we’re seeing privacy intrusions that disturb us,' said Bernard Chao, a privacy expert at Denver University who was interview by the Denver Post.

The study by Boult is one many recent examples in which entities and corporations have used iamges of people to train facial recognition software without their consent.

Among the chief concerns from skeptics of facial recognition software is that it may violate people's privacy and give way to a culture of mass surveillance. Stock image

In a report from NBC News, the outlet revealed that cloud-based photo company, Ever used millions of users' photos to train a facial recognition algorithm being licensed for use by one of its corporate arms, Ever AI.

Likewise, IBM used millions of photos sourced from photo-sharing website Flickr to train its own facial recognition software.

Ethical question over the methods used to train advanced facial recognition software seem to mirror an increasing skepticism about the systems as a whole.

Amazon recently killed an initiative to stop selling its facial recognition software, Rekognition, over concerns from some shareholders that it may be misused or sold to dubious governments while San Francisco became the first city in the U.S. to ban its use by law enforcement and other public agencies citing concerns over privacy and first amendment rights.