Images of seven people were passed on by local police for use in a facial recognition system at King’s Cross in London in an agreement that was struck in secret, the details of which have been made public for the first time.

A police report, published on Friday by the deputy London mayor, Sophie Linden, showed that the scheme ran for two years from 2016 without any apparent central oversight from either the Metropolitan police or the office of the mayor, Sadiq Khan.

Writing to London assembly members, Linden said she “wanted to pass on the [Metropolitan police service’s] apology” for failing to previously disclose that the scheme existed and announced that similar local image-sharing agreements were now banned.

There had been “no other examples of images having been shared with private companies for facial recognition purposes” by the Met, Linden said, according to “the best of its knowledge and record-keeping”.

Daragh Murray, a senior lecturer at ​the University of Essex​, said he was surprised the Met had not admitted it was supplying images of individuals to King’s Cross at the time the scheme was launched in 2016. “The scheme seems to have been run without appropriate oversight, safeguards, and procedures,” the academic said.

The surveillance scheme – controversial because it involved tracking individuals without their consent – was originally agreed between borough police in Camden and the owner of the 27-hectare King’s Cross site in 2016.

King’s Cross first admitted it had deployed facial identification technology in CCTV cameras in August, prompting an outcry about the ethics and legality of the move. In September, it announced it had abandoned plans to use the technology in the future. It said facial recognition had been used in two cameras on pedestrian boulevards at the heart of the development, with the intention “to help ensure public safety”.

Facial recognition technology maps faces in crowds and compares them to images of people on a watchlist, which can include suspects, missing people and persons of interest to the police. The cameras can scan faces in large crowds in public places such as streets, shopping centres and football crowds.

Last month, the high court ruled that the use of facial recognition software by South Wales police was legal after an office worker had brought a legal action saying the technology harvested too many images of innocent people.

But there have been concerns about the regulatory framework governing facial recognition and its effectiveness, with studies suggesting it is less effective at accurately distinguishing black people. Khan has called for new legislation to regulate the technology.

The Met report said Camden police agreed in 2016 to supply images of individuals who had been “arrested and charged, cautioned or reprimanded or given a formal warning” to King’s Cross between May 2016 and March 2018.

The idea was to help the property company “to discharge its responsibilities to prevent and detect crime”, although it was not further explained how this would be carried out or whether the software was used.

That prompted Murray to ask whether the police could be confident that the image sharing was legitimate. “The grounds on which information was shared - to prevent and detect crime – are overly broad, and really run the risk of arbitrary interfaces with individuals’ rights,” the lecturer said.

No records were kept of whether the facial recognition software successfully recognised any of the seven people whose images were passed on, the Met admitted in the four-page report, or whether any police action followed a match.

Officers in Camden nevertheless came to a new data-sharing agreement with King’s Cross in early 2019, which would have governed any future use of the surveillance technology.

King’s Cross is owned by a consortium comprising the property developer Argent, Hermes Investment Management on behalf of BT Pensioners, and the Australian pension scheme AustralianSuper.