Police are to use controversial facial recognition software to scan crowds attending the Remembrance Sunday ceremony at the Cenotaph, the Observer can reveal.

The Metropolitan Police will deploy real-time biometric tracking at the event, which will be attended by about 10,000 former and current service personnel as well as dignitaries and members of the public. Prince Charles will lay the head of state’s wreath at the commemoration, which marks the 99th anniversary of the end of the first world war. Met sources said the use of the technology at the showpiece central London event is a trial, and not related to terrorism or serious crime.

Officers have compiled a dataset of about 50 individuals known for obsessive behaviour towards particular public figures. Automated facial recognition cameras will be used to identify any individual on the list who attends the Whitehall event. None of those on the list is believed to be wanted for arrest, and this has prompted civil liberties group Liberty to denounce the use of the technology as discriminatory.

Critics say using these cameras – which will scan the face of anyone who passes including thousands of former and current service personnel – in public spaces has not been subject to parliamentary scrutiny and has not received legal justification.

Martha Spurrier, director of Liberty, said: “There is no legal basis and no public consent for deploying this intrusive and intimidating biometric surveillance in public spaces. Not only are the Met using it on our streets again, they are targeting people who have done nothing wrong, are not wanted for arrest and may have serious mental health issues. These people have just as much right as anyone else to pay their respects on Remembrance Sunday.

“There’s a dark irony to the Met resorting to this on the day we remember those who died to keep us free. The creeping rollout of this authoritarian technology, and the potential exclusion of vulnerable, innocent people from public spaces, undermines everything they fought for – it has no place in a rights-respecting democracy.”

She added that if the individuals had issued threats or had been guilty of stalking, they could be legitimately arrested.

This is believed to be the third time in two years that the Met has trialled automated facial recognition (AFR) at a public event, prompting claims hat it is being gradually introduced by stealth.

Automated facial recognition was last used at Notting Hill carnival, matching faces in the crowd against databases of people previously arrested or under bail conditions. The Met said that once the results of the trials had been analysed, there would be a public consultation.

But Liberty said that during its deployment at the carnival, the efficacy of the system was called into question and had led to “multiple incorrect identifications”, on one occasion mistaking a person’s gender, and was not successful as part of a crime prevention operation.

Opponents of the technology believe AFR may be the next civil liberties battleground. In September an official watchdog issued a warning over police use of more than 20 million facial images on searchable databases more than five years after the courts ruled that the inclusion of images of innocent people was unlawful.