The public has a right to know how data about them is being used, write Matthew A Jay and Prof Ruth Gilbert

We thank West Midlands police’s ethics committee for giving serious attention to the potential for harm arising out of data-driven offending prediction models (Alert over risk of bias in tool to predict who will reoffend, 20 April).

A wider public debate needs to be informed by research into how effectively the use of people’s data predicts and reduces criminality, who else experiences targeting and privacy intrusion due to prediction errors, and whether better use of data could reduce such collateral harm.

The public has a right to know how data about them is being used. Generic privacy notices, such as that on the West Midlands police’s website, are not enough.

Public authorities using data should publish exactly what they are doing, and what they find, and they should engage with academics to promote research evaluation of the biases and potential harms of such algorithms.

Matthew A Jay

Prof Ruth Gilbert

Legal epidemiology group, University College London Great Ormond Street Institute of Child Health

• Join the debate – email guardian.letters@theguardian.com

• Read more Guardian letters – click here to visit gu.com/letters

• Do you have a photo you’d like to share with Guardian readers? Click here to upload it and we’ll publish the best submissions in the letters spread of our print edition