The surveillance society

1 min read

Are we finally waking up to the privacy implications of real-time facial recognition surveillance?

Last week at IFSEC, in London’s Docklands, the technology was on display and being promoted as a tool for the prevention of crime, the apprehension of offenders and the protection of the public.

But beyond that it is also used by the likes of Facebook, which uses AI powered facial recognition software as part of its social networking platform, to identify people.

The technology is being used in airports and by a growing number of police forces to record faces. The images are processed to create a biometric map of a face, which is then checked against a known “watchlist” of criminals.

This is a technology that goes way beyond CCTV and traditional police methods and is now coming under increased criticism by those worried that it could be used for ‘mass surveillance’.

Martha Spurrier, the director of the campaign group Liberty has called for the use of automated facial recognition to be banned on UK streets.

She has warned that “accurate or inaccurate law enforcement marginalises [and] demonises particular communities. This technology, whether [or not] itself is biased, will be used by people where bias is already entrenched.”

The police argue that the technology prevents crime, protects the public and does not breach the privacy of innocent people whose images are captured – in fact in a legal case brought by a man in Cardiff against the use of this technology it was compared to using DNA as part of an investigation.

In San Francisco, however, the police and other agencies have been stopped from using automated facial recognition.

While the technology is developing quickly it certainly isn’t accurate, as London’s Metropolitan police can testify following extensive tests.

While the police must adhere to data protection rules and have a code of practice for the management of information, the use of this technology is known to have a serious impact on the way people interact in public places and could be used as a tool for social control.

Before we embrace it shouldn’t we be looking to regulate it – I’m not proposing a ban – but we are at a point where it could become entrenched and so much harder to do anything about it, should we want to.