‘Conditioning an entire society’: the rise of biometric data technology - 4 minutes read




In Moscow, users of the city’s famous Metro system can now pay using their face, a system that, for now at least, is voluntary. Photograph: Natalia Kolesnikova/AFP/Getty Biometrics ‘Conditioning an entire society’: the rise of biometric data technology The use of our bodies to unlock access to services raises concerns about the trade-off between convenience and privacy

In a school canteen in Gateshead, cameras scan the faces of children, taking payment automatically after identifying them with facial recognition. More than 200 miles away in North London, staff at a care home recently took part in a trial that used facial data to verify their Covid-19 vaccine status. And in convenience stores around the country, staff are alerted to potential shoplifters by a smart CCTV system that taps into a database of individuals deemed suspect.

CRB Cunninghams, the US-owned company whose facial recognition tech is being deployed in lunch halls, has said its systems speed up payment and could reduce the risk of Covid-19 spread via contact with surfaces. The system was first tested at Kingsmeadow school in Gateshead last year and dozens of schools have signed up to follow suit.

Enthusiasm for the system may be on the wane now, though, after North Ayrshire council suspended use of the technology at nine schools following a backlash. The decision to back out came after parents and data ethics experts expressed concerns that the trade-off between convenience and privacy may not have been fully considered.

There are salutary examples of how such technology could be troublingly authoritarian in its usage, and China offers some of the more extreme precedents. After a spate of toilet paper thefts from public conveniences in a park in Beijing, users were asked to submit to a face scan before any paper would be released, and in Shenzhen, pedestrians who crossed the road at a red light had their faces beamed on to a billboard.

That technology is not in use anywhere at the moment, iProov said, but it is one of several firms whose systems are embedded in the NHS app, deployed when users want to access services such as their Covid status or GP appointment bookings using their face.

“That’s convenient for 99% of people, but if someone shows up to an anti-government protest, suddenly they have the ability to track down who went in and out, unlike an Oyster-style card that might not be registered,” said Banerjee.

Some technology that is already in common use in the UK has sparked anxiety about civil liberties. London-based FaceWatch sells security systems that alert shop staff to the presence of a “subject of interest” – typically someone who has behaved antisocially or been caught shoplifting before. It started out as a system for spotting pickpockets at Gordon’s wine bar in central London, of which FaceWatch founder Simon Gordon is the proprietor.

However, Wachter has concerns about the prospect of such technology becoming more widespread. “Research has shown that facial recognition software is less accurate with people of colour and women.” She also points to the potential for existing human bias to be hardwired into supposedly neutral technology. “How can you trust that they ended up on the watch list accurately? There is bias in selective policing and in the judicial system.”

Nor is it clear in many cases to whom such systems are accountable and how individuals can contest the judgments they make. “What if I’ve been wrongfully accused, or the algorithm incorrectly matched me with somebody else?” Banerjee asks. “It’s private justice where you have zero recourse on being able to correct that.”

FaceWatch said it does not share the facial data it holds with the police, although they can access it if an offence is reported. The company said it minimises the risk of misidentification by ensuring cameras are positioned in good light to enhance accuracy, with any borderline cases referred to a manual checker. People on the watchlist can, it says, challenge the decision.

Source: The Guardian

Powered by NewsAPI.org