South Wales Police recently confirmed that they will use facial recognition technology to monitor fans at this Sunday’s tie between Cardiff City and Swansea City.
According to one of the officers responsible for the decision, the deployment of the controversial system is intended to “prevent disorder” from arising. While South Wales Police have been criticised by several stakeholders for their overzealous use of the technology, they are by no means alone. The Metropolitan Police (London) have utilised the technology, ostensibly as part of a pilot study, and other British forces are likely to follow in the coming months and years. The situation is similar across the continent, with state and municipal authorities forging ahead at different speeds depending on local sensibilities. In Italy, for instance, AS Roma and SS Lazio fans have their faces scanned before entering the stadium as part of wider ticketing and ID checks.
In addition, visitors to Baku during the 2019 UEFA Europa League Final were faced with invasive facial recognition cameras at the entrance of the city’s fan zone. Farther afield, the 2022 FIFA World Cup in Qatar is expected to involve the use of facial recognition technology on a frightening scale. There are analogous concerns regarding next year’s Club World Cup in China, and those concerns extend especially to other authoritarian regimes, where the technology is not delimited by public debate or independent oversight.
Football clubs have also been quick to embrace facial recognition software. Visitors to Danish club Brondby IF, to pick just one example, are scanned at the turnstiles. Manchester City, meanwhile, have floated the idea of replacing tickets with facial recognition scans. And in Central Europe, a growing number of mid-sized clubs have been approached by companies offering to install the technology for free, presumably with a view to expanding their database for future financial gain.
The question of whether this represents a new chapter in the history of mass surveillance, or whether such a development should be welcomed or not, is perhaps contingent on too many factors for a supporters’ organisation such as FSE to answer.
However, given the targeted nature of facial recognition, not to mention its potential wider consequences, we are compelled to restate our vehement opposition to the use of football fans as test subjects for unproven and unregulated technologies.
Indeed, as numerous well-respected civil liberties groups such as Digitalcourage (Germany) and Liberty (UK) have pointed out, facial recognition technology is fraught with technical shortcomings and sociological risks. But before we survey them, it makes sense to explain the technology itself in greater detail.
In short, facial recognition works by matching faces of people walking past special cameras to images of people on a watchlist. The technology does this by “scanning the distinct points of our faces and creating uniquely identifiable biometric maps—more like fingerprints than photographs.” The watchlists can contain photographs of anyone, including those who are not suspected of any wrongdoing. These images can be drawn from a variety of sources, including social media accounts.
So, what are the main problems?
First, it has the capacity to undermine three fundamental rights that many would consider to be the cornerstone of any democratic society:
- The right to privacy, as laid out in Article 12 of the United Nations Universal Declaration of Human Rights and Article 8 of the European Convention on Human Rights;
- The right to association, as laid out by articles 11 of both the United Nations Universal Declaration of Human Rights and European Convention on Human Rights;
- The right to expression, as laid out by Article 19 of the United Nations Universal Declaration of Human Rights and Article 10 of the European Convention on Human Rights.
Each of these rights are also fundamental to football supporters and the organisations that they might choose to join or with whom they fraternise.
A second—related—problem is that of discrimination, which is also covered by the United Nations Universal Declaration of Human Rights (Article 2) and European Convention on Human Rights (Article 14). To be sure, studies have shown that facial recognition technology displays inherent bias by disproportionately misidentifying women and ethnic minorities, meaning that people from these groups are more likely to be stopped and questioned by police and to have their images retained as the result of a false match.
This brings us to problem number three: effectiveness. Even Arfon Jones, a former police officer with over 30 years of experience and the current police and crime commissioner for North Wales, has publicly expressed scepticism. Speaking to The Guardian ahead of this week’s fixture at the Cardiff City Stadium, Jones noted that “[w]hen facial recognition technology was first used in the Champions League final at the Millennium Stadium there were several hundred, if not thousands, of false positives, so there have to be concerns about its accuracy.”
Then there is the matter of what public bodies and private enterprises intend to do with data gathered from facial recognition scans. Much of the code used in commercial software is not transparent and thus cannot be constrained or submitted to independent review. It has also been suggested that sensitive data is vulnerable to hackers, state-backed or otherwise. Abigail McAlpine, a cyber security researcher at the Secure Societies Institute, agrees, pointing out that facial recognition information from the Metropolitan Police, defence contractors, and banks was recently discovered on a publicly accessible database. This is particularly worrying in light of moves by certain clubs to operate their own schemes through third parties, either to replace physical tickets or enforce banning orders.
GDPR will be an issue, too. Ms. McAlpine told us that “by utilising a facial recognition system, clubs must commit to investing in appropriate data management systems that can answer data subject access requests, which could, if stated, include all stored recordings of the individual in question.”
“And”, she added, “they must clearly communicate this to their customers, or in this case, fans.”
What makes all this slightly perplexing is the disconnect between means and ends. Or, to put it another way, there is little, if any, evidence to suggest that the use of facial recognition technology at football matches is a proportionate response to the attendant hazards.
On the security front, Paul Corkrey (FSE Board member/FSA Cymru) summed up the incongruity of the police’s justification back in October, 2019, when Cardiff and Swansea fans were first singled out: “20,270 football supporters were subjected to facial recognition in order to monitor fewer than 50 people on a police force watchlist”, most of whom only had club-issued stadium bans.
The same goes for the service, or access, argument. In response to Manchester City’s mooted ticketless fan idea, Amanda Jacks from the Football Supporters’ Association (FSA) observed that “fans will still have to be searched before entering any stadium, and the technology is apparently only fractionally faster that electronic card readers, [so] it’s difficult to see how this is genuinely an improvement on the current system.”
What, then, is to be done?
We must first acknowledge that we are all, in some shape or form, dealing with a novel, unpredictable phenomenon, and there is no good reason to rush its application. We agree with other stakeholders that more evidence is required, as is a robust regulatory framework. Until such steps are taken, FSE supports calls for a European-wide moratorium on the use of facial recognition technology.
Given this context, FSE further demands an immediate end to the deployment of facial recognition technology at football matches in the UEFA region and a consultation process with fans and fans’ organisations.