Biometrics is a fast-growing field — these technologies are increasingly being used to identify people across a range of settings and scenarios, most notably in criminal justice and travel. The COVID19 crises has exacerbated this trend. Speaking during a recent Westminster e-Forum policy conference on this topic, Isabelle Moeller, chief executive at the Biometrics Institute, explained, “During the pandemic, it’s become clear that contactless interaction is a priority. We’ve also seen a surge in remote onboarding, not only for the government but also banking and businesses pretty much everywhere.”
However, as its usage increases, so too have ethical and legal concerns. This includes data protection — regarding the highly personal information collected and stored and the potential for technologies like facial recognition to be negatively biased against minority groups.
Implementing Biometrics
During the E-forum, Moeller outlined the significant benefits biometrics offer in “national security, both in preventing and investigating terrorism, and managing borders, combatting fraud and facilitating digital identity.”
She believes that organizations can mitigate ethical concerns around biometrics by following “good practice” policies when implementing these technologies. These should revolve around “three laws of biometrics” advocated by the Biometrics Institute, which are “policy has to come first, followed by process and then the technology.” The policy stage should analyze potential issues like proportionate use and human rights, while the process puts in place safeguards to ensure these are protected when the technology is used. “Only then, you really need to think about the technology,” she added.
According to Moeller, organizations often rush to implement biometric solutions without asking questions about them. “Then biometrics gets blamed when in reality it was probably a lack of policy or process,” she stated.
She also noted that many biometric technologies do not exhibit any bias regarding demographic differences like gender and race and are “normally very effective and accurate in the one-to-one verification mode.”
The key to ensuring these types of problems do not occur is to “understand the technology,” notably the algorithm used. In addition, it is always important for organizations to ask whether the biometric technology is proportionate and whether there is a viable alternative before going down this path.
Her underlying message was that organizations considering introducing biometrics must have a thorough strategy and process for doing so, particularly when there is currently a lack of regulations and standards in this area.
Updating Legislation and Standards
Jessica Figueras, vice chair of the UK Cyber Security Council, then provided a presentation on biometric regulation in the UK and how this area needs to evolve. As it currently stands, there is only protection for the use of fingerprints and DNA in the UK, meaning regulations are in urgent need of updating. This is “a classic case of technology outstripping society’s ability to keep pace.”
"A classic case of technology outstripping society's ability to keep pace"
She outlined legislation introduced by the devolved Scottish government last year that establishes a Scottish biometrics commissioner, which “shows the way forward.” This is primarily because this commissioner covers a far more comprehensive range of biometric technology than in the UK, such as gait analysis, voice analysis and facial expressions. However, Figueras noted that the field of biometrics is moving so quickly that even this legislation does not cover very recent developments in areas such as typing pattern recognition.
Another aspect of this legislation is that “it very explicitly considers the impact on equalities and vulnerable citizens and also requires the commissioners to work with a wide range of public bodies and to have stakeholder/user citizen voices built into the process,” said Figueras.
Therefore, she hopes the UK government will follow the path taken by the Scottish administration, particularly as the use of biometrics in the private sector “raises urgent questions.” Among private actors, they often use these technologies in public spaces, like shopping malls and venues, as well as user ID for online services.
Introducing such practices and regulations is only one side of the equation, according to Figueras. The practical challenge of ensuring these rules are implemented is substantial and requires “very specialist skills” that “are in incredibly short supply and are very expensive.” This includes areas such as algorithmic inspection.
This lack of skilled oversight services may have the effect of slowing “the pace of development in biometric technology, and will give society a chance to keep pace.”
Privacy Design and Algorithmic Bias
In the final section of the session, Philip James, partner, global privacy and cybersecurity group at law firm Eversheds Sutherland, discussed the legal issues surrounding biometric algorithms and data. He said organizations need to be very aware of these issues as there is “growing public awareness of the use of biometrics and algorithms — the benefits and the risks.”
In terms of the EU’s General Data Protection Regulation (GDPR), which the UK has incorporated into its laws since Brexit, all biometrics data is generally considered “special category data.” This is essentially information “that is more stringently controlled because it has greater sensitivity.” This means organizations will need to make the case that the data should not be categorized in this way. “Purpose and context are always key,” commented James.
He stressed that organizations should carefully follow the conditions set out in Article 9 of the GDPR (processing special categories of personal data) to ensure they remain compliant while collecting biometric data. He cited an opinion published by the Information Commissioners Office (ICO) on a case in which the police used facial recognition technologies to identify potential criminals in a crowd. It was held that these actions were unlawful, as the police did not follow the correct processes around this kind of data. Unless there are exemptions, such as detecting and preventing an unlawful act, in most situations, the processing of biometric data requires “explicit consent” from the individuals involved.
James also advised that organizations utilizing biometric technologies have a data protection officer to ensure this particular category data is managed correctly, and involve commercial and legal parties early on “because that’s going to make sure it has the best outcome to hopefully mitigate risk and make sure the investment is well focused.”
Biometrics offers plenty of benefits to society, and these technologies will only increase in use over the coming years. However, organizations should implement these technologies in a careful and controlled manner, ensuring they do not fall foul of ethical and legal problems.