Facing the Future: New Developments in Digital Processing of Personal Data
Digital innovation and public trust go hand in hand. The success of digital innovation is reliant on consumers trusting organisations with their personal data. The law must progress at pace in order to preserve public trust whilst also facilitating digital evolution. The emergence of the Data Protection Act (DPA) 2018 alongside the General Data Protection Regulation 2018 (GDPR) laid out the framework for data protection law in England and Wales.
On the 4th September, the UK High Court confirmed the legality of the use of automated facial recognition technology (AFR). It is common knowledge that technology is rapidly evolving, and the recent High Court judgement handed down in Bridges v Chief Constable of South Wales Police showcases that data protection law is malleable to modern digital innovation. The court confirmed that the current legal regime in the UK befits the use of AFR technology and can ensure its appropriate and non-arbitrary use.
The judgement ruled in favour of the South Wales Police (SWP) who were trialling the use of AFR technology in public spaces around Cardiff. The claimant, a civil liberties campaigner acting on behalf of a Cardiff resident, challenged the SWP’s use of the technology. The resident was caught on camera on two occasions during the on-going trial, once on Queen Street and another near the Motorpoint Arena. He contested that there was no legal basis for the use of the technology and that the SWP therefore did not have the power to deploy it.
The court faced the paradoxical battle between the protection of public interest and the use of the technology to better detect and prevent crime, and the safeguarding of the private rights of individuals. It was held that the current legal regime is adequate in providing guidance as to the appropriate use of AFR technology, and that its use is also consistent with the requirements of the Human Rights Act.
Interestingly, the court suggests that new technologies may have the capacity to fall into the scope of existing legal frameworks. However, AFR is a highly intrusive technology and demonstrates the risk to personal data protection that so often accompanies digital innovation.
It is clear that the continuing development of new technology heightens the public concern surrounding the protection of their personal data. Despite the existence of these data protection laws, instances of major breach of data protection are still common. For example, the British Airways data breach which exposed details of around 500,000 of their clients’ personal information. The ICO intends to fine British Airways £183m in GDPR penalties for the breach. Regardless, this does not take away from the fact that incidents like this will cause significant concern to the public when it comes to sharing their personal data in the future.
The Law Society has recently published a blog emphasising the importance of adopting the measures suggested by the GDPR. These measures facilitate the maintenance of public trust whilst also promoting digital innovation within this framework of trust. The measures include the use of Data Protection Impact Assessments, the incorporation of data protection by design, and the principle of ‘adding noise’ to databases.
The High Court’s ruling provides clarity and support in regard to the use of AFR technology for personal data processing. It also paves the way for the emergence of new technologies which may also have the capacity to be established without the need for bespoke guidelines. In the meantime, digital data processing remains a field open for innovation.