European Information Safety Authorities Ought to Reexamine their Method

[ad_1]

 


 

 

Dr Asress Adimi Gikay, Lecturer in AI, Disruptive Innovation, and
Law at Brunel Law School, Brunel University London with his research in
Artificial Intelligence, Law, and Policy. Also teaches Artificial Intelligence
and the Law, and Law, Policy and Governance of Artificial Intelligence (https://www.brunel.ac.uk/people/asress-gikay).
Twitter: https://twitter.com/DrAsressGikay

 

Facial Recognition Technology in Schools

In today’s world, our privacy and
personal data are controlled by Big Tech companies such as Facebook, Google,
Twitter, Apple, Instagram and many others. They 
know
 almost everything
about us— our location,
addresses, phone numbers, private email conversations and messages, food
preferences, financial conditions and many other intimate details that we would
otherwise not divulge even to our close friends. Children are not immune from
this overarching surveillance power of Big Tech companies.  In the UK, children as young as 13
years old
can give their consent to processing of their personal data including
using some kind of Facial Recognition Technology (FRT) by these Big Tech companies
that provide online services. Many of these companies likely know our
children’s preferences for movies, music, food and other details more than we do.
But society has no meaningful way to influence these companies whose God-like
presence in our lives epitomizes the dystopia of technology-driven world.  Surveillance is the rule than the exception
and we have little
tools to protect ourselves
from pervasive privacy intrusion.    

But the advent of FRT in schools
in Europe has alarmed citizens, advocacy
groups
, Data Protection Authorities (DPAs) far more than the pervasive
presence of Big Tech companies in our lives. It has prompted a strong response
from DPAs who have consistently blocked the deployment of the technology in schools
on the grounds of privacy intrusion and the breach of the GDPR.

Facial recognition is a process
by which a person can be identified or recognized by Artificial Intelligence
(AI) software using their facial image or video.  The
software compares
the individual’s digital image or video captured by a
camera to an existing biometric image to estimate the degree of similarity
between two facial templates to identify a match. There have been multiple
instances of use of this technology in schools — for attendance monitoring in Sweden,
access control in France
and taking payment in canteens in the UK.
 According to Swedish Municipal School
Board, monitoring
attendance using FRT would save 17280 hours
 per year at the school concerned. UK Schools
wanted to reduce a queue in canteen by taking payment faster and safer.  But DPAs, and in case of France, the
Administrative Court of Marseille stepped in to block the technology due, primarily,
to privacy related concerns regardless of the appreciable benefits.  

While the school authorities
relied on explicit consent of students and/or their legal representatives to
use the technology, DPAs rejected that explicit consent is a valid ground for
processing personal data using FRT due to the imbalance of power between the
school authorities on the one hand and students and their guardians on the
other. This raises a question whether public institutions including schools
could ever use FRT with the explicit consent of the data subject and if not,
whether that is an outcome society should aim for.  

The Concerns about FRTs in Schools and the Fix

Scholars and advocacy groups
point out that FRTs
poses certain risks
, especially in the context of processing children’s
data ranging from misuse of biometric data by the companies involved in
providing or using the technology as well as bad actors such as hackers to the
normalization of surveillance culture, directly stemming from individuals
giving up their privacy right. More generally, it is argued that FRT is “an
unnecessary and disproportionate
interference
with the students’ right to privacy.” As such, DPAs call for
the deployment of less privacy intrusive technology alterative and take strict
approach in whether there is a valid legal basis for using the technology
including a freely obtained consent.

In its 2019 decision to fine the
secondary school board of  Skellefteå
Municipality, the
Swedish DPA
argued that although FRT was employed to monitor student
attendance based on explicit consent, consent cannot
be a valid legal basis
given the clear imbalance of power between the data
subject and the controller. In France, the Administrative
Court of Marseille
in agreeing with the French DPA(CNIL), concluded that
the school councils have not provided sufficient guarantees to obtain free and
informed consent of students to use FRT to control access, despite the fact
that specific written consent has been obtained. In October 2021 as nine
schools in North Ayrshire (UK) were preparing to replace their method of taking
payment in canteens from fingerprint to facial recognition, the Information
Commissioner’s Office(ICO)
wrote a letter urging the schools to use  a “less intrusive” tool. The School
Councils were forced to pause rolling out the technology.  The content of the ICO’s letter is not public
and the ICO has not responded to the author’s Freedom of Information (FOI) Request
to access the letter.

But these decisions evidently
suggest that the mere presence of a power relationship between data controller
and the data subject renders explicit consent as the basis for processing
biometric data invalid. The UK schools’ suspension of implementing FRT for the
mere fact of receiving a letter from the ICO signals that the presumed power
imbalance alone would defeat explicit consent — at the very least, schools are
not willing to engage in the process of obtaining consents as that would likely
be regarded as insufficient and entail sanctions for the breach of the GDPR.  

While documents
obtained
from North Ayrshire Council under FOR request do suggest there
were flaws in obtaining consent (for instance attempting to obtain consent  directly from a child of 12 years old), the
overall effort of the Council seemed reasonable in terms of complying with data
protection law. If the Council wishes to obtain valid consent, it should not be
effectively prohibited ex ante. But the ICO’s letter clearly had that effect.
Subsequently, on November 4, 2021, the House Lords held a debate sponsored by
Lord Clement-Jones who expressed his opposition to the use of FRT in schools
stating that “we
should not use children as guinea pigs
.” There is overwhelming evidence of
the pressure to categorically ban the use of FRT in schools and indeed it is
now effectively banned in Europe, albeit there is no legislation to that effect.

Imbalance of Power under the GDPR

Despite banning the processing of
the so-called special categories of
personal data
, including biometric data such as facial images as a rule,
the GDPR provides exceptions under which such data can be processed. Under one
of the exceptions, it allows processing of biometric data to uniquely identify
a natural person, if the data subject has given explicit consent to the processing
of such personal data for one or more specified purposes. Consent should be given by a clear
affirmative act establishing a freely given, specific, informed, and
unambiguous indication of the data subject’s agreement to the processing of
personal data relating to her.

Where there is a power
relationship, it is challenging to prove that consent has been obtained freely –
the requirement which DPAs concluded was not met in Sweden and France. In this
regard, the GDPR makes it clear
that
“consent should not provide a valid legal ground for the processing of
personal data in a specific case where there is a clear imbalance between the
data subject and the controller, in particular where the controller is a public
authority and it is therefore unlikely that consent was freely given in all the
circumstances of that specific situation.”

The GDPR allows DPAs and courts
to consider an imbalance of power in assessing whether the consent has been
obtained freely. But this is not a blanket prohibition of using explicit
consent to process personal data by public authorities. This is consistent with
the European
Data Protection Board’s
Guideline which states that “Without prejudice to
these general considerations, the use of consent as a lawful basis for data
processing by public authorities is not totally excluded under the legal
framework of the GDPR.” Regulators and courts can, ex post facto, scrutinize if
the explicit consent was obtained freely and in an informed manner but have no
power to invalidate validly given consent based on the mere existence of
imbalance of power that did not have actual effect. If a Member State of the
European Union wishes to exclude consent as a basis for processing special
categories of personal data, the GDPR
allows such Member State
to legislate that the prohibition of the
processing of special categories of personal data may not be lifted based on
explicit consent under any circumstance. Absent such legislation, the validity
of consent in the context of the existence of power relationship can only be
examined on case-by-case basis rather than in categorical terms. Thus, schools
should be able to demonstrate that the presumed imbalance of power has not
played a role in obtaining consent. 

The Current Approach should be Reexamined

The concerns raised by scholars
and privacy advocates about the intrusive nature of FRT should be seen in the
light of data protection and privacy safeguards provided by the GDPR which has
a series of provisions guaranteeing that personal data is not used for a purpose different than originally
intended
, and that personal data be kept confidentially and securely. Furthermore,
data controllers and processor have no right to share personal data with third
parties unless consented to by the data subject. In the presence of these and a
number of other safeguards, what seems a blanket prohibition of the use FRT in
schools on the basis of unreasonable privacy anxiety and an irrebuttable
presumption that power imbalance per se leads to a vitiated consent is not
sensible.  There are several reasons this
approach needs to be reexamined.

First and foremost, it puts small
companies and public institutions at a disadvantage with regard to the use of FRT.  Big Tech companies can do almost as they
please with our or our children’s personal data. Facebook’s opaque data
sharing practice
has frequently been exposed, but there is still no
meaningful way to control what Facebook does. The same is true other Big Tech
companies in the business of monetizing our personal data. Schools and
companies providing FRTs should be the least of our concern. It is not
difficult to make them abide by the GDPR, whereas Big Tech companies can hide
behind complex legal and technical black boxes to get away with grossly illegal
use of our personal data. The blanket prohibition of using FRTs by small institutions
creates a system that unfairly disadvantages small data controllers.

Furthermore, data innovation
would be deterred by over-zealous DPAs and courts who see superficial power
imbalance without examining how it plays in reality while the real power
imbalance society suffers from vis-à-vis Big Tech companies remain inadequately
challenged.   The future depends on
innovating with data and the use of FRT would be an essential component of it.
To satisfy our excessive anxiety about privacy intrusion, we cannot prevent
small companies and institutions from benefiting from AI technologies and
data-driven innovation while we let Big Tech companies take control of our
lives. Data innovation should be by all, for all, not just for the Big Tech.

Art credit: Peder Severin Krøyer, via Wikimedia
Commons



[ad_2]

1 Comment

Leave a Reply

Your email address will not be published.

You might like

© 2022 All in Cyrpto - WordPress Theme by WPEnjoy