School Gets An F For Using Facial Recognition On Kids In Canteen

The UK’s data protection watchdog has reprimanded a school in Essex for using facial recognition for canteen payments, nearly three years after other schools were warned about doing the same.

A statement from the Information Commissioner’s Office (ICO) said Chelmer Valley High School in Chelmsford broke the law when it introduced facial recognition technology (FRT) to take cashless canteen payments from students in March 2023.

By processing biometric data to uniquely identify people, FRT can result in high data protection risks, the regulator said.

It reprimanded the school, which caters for around 1,200 pupils aged 11 to 18, for failing to carry out a Data Protection Impact Assessment (DPIA) before starting to use the FRT, as required by UK data protection law, formally UK GDPR after Brexit, which sits alongside the 2018 Data Protection Act.

At the same time, the school failed to carry out an assessment of the risks to the children’s information before it introduced FRT, as the law requires. The regulator also said the school had not properly obtained clear permission to process students’ biometric information and the students were not given the opportunity to decide whether they wanted it used in this way.

Lynne Currie, ICO Head of Privacy Innovation, said: “Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.

“We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children. We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights.”

The school did not get the views of its data protection officer, nor did it consult parents and students before rolling out FRT, the ICO said.

The Register has contacted the school for a response.

According to the ICO, the school did send a letter to parents in March 2023. It contained a slip for them to return if they did not want their child to participate in the FRT. In doing so, the school failed to seek “opt-in” consent, which meant the school wrongly relied on “assumed consent” at first, which is not a valid form of consent and requires explicit permission in the eyes of the law.

The ICO said it had provided Chelmer Valley High School with recommendations for the future.

The case follows incidents from 2021 that attracted attention as FRT was being employed in more UK schools to allow pupils to pay for their meals.

At the time, under North Ayrshire Council, a Scottish authority encompassing the Isle of Arran, nine schools were preparing to begin processing meal payments for school lunches using facial scanning technology.

Shortly afterwards, the ICO asked for the plans to be put on hold until it had the chance to find out more.

Jen Persson, director at campaign group digitaldefendme, said schools have again demonstrated they have not had the training or capacity to procure technology that respects the law or children’s rights.

“Facial recognition should have no place in schools,” she told us today. “Children in the UK are exposed to high-risk AI, biometric, and emerging tools in ways that are unthinkable elsewhere and the education sector must get urgent resources and support to get a grip on how this kind of practice can compromise pupils’ identity for life.”

She said the ICO’s enforcement was vital but encouraged the regulator to go further and identify the manufacturer and country of origin behind each case it intervenes in. ®

READ MORE HERE