Connect with us

Hi, what are you looking for?

Academies

DfE is taking the lead on AI, but who’s really holding it?

In the summer holidays, at a conference in South Korea, the DfE announced a project to create a datastore for “AI companies to train their tools”. Scant on detail, it begs practical, legal and ethical questions.

The project will be funded to the tune of  £3 million, with a share of a further £1 million available to encourage its use by those who bring forward ideas that reduce teacher workload.



But does the government understand sector priorities or the root causes of excessive teacher workload? And will displacing tasks into as-yet imaginary, untested tools, make any substantive difference?

Ethical implications for staff are significant. The push for personalisation is contested and raises concerns including inequity, quality, learner profiling and using ‘if then’ logic to treat pupils like the others they match.

Schools and their legally accountable bodies need to know the sources of the, “anonymised pupil assessments” that will be used to “stimulate the edtech market”. The DfE must commit in its framework on AI products, promised later this year, for companies to prove lawful provenance of their AI training data – where the basis of the AI product comes from – before any pupil data might be added to it.

If you understand the scale of digital risks (including identity theft, fraud, and sextortion) then the lifetime protection of children’s records is clearly a safeguarding duty, not an IT issue.

The supporting Gen AI use cases report says, “In the information provided to parents and schools, it is important to be clear that the removal of all PII is not guaranteed”.

Noting the choice of the U.S. term PII (Personally Identifiable Information) instead of ‘personal data’ applied in UK law, what does that mean for children whose parents signed away their written work to create the Proof of Concept model?

Instead of investing in staff and infrastructure, the focus is on incentivising tech firms

This is unclear, both in the IP agreement generated by Faculty AI and DfE for schools to disseminate to parents, and in the DfE’s AI privacy notice. 

In accompanying public attitudes research, most parents expect to be asked for permission. Of those Survation polled on behalf of Defend Digital Me in 2018, 69 per cent didn’t know a National Pupil Database existed, let alone about today’s commercial reuses. 

Many products, including large language models like OpenAI’s ChatGPT, even require child users to be 13+ and have parental consent, which cannot be freely given and withdrawn with ease and is therefore problematic (if not impossible) to gain in educational settings.

Meanwhile, generative AI models are losing their over-hyped appeal to investors. Edtech companies are increasingly platform-based, demanding user lock-in. Pearson, for example, aims to be the “Netflix of Education” built on a single Oracle platform.

The influence of corporate donors in shaping public sector procurement, such as the Tony Blair Institute’s backing by Oracle owner Larry Ellison, therefore demands scrutiny.

There’s a global race to control computing power and cloud infrastructure. The countries leading on AI may avoid the most apparent environmental and social harms (climate damage, cheap labour cleaning nasty content from training datasets, and child labour in mining minerals for hardware) but what of our obligations to the future?

The government must strengthen the application of basic principles of data protection law for schools and implement the 2020 Council of Europe Guidelines for Educational Settings.  

In a 2019 Nesta survey, while 75 per cent of parents said they would be happy for AI to be used for timetabling, support dropped to 55 per cent in favour of AI adjusting the pace of a student’s learning.


A safe digital environment for the education sector requires teacher training on data protection and child rights. It hinges on information management systems to enable schools to meet legal obligations such as on optional data items. And it must include mechanisms to enable parental opt-in for re-uses.

But instead of the government investing in staff and adequate school infrastructure (which some achieve today by flaunting laws on the prohibition of charges for provision of education and locking parents into expensive, school-controlled 1:1 iPad schemes), the focus is on incentivising tech firms keen to reach new markets.

How will the department make sure it’s not been sold a pup? It’s fair for schools to ask if the AI tail is wagging the education dog, and who’s holding the lead.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Academies

August. A formal event for school leavers. The pageantry has been magnificent. Speeches have been in turn formal and touching. There’s a feeling that...

Academies

Ormiston Academies Trust’s vision is of a school system where every child can thrive. This means that young people’s learning, behaviour and wellbeing are...

Academies

This year marks the third annual National School Trust Report, published by the Confederation of School Trusts with Edurio. The full survey report continues...

Academies

Is the SEND system broken? This week, I attended a BBC Woman’s Hour special that explored this very question. While it served as a...