Hello hello again.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Today, Clement-Jones argues, families are finding it difficult to access data about their children and know how this data is being used. A code is needed to reduce power imbalances and clarify legal obligations for parents, childrens and edtech providers
-
Mariano delli Santireplied to Mariano delli Santi last edited by
There is also the need, argues Clement-Jones, to empower children to participate and to be able to exercise their own choices, in line with their maturity and personal development.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Clement-Jones also draws the attention on reliance by the edtech sector of “public interest” legal bases, and explains that clarity is needed as to what data processing can be justified on these basis. Defend Digital me is also mentioned as one of the organisations supporting the Lords with their briefings
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Baroness Kidron also intervenes, stating that the Departement for Education and the Information Commissioner's Office have continually failed to confront ongoing breaches of data protection law in the education sector
-
Mariano delli Santireplied to Mariano delli Santi last edited by
A key aspect of the Code, Baroness Kidron explains, should be to transfer compliance responsibilities from educational providers to edtech providers. [I guess Baroness Kidron must be talking about tech companies falsely classifying their customers as controllers to avoid legal liability, and unfortunately too common trick in the tech sector]
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Baroness Kidron also complains about the growing use of biometric monitoring (such as live facial recognition) in schools: a Code could bring clarity of what are the circumstances where this can be used, and how consent can be validly collected and given in such context
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Baroness Kidron also touches upon the issue of reuse of pupils data for reserach purposes, arguing that a Code could help draw useful lines that separate legitimate research activities from commercial and predatory behaviours by edtech companies
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Another Lord intervenes, ominously declares various interests in an AI and edtech company, starts listing the benefits of edtech tools, which would make teaching and assessments cheaper, easier and liberates teachers' time.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
However, he does admit there are dystopian cases, such as digital divide or the possibility of a two-tier system where poor children are tought by AI and richer children by humans. He warns against the risk of covert privatisation of the educational sector, driven by the adoption of AI companies' products
-
Mariano delli Santireplied to Mariano delli Santi last edited by
These systems need data and a variety of data in order to work, he says. However, he recognises that valid concerns about data uses exists, and so he supports a code of practice for the edtech sector to address these issues
-
Mariano delli Santireplied to Mariano delli Santi last edited by
The Lord also invites the Lords to involve the Departement for Education on these issues. Finally, he raises the issue of childrens' data being shared with DWP to check entitlement to benefits
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Another Lord takes the floor, laments that schools are already “drowning in guidance”. He says, though, there is difference between guidance and a clear code, that would establish guardrails in this context
-
Mariano delli Santireplied to Mariano delli Santi last edited by
A code, he says, should put a line in the sand, establish clear expectations about what is permissible and what is not
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Lord Kirkhope intervenes, laments again about the fact that, in designing the GDPR, lawmakers failed to take AI into account. But I need to make an intervention here...
-
Mariano delli Santireplied to Mariano delli Santi last edited by
The GDPR was meant to be technologically neutral and set the rules that future technologies should have complied with in the future (the next 20 years, as the European Commission used to say). While it does not cover all aspects relevant for protecting rights in the field of AI, all GDPR provisions apply for data used by AI systems
-
Mariano delli Santireplied to Mariano delli Santi last edited by
As it stands now, the GDPR is the single most imprtant legal framework that protects individuals from AI system which are using data wthout their consent, which are using this data unfairly, which are taking or informing decisions that affect peoples' lives or their rights.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
I would argue, what we should be concerned about is the several provisions in the DUA Bill that would carve out exemptions from data protection obligations for AI companies for the sake of allowing bogged products and business models to survive instead of “evolve or face extinction”. It is not clear why companies who built upon illegal foundations should now be salvaged instead of facing consequences for their failures
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Anyway, other Lords intervened to support the need of a Code of Practice for Edtech. Now Baroness Jones (LAB) intervenes, says it would be premature to put these requirement into law but commits to continue engagement with the ICO and to continue work on this issue
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Clement-Jones (LIBDEM) responds: there are real issues here in the edtech sector. “It's premature” is a red flag, he argues, in these kind of debates
-
Mariano delli Santireplied to Mariano delli Santi last edited by
I must support the statement above from Clement-Jones: digital technologies have normative power, and edtech providers can choose what data is collected and for what reason, how an individual can interact with it, and what choices, preferences and behaviours are allowed, rewarded, prohibited or punished. The question is never if regulating is needed, but who should be answering these questions: our democratic institutions, or a private edtech provider?