Hello hello again.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
A key aspect of the Code, Baroness Kidron explains, should be to transfer compliance responsibilities from educational providers to edtech providers. [I guess Baroness Kidron must be talking about tech companies falsely classifying their customers as controllers to avoid legal liability, and unfortunately too common trick in the tech sector]
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Baroness Kidron also complains about the growing use of biometric monitoring (such as live facial recognition) in schools: a Code could bring clarity of what are the circumstances where this can be used, and how consent can be validly collected and given in such context
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Baroness Kidron also touches upon the issue of reuse of pupils data for reserach purposes, arguing that a Code could help draw useful lines that separate legitimate research activities from commercial and predatory behaviours by edtech companies
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Another Lord intervenes, ominously declares various interests in an AI and edtech company, starts listing the benefits of edtech tools, which would make teaching and assessments cheaper, easier and liberates teachers' time.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
However, he does admit there are dystopian cases, such as digital divide or the possibility of a two-tier system where poor children are tought by AI and richer children by humans. He warns against the risk of covert privatisation of the educational sector, driven by the adoption of AI companies' products
-
Mariano delli Santireplied to Mariano delli Santi last edited by
These systems need data and a variety of data in order to work, he says. However, he recognises that valid concerns about data uses exists, and so he supports a code of practice for the edtech sector to address these issues
-
Mariano delli Santireplied to Mariano delli Santi last edited by
The Lord also invites the Lords to involve the Departement for Education on these issues. Finally, he raises the issue of childrens' data being shared with DWP to check entitlement to benefits
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Another Lord takes the floor, laments that schools are already “drowning in guidance”. He says, though, there is difference between guidance and a clear code, that would establish guardrails in this context
-
Mariano delli Santireplied to Mariano delli Santi last edited by
A code, he says, should put a line in the sand, establish clear expectations about what is permissible and what is not
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Lord Kirkhope intervenes, laments again about the fact that, in designing the GDPR, lawmakers failed to take AI into account. But I need to make an intervention here...
-
Mariano delli Santireplied to Mariano delli Santi last edited by
The GDPR was meant to be technologically neutral and set the rules that future technologies should have complied with in the future (the next 20 years, as the European Commission used to say). While it does not cover all aspects relevant for protecting rights in the field of AI, all GDPR provisions apply for data used by AI systems
-
Mariano delli Santireplied to Mariano delli Santi last edited by
As it stands now, the GDPR is the single most imprtant legal framework that protects individuals from AI system which are using data wthout their consent, which are using this data unfairly, which are taking or informing decisions that affect peoples' lives or their rights.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
I would argue, what we should be concerned about is the several provisions in the DUA Bill that would carve out exemptions from data protection obligations for AI companies for the sake of allowing bogged products and business models to survive instead of “evolve or face extinction”. It is not clear why companies who built upon illegal foundations should now be salvaged instead of facing consequences for their failures
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Anyway, other Lords intervened to support the need of a Code of Practice for Edtech. Now Baroness Jones (LAB) intervenes, says it would be premature to put these requirement into law but commits to continue engagement with the ICO and to continue work on this issue
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Clement-Jones (LIBDEM) responds: there are real issues here in the edtech sector. “It's premature” is a red flag, he argues, in these kind of debates
-
Mariano delli Santireplied to Mariano delli Santi last edited by
I must support the statement above from Clement-Jones: digital technologies have normative power, and edtech providers can choose what data is collected and for what reason, how an individual can interact with it, and what choices, preferences and behaviours are allowed, rewarded, prohibited or punished. The question is never if regulating is needed, but who should be answering these questions: our democratic institutions, or a private edtech provider?
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Of course, the more you delay regulation, the more edtech providers will enforce their own norms and regulations, after which they will present these as the status quo. Delaying regulation favours large technology companies and bad actors, not innovation.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
Lord Holmes now presents an amendment that would change the computer misuse act to protect cybersecurity researchers from unjust prosecution for their work.
-
Mariano delli Santireplied to Mariano delli Santi last edited by
That's on me that I had completely missed these amendments. They relate to an important issue I engaged with quite some time ago, although in a rather different consultation https://www.openrightsgroup.org/publications/computer-misuse-act-1990-open-rights-group-submission-to-the-home-office/
-
Mariano delli Santireplied to Mariano delli Santi last edited by
The long story short is: the Computer Misuse Act criminalises violating the security of an IT system regardless of whether this is being done with fraudolent or malicious intent, or with the aim of testing the security of the system and identifying vulnerabilities. In turn, this exposes cybersecurity researchers to the risk of prosecution for carrying out an all too-important public interest job