This article was originally published on The Conversation, an independent and nonprofit source of news, analysis and commentary from academic experts. Disclosure information is available on the original site.
___
Author: Teresa Scassa, Canada Research Chair in Information Law and Policy, L鈥橴niversit茅 d鈥橭ttawa/University of Ottawa
Ontario school board lawsuits against social media giants including Meta, Snapchat and TikTok are seeking damages 鈥 money paid as a remedy 鈥 for the disruption of the educational system.
A growing volume of evidence indicates that young people have become addicted to social media. It suggests social media platforms are designed to foster such addiction, that online activities contribute to behaviour such as bullying and harassment, and that excessive use of social networks can harm students鈥 mental health, even influencing suicide.
Ontario school boards, speaking as a coalition called Schools for Social Media Change, argue 鈥渟ocial media products, designed for compulsive use, have rewired the way children think, behave and learn鈥 and that 鈥渟chools are unfairly bearing the brunt of the learning and mental health epidemic caused by the alleged negligent conduct of social media companies.鈥 Lawsuits come as 95 per cent of Ontario schools report needing more resources to support student mental health.
At the core of the litigation are concerns about the impact on young people of social media companies鈥 practices. But neither lawsuit victories, nor existing or proposed Ontario provincial or federal privacy or AI legislation will prevent problems related to rampant collection and processing of human-derived data.
Boards in U.S. and Canada
Four Ontario school boards announced that they were suing social media giants including Meta, Snapchat and TikTok in March 2024. Five other school boards and two private schools also filed suit shortly afterwards.
These actions follow a flood of lawsuits launched in the U.S. by over 200 school districts against social media companies.
The U.S. lawsuits link social media engagement with a decline in students鈥 mental health. One U.S. statement of claim describes the situation as 鈥減erhaps the most serious mental health crisis [the nation鈥檚 children, adolescents and teenagers] have ever faced.鈥
The Canadian lawsuits make similar claims. For example, one alleges that the defendant social media companies 鈥渆mploy exploitative business practices and have negligently designed unsafe and/or addictive products鈥 that they market and promote to students.
Regulating digital information
The litigation on both sides of the border is novel. In Canada it has also been somewhat controversial. When asked about the Ontario lawsuits, premier Doug Ford called them 鈥渘onsense,鈥 suggesting that the school boards should focus on educating students.
Shortly after the launch of these lawsuits, the Ontario government introduced Bill 194. This bill proposes, among other things, new regulation of digital information of children and youth in schools and in children鈥檚 aid societies.
Nonetheless, what is proposed in the bill won鈥檛 address what these lawsuits attempt to tackle: the impact on education from how social media companies engage with children and youth 鈥 including in time spent out of school. Ontario鈥檚 Information and Privacy Commissioner, in her submission on Bill 194, recommends largely replacing what the government proposes with improvements to existing privacy law.
Similarly, the province鈥檚 school cell phone ban tackles only one dimension of a much bigger problem.
Impact of company practices on youth
The Canadian lawsuits against social media giants are not framed as privacy claims. Indeed, school board led litigation could not raise such claims since any privacy rights are those of the children and youth who engage with social media and not those of the school boards.
The damage alleged by the school boards is the disruption of the operation of schools, but at the core of the litigation are concerns about the impact on young people of social media companies鈥 practices.
While privacy claims are not part of the school board litigation, they are not far from the surface. Social media user data fuels these companies鈥 business models, incentivizing them to engage in practices that draw users in, and that drive continued engagement and social dependence. Although all users are affected by these practices, evidence suggests that children and youth are particularly susceptible to becoming addicted.
Data gathered through engagement on these platforms also fuels targeted advertising, which can foster insecurities around body image and other self-confidence-affecting concerns of young people.
Privacy laws out of step?
The roots of the harm alleged by the boards are therefore in personal data collection and processing. However, the consequences far transcend the individual privacy harms recognized in privacy laws or privacy torts. This suggests that our privacy laws are out of step with contemporary data practices.
It would be good to take comfort from the fact that Bill C-27, currently before Parliament鈥檚 Standing Committee on Industry and Technology, proposes long-awaited reforms to Canada鈥檚 private sector privacy law in the form of a new Consumer Privacy Protection Act.
It also contains a new law that would regulate the development and use of artificial intelligence (AI) technologies. Unfortunately, even if the bill is passed into law before the coming election (which seems increasingly unlikely), these reforms will do little to address the broader systemic harms impacting our society that come from the exploitation of personal data.
Legislation falling short
The proposed Consumer Privacy Protection Act takes only small steps to recognize the sensitivity of children鈥檚 information. It falls far short of the United Kingdom鈥檚 age-appropriate design code of practice for online services.
Further, although the proposed Artificial Intelligence and Data Act would set parameters for the design, development and deployment of AI systems, it defines harms in individual terms 鈥 and doesn鈥檛 acknowledge group and community harms from algorithm-driven practices, such as the disruption of the educational system.
The European Union鈥檚 AI Act is not so limited. In its first recital, it describes its broad goals to ensure 鈥渁 high level of protection of health, safety, fundamental rights 鈥 including democracy, the rule of law and environmental protection.鈥
What the school boards are advancing in their litigation are novel claims for redressing what they and a growing body of experts say are harms rooted in the collection and processing of human-derived data. These harms go beyond the individuals whose data is harvested and impact society more broadly.
As this litigation unfolds, we should be asking: When new bills to regulate AI or privacy are introduced, how will they equip us to address the group and social harms of personal data exploitation?
___
Teresa Scassa does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
___
This article is republished from The Conversation under a Creative Commons license. Disclosure information is available on the original site. Read the original article: https://theconversation.com/youth-social-media-why-proposed-ontario-and-federal-legislation-wont-fix-harms-related-to-data-exploitation-242187
Teresa Scassa, Canada Research Chair in Information Law and Policy, L鈥橴niversit茅 d鈥橭ttawa/University of Ottawa, The Conversation