In what could be a significant step towards protecting children from potential harms online, the California legislature is currently debating an amended bill that would enable parents, as well as the state Attorney General, to sue social platforms for algorithms and systems that addict children to their apps.
As reported by The Wall Street Journal:
“Social-media companies such as Facebook parent Meta Platforms could be sued by government attorneys in California for features that allegedly harm children through addiction under a first-in-the-nation bill that faces an important vote in the state Senate here Tuesday. The measure would permit the state attorney general, local district attorneys and the city attorneys of California’s four largest cities to sue social-media companies including Meta – which also owns Instagram – as well as TikTok, and Snapchat, under the state’s law governing unfair business practices.
If passed, that could add a range of new complications for social media platforms operating within the state, and could restrict the way that algorithmic amplification is applied for users under a certain age.
The ‘Social Media Platform Duty to Children Act’ was initially proposed early last month, but has since been amended to improve its chances of securing passage through the legislative process. The bill includes a range of ‘safe harbor’ clauses that would exempt social media companies from liability if said company makes changes to remove addictive features of their platform within a specified time frame.
What, exactly, those ‘addictive’ features are isn’t specified, but the bill essentially takes aims at social platform algorithms, which are focused on keeping users active in each app for as long as possible, by responding to each person’s individual usage behaviors and hooking them in through the presentation of more of what they react to in their ever-refreshing content feeds.
Which, of course, can have negative impacts. As we’ve repeatedly seen play out through social media engagement, the problem with algorithmic amplification is that it’s based on a binary process, which makes no judgment about the actual content of the material it seeks to amplify. The system simply responds to what gets people to click and comment – and what gets people to click and comment more than anything else? Emotionally charged content, posts that take a divisive, partisan viewpoint, with updates that spark anger and laughter being among the most likely to trigger the strongest response.
That’s part of the reason for increased societal division overall, because online systems are built to maximize engagement, which essentially incentivizes more divisive takes and stances in order to maximize shares and reach.
Which is a major concern of algorithmic amplification, while another, as noted in this bill, is that social platforms are getting increasingly good at understanding what will keep you scrolling, with TikTok’s ‘For You’ feed, in particular, almost perfecting the art of drawing users in, and keeping them in the app for hours at a time.
Indeed, TikTok’s own data shows that users spend around 90 minutes per day in the app, on average, with younger users being particularly compelled by its never-ending stream of short clips. That’s great for TikTok, and underlines its nous in building systems that align with user interests. But the question essentially being posed by this bill is ‘is this actually good for youngsters online?’
Already, some nations have sought to implement curbs on young people’s internet usage behaviors, with China implementing restrictions on gaming and live-streaming, including the recent introduction of a ban on people under the age of 16 from watching live-streams after 10pm.
The Italian Parliament has implemented laws to better protect minors from cyberbullying, while evolving EU privacy regulations have seen the implementation of a range of new protections for young people, and the use of their data online, which has changed the way that digital platforms operate.
Even in the US, a bill proposed in Minnesota earlier this year would have banned the use of algorithms entirely in recommending content to anyone under age 18.
And given the range of investigations which show how social platform usage can be harmful for young users, it makes sense for more legislators to seek more regulatory action on such – though the actual, technical complexities of such may be difficult to litigate, in terms of proving definitive connection between algorithmic amplification and addiction.
But it’s an important step, which would undoubtedly make the platforms re-consider their systems in this regard, and could lead to better outcomes for all users.