Online Safety Amendment (Social Media Minimum Age) Bill - 26 November 2024
This Online Safety Amendment (Social Media Minimum Age) Bill 2024 requires social media platforms to take reasonable steps to exclude people under 16 from creating accounts. While I have concerns about the process, the risk to privacy, the unintended consequences and the bluntness of this tool, I recognise that there is significant community concern about kids and social media. I've consulted widely with my electorate, and the majority of people who engaged wanted a social media ban for children, so I will be supporting this bill.
This bill introduces a new definition of an age-restricted social media platform whose sole or significant purpose is to enable users to post material online and interact socially with other users. The bill will attempt to force owners of newly defined age-restricted social media platforms to take reasonable steps to prevent people under 16 from having a user account, so kids under 16 can't establish an account on any of these platforms but can still view the platforms while they are logged out. The bill includes an exclusion framework that exempts messaging apps such as WhatsApp, online gaming platforms and services with the primary purpose of supporting the health and education of end users. The onus will be on the platform to restrict under 16s. Platforms could face fines of nearly $50 million, if they don't take reasonable steps.
I have some serious concerns about a number of aspects of the bill. Firstly, I have concerns about the process the bill is gone through, or lack thereof. In what's becoming a bad habit for this government, we're seeing this pretty significant change being rushed through. The government tells us that they've consulted widely in drafting this bill, but, here we are again, rushing through legislation that has barely been scrutinised.
We, on the crossbench, first saw this legislation last week. There was a three-hour committee inquiry yesterday. Stakeholders were given three days to provide submissions and, even so, 15,000 people managed to get submissions in. The committee report just dropped, a day after the only hearing. Quite frankly, that's ridiculous. When asked, 'Why the rush?' the government effectively said that they'd agreed with the opposition to do this before Christmas. We have no idea how or when the government and the opposition stitched up this deal or why, to be honest. The process is awful. This legislation is a world first. There's no well-trodden path here, but we are rushing it through.
The Joint Select Committee on Social Media and Australian Society sat for six months to produce a report this month on how social media is used in Australia. One of the specific terms of reference was for the committee to inquire into the use of age-verification to protect Australian children from social media. In the report, there were contrasting views on whether making it safer for children means preventing them from accessing social media until they reach a certain age. The committee did not recommend a ban on access; they instead recommended a single and overarching statutory duty of care on digital platforms for the wellbeing of their Australian users. Despite this recommendation, the government is proceeding with a ban, and proceeding now—today—despite there being 12 months before the legislation comes into effect, despite there being limited public scrutiny and despite there being limited clarity on how the ban will be enforced. This is my biggest concern. This is no way to make laws.
When it comes to the ban itself, one of the big risks—it's one of the reasons we need a proper inquiry—is that we might throw the baby out with the bathwater. Social media is arguably now a crucial form of modern communication and socialisation. We can't turn the clock back to the wholesome childhood I had in the 1980s, where I was climbing trees, riding my bike with friends and watching The Goodies. The world has changed. In October, more than 140 experts wrote an open letter to the Prime Minister, Anthony Albanese, in which they said 'a "ban" is too blunt an instrument to address risks effectively'. The Australian Human Rights Commission has now added its voice to the opposition to the ban. I hear mixed reports about the positive mental health impact of social media. I've heard it said that, for young people in crisis, social media can often be both the cause of trauma and also the channel they use to seek help.
There's no doubt young people can connect with communities beyond their immediate geographic world using social media, and this can help make them feel less alone. It's very challenging to balance this benefit against the harm we hear so much about: the bullying, the comparisons against an ideal, the harmful content, the porn, the eating disorder material, the extremist views and the predators. But we must acknowledge that it's not all bad and that young people will have fewer connection opportunities than they have now without access to social media.
Another big concern is enforcement. It's not clear how this will be enforced, and this raises a number of risks. The first is that it's just not enforced and it doesn't work. Second is that, at best, it's a tool for parents to use—saying it's against the law—but kids can get around it. If this happens, it has some value for parents, but we have to be realistic about how much of a difference it will actually make. But if it is enforced through some sort of age verification process, these platforms will have all our data. They already have a lot of data, but adding to that is problematic. There's no point in going into the detail of the privacy issues here. Apparently, we need to wait and see what the age verification trial uncovers, so what will constitute reasonable steps to prevent young people creating an account remains to be seen. This is a pretty big gap in the legislation, and there's a good argument to postpone the legislation until we know what, if anything, actually works.
Another significant concern is that a ban for under-16s lets the platforms off the hook when it comes to creating safe spaces. The committee report I mentioned refers to widespread agreement that a ban alone is not sufficient to curb harms on social media. A ban for under-16s may actually give the platforms licence to avoid other regulation. The government has said that, while legislating an age limit might not be the perfect solution and should certainly not be the only solution, it would provide important breathing space for the implementation of long-term, sustainable digital reforms. So it's important that this isn't the end of the story when it comes to regulating social media platforms. It's the wild west out there. We could definitely be doing this in a more nuanced way if we took the time to examine it more closely, and many of my crossbench colleagues are proposing amendments to broaden the scope or focus of regulation, and I'll be backing those amendments.
The EU is requiring social media platforms to do a risk assessment on any new features they add to an app. For example, TikTok added a feature in France and Spain which would encourage addictive behaviour. They failed to submit a risk assessment, so the European Commission banned the feature. Other options that I think should be considered in parallel with this blunt tool are regulating algorithms, legislating a duty of care and risk assessments, or making social media companies provide greater transparency and data so we can actually understand how algorithms work and decide if they're in the public interest. But, and it's a pretty big 'but', there is a strong desire from communities to see immediate protection for children.
I'm a parent of teenagers and I know that I rarely have a conversation with other parents of teenagers that doesn't involve us talking about the impacts of screens and social media on our kids. We are conducting a huge experiment on a whole generation, and it's raising serious concerns. We don't know how much is too much, whether we're turning our kids into zombies or depriving them of normal social interaction. As parents, we see growing mental health issues in teens, and you can't help thinking it's connected with the biggest change in the last generation—the ubiquity of smartphones and technology in every aspect of life. Even if it is a blunt tool, this ban should protect children and young people from cyberbullying, harmful content and online predators as well as go some way to safeguarding their psychological and emotional wellbeing. Hopefully it will give kids time to be kids offline. It will also protect their data from being harvested. No matter how it is enforced and whether it is well enforced, it at least gives parents the ability to hold the line and tell their kids, 'You can't have social media, because it's against the law.'
On balance it seems that people in Curtin, or at least those who have engaged on the issue, largely support a ban. I conducted an online survey in Curtin some months ago with nearly 600 respondents, promoted through social media and also in the local Post newspaper. This is obviously not the whole community, but these are the ones who have expressed an interest and a view. The key themes that came out of this survey were that protection of children is paramount, that there was broad support for a minimum age for social media use, that there was broad support for legislation that covers emerging harms and technologies, and that people felt that platforms should have a statutory duty of care. Specifically on the minimum age for social media use, 86 per cent of respondents wanted a minimum age for social media. Of those, 45 per cent wanted the age to be 16 years. This was by far the most popular minimum age in the survey. But only 57 per cent said that they were willing to share their personal information.
I also asked my youth advisory group, a group of year 11s from across Curtin, about it. Even with that group there was a surprising level of support for a ban for under-16s. One youth advisory group member, Christian, pointed out that 16 is pretty old—the same as the age for sexual consent—and that maybe a younger age would be better or that it should be up to parents to regulate. All the young people I've spoken to thought that a duty of care and greater obligations on platforms to create safe spaces were a better option than a total ban. But Alice pointed out that social media and doomscrolling are a recent thing and that people long before us have survived without them. She pointed out that social media changes our whole way of living and communicating with each other and it's sending us in the wrong direction.
When I asked how a ban would have affected them when they were under 16, none of them were devastated at the idea. Oban said that, if none of his friends were using these platforms, it would have been fine. Isabella said she wouldn't have felt left out if her friends were also not allowed social media. Miles said he and his friends would have migrated to more direct communications platforms instead. Even my year 11 work experience student, Alasdair Watson, while arguing vigorously against the ban, admitted that he would probably have been better off without social media before he was 16. In the new year, I'll be working with my youth advisory group using an innovative technology platform to do broader community research into how young people think social media platforms should be regulated. I had planned to do that before this bill came up this week, and I will continue with that next February, even in light of this legislation, because there is so much more to do.
In conclusion, we should continue to examine and evolve our regulatory approach to this rapidly changing but ubiquitous part of modern life. But I reluctantly concede that this ban for under-16s may be a useful first blunt tool. So, despite my concern about the ridiculously rushed process, the risk to privacy if it's enforced or pointlessness if it's not enforced and the need to do a lot more to create safe online spaces, I will be supporting this bill.