Communication Legislation (Combatting Misinformation and Disinformation Bill) - 6 November 2024

6/11/24

This Bill 2024 addresses misinformation and disinformation that causes serious harm to public health, groups of people or elections. It says that digital platforms have to disclose what they're doing about mis- and disinformation and provide information to prove it, and, if they're not doing enough, it gives ACMA the power to require them to do more. A range of limits have been put in place to reduce the potential for infringing freedom of expression. Thanks to some amendments to which the government's agreed, I think this is a step in the right direction in meeting my community's desire to see digital platforms held to account for the harm that they can enable.

Seriously harmful online misinformation and disinformation pose a safety and wellbeing threat to Australians, as well as undermining trust in democracy. A number of people in my Curtin community have expressed real concern about online safety and harm generally. This was reinforced to me by a strong response to my survey on a review of the Online Safety Act. My community welcomed better regulation on current and emerging harms and technologies, and supported greater responsibility being placed on digital platforms through a statutory duty of care. This is consistent with research by the independent Reset Tech Australia that commissioned a YouGov poll in April 2024 and found that 93 per cent of people agreed that social media companies should have a duty of care to take reasonable care of their users. While it doesn't create a duty of care, I understand that this bill is one of a range of reforms the government intends to propose in the broad online safety and big tech space.

Voluntary self-regulation has not worked. The fact is, we need to hold to account the companies that allow misinformation and disinformation to be distributed. We need strong processes to ensure our communities are protected. The concept of better regulating digital platforms from seriously harmful misinformation and disinformation is not novel, and has broad in-principle support. The government is attempting to do it here, and the coalition's 2022 election platform included a commitment to increase the information-gathering and enforcement powers of the media regulator to combat harmful online misinformation and disinformation. I agree it needs to be done too, with transparency and accountability being at the forefront. This is consistent with my fair and transparent elections private member's bill, where one of my priorities was seeking to ban lies in political advertising.

The need to regulate harmful, false communications is particularly pertinent today, of all days, as we watch the US election in a febrile atmosphere of a divided populous and online echo chambers. But how we regulate misinformation is a complex challenge. How do we make sure the communications we receive are truthful, while also making sure people can continue to communicate easily? How do we strike the right balance between combating misinformation and disinformation on the one hand, whilst sufficiently protecting freedom of expression on the other?

The broad aim of this bill is to incentivise digital communications platform providers to have robust systems and measures in place to combat misinformation and disinformation. The bill does three things. Firstly, it imposes core transparency obligations on digital platforms, which will be required to publish a report on a risk assessment of their platform, a current media literacy plan and their policies for tackling misinformation and disinformation. Secondly, it provides ACMA with information-gathering and record-keeping powers to hold digital platforms to account. ACMA can require platforms to maintain and keep records, it can obtain information on an as-needed basis when investigating and it can publish information that's not commercially sensitive. Thirdly, it enables ACMA to approve codes and make standards if voluntary platform efforts provide inadequate protection to prevent and respond to misinformation and disinformation on their services. These codes are industry led and can be approved and registered by ACMA, whilst standards can be set by ACMA as a last resort. A code or standard could include obligations on platforms to have robust systems and processes such as reporting tools, links to authoritative information, support for fact-checkers, and demonetisation of disinformation. Approved codes and standards will be legislative instruments subject to parliamentary scrutiny and disallowable.

I have some concerns about ACMA's ability to fulfil these new roles. This is beyond ACMA's current role. But I understand that additional funding has been allocated and the evolving role can be addressed in an updated statement of expectations from the minister, which is a public document. The bill doesn't apply to all misinformation and disinformation; it only applies if the information causes serious harm. 'Serious harm' can mean harm to public health, vilification of a group and harm to the operation or integrity of an election. Given the bills are proposed to improve the integrity of our electoral system, particularly relating to truth in political advertising, I'm pleased to see this on the list. In Australian federal elections, the AEC is operating in an information environment increasingly impacted by misinformation and disinformation, which has the serious potential to impact elections, as well as eroding voters' trust in the legitimacy of results. As parliamentarians, we must be doing everything possible in order to restore voters' trust in a time of record disinterest and distrust in our democracy. The Australian public has a high level of trust in the AEC, and the AEC sees the value in considering legislation to increase the transparency and accountability of major digital platforms and their responses to misinformation and disinformation. So that's a good start.

But the Law Council of Australia and other civil society groups are concerned that this bill will amount to a restriction on the freedom of expression, and I agree that any restriction on freedom of expression should not be made lightly. We're back to the balancing act I mentioned before.

I acknowledge that, in drafting, the government has attempted to limit restrictions on freedom of expression. Firstly, ACMA's powers are directed to digital communications platform providers and not to individual end users. These digital communications platform providers will be incentivised to have appropriate systems and measures in place to combat misinformation and disinformation. The bill does not empower ACMA to require digital platforms to take down or remove online content, except for disinformation involving inauthentic behaviour such as bots or troll farms. It does not give ACMA any direct take-down power or the ability to fine or regulate individual end users. Also, various content is excluded from the bill—things like parody or satire content or content for academic, scientific or religious purposes. Professional news content is also excluded, on the justification that it's subject to existing industry oversight.

Changes have also been made since the 2023 exposure draft bill, including narrowing the definition of misinformation and disinformation and the scope of serious harms—both done to reinforce protections to safeguard freedom of expression. These changes have been designed to better align definitions with our international human rights obligations.

I'm encouraged that the government has agreed to make some amendments after listening to concerns raised by the crossbench and other stakeholders. Firstly, the government will now put up an amendment to ensure the three-yearly review is undertaken by an independent person. This was an amendment proposed by the member for North Sydney, and I'm glad the government is accepting it. Given how quickly this area is changing, it will be important that we have a credible and clear-eyed review of whether it's working.

The second amendment that the government is now proposing relates to access to data for researchers so policy can be informed by emerging insights in this rapidly changing space. There's no better disinfectant than sunlight, and giving researchers access to data for appropriate research purposes, along with legal protection, means digital platforms will no longer be a black box. I recognise that the Data Access Framework proposed is based on the EU model, which is still a work in progress, so I'm glad to see that data access for independent researchers will be reviewed in 12 months.

I would like to see more public reporting. As the New South Wales Council for Civil Liberties says in its submission to the legislation committee:

Regulatory transparency ensures that the government can hold platforms accountable, but public transparency allows for a multi-stakeholder approach, where civil society, journalists, advocacy groups, and individuals can play an active role in monitoring misinformation efforts. Without public access to critical data and reports the regulatory process risks becoming opaque, which could diminish public trust in both the government and digital platforms.

But I'm yet to work through the detail of the amendments being proposed to determine how much public reporting is now included directly or indirectly by providing access to researchers.

I note that there have been concerns raised about exempting traditional media outlets from the bill on the basis that there are other regulatory frameworks in place. I accept this, but issues with misinformation and disinformation should be addressed appropriately under those existing frameworks. Given that this is a relatively new area of regulation and a rapidly changing space, I'm glad this legislation will be reviewed every three years to ensure it keeps up with the structural evolution in how people obtain information.

It's worth noting that, while the crossbench has been engaging constructively with experts and the government to improve the legislation, those in opposition are yet again playing a pointscoring game and refusing to even consider how we might work to address this issue, which they have recognised is a problem. Despite previously announcing their intention to introduce legislation to combat harmful misinformation and disinformation online, members of the opposition are refusing to even engage in a constructive way in this debate, and then they have the gall to criticise the crossbench for voting with the government when the government introduces the very amendments that they've negotiated. The pointscoring remains more important to the opposition than the substance.

With these limitations on power and the agreed amendments, I think that there's an acceptable balance between the right to freedom of expression against other rights—although I would be open to supporting additional amendments that put further safeguards in place for freedom of expression, and I look forward to seeing how the bill operates at its first review. In the interests of combating online misinformation and disinformation that causes serious harm, and consistent with the stated desire from my community in our online safety survey, I intend to support the bill in its amended form. I commend this bill to the House.

Previous

Aged Care Legislation Amendment Bill - 6 November 2024

Next

Better and Fairer Schools (Funding and Reform) Bill - 6 November 2024