Australia trailing big tech as disinformation spreads like wildfire on our democracy

The Human Rights Law Centre have welcomed the draft Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (Cth) as a positive step toward regulating social media platforms, which profit from the spread of dangerous disinformation. 

In a submission to the Federal government Department, the Human Rights Law Centre recommended that in place of the proposal for the Australian Communications and Media Authority (ACMA) to be a “backstop” option for regulation, ACMA should instead be given sufficient powers to regulate social media platforms effective immediately.  

Co-regulation models are highly unlikely to work given the consistent failures of social media networks like TikTok, Facebook, Instagram and X (formerly Twitter) to live up to public expectations in addressing harmful speech online. It also made recommendations to strengthen the Bill, including: 

  • ensuring that experts and civil society organisations were able to access digital platforms’ data on how they amplify the spread of disinformation, to ensure greater accountability; and 

  • requiring misinformation standards created by the industry and ACMA to be consistent with human rights.  

The Human Rights Law Centre also warned that, too often, powerful people weaponised a distorted interpretation of the right to free speech to avoid accountability for the harm they cause by their speech. It urged the Australian Government not to back down from regulating big tech under pressure from such voices. 

Millions of Australians rely on social media networks like TikTok, Facebook, Instagram and X (formerly Twitter) for staying up to date with news, friends, family and the wider world. But these powerful platforms are driving the spread of disinformation and hate speech.   

While Australia has been an early mover on reform for online safety and digital media, it lags on key aspects of regulating digital platforms.  

Alice Drury, Acting Legal Director, Human Rights Law Centre:  

“From disinformation campaigns undermining the right to health during a pandemic, to intentionally misleading campaigns to distort free and fair elections, to hate speech that stokes violence and threatens lives, the proliferation of disinformation, misinformation and harmful material online has a profound impact on human rights and democratic processes in Australia and beyond.”

“Social media platforms are allowing disinformation to spread like wildfire in our democracy. These platforms have an enormous level of influence over public discourse, with the power to amplify the information – and disinformation – that forms the basis for people’s decisions and beliefs. This turbo-charges discrimination, polarises society and distorts public debate on matters of critical importance.

“Instead of the lax, voluntary and ineffective self-regulation measures currently in place, we need laws to make digital platforms more transparent and accountable. Technology should serve communities, not put people at risk.

“Self-regulation and co-regulation will never be able to compete with the commercial interests of digital platforms. It has failed everywhere else. We urge the Australian Government to consider amending the Bill to give ACMA the powers it needs to regulate the tech industry.”

Read the Human Rights Law Centre’s submission on the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023

Media enquiries:

Michelle Bennett: 0419 100 519, michelle.bennett@hrlc.org.au