Speech - International Institute of Communications (IIC), Telecommunications and Media Forum, Sydney


I acknowledge Uncle Allen, and the Gadigal people of the Eora Nation, the traditional owners of the lands on which we meet.

I pay my respect to Elders past and present and to First Nations people here today.


This year, we have a once-in-a-generation chance to vote Yes for a better future for Aboriginal and Torres Strait Islander people and all Australians.


In decades to come, let us look back at 2023 as the year we said “yes” to recognise the First Peoples of Australia by establishing an Aboriginal and Torres Strait Islander Voice.


To this end, I also note the work of the First Nations Digital Inclusion Advisory Group to bridge the digital divide, and Close the Gap on Target 17, particularly for remote Australian communities.


Thank you to IIC President, Chris Chapman for your introduction - as always, your opening remarks are on-point - and to the International Institute of Communications, Director General Lynn Robinson and Australian Chapter President, Angela Flannery, for inviting me to give a few opening remarks. 


With chapters in around the world, the IIC has, for over fifty years, shaped the global communications agenda.


The speakers, participants and topics at this auspicious forum are testament to this, and it is a privilege to be here - I know your contributors and discussions will be stimulating. 

Digital platform governance

Around the globe, governments, industry and civil society are grappling with the question of effective digital platform regulation.


And the words of the IIC President about the need for government policy and regulation to be turned on its side, from the vertical to the horizontal, and looked at through a digital lens, becomes more real each day.


Media and communications regulators are, along with their competition counterparts, at the forefront of these efforts.


International collaboration is of increasing importance and trusted global and regional multi-stakeholder events, like this one, provide a forum for much needed policy debate.


From the early days of cross-border communications via satellite, to digitisation, Web 2.0 and social media, and nowadays generative AI, each iteration of technological evolution has demanded a response.

Increasingly, that response requires greater flexibility and agility, just as the need for best-practice policy making, based on principles, evidence, expertise and consultation remains.


Australia has long recognised the internet as a governed space – and has regulated it.


Early legislation in Australia included 1999 amendments to the Broadcasting Services Act 1992, which established a regulatory regime for internet service providers and online content, followed by the Cybercrime Act 2001 and the Spam Act 2003


More recently, Australia has applied established regulatory frameworks to digital platforms – using some tried and true mechanisms that are very familiar to the broadcast, telco and competition lawyers in the room, to new contexts.


The Online Safety Act 2021 contains a co-regulatory framework, that incentivises and leverages industry efforts, while providing a strong backstop for the government regulator.


The News Media Bargaining Code utilises a designate / negotiate / arbitrate model as part of incentive regulation to address imbalances in bargaining power between digital platforms and news businesses.


As with longstanding regimes for telco and broadcasting – these are graduated and proportionate and can be calibrated for different situations as required.


While the context may change, what remains is the need for targeted regulation that doesn’t stifle diversity, choice and innovation, and that promotes citizen and consumer interests.


In the dynamic communications environment, assessing the fitness for purpose of regulation is a business as usual activity.


To that end, the Minister for Industry and Science, Ed Husic, recently released the Safe and Responsible AI in Australia Discussion Paper to ensure the growth of artificial intelligence technologies in Australia is indeed safe and responsible – recognising that while Australia already has some safeguards in place in relation to AI, it’s appropriate that Australia consider whether these mechanisms are fit for purpose.


Importantly, this consultation provides a whole of government approach that is complementary to work underway in other portfolios in relation to AI developments, specific to various areas of responsibility.


The issues posed by artificial intelligence and copyright are also being considered by the Attorney-General as part of a Ministerial copyright roundtable process, which brings together a wide range of sectors, including publishing, broadcasting, screen, education, research, music, gaming, technology and cultural collections.


 And in June, I convened a discussion on the challenges and opportunities posed by generative AI, with a focus on the potential for AI to cause or amplify harms in relation to online scams, misinformation and disinformation, and online safety.


This brought together digital platforms, AI companies, academics, regulators and government and explored the possible role of government, industry, education and standards.


I will continue to work with my department, as well as regulators – the Australian Communications and Media Authority and the eSafety Commissioner – to explore the implications of generative AI in the context of issues within my portfolio.


Australia will not stand idle.


We are acting to address the need for updated regulatory frameworks, including digital platform regulation, on a range of fronts.


The ACCC’s work out of the Digital Platforms Inquiry has been extensive, and the Government is working through a backlog of recommendations and reports left unfinished by our predecessors.


As part of that work, the Government is considering Treasury’s review of the News Media Bargaining Code.


However today I will cover four particular areas on our government’s actions to craft regulatory frameworks to address challenges in the media comms environment, including:

  • How we are acting to reduce the spread of seriously harmful mis and disinformation online;
  • How we are protecting our citizens from scams and supporting them to be smarter, safer digital citizens;
  • How we are fostering safer, more inclusive online environments;
  • And how we are working to ensure ongoing access by Australian audiences to our local media services on connected TV devices.

Combating mis and disinformation online  


Electoral processes worldwide and the war in Ukraine show how malicious actors use digital disinformation to infiltrate and influence public discourse.


During the global pandemic, we saw the threat of mis and dis information to public safety, including ridiculous suggestions that drinking or injecting bleach can safely treat a viral infection.


Spread online at speed and scale, the impacts of these harmful campaigns are felt offline.  


Both misinformation and disinformation can cause serious harm – it sows divisions within the community, undermines trust and can threaten public health and safety.


Left unchecked: it’s our democracy, society and economy at risk.


Many digital platforms – often overseas-based companies – take down a range of content, at scale, including misinformation and disinformation, every day of the week.

Yet currently, our regulator has no power to compel digital platforms to provide information on what they are doing in this space.


We are determined to place Australia at the forefront of tackling this growing global challenge by holding digital platforms to account.


This demands action between governments, industry, and the community.

That’s why the Government has released exposure draft legislation for consultation, that would lift the hood on industry efforts to target seriously harmful mis and disinformation on digital platforms.

We want to boost the powers of the ACMA to hold digital platforms to account by strengthening and building upon the industry’s voluntary Australian Code of Practice on Disinformation and Misinformation.


The new information powers will create greater transparency on the effectiveness of the voluntary code and give insights on how industry can continuously improve its own approaches to address mis and disinformation.


The ACMA would have the option to use the graduated set of reserve powers to ask industry to make new, registrable codes, or, if necessary, to impose standards.


This would place obligations on platforms to have systems and processes in place – such as greater support for fact checkers, links to authoritative sources of information and better complaints handling processes.


Importantly, digital platforms will remain responsible for the content they host – as is already the case.


The ACMA will not have power to request specific posts be removed, nor will it have a role in determining what is considered truthful.


Professional news, satire, and private messages, among other things, are not within the scope of the proposed powers.


The exposure draft builds upon the work of the previous government and key recommendations made to it by the ACCC in its 2019 Digital Platforms Inquiry, and the ACMA in its 2021 and 2023 reports.


But not only that, it responds to calls from digital platforms themselves for greater regulation.


On 30 March 2019, the founder and chief executive of Facebook, Mark Zuckerberg wrote an opinion piece in The Washington Post, entitled “The Internet needs new rules. Let’s start in these four areas”.


He stated that “Every day, we make decisions about what speech is harmful” and “I believe we need a more active role for governments and regulators”.

And I take this opportunity to affirm that, when it comes to keeping Australians safe online, Australians expect and deserve a mature response from Government and the Parliament.


The overwhelming majority of Australians are concerned about the impact of mis and disinformation.


This concern was once shared by the Opposition and is indeed something the Liberal National Coalition committed to act on when last in Government.


In March 2022, it was my predecessor, the then Minister for Communications, Paul Fletcher MP who first announced his government’s intention to introduce legislation to combat harmful misinformation and disinformation online.


I endorsed his media release at the time – which very well could have gone out in my name.

The Liberal Party actually still has this policy listed in their “Plan for Protecting Australians Online” on their website. It states:


“a re-elected Liberal Coalition Government will introduce stronger laws to combat harmful disinformation and misinformation online by giving the media regulator stronger information-gathering and enforcement powers.”


The opportunity for a constructive, bipartisan approach to keeping Australians safe online remains.


The model proposed in the exposure draft is a well-established co-regulatory framework, under the graduated and strategic risk-based approach of the ACMA.


The definitions and thresholds proposed are high – at the level of serious harm.


As the people in this room well understand, the threshold of “serious harm” is also used in other contexts, including the adult cyber abuse scheme in the Online Safety Act, for example.

And let’s not forget, the introduction of the Online Safety Act was not without controversy.


When I was the Shadow Minister, I worked constructively with then-Minister Fletcher on a number of sensible amendments to improve the Bill.


That’s how responsible Parliaments develop effective regulatory frameworks.


The work across the Parliament on the misinformation and disinformation exposure draft is ongoing, but I note there are in-built checks and balances pertaining to the implied freedom of political communication as well as avenues for appeal.

Consultation on the exposure draft is still open.


We are consulting to ensure the balance is right.


But we will not resile from keeping Australians safe online and holding big tech to account.


Because – as former Chair of the ACCC Rod Sims said in a recent opinion piece in the Australian Financial Review:


“Governments face two choices on these vital but difficult issues… First do nothing and leave it to the platforms themselves to decide whether to do anything at all… or… seek to intervene in some way”.


The former Chair was very clear that to do nothing “is an abrogation of responsibility by government”.

Keeping Australians safe from scams  


Another issue we are tackling head-on is the growth in online scams which are causing unacceptable financial, emotional and societal harm.


Last year, Australians lost more than $3 billion to scams, up 80 per cent from 2021.  Enough is enough.


We are making Australia a hard target through a comprehensive package of anti-scam initiatives, including a new $58 million National Anti-Scam Centre.


Overseen by the Australian Competition and Consumer Commission (ACCC), the Centre will improve cooperation and data sharing across governments, law enforcement and the private sector.


This centralised approach will enable banks to freeze more suspect accounts, telcos to block more scam calls and digital platforms to take online scams down.


In 2022, one third of reports to the ACCC were scams sent via text.


Sender-ID scams trick people into thinking they are receiving legitimate texts from businesses or Government services such as MyGov or AusPost.


This sort of SMS phishing appears legitimate and lures people into disclosing personal and financial information.


Our new $10.5 million SMS Sender ID Registry will help combat this and complements the industry-code SMS scam rules that came into effect last July.

Since then, Australian telcos have blocked a staggering 257 million scam texts – and one billion telco scams in total.

We need to work together to build a nation resilient to this criminal activity and keep Australians safe from scams online.

Fostering safe online spaces

In today’s hyperconnected world, we all deserve to engage in online environments that are safe and inclusive.

Australia’s world-leading regulator, the eSafety Commissioner, is working hard to promote this.

And that is why the Albanese Government has quadrupled ongoing baseline funding to the eSafety Commissioner, to support their crucial work.

When Australians are exposed to harmful content online or experience cyber bullying, for example, they can report it to eSafety to have it removed.

Under wide-ranging powers in the Online Safety Act, the eSafety Commissioner can legally request information from platforms about their efforts to tackle online abuse.

In June, the eSafety Commissioner issued a notice to Twitter about how it is tackling online hate, after receiving extensive complaints.

Legal notices are a tool to promote transparency and accountability and are an important element in the kit when it comes to addressing user concerns.

Enforcing industry codes and standards is another tool of eSafety.

Under the Online Safety Act, eight industry sections must develop codes to regulate illegal and restricted content to protect users. If these codes aren’t up to scratch, the Commissioner can then move to make standards.

The codes are being developed through two phases of work. Under phase one, industry was asked to submit draft codes that address seriously harmful ‘Class 1’ content, or child exploitation and terrorist material, and submitted eight codes.

The Commissioner decided to register five codes but did not register all - after finding codes for certain industry sections did not provide appropriate community safeguards for users in Australia.

eSafety will move to industry standards for those industry sections, and a further draft code has also been revised and resubmitted for consideration in order to capture content generated through AI.

Phase two of the codes process will address ‘Class 2’ content which is legal, but not appropriate for children, such as pornography.

Decisions on code registration is a regulatory decision for the eSafety Commissioner and I encourage online industry associations to continue to work with the eSafety Commissioner to keep their users safe, and address seriously harmful online content.

The Albanese Government is working to ensure that the Online Safety Act is successfully implemented.

A framework for the digital age

The final issue I want to touch upon, is how a new digital-age legislative framework will ensure local content continues to reach Australian audiences.


The Australian Government is committed to legislating a new prominence for Australian TV services on connected TV devices.


A legislated prominence framework would shape the way TV applications and/or content are presented to Australian audiences.


It would seek to ensure that local TV services can be easily found on TV devices, so that they can continue to contribute to Australia’s public and cultural life.


Nothing less than the achievement of our media policy objectives is at stake.


It’s getting harder to find live TV on connected devices. When Australians search their TVs for free content, they get pushed to paid content first.


 And it’s increasingly difficult in a crowded to market for our local TV services to be accessed by audiences. 


As Free TV noted in its submission to the consultation on the framework:


“The UNESCO Convention on the protection and promotion of the diversity of cultural expressions requires Australia to protect and promote the diversity of cultural expressions within its territory. FTA television services provide an important, and unique, contribution to Australia’s shared culture”.


As a sovereign nation, it is imperative that we protect and assert our unique cultural values and voices in the global marketplace.


We are working through all the issues raised during consultation carefully and methodically, to get the best outcome for Australians.


The implementation of the framework is a priority for the Government in 2023 and I expect to be in a position to say more on this matter later this year.


Technology and innovation continue to create new trends and market disruptions in communications.


It is our responsibility to ensure the policy and regulatory environment keeps pace.


We must work together with global partners, industry and community stakeholders to achieve this in a way that benefits all.


Today, I have outlined some of the measures the Australian Government is taking to modernise our regulatory settings, achieve our policy objectives and keep Australians safe from serious harms.


These are global conversations, and I look forward to sharing the outcomes of these processes with the IIC in future.


International engagement is critical to the task at hand – and the IIC, as always, remains an important platform for thought-leadership in this ever-evolving environment.


Thank you.