Address to the National Press Club

I begin by acknowledging the traditional owners of the land on which we meet - the Ngunnawal and Ngambri people - and pay my respects to Elders past and present.



I particularly extend that respect to any First Nations people joining us including Ms Dot West, OAM, and Dr Lyndon Ormond-Parker, the Chair and Deputy Chair of the First Nations Digital Inclusion Advisory Group.



This is my first address to the National Press Club and it is the greatest privilege to serve as Minister for Communications in the Albanese Government: shareholder Minister for the two great Government Business Enterprises, NBNCo and Australia Post; custodian of the national broadcasters, the ABC and SBS; the telecommunications sector; the broadcasting sector; the media industry; classification; and, of course, online safety.



In choosing to share with you today my reform program and initiatives to improve online safety – I also want to acknowledge some facts:



While this issue is of great significance to all Australians, I note the recent research commissioned by my department which shows, that, at 58%, Aboriginal and Torres Strait Islander people are far more likely to experience online harms in general than the non-Indigenous population, at 32%.



58% compared to 32% for non-Aboriginal or Torres Strait Islander Australians.



Both of these figures are unacceptable, as is the gap between them. I am dedicated to addressing these harms and improving digital inclusion for First Nations Australians more generally.



Introduction



Growing up in Lalor Park – near Blacktown - in the 1970s I consumed all aspects of communications that existed at the time: I watched the one public and three commercial channels on the TV; I used a rotary dial telephone at home and always carried 20 cent coins in case I needed to use a payphone; mail arrived in our letterbox; and I made mix tapes from my favourite radio stations.



But the moment when communications would really impact on my daily life was around 6.00pm every weeknight. That’s when my dad - Circulation Manager at John Fairfax & Sons - would walk through our front door with a bag of newspapers from around Australia, and elsewhere.  It was a window into a new world, of opportunities and ideas.



I would pore over these newspapers for hours, like many of us scroll on our various devices today.

Australians have had access to the internet for over 30 years now.



By the 1990s telecommunications was our fastest growing industry and Australia had the second highest internet take-up rate, just behind the United States. And since then, Australians have consistently been early adopters of new technology.



We know that communications and connectivity are drivers of productivity and can be a game-changer for families and small businesses, particularly in regional areas. 



For outer metropolitan areas, where I naturally take a keen interest representing some of the fastest growing suburbs in Australia, the impact on liveability, access to opportunity and household productivity is life-changing.



This is why I am passionate about communications, and why I am so fortunate to have worked in the sector for 23 years as a legal practitioner, legislator and now, as Minister.



Much has changed in that time, but when it comes to how I think about the fundamentals of policy and law, some principles hold true: regulation should be proportionate, and not stifle innovation.

But most importantly, it needs to be human-centred.



Over the years, network and software engineers have taken the time to explain to me how things work in a technical sense – everything from packet-switching, to Wi-Fi, to Chat GPT - and the simple foundational fact: within a complex network of networks, there is rarely a neat solution to a challenge.



While the technology itself is complex, that pales in comparison to the interactions and collaborations enabled by it, and the deeply human impacts this has – both positive and negative.



While technology can offer a young person connection to community, and opportunities to build their confidence and sense of identity, it can also present difficult and potentially harmful experiences.



And just as the technical problem-solvers taught me, with policy making in the online space, there is rarely a single solution.



We can’t put up the great firewall of Australia, or switch off every service that facilitates a bad experience.

But it is not beyond our wit to make things safer.



The most critical task I have is to support innovative communications service provision that is in the public interest: that helps lift people up and make the country better, while minimising harms and respecting community standards.



So today I want to do three things.

  • I will share what we are learning about new and emerging harms
  • I want to provide an overview of how our current frameworks are performing
  • And finally, I will talk about the initiatives we are undertaking in this part of the portfolio to make our online spaces safer, more inclusive and cohesive.

The internet and evolving harms



In 1993, a Senate report found there were ‘complex regulatory problems’ involved with the availability of pornographic images on computer bulletin boards accessible by a home computer and modem.



In announcing a taskforce investigation in response, the Labor Government of the day noted:



“We are in an era where children operate computers as easily as their parents rode bicycles.



We cannot allow advances in technology to overtake the legal and law enforcement measures designed to protect them, in particular, from undesirable material”.



The same statement could be made today.

           

Three decades later, through social media posts, direct messages, stories, snaps and more, the vectors for harm have not only expanded, but their ability to scale has increased.



At the same time, companies with net worths exceeding the GDP of some nations— have also been making decisions that impact billions of users.



By sheer size, market dominance and influence, these platforms are also the site of a huge information asymmetry and power imbalance.



Many platforms have taken on some responsibility, establishing terms of service and content policies to address online harms. But it is clearly not enough.



Perpetrators continue to use these platforms to do harm.



As David Kaye, former UN Special Rapporteur said of the digital platforms:



“They have become institutions of governance, complete with generalized rules and bureaucratic features of enforcement. They have struggled to figure out how to police content at the scale to which they’ve grown.



Platforms are a regulated space



This is not to say the online world is an ungoverned space. It has been regulated for some time. Australia has actually been at the forefront of efforts to regulate the internet – to assert our sovereign values and expectations in the online environment.



However, our regulatory framework has been incomplete.



Privacy is a prime example.



When the offer of free email addresses appeared in the mid 1990s, first with Hotmail and Yahoo and eventually everyone else, there was hardly any resistance. We were excited by the novelty despite the fact we were handing over a significant amount of personal information.



Today, we’ve lost count of how many times we’ve clicked “agree” to lengthy privacy notices as we download the next new app.



A global game of catch up is now occurring on a range of fronts – including competition, privacy, and online safety.



I note my colleague, the Attorney-General, is taking forward a critical piece of work to update Australia’s privacy framework for the digital age.



Anchors to approach



Regarding my approach when it comes to online safety, there are three anchors:



First, addressing the power and information imbalance by increasing the transparency and accountability of platforms and services.



And empowering our regulators to hold them to account.  For example:

  • The Basic Online Safety Expectations Determination under the Online Safety Act, empowers the eSafety Commissioner to require online service providers to report on how they are addressing user safety.
  • Another example includes the Government’s proposed Combatting Misinformation and Disinformation Bill. It would empower the Australian Communications and Media Authority to “look under the hood” of the digital platforms and improve transparency about the systems and processes they have in place to protect Australians, including how they comply with the voluntary code they signed up for.
  • I acknowledge two regulators in my portfolio who are here today – the eSafety Commissioner, Julie Inman Grant, and Authority Member Adam Suckling representing the Chair of the ACMA, Ms Nerida O’Loughlin PSM.

To ensure reasonable protections are built into the platforms, we need transparency around industry’s processes and actions.



I’ll return to this theme later in the speech.



Second, acknowledging that the way harms manifest in the online environment is different to how they manifest offline - and that informs policy settings.

  • There is a difference between a young person having a bad experience in the school yard, and being subjected to a deluge of round-the-clock abuse and harassment over social media.
  • There is a difference between visiting a local pub and placing a bet, and being home alone, betting online. 
  • There is a difference between how information spreads when it is shared between neighbours, to when it goes viral on social media.

This is not to say one is more benign than the other. Just that it is critical we understand these differences so we can be intentional about the appropriate regulatory settings that should apply.



Third, together with industry we need to lean on researchers and experts as they better understand associated impacts on human physiology and psychology. And the evidence is concerning.

  • The ASIO Director-General’s Annual Threat Assessment released last year highlighted that the number of minors being radicalised is getting higher, while the age of the minors being radicalised is getting lower, with children as young as 13 now embracing extremism – and critically, that most of the radicalisation occurs online.
  • A recent Flinders University study on Years 7-9 found that the time those young people spent using image-based platforms such as Instagram and Snapchat was associated with significantly higher disordered eating behaviours.
  • And in relation to online scams, Older Australians continue to lose more money than other age groups. According to Scamwatch, in 2022 people aged 65 and over made the most reports and lost more money than any other age group with $120.7 million reported lost.

Understanding these very real impacts of online harms will help industry build in the right user empowerment controls, education and supports to keep people safe.



While today’s address focuses on the industry and regulatory responses to online safety, it is also critical we invest to empower citizens with the digital and media literacy skills they need to navigate the online environment.



In our Budget, we provided $6 million over 3 years for online learning tools, through a partnership with the Alannah and Madeline Foundation.



By 2025, every primary and secondary school in Australia will have the opportunity to engage with the Foundation’s digital literacy products – for free.



Industry has strong incentives to create safe products and services for users.



Virtually all the evidence will demonstrate that there’s a virtuous cycle here – the safer your product, the better experience for consumers, which improves reputation, and all the benefits that ensue.



And because industry knows their own systems, they are best placed to build safety in from the outset.

I acknowledge the great work done by the eSafety Commissioner with industry to support safety by design.



Meanwhile, Governments have a strong incentive to balance industry innovation while promoting safety.



OSA and eSafety



The existing framework under the Online Safety Act provides a backstop when industry has failed to remedy harm.



It also seeks to prevent harm through education and awareness raising and by putting requirements on industry to take proactive measures to promote safety.



The Act, which came into effect in January 2022, gives eSafety the power to take down harmful content such as cyberbullying of children, intimate images shared without consent, cyber abuse targeting adults, and illegal content.



It also requires industry to develop mandatory codes that address illegal and seriously harmful content, and protect children from age inappropriate material.



And eSafety can develop mandatory industry standards if the Commissioner determines the draft codes don’t provide appropriate community safeguards.



Just this week, eSafety opened consultation on two draft standards to cover apps, websites, and storage services; as well as messaging services, online dating services and gaming.



In the last financial year, eSafety facilitated the removal of image-based abuse material from more than 6,500 locations. They held 225 workshops supporting Australian women online, and had over 300,000 visitors to their learning portal to support older Australians.



That’s why, in the Budget the Albanese Government quadrupled ongoing baseline funding for eSafety – including by rolling terminating measures into base funding.



The Regulator now has the stability and certainty it needs to carry out its critically important work.



International cooperation



Governments around the globe are dealing with the question of effective digital platform regulation and how to grapple with new risks and harms.



International cooperation remains an important feature of internet governance.



We saw this most recently at the AI Safety Summit in the UK where Australia, along with the EU and 27 countries became signatories to the Bletchley Declaration, affirming that AI should be designed and developed in a manner that is safe, human-centric, trustworthy and responsible.



And I am pleased to share today that Australia is in the advanced stages of establishing a new online safety and security memorandum of understanding, with the United Kingdom – a first between our two countries.



The MOU will step up our bilateral engagement across a range of issues - from child safety and age assurance, to technology-facilitated abuse, as well as user privacy and misinformation, and responding to emerging technologies like generative AI.



This form of engagement means we can share and learn from our close allies to ensure our regulatory interventions are measured, targeted and evidence-based.



Dating apps



Turning back to our own shores, the Government has been focused on addressing some immediate gaps in our online safety framework.



Earlier this year, I convened the National Roundtable on Online Dating Safety in response to concerning research about abuse or harassment occurring over online dating platforms.



The Roundtable brought together advocacy groups, state governments, victim-survivors, law enforcement, and dating sites – major global platforms including Bumble and Match Group.



This was the first time some of these dating apps had engaged with the Government.



Even before the event took place, platforms started to make announcements about new safety features.

However, following an information request I issued to industry in March, it was evident there were still gaps in safety policy and processes.



That’s why, in September I formally requested dating platforms operating in Australia develop a new voluntary industry code, to better protect Australian users.



Dating industry representatives met earlier this week to progress this work – with support from my department – and I look forward to an update on next steps.



The truth is, the dating apps are offering a hugely popular service – it’s now the most common way to meet someone in Australia. But Australians need to be able to do so safely.



I have been clear that the code is to be in place by the middle of next year, and industry knows it must improve the safety of Australians – or we will regulate them.



Basic Online Safety Expectations

This setting of clear expectations and driving transparency around industry’s actions, is a fundamental component of Australia’s online safety approach.



The Basic Online Safety Expectations Determination sets out the Government’s core expectations for the online industry and provide a clear message about our priorities, and the steps industry can take to address them.

The eSafety Commissioner can require online service providers to report on how they are meeting any or all of the Expectations.



Since last August, eSafety has issued three rounds of reporting notices to 12 companies, covering 27 different services – shining new light on what they are doing to counter child exploitation and abuse, and hateful conduct.



Since issuing these notices, we have seen some of our most popular platforms introduce new safety policies to address this horrific material.



This demonstrates the efficacy of the framework in putting pressure on industry to do more of their own accord.



The information received through these reports also gives Government a critical evidence-base about the need for further reforms.



But it’s clear there is more industry can do on harms not explicitly set out in the current Expectations.

That’s why today, I announce that the Albanese Government will commence consultation to strengthen Australia’s Basic Online Safety Expectations – to address gaps, emerging harms, and further clarify the government’s expectations of industry.



Under the proposed changes, services using generative AI would explicitly be expected to proactively minimise the extent to which AI can be used to produce unlawful and harmful material.



We are also proposing a new expectation that industry consider the best interests of children in the design and operation of their services. This could include implementing appropriate age assurance mechanisms and technologies, to prevent children from accessing age-inappropriate material.



The Government received the eSafety Commissioner’s Age Verification Roadmap earlier this year. The Roadmap noted that the age assurance industry and associated technologies are new and still evolving and come with privacy, security and implementation risks.



However, it is also clear that some age assurance technologies and complementary measures are being deployed effectively across commercial platforms to prevent a range of harms to children.



We want to see industry continue to develop and invest further in these technologies.



Consultation on the proposed amendments commences today and I invite submissions before 16 February 2024.



Misinformation and disinformation



I also want to touch on another proposed reform in my portfolio that addresses a gap in our existing regulatory framework – the exposure draft bill to tackle seriously harmful misinformation and disinformation on digital platforms.



The overwhelming majority of Australians are concerned about misinformation – nearly 70 per cent, according to the Digital News Report by the University of Canberra. 



The Government’s draft bill, which was released for public consultation would empower the ACMA to hold digital platforms to account with new information gathering, record keeping, code registration and standard making powers.



This would improve transparency about the decisions platforms already make about harmful content on their services.



Digital platforms are not passive. Social media companies manage content on their services –– every single day.



Australians deserve to know more about what content is flagged, demoted or taken down. 



This is about better understanding the systems and processes industry have in place.



Spread online at speed and scale, both misinformation and disinformation can cause serious harm – sow divisions within the community, undermine trust, and threaten public health and safety. 



Indeed, the potential for the spread of AI-generated mis and disinformation makes taking action more pressing.



The risks here were well understood by my predecessor Paul Fletcher who announced that his Government would legislate to give the regulator new information gathering and enforcement powers in this area.



We supported him at the time and remain committed to the policy. And for good reason.



We are talking about harmful misinformation spread during a global pandemic that claimed millions of lives, about cancer misinformation that prevents a patient from seeing their doctor during the critical time when recovery is still possible.



And disinformation spread by bad actors who want to undermine our democracy.



It was recently reported the Chief of the Defence Force said disinformation operations have the potential to “fracture and fragment entire societies”.



As former ACCC Chair Rod Sims wrote recently - doing nothing in this area would represent an abrogation of responsibility by Government.



I thank stakeholders for the feedback provided during consultation – we are working through those submissions in earnest, and in particular the feedback on definitions and exemptions – and expect to make some changes to the draft bill before it is introduced to the Parliament next year.



Online gambling harms



A solid evidence base is important to inform Government’s approach to addressing online harms, particularly for those most vulnerable.



In the last twelve months I have introduced several gambling harm minimisation measures including:

  • launching BetStop, the National Self-Exclusion Register;
  • agreeing new mandatory minimum classifications for games with gambling-like features;
  • introducing legislation to ban the use of credit cards for online wagering,

On BetStop, since its launch less than three months ago, nearly 13,000 Australians have already self-excluded from online wagering.



Over half of these registrants are between the ages of 18 and 30, and 40 percent have elected to self-exclude for a lifetime. This is a critical protection when it comes to helping people stay safe from gambling-related harm. 



This is in addition to the Government’s work to implement consistent gambling messaging; wagering staff training; and mandatory customer pre‑verification.



But more needs to be done.



That’s why, in September last year, we referred an inquiry into online gambling harm and its impacts to a House Committee which has now made 31 recommendations, which are informing the Government’s consideration of further gambling reforms.



In relation to gambling advertising I am, like many Australians, concerned about the extent and impact of gambling ads.



I have been working with my department to understand how we can practically implement reforms and engaging with a wide range of stakeholders – media, digital platforms, sporting codes, and harm reduction advocates. 



We all know that the online gambling industry has grown and changed in recent years and that some sporting codes now rely heavily on online gambling revenue.



But my priority is clear, and that is harm reduction.



We haven’t made final decisions yet, but I appreciate the constructive input from all stakeholders as we consider next steps.



I, along with the Minister for Social Services, look forward to outlining our agenda to further address gambling harms.



Updating the regulatory framework



As we near the end of my address, I want to reflect on a point I made earlier.



New harms have emerged and some known harms have intensified.



We need to ensure our legislative framework does more than just play catch up, and incentivises platforms to have protections built in.



While the Online Safety Act provides protections for individuals who have been targeted by seriously harmful online abuse, there is no mechanism to address harmful abuse directed at communities on the basis of their religion or ethnicity.



There is deep concern across the community about the way hateful language spreads online – including recent reporting about the rise in anti-Semitic and Islamophobic rhetoric.



Over the past two years, it has also become harder to distinguish between AI generated images and genuine ones.



And while this technology has incredible potential as a positive tool, it has also been used to create images designed to humiliate, embarrass or even abuse – others.



Australia needs our legislative framework to be strong, but also flexible enough to respond to an ever-evolving space. Earlier this year, I said I would bring forward the statutory review of the Online Safety Act by a year.



Today, I am pleased to announce that I have appointed Ms Delia Rickard PSM to lead a comprehensive review of the Online Safety Act, with public consultation to commence next year.



Ms Rickard has extensive experience in regulating consumer harms and is highly regarded for her efforts to promote a safer online environment through regulating scams.



As the Deputy Chair of the ACCC for more than 10 years, she witnessed first-hand the changing digital landscape and how the vectors for harm have become increasingly sophisticated. 



The review will be broad-ranging, and include consideration of the overarching policy objectives of the Act.



It will look at the effectiveness of the complaint schemes, the operation of the regulatory tools, and identify gaps in the legislation.



The Review will consider whether regulatory approaches being adopted internationally – including the Duty of Care approach that is progressing in the UK – should be adopted in the Australian context.



I welcome the involvement of all interested parties as Ms Rickard undertakes this work.



Conclusion



Back in the 1970s, when I was immersing myself in news and content – the analogue way – my parents didn’t worry so much because they knew and understood what I was looking at.



Today our children get online, and parents and legislators are rightly concerned.



Because we know that the content kids can access and the connections they can make through games, video sites, and social networks are vast and too often unchecked.



The way Governments and regulators go about keeping citizens safe from online harms cannot be set and forget.



The Albanese Government is committed to realising improvements to communications to make Australians’ lives better and expand opportunity, supporting people to connect, work flexibly, be entertained, transact, do business, learn and play – safely.



It must be an ongoing, careful exercise, delivered together with industry, in step with changes in technology, and in partnership with the health practitioners, experts, parents and communities.

In the era of the digital giants, it is more important than ever that Australia asserts its values and the community standards we expect online.



That we reaffirm our shared responsibility to protect citizens and consumers.



That way, we can fully realise the benefits of the digital world.