Interview with Chris O'Keefe, 2GB Sydney

CHRIS O'KEEFE, HOST: Michelle Rowland is our Federal Communications Minister, and she's on the line. Minister, thank you for your time.



MICHELLE ROWLAND, MINISTER FOR COMMUNICATIONS: Pleasure.



O'KEEFE: Is it really wise to hand a group of public servants the power to decide what's true and what's not?



ROWLAND: Well, to be clear, Chris, what this is, is the ability to hold the Big Tech platforms to account for their own policies. Big Tech currently has a Voluntary Code of Practice dealing with mis and disinformation, but that's found to be wanting in a couple of areas. Those are around transparency of their systems and processes, and the accountability of the digital platforms.

 

What is sought through this exposure draft is to have a scheme where we can ask the platforms what they are doing to comply with their own rules, and to make them accountable for their own actions.  It doesn't actually go to individual pieces of content.  I should be clear, that as is currently the case, it will be the platforms that will continue to moderate the content on their own platforms - that is not for the Government or the Regulator to look at individual pieces of material or posts, that will continue to be in the purview of those platforms. What we're seeking to do here is to hold the platforms to account for what they say they are going to do.



O'KEEFE: Okay. Let's take Facebook, for example, their Code of Conduct. What would you like to see Facebook or Meta doing better?



ROWLAND: Currently they don't have very effective complaints mechanisms, and that's something that has been found by the ACMA in its own enquiries. I regularly have people asking me why, for example, they've been deplatformed, doesn't seem any transparency around that. A better understanding of the kinds of policies, systems and processes that they apply is important.



Also, I should point out that we have been concerned for some time with X, formerly known as Twitter, for example. The inability to respond to requests for Regulator information including in relation to eSafety. That is a problem, and that's the problem because they appear to not have any Australian staff at the moment. But again, all this goes to their systems and platforms, not to the actual content that they are posting.



O'KEEFE: But isn't ‑‑



ROWLAND: ‑‑ what we are concerned about is that transparency.



O'KEEFE: Understood. But when you've got a disinformation misinformation bill, and in that bill the Exposure Draft Bill that you've released, it talks about a definition of what misinformation or disinformation is: false, misleading or deceptive conduct ‑ content, or misinformation that is disseminated with deceptive intent. That is a pretty broad brush for these social media platforms to police.



ROWLAND: That's currently how it is defined in the industry's own Code, and the definition of harm is very similar to the harm threshold that we have under the cyber abuse and cyberbullying regime in eSafety, for example. But I should stress that this exposure draft process is ongoing. I'm very grateful to a lot of organisations who've taken the time to make submissions on this.



It is a situation, Chris, and I was only reading in the Nine papers in the last couple of days, there's questions raised about, why aren't governments doing more, why aren't Governments working hard to balance the needs of keeping people safe, freedom of speech, and ensuring that these enormous tech platforms are held to account.



This is what we're seeking to do here. I met this morning, for example, in my own electorate in Western Sydney with the Australian Christian Lobby (ACL), a really constructive engagement. Some of the issues that they raised with me were about how can we ensure that matters of religion and free speech are preserved, how do we balance any unintended consequences.  I thought this was very worthwhile to listen to these particular views, and a number of religious groups have made that same point.



But again, we are open in relation to getting these definitions right. We want to make sure, and I want to make clear, that there are some submitters that say this bill is right ‑‑



O'KEEFE: Sure.



ROWLAND: ‑‑ some who say it goes too far, and some say it doesn't go far enough.



O'KEEFE: What is the ultimate goal here though, from the Albanese Government, and from you as the Minister? In plain language, what do you want to see?

 

ROWLAND: To keep people safe.

 

O'KEEFE: Safe from what?

 

ROWLAND: Unfortunately, from information that could cause harm to either themselves or others. To give an example, Chris, something that says, for example, that is disseminated online that spreads at speed and scale that says, "You should drink bleach to treat a viral condition" is something that we need to protect people from. If we have people ‑‑

 

O'KEEFE: Isn't that the role of education though?

 

ROWLAND: There's no silver bullet here, Chris, and it's one that regulators around the world are really grappling with the moment.

 

O'KEEFE: 'Cause it goes ‑ it's a bit book‑burnie, you know what I mean? Like if you're sort of starting to censor information streams rather than educate people to censor it themselves, you're starting to get into the weeds here.

 

ROWLAND: Thanks Chris. Just to be clear, we are doing both. We have invested heavily under this government so far in ensuring digital literacy is put for free through our schools, through reputable organisations like the Alannah & Madeline Foundation, with quadruple funding to the eSafety Commissioner, which has a range of resources and also the powers to make sure people understand where they can make complaints, and we are continuously seeking to ensure our laws are fit for purpose as new harms emerge.

 

When we talk about mis and disinformation, we're thinking about the written word, but really in the era of generative AI, where we have the ability for technology to mimic voice, and also images and videos. That becomes particularly harmful. So, we need to always be forward-looking here. But again, we are in the exposure draft stage. This was a policy that Scott Morrison had before the election as well. We picked it up, because we think that the safety of Australians is paramount. We've always worked constructively in this area, and it's traditionally been an area of bipartisanship.

 

O'KEEFE: Not in this case though.

 

ROWLAND: Well, that is a choice for the Opposition. I would point out ‑‑

 

O'KEEFE: You can understand why though, right?

 

ROWLAND: Well, I can't, because they still have it on their own website that they will legislate to deal with mis and disinformation.

 

O'KEEFE: Well, they're not. They're not in this instance. Just before I let you go, Minister, I just want to ask, what we saw during the pandemic, it was pretty clear that there now is a fair distrust of media, of government and of institutions. When government starts getting its hands and its mitts on information, what do you think that makes the public think?

 

ROWLAND: Well, it means that governments should always strive to get the balance right in these situations.

 

O'KEEFE: Because the government's not always factually correct, and we saw that through COVID time and time and time again.

 

ROWLAND: At the same time, Chris, we really need to balance these risks that we have, and they include risks that have been brought to our attention by our National Security Agencies. We have, unfortunately, bad actors ‑‑

 

O'KEEFE: Sure.

 

ROWLAND: ‑‑ and people who seek to harm Australians. Doing nothing is not an option. Getting it right is what we should be doing.

 

O'KEEFE: We also can get it wrong, because weapons of mass destruction in Iraq, as we saw, you get my drift. I understand what you're trying to do. I think it's noble, I just ‑ I worry about the consequences. Minister, thank you so much for your time anyway.

 

ROWLAND: Pleasure.