Transcript - 2GB Ben Fordham Live
BEN FORDHAM, HOST: Hey kids, the countdown is on. The Federal Government is on track to introduce a ban on social media for children under the age of 16. The devil is always in the detail. We understand social media apps like Instagram, Facebook, TikTok and Twitter will be included in the ban, we're still not sure about YouTube.
When addressing the looming ban last week, the Communications Minister, Michelle Rowland, stated YouTube would likely fall within that definition as well. But is YouTube a social media platform, and if you ban it, it's going to impact classrooms. So we wanted to get some clarity on all of this. Michelle Rowland, the Minister for Communications, is on the line right now. Minister, good morning to you.
MICHELLE ROWLAND, MINISTER FOR COMMUNICATIONS: Good morning.
FORDHAM: So, will YouTube be included?
ROWLAND: We'll have the definition of what constitutes age restricted services. YouTube would likely fall within the definition because we already have some definition of what those social media services are under the current Online Safety Act. But what we want to do, Ben, is we're going to have some criteria that will be set out. And this will also be administered by the eSafety Commissioner.
Part of what we want to do, as well, is encourage platforms to develop low-risk services. And they're ones, for example - YouTube Kids - that could be one that could be a candidate for being within those exemptions.
But we'll have the legislation presented to the Parliament next week. There will be some instruments that will set out criteria, because as your listeners will understand, the technology develops very rapidly. But we will also have exemptions in there for education and health purposes. And I think that when your listeners are able to see what this definition encompasses, because we want to really do two things: firstly, we want to make sure that we capture those specific social media platforms, but also we understand that social media, and some of these apps do have very useful benefits for parents and for children for good purposes, we want to ensure that we don't exclude those from being able to be accessed.
FORDHAM: Okay, you say messaging and gaming services will not be in scope of this definition. So where does Snapchat fall?
ROWLAND: Well, Snapchat, under the Online Safety Act, depending on how it's defined, could fall within that definition. But we will go through in a methodical way through having these criteria, and the eSafety Commissioner applying what will be a very transparent process.
Then it will also be subject when we have these instruments, this will be subject to disallowance by the Parliament because we understand there's a need for transparency. But all of this will be in the legislation that we present to the Parliament next week. I know that it will be an area where, over the next 12 months, as this is implemented, we will need to take really good consultation on because we want to ensure it's capable of being implemented. But this is one where I can assure your listeners, the public and also very important advocacy groups, including young people, will be able to have a look at what we are proposing and why this is important to get done.
FORDHAM: If you're banning Instagram, Facebook and TikTok and Twitter, you've got to ban Snapchat, right?
ROWLAND: Well, that is an argument that has been put. We are very prepared to go through having a process of criteria and seeing how this fits against it. Also, as your listeners I think will understand, some of these platforms do present themselves in different ways. They will argue, for example, that they are messaging services and not social media services. But we need to assess that objectively against a transparent set of criteria.
FORDHAM: Okay. You say the Federal Government will impose strong penalties on platforms that breach their duty of care. So what are the strong penalties?
ROWLAND: Well, currently under the legislation, your listeners may be stunned to learn that those penalties currently are less than a million dollars. And that really is out of step with what we have in consumer protection legislation, for example. So we have taken advice through the review that we've just had done - an independent review of the Online Safety Act - that had in its terms of reference penalties as well. So we will have a penalty regime that is effective. And again, this will be in the legislation when it's presented to the Parliament.
FORDHAM: Okay. How will children prove their age?
ROWLAND: This is why we're conducting the Age Assurance Trial at the moment. Age assurance is an area that is rapidly developing in technologies, and there's a number of methods. But part of our trial has been looking at the effectiveness. So the willingness of the platforms to take them up, the willingness of people to be able to use them, and how they fit within really solid privacy criteria.
Now, some of your listeners, I think, will be familiar, when you do online shopping, for example, you'll often get to a site where your bank will send you a one-off code. That can be used, for example, as a sort of a model for what we may call blind tokens or third-party verification. So, we want to ensure that we balance those privacy concerns. Age assurance also has- in the technology that's being developed, there are some biometric examples of that. But we don't want to mandate a specific technology. We want to set a specific set of criteria that needs to be satisfied to show that the platforms have taken reasonable steps.
And just on the last point of age assurance, I had explained, previously, that when we were coming to this decision around 16 as being the age for minimum age for access, we were looking at an age range of somewhere between 14 and 16, because there is some age assurance that can go within a couple of sometimes accurate, but sometimes not, depending on someone's ethnicity or gender, for example.
So, this technology is developing rapidly. We don't want to mandate a specific one, but we do have 12 months in that implementation timeframe to make sure that we have the most robust systems.
FORDHAM: Okay. A lot of detail yet to be sorted out. We're talking to Michelle Rowland, the Communications Minister. We'd like to discuss the misinformation laws. I know that Australians, including myself, are concerned about the misinformation bill that Labor has introduced to Parliament. Is the point of this bill to minimise the amount of content that Australians can see online?
ROWLAND: Not at all, Ben. We know that 80 per cent of Australians are concerned about the amount of mis and disinformation that proliferates online, and when you consider the speed and scale of this, this is a problem. It's a problem that's been identified by our top spy- by our top Defence Force personnel, and this information really has the capacity to cause harms to our democracy. It can often be spread by malicious actors or rogue states …
FORDHAM: Okay. So some are saying that this acts as a recalibration for a whole range of human rights playing out online, including freedom of speech. Do you agree?
ROWLAND: I don't, and the reason for that is because we have undertaken very thorough consultation, including a very thorough human rights assessment, and an explicit statement within the bill that goes precisely to that issue of human rights.
Ben, I think it's important to emphasise that the existing content moderation policies that the platforms do – and these are the people, these are the institutions who are not passive purveyors of content, they are active pushes of this content – they have moderation policies that are opaque.
Your listeners, as users of these platforms, don't know how they're curated. It's important for your listeners to have that level of transparency. And what we are talking about here is the systems and the processes of the platforms, not individual pieces of content. It's not a takedown regime, except in cases where there is clear disinformation spread by inauthentic behaviour like bots and troll farms.
FORDHAM: Sure. Minister, you've got a bit of an issue, because I've just put two statements to you that you have disagreed with. Both statements come from your eSafety Commissioner, Julie Inman Grant.
ROWLAND: Well, I should be very clear that this is building on a voluntary industry code that is in place right now. What we're talking about here are the systems and processes. We are not talking about individual pieces of content …
FORDHAM: Sure. But can you see the point that I'm making? I just want to play this for you for a moment. This is your eSafety Commissioner Julie Inman Grant, saying that this is about minimising the amount of content Australians can see.
[Excerpt]
FORDHAM: And the comment about a recalibration of a whole range of human rights online, including freedom of speech.
[Excerpt]
FORDHAM: So, you're at odds with your eSafety Commissioner?
ROWLAND: Well, firstly, I would say that eSafety – and we've been working very closely with eSafety on a range of matters that go to online harms – the context of that, Ben, I will say, I do not know precisely what was said before or after those statements.
But what I can tell you is that this is a scheme that builds on a voluntary industry code that is in place right now, that has been criticised by successive governments because it lacks teeth and because the reporting processes under it have been inadequate. And I would …
FORDHAM: Okay. Let me jump in with some quick questions, only because we've got five minutes. So who decides- who judges what is and isn't misinformation? If we could have short answers.
ROWLAND: The platforms.
FORDHAM: The platforms decide?
ROWLAND: This is what they decide right now. They have their own systems and processes for curating what content is shown, but also how they respond to mis and disinformation.
FORDHAM: Okay, so they'll be making the decisions under your bill?
ROWLAND: They currently already make these decisions.
FORDHAM: Yeah, but under your legislation it'll be left up to them?
ROWLAND: They will continue to be doing that. The Government does not have control over what these platforms decide to serve up. But what is important here is their own systems and processes for identifying and dealing with mis and disinformation. Sometimes they will tag certain content. Sometimes they will put them further down in a list. Sometimes they will …
FORDHAM: Sure.
ROWLAND: As I said, label this content. They'll still be doing that. What this bill is doing, and I'll [indistinct] you on this point, the bill will bring transparency to what the platforms are doing in their content moderation right now.
FORDHAM: Okay. We've heard about potential jail sentences. Under what circumstances would someone go to jail?
ROWLAND: Well, this isn't about individuals going to jail, Ben. That is just false.
FORDHAM: So, who would go to jail?
ROWLAND: Well, your users aren't going to jail under this. This purely goes to the conduct of the platforms and their systems and processes.
FORDHAM: Okay. I understand my listeners won't go to jail. Who might go to jail?
ROWLAND: Well, none of your listeners are going to go to jail here, Ben. But there is a graduated enforcement scheme that's under the act that has penalties in it. But again, this goes to the systems and processes of the platforms themselves, not individual users. This does not go to individual pieces of content. And again, I should stress this is a bill that was brought in, building on the previous Morrison government's assessment that something needed to be done in this area – that we needed to codify the industry code of practice as it is right now. It's exactly the policy that Scott Morrison took to the last election. People want action on this now, and it's important that we get this done.
FORDHAM: I'm not sure that they do want it. I mean, you have a look at the people who've come out against it – James McComish from the Victorian Bar; the media lawyer Justin Quill; the constitutional law expert Anne Twomey; the former New South Wales Supreme Court judge and anti-corruption advocate Anthony Whealy; the Sydney barrister Jeffrey Phillips SC; the Victorian barrister Peter Clarke.
And look, I'd say on behalf of my listeners, because this is the feedback I'm getting, Minister, just to close things off, I think we'd rather be offended or we would rather read something that may not be true, then have someone decide what's misinformation or disinformation.
ROWLAND: Well, I'll end on this point, if this is your last question. The point here is that we are talking about a high threshold of serious harms. Where misinformation, for example, causes people to tear down mobile phone towers in regional areas – that can mean people can't call Triple Zero – that is a serious harm, and that is due to the scale, speed and spread of misinformation. That is what we are attempting to deal with here, to protect Australians from harms.
FORDHAM: And just lastly and briefly, politicians will be subject to these laws too?
ROWLAND: Yes, they will.
FORDHAM: We appreciate your time.
ROWLAND: Pleasure.