SARAH FERGUSON: Anika Wells is the Minister for Communications. She joined me from Parliament House. Anika Wells, welcome.

MINISTER FOR COMMUNICATIONS AND SPORT ANIKA WELLS: Good to be with you.

FERGUSON: Now going first of all to the reversal on YouTube, your predecessor Michelle Rowland said in November that while YouTube functions like social media, it was exempted from the ban because it enables young people to get the education and health support they need. What changed?

WELLS: Fresh evidence from the eSafety Commissioner about research that she did with Australian children from December last year, so after those laws were legislated, that said 37 per cent of Australian kids, or almost four in 10, had suffered their most recent or most impactful online harm on YouTube. So, under those laws that Minister Rowland drafted, I had to ask the eSafety Commissioner for her advice about the draft rules, and I had to give that advice regard. So, her recommendation was that YouTube form part of the social media ban because that online harm, on balance, was more impactful than the benefits of health and education. 

And I’d also point out, Sarah, that kids will still be able to use YouTube Kids, and parents, teachers will still be able to use YouTube links in a logged out state to access health and educational videos on YouTube. This is about kids having their own logins.

FERGUSON: Let's talk about the logins because, as you just said, it doesn't stop young people from accessing YouTube without an account. So, what sort of content will a young person still be able to view on YouTube without an account?

WELLS: Well, YouTube says that you can't view age-restricted content in a logged-out state. So, per YouTube's own policies that run and you can see on their website now, if you are in a logged-out state you should not be able to see anything that is inappropriate for children. So, we take that, we accept that. But if that is not the case, then YouTube, it is their social responsibility as a social media platform to fix that.

FERGUSON: We're talking here about trying to restrict content for young people, but isn't the much bigger question, why are the social media companies putting this vile material on their platforms in the first place?

WELLS: It's the question at stake, I agree with you, Sarah, but it is a question for the social media platforms. I mean, YouTube uploads 500 hours of content every 60 seconds, and if you could put all of that social media content in a library, it would be the world's worst library. You would not allow your child to walk into it. But the internet is here. The internet is ubiquitous. And I said, it's like trying to - you know, I've got little kids - we are all trying our best. Parents are trying their best but it is like trying to teach your kids to swim in an open ocean with rips and sharks rather than at the local council pool. We can't control the ocean, but we can police the sharks and we're going to have a crack at it.

FERGUSON: Let's just talk about what YouTube says. They argue now that this ban on young people holding accounts will mean that those young people are no longer protected by the guardrails that they have put in place for young people, which includes trying to stop young people getting access to inappropriate material. Essentially, their argument is you're making the situation more dangerous for young people by taking them out of that restricted area.

WELLS: Well, I have a couple of things to say to that. Firstly, YouTube Kids will continue to be able to be used by children, so why can't these health or educational videos be viewed on YouTube Kids? And why can't YouTube look at building out YouTube Kids to YouTube teens, for example, to use those safeguards that they've already developed and the eSafety Commissioner acknowledges they have developed in her advice to me, which we published. But if YouTube is prepared to admit that they are allowing dangerous or age-restricted content in an open manner against their own policies, then that is a question YouTube should answer.

FERGUSON: Last year, the Government was pursuing a digital duty of care. That was a much bigger project that put the legal responsibility onto the tech companies to prevent harmful content on their platforms. Have you given up on that? 

WELLS: No, we remain absolutely committed to that plan. You know, it was the publicly committed recommendation that we committed to as a Government in the last term of that online safety review, and I'm looking forward to doing more work on it. I'm still a new minister. I'm getting my feet under the desk.

I think the digital duty of care is a really important way of us embedding in the Australian law that social media platforms have a social media responsibility. Obviously, in that act, industry codes are led by industry and, like you say, the onus is on them to do it. But I hope this opens up a conversation amongst Australians, and particularly Australian parents, about what that level is. What is the level of government intervention that is appropriate? What is the level for industry codes to regulate themselves? I hope we can all have that discussion, and I hope we can all do it, particularly on the road to 10 December.

FERGUSON: And just briefly, the language of the ban is that social media platforms will be required to take reasonable steps to prevent under-16’s from holding accounts. What does reasonable steps mean?

WELLS: The reasonable steps is suggested and worked through with the eSafety Commissioner. These platforms work with her every day on what this looks like. And, I mean, they've had eight months already. They have another four months. We made this law last year to give everybody 12 months to work through these elements. And obviously, this is going to look different for every platform. They've all got proprietary technology. They're all rivals with each other. That's why we can't do this in a universal way. It's going to look different for each platform. 

But, I mean, reasonable steps is a pretty common test under the law, and I think it comes down to common sense. And in this case, in particular, it comes down to whether the eSafety Commissioner is satisfied.

FERGUSON: And when will we know how age verification is going to work?

WELLS: So, we're awaiting the final results of the age assurance trial. And when I get that, I will publish that as soon as possible so that everybody can have a look at it. And I think it's an important step along the way. Plenty of platforms already have age assurance mechanisms that they use in their platforms. And then we've also stipulated as part of these reforms that platforms are going to have to offer people an alternative to uploading their own personal identification documents to the platform if they're not happy. 

FERGUSON: So, just to conclude on age verification, you said there's some time to go, but your expectation is that the social media companies will come up with mechanisms that will work, short of things like uploading personal documents?

WELLS: We've said they have to provide an alternative. Maybe that is an option, but you have to provide an alternative for Australians who aren't comfortable doing that. But I would say, that social media platforms can target with deadly accuracy to all of us. They know who we are, they know what we do, they know who we're friends with, they know when we hang out with them, they know what we click on and why and when, and they have a lot of data on us. They probably know our age already. They need to come up with ways to test that if they don't. But, you know, if you've had a social media account, if you've been on Facebook since 2009, Facebook knows that you're over 16.

FERGUSON: Online gaming platforms are exempt from the ban, but there's plenty of research that shows that predators target very young, often very young, children on gaming platforms, using - well, making available sexual content via the avatars and via the voice functions in those games. Why are they not banned?

WELLS: Because they are subject to the National Classification Scheme and other laws in Australia, they're nominated as exempt. But it is for, like I've said earlier today, these laws aren't set and forget, they're set and support. And if it is the case that we see or the eSafety Commissioner observes the practice of that after these laws come into place as needing attention, then we will absolutely have a look at it. 

FERGUSON: Anika Wells, thank you for joining us and for working through the bell system there in Parliament.

WELLS: Thank you. Sarah. Have a good evening.