Interview on The Guardian - Australian Politics Podcast
KAREN MIDDLETON, HOST: Minister, thanks so much for joining the Australian Politics Podcast.
MICHELLE ROWLAND, MINISTER FOR COMMUNICATIONS: Pleasure.
MIDDLETON: One or two things have been going on in your portfolio, so there's a lot to talk about. I wanted to start with the announcements we saw this week on responses to the terrible scourge of violence against women. Now, they obviously cut across a range of portfolios, but there are a number that land in your area, or cross into your area. Can we start, maybe, with the countering misogyny idea, because that is looking particularly at platforms. Can you explain how you envisage that seeking to influence or change people's views with a counter misogyny campaign, how that's going to work, and where you might target that sort of campaign?
ROWLAND: You're absolutely right about this being quite a seminal moment, I think, in public policy. And I think it comes at an opportune time when people are thinking about the role of social media, they're thinking about the role of government and regulators. But I also get a good sense, Karen, that people are thinking about the collective, as well. And when you think about the collective, you being part of it, you being ‘us’, people start to think about what their own habits and obligations might be as well. Whilst it is a terrible issue, of course, that we are dealing with in terms of the subject matter, I think that it really is an opportune time for governments and regulators around the world who are grappling with this, and in particular, Australia to be a leader in this area.
There's probably three things there in your question, the first is around influencers. Influencers in terms of what's been proposed, primarily in Minister Rishworth’s Social Services portfolio, about engaging what I see to be people who will have a positive impact. So, people who will be charged with getting online, reinforcing positive stereotypes. Because I think that at a high level we know about the amount of negativity that is online, and we know that a lot of what is being pushed, is, unfortunately, negative. And the way in which we counter that is by really flooding the market with something better.
I sort of find it analogous to, for example, foreign propaganda that is coming into a particular country, or where you want to counterbalance something. I use SBS, for example, as a good case in point, when there is a lot of discussion about potential foreign interference, particularly with Chinese or Arabic-speaking communities, SBS has done a lot of research to show that the way in which you do that is not by necessarily limiting or seeking to limit the amount of information, because that is obviously difficult in and of itself, but they are flooding the market with content that is being created predominantly in Australia and by Australians – and ones that put forward a positive viewpoint.
I realise I'm not answering that precise question, partly because it is in the Social Services portfolio, but I think the principle is there. We understand that there's a concentrated amount of harms, the way in which you dilute that is by installing some positive frameworks around that.
MIDDLETON: And to go more precisely to your area, I guess we're talking about a range of different platforms where people – young people – gather. We’ve had governments have a Stop it at the Start campaign that really was looking at the attitudes from when you're a child and parents, and how they can shape that. And now as I understand it, we're talking about perhaps an older demographic, maybe teenagers or early 20s, people who are on gaming platforms, or the like. So, going to your point, I guess you have to choose people who speak the language of that platform, so are we talking about video sharing platforms like YouTube and TikTok? Or are we talking about chat rooms? How do you envisage it working?
ROWLAND: I see it as being quite a combination. And, being someone who uses a small segment of those, and being in an age category where there's some that aren’t popular with my demographic, I think that there are people who specialise in these areas, and they also exist within government departments. And you look, for example, at Minister Butler and his anti-vaping campaign, I'm aware that they're using influencers on various platforms, because that is the cohort they need to get to in terms of an anti-vaping message. I think this is an area that's not new. But I think it does need to be informed, and I'm sure Minister Rishworth will make sure it is, by solid evidence about who is our target here. It is a lot of males, in some cases, young males, but there is a board cohort that needs to be spoken to at the same time.
I think part of your question, Karen, also goes to – I've made some comments in the last couple of days – about ‘recommender systems’ and the rubbish that has been pushed to young boys. That's a topic that we've got under the Online Safety Act review, right now, and we kicked that off, actually, I announced it in November last year. We've got the discussion paper that just got released this week, and again, just encouraging the public to have their say around that. Delia Rickard, who is a well-respected former deputy of the ACCC, is leading that and she has spent a few months discussing with various departments, across government agencies, about what the scope should be.
What your listeners might not realise is - yes, we are unique in that we've got bedded down now an Online Safety Act - but that only came into force at the beginning of 2022. And one of the reasons why I decided to bring this review forward by a year was that generative AI really wasn't known as a concept when this Act came into force, deep fakes weren't really a concept. The number and the breadth of new and emerging harms means that the law needs to keep up, so this is something that will be looked at in the context of this review.
But I also want to assure your listeners: we are constantly looking at ways in which these issues can be addressed. There is a power imbalance between the platforms, and the media, and the rest of society more generally. But doing nothing is not an option for governments and regulators. And yes, we need to pick our spots, but they need to also be deliberate, they need to be based on good evidence, but, also we need to be very mindful of the outcomes that we want to achieve.
MIDDLETON: So many things to pick up on there. But I want to ask you - you mentioned parents before, and obviously, all of these kinds of measures fall under the purview, potentially, of the Online Safety Act. You know, right across the board, we're having a look at a range of things. But talking about recommender systems, that sort of algorithm problem, and the problem that parents face. One of the things, I know that you've been trying to do is help parents to monitor their kids, but those algorithms are pushing different stuff to the kids than they are to the parents. And try as you might as a parent, you can't always see exactly what the kids are seeing because you're not getting fed the same material. How do you overcome that without actually getting access to the algorithms which these companies persistently will not give you?
ROWLAND: Well, they are entirely opaque, these algorithms, and there has been various attempts, some of them more successful than others, to examine them. For example, for competition impacts. But when you're talking about the impacts on young people, I'm well aware of this. I've got 12 year-old and 7 year-old girls - and even the conversations, particularly with the older one, I can tell she has been seeing things, she's been hearing things, things that I think might be a bit dangerous. I guess the way I approach it is to just be supportive. And I'm not doing this – I'm not speaking to you now Karen as a psychologist with particular skills – but I've just taken the time to go to eSafety.gov.au. There is a range of resources there, like short videos and we know that's the way a lot of young people consume information now is through short-form video. They're designed for parents, for young people of various ages. And I just think that there is a weight that I think a lot of parents and caregivers are feeling at the moment, especially when you read the media, you see how your kids are behaving, you're constantly aware of these big forces in the digital platforms, which do have the potential to operate as a force for good, but here we're talking about a lot of the harms.
Those resources have been developed by skilled people, they're based on evidence, and they are readily available. So, I just try to do the public service announcement wherever I can. I was pleased the Prime Minister did it recently - at the announcement as well - because we need people to know that you already have tools that are available to you, they don't take long either.
If I can end by saying I did an Online Safety Forum with Sally Sitou, the Member for Reid, a few weeks ago, and they had a representative from eSafety there, who was a former educator. You go to these things, sometimes also as a parent at schools, and you're wondering whether you're going to learn something new. The new thing I learnt, the technique that I actually have been using, is you have to reinforce that no matter what, your kids need to know that you love them and they can tell you anything. My heart has been broken, I don't know what it is the last couple of weeks, I seem to be reading so many stories about young men who've been the victim of sextortion, in quite a few cases from Nigerian criminals, who've taken their lives. And just to read the accounts from their parents just saying, but I wish I could have told him ‘it didn't matter, we will still love you, we'll help you’, and these people felt they had no hope but to take their lives. Now, these are very extreme cases, but extreme is happening right now.
MIDDLETON: I know a lot of parents are terrified because they have trouble seeing what their kids are seeing, or they don't have time to monitor what their kids are seeing - you know - we used to have kids sitting in front of the television, now they've got devices, and it's really hard to get them off them. So, I understand the terror of parents. I wanted to ask you, I suppose a sharper question before we leave the misogyny thing, and there's lots more to ask you about there. But, would you consider compelling social media companies to actually block young people from seeing particular individual’s posts that you think are harmful? Or kinds of posts? Is there a compulsion power that you might go to beyond just the influencing stage?
ROWLAND: Well, under the Online Safety Act - and this is one of the things that's being explored in the review - it's really a complaint-based takedown system. And in many cases, that's because people don't want the material to be seen anymore, or an elected government has said this is the types of content that we don't want people seeing, refuse classification. So, for example, the footage of the Wakely Church attack, real violence, child sexual abuse material, pro-terror content. So, it really is a balancing act. We've got the Classification System, on one hand, that says what is just not acceptable to be seen, and then there are judgments about what kind of content shouldn't be seen by certain cohorts. Or more importantly, what sort of compulsion is being made for that content to be served up.
And it was one of the points I made with the Prime Minister the other day. The platforms are not necessarily the content creators, they're the platform on which other content is being served, they are pushing content. So, I think that as we examine this, and I know other regulators around the world are examining it, it's that act of the pushing the algorithm, the recommender, that is being examined. That that piece of service, rather than the content itself. Because, of course, the whole idea of the internet and the social media platforms was that this will create a pro-democratising environment, one in which people could be content creators, it could be shared. And of course, you've got to balance that against free speech. We know free speech is not absolute, we have censorship rules, defamation, and other methods of classification. But there's a real balance there to be struck. We're looking more at it in terms of the behaviour of the platforms rather than the content itself.
MIDDLETON: But doesn't that go to that point? If you're going to compel a platform to not make the content available to particular people, or not make particular kinds of content available, and you've talked about content that might have physical violence, and you gave the example of that video. But I guess I'm asking about other kinds of content and whether you consider the definition shifting of what qualifies to include misogynistic content, attitudinal stuff as well as just nasty videos?
ROWLAND: And that's, that's exactly what we're looking at the moment with the Classification Review. This is a system that's around 30 years old, designed at a time when you went to a cinema to and watched a film, you bought a book, or you went to a games shop and picked up a game off the shelf. The governance and also the community expectations of the Classification Scheme are long overdue.
This is a federated system, so, it goes through the Standing Council of Attorneys-General. I think part of what the announcement the other day from the Prime Minister was also that we need the States to cooperate in this review to get this done, to have proper governance and enforcement mechanisms. That’s been one of the key issues, how do you classify every piece of content that is in the digital age now? A lot of it is self-classified, but a lot of it just isn't.
MIDDLETON: Do you think we've let it get away from us? I heard the former eSafety Commissioner Alistair MacGibbon on the radio talking, he was trying not to be defeatist in that the horse has bolted, essentially, globally, on all this stuff. That we've allowed these massive platforms to gain so much power and so much influence, particularly over a generation of people with their dependence on this stuff, on their devices, and it's very, very hard to haul it back in. Are you maintaining optimism that you actually can do it?
ROWLAND: I would respectfully say that I think the role of government is to keep people safe, that includes in an online environment. And I think to do anything other than go about in a very methodical way, looking at the laws, looking at the gaps that are there, looking to international regulatory cooperation, and the examples that we see overseas as well. I think it would be an abrogation of responsibility for government to do nothing in this area –
MIDDLETON: To be fair, he was also saying, ‘but you should have a crack, that’s not a reason to not have a crack’.
ROWLAND: Yes. Most Australians, I think, would appreciate that with the near infinite amount of content that is now available, there is always going to be some aspect of risk, there is always going to be some content that simply should not be viewed, or should not be viewed by certain cohorts. But I think Australians also expect that their online environment will be made safer, that governments will do what they can to make it safer. It may not be a 100% safe place, just where you can walk in. Everyone needs to do play their role, but the role of government is to demonstrate that there has been some improvement. And that's the process that we've been going through since we came to Government.
If you look at an area like dating apps, I convened a Dating App Roundtable, unfortunately as a result of what seemed to be, and still occurring far too frequently, people who've been victims of tech-facilitated abuse, in most cases women. Dating apps, as a largely unregulated space, had never really sat down with government to say what mechanisms need to be put in place to make your app safer? And how can we incentivise you? Or, how can we regulate you if you're not willing to lift your game? We even saw within days of that meeting being convened, some of the major dating app providers introduce new safety features on their apps. I even see today that there's a new function that one of the major ones has announced where you can let people know where you're going on a date, as an added precaution. So, I think in some ways that has incentivised better behaviour.
I do expect that within the next few months, we will have an industry code that myself and Minister Rishworth tasked industry to come up with. My Department's been overseeing that and we'll examine that to see whether it's fit for purpose. And, as I said at the time, there are powers for regulatory intervention here if it's not satisfactory.
MIDDLETON: What will the focus of that Code be, regulating behaviours in what domain?
ROWLAND: This applies to dating apps, so what safety, security and privacy features are in place. When we first sat down in the roundtable, it was quite, quite shocking to learn what security features there actually were. So there have been improvements in this area. But I still think there's a way to go, and I think that in the context of gender-based violence, it's one that is foremost in people's minds.
MIDDLETON: In that context, in going to the announcements of this week again, there's a question about the technological practicality of some of the things you tried to do, for example, the age verification issue. I know that came up last year and I think the government's response initially was that the technology actually wasn't there to enable it. How confident are you that in the space of six months that technology now is there, that you can do it? Or are we really having to go right back to start and explore whether the technology is up to the task?
ROWLAND: Well, I think there's two parts here. The first is that we already have provisions in the Online Safety Act for the eSafety Commissioner to oversee the making of industry codes in this area. And in this phase of codes, that deal with access to pornography, this is one that eSafety has been working on. Now our focus as a government has been on getting [the Online Safety Act] implemented. The first phase of codes is more or less complete, in terms of access to child sexual abuse, and sexual exploitation material. This phase of the code is looking at pornography and other types of content, particularly, as we know, as relates to access by minors. So, we've got that aspect of the code and eSafety has been focused on that.
And eSafety, as they have advised, [has said] there have been developments in the technology in this area. We have spent some time from the outset, scoping this across departments, because it does have privacy and security implications. And it does go across a number of different portfolios. We took the time to scope this. And we're able to bring a proposal in the Budget context, which was approved. I'm pleased the Prime Minister decided that we would announce it a couple of weeks earlier than the Budget itself. As eSafety will tell you, it will help inform not only the codes making process, but also in terms of enforcement. So, is it the role of eSafety to specify a particular technology, Or is it the role of eSafety to say it's got to meet a minimum amount of standards? This pilot and the way in which it's designed will help inform that work. It actually fits in well with the code-making process. And we think it's important to continue working with eSafety in this area.
MIDDLETON: So you sound like you think the technology has advanced enough to make it a viable thing? How will it work? Are we talking about people having to upload ID verification documents to a website? Do they get their facial features scanned? What exactly would they be doing?
ROWLAND: They’re some of the technologies that are being developed at the moment. But we'll have more to say soon about how we move ahead with the pilot, including the types of technologies. But I think it's useful to just to note at this stage that this will be a pilot. So, this has to be well designed, it has to be one that's capable of implementation, satisfy all those efficacy points around safety and security – but above all else – making sure that it actually operates effectively to inhibiting the ability for minors to see content that is not age-appropriate. That includes pornography, but there are other use cases, like potentially harmful games that are inappropriate for young people to see as well.
MIDDLETON: What's the timeline? How long do you envisage a pilot would have to run before you think you'd have enough data to be able to say definitively, yes, we think this will work. And then how long after that before we could get a system up and running?
ROWLAND: It's a good question. It's one that's being grappled with by regulators around the world. We're going to give it the time that it needs. We'll have more to say on this in the near future about some of the milestones. I do note that eSafety has been progressing with the code process itself and sees the pilot as a potential input to the code process making or even its enforcement. So, we want these to be in place.