MINISTER FOR COMMUNICATIONS ANIKA WELLS: Good morning, everybody. Today, the eSafety Commissioner and I are providing clarity for parents and for children. On 10 December, our world-leading laws will give kids a three-year reprieve from predatory algorithms, toxic popularity metres and harmful content manipulating Australian children. While we are now more than a month away from the law being enforced, we want families to have informed conversations about the changes ahead. So today, we are announcing the platforms that on 10 December will be subject to the social media minimum age laws. eSafety has assessed Facebook, Instagram, Snapchat, TikTok, YouTube, X, Threads, Reddit and Kick as age-restricted platforms. This means from 10 December, these services must take reasonable steps to prevent under-16s from holding accounts, and failure to do so could warrant fines of up to $49.5 million. eSafety’s platform assessments will be ongoing and respond to technological change, but we understand that families need certainty now, regarding the major platforms captured under the social media minimum age laws.
We have met with several of the social media platforms in the past month so that they understand there is no excuse for failure to implement this law. Online platforms use technology to target children with chilling control. We are merely asking that they use that same technology to keep children safe online. We aren’t chasing perfection. We are chasing a meaningful difference because it is too important here not to try.
The government has recently launched a national education campaign and I encourage all Australians to engage with a range of resources now available on esafety.gov.au to help them understand the law. I want parents to have the ability to tell their children, it is against the law for me to let you use this platform. The statistics speak for themselves. The need to act is clear. More than one in three children reported that their most recent experience of cyberbullying came from a social media platform. And over seven in ten children have seen harmful content online. We want children to have a childhood and we want parents to have peace of mind. And in naming these platforms today, we will provide that clarity for conversations ahead of 10 December.
So, thank you, and I’ll ask Commissioner Julie Inman Grant to speak.
eSAFETY COMMISSIONER JULIE INMAN GRANT: Thank you so much, Minister Wells. Today, again, we are announcing the platforms we’ve assessed that, in our view, meet the criteria of age-restricted social media platforms from 10 December. This provides for transparency to parents about how these changes will impact them and their children. Now, these assessments are purely based on the criteria set out in the legislation, including the sole or significant purpose of the platform to enable online social interaction.
The other key consideration for us is whether the platform meets the criteria of a class of services where there is an exception, such as for messaging, online gaming, or educational content. A few important points I want to make about this list. Technology is fast-changing and ever-evolving, which means this will never be a static list. Our work will continue. These assessments we have announced today are not a requirement of the legislation. eSafety, off our own bat, developed the self-assessment tools for industry to provide a template to ensure fair and consistent application of the legislative criteria to make sure there was no question.
And I can talk to you about the technology playbook on denying that there’s a problem, minimising the scope of the problem. We don’t have any underage users on our platforms. Claiming that the technology is an impossible, that it’s not able to be achieved. But then we start to see a change when the legal and regulatory pressure ramps up. And all of a sudden, we can discover a solution. Oh, 16 isn’t such a novel age. Oh, we can target using our age inference technologies we’ve already been using for a decade. And in the end, when we are confident that the companies have indeed complied, we’ll probably see some great PR about all the innovations they were able to achieve to this end.
Two other things I want to reinforce. This is not a safety, a harms, or a risk-based assessment. These were criteria for assessment contained in the legislative rules. I will talk to you a little bit more about the platforms that met some of the exceptions, but this was not a compare and contrasting exercise. In order to be consistent and fair, we assessed each platform or service on its merits against the criteria in the legislative rules.
And I want to say that this is a really potent solution, but not the only one in terms of how we’re trying to keep the online world safer for children. We have promoted safety by design in cooperation with the companies since 2018, and we’ve ensured that safety by design, so assessing the risks and harms up front and embedding the safety protections rather than retrofitting at the end, are part of our mandatory codes and standards. We, of course, have extensive information in terms of our resources in education, and we encourage parents to avail themselves to this.
And this is a really important reminder that while 10 December will be a seminal day, parents can be empowered to have their conversations with kids now. They can start the chat and delete the apps today. What we help them work through is how they taper off their children’s use, how they might be able to download the archives of information, their photos, their memories, how they can find influencers that you approve of online and in other places.
So, remember, this isn’t a comprehensive prohibition or ban. Young people will still be able to interact on messaging platforms. They’ll be able to search the internet for information. They’ll be able to use online gaming platforms. So, they aren’t going to be totally disconnected from communication or have their digital lifelines cut. And the Minister and I recently met with a range of mental health and educational NGOs that helped ensure that when we were developing this guidance that we used language that was very empathetic, but will also take them to places that they can go to for help.
And of course, this is an opportunity for all of us to take this into our own hands. And we’ll see what happens on 10 December. As the Minister said, we expect it will be significant. But the changes that impact our young people will be felt over a much longer period of time, which is why we are also working with a range of academics to evaluate the impacts. Are kids sleeping more? Are they out in the footy fields? Are they interacting more? And we’ll also look for unintended consequences, and we’ll be gathering an evidence base as other companies around the world look at what we’re doing so that we can provide them the learnings from what we’ve already achieved and what we plan to achieve further.
WELLS: We’d be very happy to take your questions.
JOURNALIST: Minister, my seven and nine-year-olds are already telling me that they know how to get around this social media ban, which they’re well aware of. They’ll just go to the browser rather than the app and be able to access it. How do you stop kids circumventing it in such a simple fashion?
WELLS: Well, I think I’ll let Julie speak to the ins and outs of the tech platforms in particular, but I’ve got lots to say about this. Firstly, kids will be kids. There will be kids today that manage to procure themselves alcohol despite the fact that it’s against the law to buy alcohol if you’re under 18 in Australia. That doesn’t mean that we shouldn’t have a law preventing under-18s from buying alcohol in Australia. This is about a cultural change and moving people away from having every single interaction they have as a 13-year-old, 12-year-old, 8-year-old being online and being in real life, trying to give them 36 months back to develop those relationships and develop that resilience themselves before giving them the opportunity of having a social media platform.
I have been enjoying some of the social media content by recalcitrants in the run-up to 10 December, talking online about how they are under 16, they don’t like the laws, this is how they’re going to get around it, therefore identifying themselves as someone who is under 16 and their accounts are going to need to be deactivated. I would also point you specifically to when TikTok went offline for 24 hours in the States, people tried all kinds of methods to get around that, like VPNs, et cetera, and were in the vast majority unsuccessful.
GRANT: I would just say a feature rather than a bug of the legislation was for young people to be able to engage in that content in a logged-out state. Some of the conversations that we’ve had with the platforms who still use recommender engines and algorithms to determine what the logged-out state content looks like, they shouldn’t be surfacing up content that’s appropriate for a 35-year-old man. They should be ensuring that it is appropriate.
So, in any case, I’m not sure if you took the time to read our regulatory guidance, but we have pretty extensive requirements that we placed back on the platforms themselves to prevent circumvention. So, location-based circumvention in terms of the use of VPNs, but there are so many signals that they can and are picking up now, device IDs, who their ISP account is, where they’ve downloaded, from which app store. And in most cases, if you brought a phone from the United States and you download it from an app store, they will look at a six-month period to see whether or not you’re actually a resident in Australia.
We’ve had deep conversations with companies about this. I was in Silicon Valley talking to all of the major platforms. They’re already doing this. They know they can do this. They know that this is their responsibility. And it’s not just location-based circumvention. It’s the use of accounts that may be pre-age verified. We could see a black market for that. But we also know generative AI, a mask, or graphics from platforms can be used to spoof AI or age verification systems. And we’ve given them very specific specifications as to how we think they should tackle that. And they’ve given us comfort that this is something that they can do and what they will do in these cases when they think someone is using a VPN is just re-age verify. So, it’s not as big an issue.
The last thing I would add is we’re talking to a range of other platforms and we’re asking them to look for migratory patterns. So, if there is a huge spike in Bluesky, for instance, before or after, we expect them to come and talk to us about that. We can even look at the iTunes store or Google Play and we can see which are the most popular apps being downloaded. Right now, Yubo is one of them. Again, we’re in conversations with them.
JOURNALIST: Minister, last week the AFP commissioner talked about this really disturbing global phenomenon where Australian girls are being targeted by sadistic men. A lot of it is happening on platforms like Roblox. So why wasn’t Roblox included? I understand it’s a gaming platform, but given what the AFP commissioner laid out last week, did that give you pause to rethink whether that platform would be included?
GRANT: We also deal with sadistic extortion and financial sexual extortion on a daily basis. So, it’s obviously of great concern for us. Like I said, we’ve got codes and standards, and we’ve used this in our negotiations with Roblox. So, based on that negotiation, by the end of this year, Roblox will be rolling out age-assurance technologies here. Their primary user base are, you know, 5 to 13-year-olds, but it’s a co-mingled platform. We know there are adults. So, we ask them to take other specific steps, including not allowing adults to contact children without specific parental consent and putting on privacy at the highest default.
So, we’re using other tools in our arsenal to keep these other platforms safer. And that’s a really important question, just because a platform is exempted through the legislative rules, doesn’t mean it’s safe. We’re trying to do other things through safety by design and our codes and standards to make sure they’re safer. I just want to also mention that we used our codes and standards powers for the Appstar codes with Apple and Google to de-platform a chat roulette site that was based in Portugal, who was non-responsive. A 14-year-old Queensland girl was groomed by a 40-year-old Australian pedophile on this platform, and he engaged in sadistic extortion, requiring her to perform on video even more explicit sexual acts for two of his friends. So, this can happen on any platform. So, a rising tide lifts all boats. But that’s why we’re using this as one tool. This is an age restriction for social media, but we have other tools that we’re using for the rest of the industry. And our codes and standards cover eight sectors of the technology industry.
JOURNALIST: Can you just clarify, is this the final list that will go of platforms covered 10 December? And are you sure that you’ve given enough time for Kick and Reddit in this case to comply with those standards?
GRANT: I’m happy to take that. As I said before, this is a dynamic list and it will always change. And we’ve told companies, so for instance, Roblox, as was just mentioned, some of these companies, when we did the assessment, were very much what I would say on the line. So, we had to put our minds to what is the sole and significant purpose? Online gaming, right? But there’s chat functionality. In the US they’ve launched a program called Moments, which is very much like Stories, which is online social interaction. They market themselves as the first metaverse company as being immersive. And so, when we did the assessment, we said- and we also have to think about, would kids who are using Roblox today, they use the chat functionality and messaging so that they can online gameplay. If the online gameplay is the significant or sole purpose, if that were taken away, would the kids still use that messaging functionality to chat? Probably not. So that’s just to give you some insight in terms of how much rigour was done with each of the assessments.
So, we’ve said to the companies, you need to continue to self-assess. The burden is on them, but we will be watching as well. And if they start rolling out features that look more like they’re becoming a social media company than an online gaming company, then we will seek to capture them, which is exactly what I did with OpenAI. I met with them a week or two before they announced the release of Sora. It was never mentioned to me. It was framed as a social AI generated social media app. So, I’ve sent them the self-assessment tool. And this is where we’re trying to remediate the harms today. Obviously with social media, we’re trying to remediate the harms that were created more than 20 years ago. We also have to keep a watchful gaze or an eye on the future and look at how all of these technologies are converging. Because I happen to believe that if the proper guardrails aren’t put around AI, they could bring much greater harms, and this was echoed by DG Mike Burgess last evening. He often talks about social media as an incubator but AI as an accelerator, and I happen to agree. So, we’ll have to keep a watchful, mindful eye on how all of these services evolve.
JOURNALIST: Sorry, just to clarify though, parents need to be mindful that the platforms that are covered by this could change between now and when the 10 December deadline comes into effect. Is that right?
WELLS: I don’t really think that’s the thrust of the message. The onus is on platforms to deactivate accounts belonging to under-16s.
JOURNALIST: But say I’m a child, though, and I ask my parents, can I have an account on some social media platform that isn’t on the list yet but could be by 10 December, how am I as a parent then able to say, no, actually you can’t have that, if it’s constantly shifting, I guess is the concern.
GRANT: Well, technology is constantly shifting, and these companies are constantly adding features and functionalities. We’re not trying to prevent innovation. Again, the burden goes back onto the platforms themselves to not only self-assess. And if they think they are age-restricted social media platform, or we come to the view that they are, we will let them know that we think they’re in its scope and what reasonable steps we think they need to take to ensure that young people do not have or hold an account on their platform.
WELLS: And we want parents to know that one source of truth is esafety.gov.au, every minute until 10 December.
JOURNALIST: So obviously parents know about Instagram, Facebook and all that. But they might not know about Kick messenger. Would you be able to explain to parents what that is? Because I know in the US they’re facing a lawsuit. It’s also been described as a safe haven for child predators. Can you maybe, for parents watching, why is this platform, I guess, sort of dangerous for kids and teenagers?
GRANT: Well, Kick is an Australian-based platform. We’re doing some of our own investigations and we’ve been working with them for some time. I think you may be referring to the streaming incident with the French national, and so we’ve been working with our French fellow regulator on that as well.
Well, again, I think online gaming is something a range of platforms offer in the context, whether they’re social media. This is a streaming platform. We also looked at the Steam app and Steam Chat. We will be considering Twitch. We found that Discord, which is a messaging platform but can use generative AI or gaming – so again, these are all hybrids. This is why it takes quite a lot of mental gymnastics and thinking, and just applying the criteria, but also asking ourselves if this particular feature or functionality was taken away, would people still be using it in the same way? So, exemption, as I said, does not mean a platform is absolutely safe, Sexual extortion, grooming happens on these mainstream social media platforms all the time. We’ve got the data. We take the reports in on a daily basis.
JOURNALIST: To follow up the earlier question, obviously we are a month away from this now coming in. You’ve added two large platforms to this cohort of platforms that are now involved in the ban. Why has this happened so late in the piece? And so what sort of time, duration of notice will companies get in future if they are to be added to that?
GRANT: We’ve had these conversations with these platforms for 12 months.
JOURNALIST: But you’re announcing it right now today, though. That’s the point of these press conference is you’re announcing it today. Why so late in the piece, and what sort of threshold, what sort of duration notice will companies get in future that they’re being added to a list?
WELLS: The purpose of the press conference is we’re trying to give clarity to parents because we want these discussions had on the commute to school, at dinner tables, in the run-up to 10 December, because this is a big change for kids and we want the families to be prepared. Everyone says to us this this is such a war in our household these days, it’s such a modern point of friction for families, so we want to empower parents and give them as much information as we can as quickly as we can. I think it’s a completely different premise for platforms. They have had 12 months’ notice since this was legislated in a bipartisan fashion through the parliament. These platforms, it is their duty every single day to consider whether they are going to be caught up in law and whether they themselves as a social media platform will be required. If they have not been given thought to this up until today, that is nobody’s business but theirs. They’ve had 12 months’ notice.
GRANT: And we’ve had continual conversations. So, I would say, just so you know, Kick agreed that they are an age-restricted social media platform, and they are willing to do what is needed. They know they need to up their safety game. Reddit has been deploying age assurance technologies, because we’re not the only government in the world that is asking for age assurance. The UK Online Safety Act came into force at the end of July, so they’re using a third-party solution called Persona. So, stepping this up for them is not a difficult case. They may not agree they’re an age-restricted social media platform, but they have said they will comply
JOURNALIST: Minister, your office last night issued a release, and it said that there would now be eight platforms with these two new ones included, but your opening remarks has also mentioned Threads. Has something changed overnight for Threads to be included?
WELLS: My answer to that is nothing’s changed.
GRANT: It is nine. You can’t access Threads without an Instagram account, but it is nine.
JOURNALIST: Julie, you mentioned an excuse playbook that these platforms have been using in your conversations, and you said that some platforms are saying that they don’t have any underage users. So which platforms are giving you that excuse?
GRANT: So, we used our Section 20 transparency powers and put out a study called Beyond the Screen. We did a lot of that work in September of last year and at the social media summit hosted by the New South Wales Government. We juxtaposed the transparency powers that we had given to the platforms who said they didn’t have any underage users, so the numbers they were giving us on the underage- the 13 to 15-year-olds were much smaller than we know the broader population of eight to 15-year-olds are. So, what we found through this youth research juxtaposed with those transparency notices is that 84 per cent of eight to 12-year-old Australians already had a social media account, at least a single one in 2024. We asked them questions like, are your parents aware? In 80 per cent of cases, they said yes. And then we asked them, if your parents were aware, did they help you set them up? And in 90 per cent of cases, they said yes. And this is where the normative change becomes so important.
I’m a parent of three teenagers, and no one wants their kids to be excluded from the Snap Maps or the Instagram planning, and that’s what essentially happens. It has been happening before this normative change. We’re letting our kids have them. There was maybe a handful that said that they had ever been banned as eight to 12-year-olds on any of these platforms for being underaged. So essentially, companies have had age-inference technologies in place for years. They’ve been using insufficient age-gating or self-declaration, which they know kids are going to lie, of course, to get on the platform. This is changing the game, not just for parents, but this is actually saying to platforms, you have to take best efforts and reasonable steps to age verify. And if you read the regulatory guidance, we’re asking for what we call a waterfall approach so there isn’t a single point of failure.
JOURNALIST: On the inclusion of Kick, could you explain why services like Twitch and Steam aren’t included? Because from the outside, they look very similar, and the principle there seems a little confusing. And also, you mentioned that if services like Bluesky saw a spike, you’d expect them to come and report to the government or to the commission. But what incentive would they have to do that?
GRANT: They reached out to us proactively, and we’re having conversations. We do that all the time. So, for instance, when I was in Silicon Valley in September, we met with Character AI. I raised my concerns about them. I told them about the codes they were going to be subject to in March 2026. They voluntarily told us that they were planning on preventing or banning under-18s from using their service. Then they asked us, what do you know about what other companies are using in terms of age assurance technologies? We sent over the age assurance technical trial. So, a lot of these conversations have to be cooperative. I want these companies to lift their safety standards, and we want to help them get there. So, a lot of these conversations are collegial. They only become combative when there is pretty flagrant disregard for the law or flagrant non-compliance.
With respect to your question, one of the reasons I said we’re not doing comparing and contrasting here is we have to look at each service when we do these assessments based on their features, functionalities, merit, and then the submissions that they give us. Kick agreed and we agree with them on their two current purposes: to enable users to livestream or broadcast material, and then to enable users to create, share, view, and engage in material with other users on the surface. Both of those fit the criteria of online social interaction. We assessed Steam and Steam Chat separately. Again, we assessed Steam as having four current purposes –but all four purposes we identified for OSA purposes. But they did fall more into the online gaming platform, and they fell within the class of service specifies in the rules as exempted because their primary purpose is enabling users to play online games with other users, even though there’s still online social interaction as part of their features.
JOURNALIST: And Twitch?
GRANT: Twitch is still being evaluated.
JOURNALIST: And so, I mean, if they’re sitting in the same category- I mean, Steam is slightly to the side, but Kick and Twitch are very similar platforms. I mean, if one’s being listed and the other isn’t, isn’t everyone just going to go to the other platform on 11 December?
GRANT: That is possible, but we are assessing each on its merits. We’re not doing comparison contrast. Every single platform has different features and functionalities. It was built for a different purpose. It’s used in different ways. The other thing that’s important to ask, if you think about it backwards, would a user be using this platform if this particular feature or functionality did not exist for this purpose? Would they use this messaging, for instance, on Steam or on Discord if it didn’t have the other features and functionalities? In some cases, it’s yes, because that messaging experience is compelling, and some it’s no. So again, comparing and contrasting is not helpful because we’re not regulating based on categories. We’re putting the responsibility back on the platforms. It’s about whether or not children under 16 can have and hold an account on those platforms.
JOURNALIST: Julie, given that eSafety has had issues, and you flagged there with the degree of information provided when you issued transparency notices to this platform, beyond 10 December, what powers do you have and what confidence do you have that if you seek information about how successful these platforms have been in actually keeping kids off their platform, you’ll get accurate numbers? There’s obviously going to be cases of anecdotal evidence of kids on there. What confidence do you have that we will be able to get serious data about the efficacy of these laws?
GRANT: I think it’s a great question. We’re deeply engaging with the companies we’ve identified as being age-restricted social media platforms now, and have been since our meeting with their engineers and their decision makers back at their headquarters for some time on compliance plans. And so right now until 10 December eventuates, this is all voluntary. But we’re trying to seek baseline data, so we’re working from the same page. We need to and would like to know when they plan to follow the reasonable steps and communicate to their younger users in a compassionate way that their accounts, if they’re under 16, may be deactivated or removed and give them a choice. What happens on 10 December is we will have formal information powers to be able to ask a specific set of questions. Of course, we’re preparing for that now, but we will be using a range of other methods to assess compliance and enforcement. We also, as the Minister said, don’t expect perfection. Some companies are going to be more effective than others. Some may use blunt force tools and over-block, and for that reason, in the reasonable steps, we’ve required that they have access and redress mechanisms provided. We think some will under-bake their age assurance and their deactivation removal. So, we’ve asked them, and we’ve seen some of the reporting flows that some of the companies are planning in terms of giving parents and educators a place to report when under-16 accounts have not been used so that they can adjudicate that.
JOURNALIST: Thanks. Minister, a question for you. When Anthony Albanese sold this law in the first place, he said part of it was keeping kids off devices and getting them back into the footy fields and into ballet classes. But the way the law’s worked out, kids can still watch YouTube in a logged-out state. They can still scroll TikTok in a logged-out state. Steam- Twitch, we don’t know yet, but potentially is still available when the other platforms aren’t. What do you say to parents who expect come 10 December that kids will not have access to many of these things, but realise that the way the laws actually played out, they will still be able to access many of these services?
WELLS: I would say that this is about a cultural change in that at the moment, very much the expectation is that you reach a certain age, you get a smartphone, you’re on these things, and we want to change that so the expectation is that you are not on these particular platforms, that you don’t have an account on these platforms. I absolutely accept that people will still find ways to be there, but I want that to be the exception and that being a couple of kids in a class rather than it being everybody is on there, and only one or two kids who have strict parents or who have parents who can’t afford to give them a smartphone at that kind of age to be the ones who are excluded. And like you’ve heard me say before, we can’t control the ocean, we can police the sharks. This is not a law that solves online safety. This is not a law that cures the internet. What this is, is a law that creates cultural change and gives 13 to 16-year-olds 36 more months to build resilience, to forge real life connections in their activities of choice, whatever that is, either boccia or netball, for example, and give parents some peace of mind back that they’ve got a bit more time to learn how to navigate this. And hopefully at 16, that culture will have changed so that the expectation and you aren’t excluded for being the only one not there, the exception is that you are someone who is still choosing to be there.
JOURNALIST: Sorry, you’re not policing the sharks if Discord, Steam, WhatsApp, Messenger, they’re not covered by this ban. And Ms Inman Grant, what is the dominant purpose or significant purpose of WhatsApp if it’s not social media interaction? The same with Messenger – like, what could it possibly be?
GRANT: It’s a messaging app that does allow some broadcasting.
JOURNALIST: Isn’t that social media interaction?
GRANT: No, there’s a specific legislative language to assess what online social interaction is. I mean, if you look at other messaging apps that have much more online safety functionality, let’s take Snap. And Snap said publicly they see themselves as a camera app and a messaging app, and they noted that some of their competitors have copied or have similar functionality. But we had to look at the totality of the online social interaction. So, they have Snap Streaks, which is a gamification. It’s a way to keep them in endless scroll. They have lenses. They have stories. They have spotlights. They have ephemeral media. They have Snap Maps. So, all of these things together, if you stripped out all that functionality, would kids use Snap in the same way? I’m not sure. I mean, those are all really engaging features and functionalities. And it is materially different from Messenger and WhatsApp. But again, yes, we have to kind of in the frame look at these separately.
JOURNALIST: It sounds like you’re letting half the sharks swim free.
WELLS: I’d love to answer that question. This is like everything I just said, but those platforms who are currently exempt from the social media minimum age laws are still subject to the Online Safety Act and the Privacy Act, which are acts of Australian law which act to try and keep people safer online. And all of these platforms, in or out, will become subject to the digital duty of care, which is where the Albanese Labor Government is going next in the work of online safety. We’re opening consultation on that soon. And digital duty of care is about what these platforms owe their users by way of social moral obligation to try and keep them safe online.
JOURNALIST: Minister, Google previously said it would launch legal action. Do you have any indication that it may still proceed? Was it an empty threat in your view?
WELLS: We met with Google and YouTube together last week because we want to keep this conversation going. We want to give platforms every opportunity to ask us privately about questions or concerns they have and make sure that they have had every opportunity to put in place the mechanisms they need to comply with the law come 10 December.
JOURNALIST: Do you have any indication that they may still proceed?
WELLS: Nothing formally.
GRANT: And they are entitled to the standard appeal rights and they can challenge the law, or if they decide not to make a constitutional challenge, if they decide not to comply and I decide to take action, they can challenge that action again with the same appeal rights that they have today.
JOURNALIST: Minister, very quickly, do you have a rate of how effective you expect the ban to be? Like, we know it’s not going to be 100 per cent, but do you have a goal of how effective?
GRANT: I mean, how long is a piece of string?
JOURNALIST: But across the spectrum, you must have something in mind?
GRANT: Across the spectrum, yes, and that will be made available when we send out our information notices in terms of what we’re looking for. This is what we’re discussing actively right now with the platforms in terms of we’ve got to establish the baseline numbers of how many. We know that there are 2.5 million eight to 15-year-olds in Australia, and we have a very good sense, based on both ABS data and our own research, on how many should be on each of those platforms. The numbers they have given us previously have been much lower. So, we need to come to a baseline number, and then we will discuss with them what we think good or achievable looks like. Thank you.