Dr ALY: As those who have spoken before me have stated, the Online Safety Bill 2021 seeks to create a new online safety framework for Australians. It is 'a modern, fit-for-purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety'. Many of the aspects of this bill should be supported. They tackle some very serious—I would venture to say, criminal—activities online and some very serious and damaging behaviours online.
For the most part, the bill consolidates various online safety laws into one bill and tidies up those laws, but there are a couple of novel elements to the bill. One element is the articulation of a core set of basic online safety expectations, to promote and improve online safety for Australians—certainly something that is very welcomed and supported by those on this side. The other element is the creation of a new complaints based removal notice scheme for cyberabuse being perpetrated against an Australian adult—an adult cyberabuse scheme. I want to talk, in the time I have left, a little on that particular aspect and some concerns that have been raised about it, as well as some concerns that have been raised around the handling of the bill.
While remaining largely supportive of the intent of the bill and largely supportive of the need for this bill, the criticisms of the bill have been focused on the process and some of the substance issues—things like the functions, powers and oversight of the eSafety Commissioner. I concur with the member for Higgins that the current eSafety Commissioner is a woman of extraordinary talents and brings extraordinary experience to the role. However, that does not preclude an eSafety Commissioner from having the kinds of checks and balances that other people with those kinds of powers should be and are subjected to. There are also concerns about the clarity and breadth of the basic online safety expectations, services in scope of the online content regulation schemes, the clarity and the proportionality of the blocking scheme, the appropriate basis for the online content scheme, reduced response time, and, importantly, the rushed public consultation for this bill, following an exposure draft which was released in December 2020. Only 376 submissions were able to be received, because of the short time of the consultation process.
But coming back to the adult cyberabuse scheme, which is one of the novel parts of this bill, as I said before—that scheme would enable the eSafety Commissioner to make a determination about what is offensive and have that material removed within a short time frame. Concerns have already been raised about the subjective nature of this determination and the lack of, as I mentioned earlier, checks and balances on determinations, and they're valid concerns about how these determinations and how this particular aspect of the bill would be used.
I remember that, in my first term here, we had a very heated debate in this House about section 18C of the Racial Discrimination Act. There was fervent debate on the other side, with free-speech warriors claiming that 18C needed to be removed. The member for Higgins, who spoke earlier—I know that she wasn't a member at the time—might like to go back and have a look at some of the Hansard on that debate, because it was members on her side who stated in their debate that offence was not given; offence was taken. It was members on her side who argued quite loudly for the right of people to offend, under the guise of the freedom of speech. I would urge those people—those free-speech warriors—on the other side to look very carefully at the detail of this bill and to look very carefully at the provisions of this bill and to consider how it might impact on free speech and how some of those provisions in the bill might be used.
The other concern I have is that this provision in the bill is about adult cyberbullying. I have spent a lot of my career working in the spaces of antiracism, and, for some time, I worked in anti-bullying and anti-harassment policy as well. I can tell you that one of the greatest grievances of people who work in that field is that racism often gets subsumed into bullying and it's often dealt with as bullying. That has quite a serious side effect, has a serious consequence, because it means that racism never gets called out as racism. Instead, it's called bullying. But when bullying and the harassment is of a racial nature, it is racism. End of story—full stop. It is racism, and it needs to be called out as racism.
Imagine a scenario where somebody is trolling with racist remarks and gets called out for it, gets called a racist, and the person, the troll, takes offence to that and reports it and instead of the racist remark being removed, the remarks that are calling out racism get removed instead. That is a very likely scenario—as we currently have it. It's a very likely scenario that somebody who is trolling another individual with racist commentary and gets called out for that racist commentary can claim that they are being bullied and harassed and take action against the person who has called them out. It is a very likely scenario.
I will remind those on the other side what it looks like to be the subject of racist trolling online, although I'm sure that I don't have to remind some of those on the other side. I know that there are people on the other side who have been trolled online for their political views, for their race or for a number of other things. I've had people comment that I should be hung from a tree, that people should put a bullet in the bag and that they should take me out to the town square and shoot me or behead me. Imagine if, in a scenario like that, I were to call them out as racist and respond with remarks that they are racist, and I then became subject to the adult cyberbullying scheme and I were shut down. That is a very real possibility.
That said, it is absolutely true that we need a scheme like this that would allow and require platforms to block the kind of terrible content that we saw with the live streaming of the Christchurch terrorist attack. I thank those who have spoken in this place and in the Federation Chamber on the second anniversary of the Christchurch terrorist attack. It is quite timely that this bill comes in line with that anniversary.
The Parliamentary Joint Committee on Intelligence and Security is currently undertaking an inquiry into violent extremist groups in Australia, and the terms of reference include online content. But I want to make this point: much of the recruiting and influence of violent extremist groups happens in the dark spaces of the internet, on the dark web. Very little of it happens on the surface web. It's just the tip of the iceberg that we see on the surface web. I want to make sure that this scheme captures the kind of online racism that we see on the surface web. People can be racist trolls and not be members of or affiliated with violent extremist far-Right organisations. We cannot let that behaviour slip through the net if we're going to have a comprehensive framework for dealing with these behaviours online.
I just remind the House that, before I entered Parliament, much of my research was on online behaviours, and I also did some research on the most appropriate way of dealing with online offensive material, looking at whether or not the whack-a-mole approach of removing material as it appeared was the best approach and also at the kinds of online behaviours that precede radicalisation and violent acts, such as the behaviours that were observed post the act of the Christchurch terrorist but perhaps should have been observed prior to the act. Many of those behaviours happen in the dark spaces. So I really would like to see further work done on this bill, and particularly on the scheme, to ensure that the adult cyberbullying part of this bill, the adult cyberbullying scheme, also recognises racism as part of it, racial harassment as part of it and hate speech as part of it, and clearly defines them and has clear and defined processes for the reporting and the removal of this kind of material online.
That said, I join my Labor colleagues in supporting the intent of the bill. I join my Labor colleagues, particularly the member for Gellibrand and the Shadow Minister, who is sitting here today, in supporting the intent of the bill and the good things that the bill will do in removing really, really harmful content. But also, in adding to the debate today, I add my concerns about the bill—about the way that the bill was handled, about the lack of consultation in the handling of the bill and about the further steps that the bill needs to take to ensure that it actually does provide safety. All of us are entitled to feel safe, whether it's in our homes, whether it's in our workplaces, whether it's in our streets and whether it's online.
ENDS