(Image: Adobe)

As the Senate investigates what it can do about extremism in Australia, the role and influence of tech platforms — both the mainstream and the fringe — is under the microscope.

Representatives for the Australian Security Intelligence Organisation (ASIO), the Australian Federal Police (AFP), the Department of Home Affairs and the Department of Foreign Affairs and Trade (DFAT) told the parliamentary joint committee on intelligence and security inquiry into extremism movements and radicalism in Australia on Thursday about how the internet has facilitated radicalisation and extremism during appearances.

The inquiry’s terms of reference call on the committee to consider “the role of social media, encrypted communications platforms and the dark web in allowing extremists to communicate and organise”.

AFP deputy commissioner Ian McCartney described the internet as a “force multiplier of hate”. He said young Australians were increasingly being targeted by extreme groups, echoing statements by ASIO boss Mike Burgess.

Home Affairs first assistant secretary, social cohesion Richard Johnson said while Home Affairs had an “online team” that identifies extremist content and refers it to platforms for removal, some enforcement of anti-extremist policies is undertaken by the tech companies. But only some platforms did this, he said.

Tech companies Facebook, Google and Twitter made submissions to the inquiry about their efforts to combat extremism and will appear before the committee on Friday.

Notably absent are any of so-called alt-tech companies such as Gab, Parler or Telegram. Their near non-existent moderation has attracted online extremists in Australia and abroad. 

Crikey understands that one of the presenting tech companies asked the committee to consider inviting these platforms, but the request was denied. The committee secretariat told Crikey invitations are private and confidential and declined to comment.

Jordan Mcswiney is a University of Sydney researcher investigating how Australian online groups use the internet to recruit, mobilise and communicate. He says it’s important the inquiry considers these alt-tech platforms, particularly given their proven influence in offline violence at the 2021 US Capitol invasion or the 2018 Tree of Life Synagogue shooting. 

“These platforms are a key means of communication and community-building among extremists,” he said via email. “With their extremely limited content moderation, these platforms allow for the proliferation of hateful and extremist content, providing greater opportunities for recruitment and radicalisation.”

The power of big tech’s algorithms

Digital advocacy group Reset Australia’s executive director Chris Cooper questions whether there would be any purpose inviting alt-tech platforms. 

“It’s unclear what inviting Parler would have achieved,” he said. “The creators would have only sought to further normalise the extreme behaviour and content that their platform is specifically designed to host.”

Instead, he said focus should remain on major platforms such as Facebook and YouTube with business models that radicalise users who eventually move on to extreme platforms such as Parler.

“The platform’s response to extremism is rarely much more than PR spin that distracts from the actual problem — their whole business model relies on algorithms that funnel people down rabbit holes towards extremism,” he said.

“The motto almost appears to be ‘the more enraged, the more engaged’, ensuring we all spend more time online and being served advertising.”

In their submissions, Facebook, Google and Twitter touted how much they’ve done to battle extremism. Their submissions each discuss the (largely inscrutable) statistics for the amount of extremism content they’ve moved, the research they’ve funded, and their many technological initiatives.

Their submissions also try to frame the role of social media platforms as just one part of a system, and reforms related to them as just one part of a solution. (Conveniently this view largely downplays or absolves them of responsibility for the spread of extremist content.)

Cooper says that despite their claims, platforms are still dragging their feet.

“Australia needs to decide if it wants platforms to be compelled to act in a way that meets our standards,” he said. “Regulating international organisations to meet our own safety and community standards is difficult, but Australia has the global social capital to pave the way on this.”