Governments not interested in taking down ISIS messaging apps

Governments not interested in taking down ISIS messaging app

A Senate inquiry has revealed that no government was interested in taking down messaging apps where 80 per cent of users were affiliated with the Islamic State.

Tech Against Terrorism’s Director Adam Hadley said that extremists have moved towards smaller messaging apps and social media services after being shut down by social media giants like Facebook and Twitter.

“It is like small platforms do not exist,” he said during a committee reviewing Australian laws that force publishers to remove abhorrent violent material as soon as they become aware of it.

“They are the threat … (but) smaller platforms … do not appear to be considered.”

He further explained that while it was right and proper to apply pressure on larger platforms, it was not acceptable for terrorists and extremists to have room to move their communications.

The issue came about after violent online material was spotted by internal artificial intelligence systems on large social platforms. These materials, which amounted to 99.7 per cent of cases for Facebook and 96 per cent for Twitter, have since been taken down proactively.

However, the smaller platforms’ lack of resources to remove such content from their systems has drawn extremists in.

While a small amount of terrorist and extremist material still makes its way onto Facebook and Twitter, smaller messaging apps have been taken over almost entirely by terrorist or extremist groups.

Director Hadley said he knew of small social platforms where 70 to 90 per cent of users were terrorist affiliates.

The Senate inquiry also found a lack of a clear definition of terms for the abhorrent violent material publishers are forced to take down, which risks limiting whistleblowing posts.

While representatives of Meta and Twitter declared that they supported regulations that forced the takedown of violent and extremist material, more clarity was needed for such actions.

The regulations, while protecting users from violent content, can also cause the shutting down of people condemning violent material and actions, or journalists, politicians and whistleblowers exposing atrocities.

One such example is the instance where a violent video at a mosque in Southeast Asia was taken down by Facebook. However, it turned out to be exposing violence against Muslims.

Public policy Vice President Simon Milner said Meta erred on the side of caution, initially believing it was footage of the Christchurch mosque attack. He then questioned whether there was enough time to make the correct decision.

“Does (the law) allow us to have time to think about it? Once we are aware of it we have to remove it immediately and that is a challenge for us if we want to deliberate,” he told the Senate committee.

Meanwhile, Twitter Australia’s Senior Director Kathleen Reen said 73 per cent of the content on the platform after Christchurch were from verified accounts. Most of these accounts that reported the event were operated by news organisations and journalists.

“One of the really big takeaways is (the need for) a reasonable process for consultation, for review and for appeal,” she said.

“There could be contingencies … around mass takedowns of content which could effectively block Australians having access to newsworthy content or from understanding what is going on.”

As of the moment, neither Twitter nor Facebook has received any official takedown notices from Australian law enforcement agencies.

Share
Dee Antenor
Dee Antenor
Dee Antenor is an experienced writer who specialises in the not-for-profit sector and its affiliations. She is the content producer for Third Sector News, an online knowledge-based platform for and about the Australian NFP sector.