“My concern is that I see very strong community standard policies, or hateful content policies or ‘insert name of keep the community safe’ policies from various platforms. I almost can’t fault them but I find a very big gap with the application of them,” she told Meta, who appeared before the committee for questioning.
Wicks’ comments were made in light of 15 female Australian politicians, including herself, being the targets of abusive online comments that were left online for weeks and only taken down following law enforcement intervention.
Meta Australia APAC policy head Mia Garlick, who appeared before the committee yesterday, acknowledged the reality that abhorrent bullying material still remains on her company’s platforms.
She then said Meta’s machine learning models for identifying and removing bullying and harassment content are not as capable relative to its other models for detecting other types of content.
“In terms of the ability of artificial intelligence to identify all of this type of content, it’s still learning and we’re getting better at doing it,” Garlick told the committee.
“[Bullying and harassment] has been one of the categories in our Community Standards Enforcement report that has had sort of the lowest proactive detection rate and is slowly building up as the machines learn from themselves.”
Meta’s Australia policy head Josh Machin, who was also before the committee, added that his company has been making efforts to change the perception that it doesn’t effectively uphold its community standards. He said Meta’s community standards for bullying and harassment uphold Australian laws, which he explained means Meta has processes that geoblock or restrict access to certain content that would be deemed inappropriate in Australia.
Wicks said she was not convinced Meta has taken enough action, however, telling the Meta representatives that digital platforms should be doing more to keep people safe online.
“There is a very big, distinct difference between saying something that’s against the law, which is about freedom of speech, and you’re talking about this grey area here, which is basically you saying, ‘Oh no, we have to make it illegal in order to be able to police it,’ but misinformation and disinformation are not made illegal in order for you to be able to remove it.”
“That seems inconsistent to me.”
Earlier in the day, Wicks also questioned Snapchat about the suicide of Matilda Rosewarne, a 15-year old girl who experienced various instances of cyberbullying prior to her death, including having fake nude images of herself circulate Snapchat and a Belgian porn website.
Snapchat’s head of public policy in the Asia Pacific, Henry Turnbull, apologised for the death and, like Meta’s representatives, accepted that online abuse exists on his company’s platforms.
“I just wanted to say how sorry I am for what [the Rosewarne family] are going through right now,” he said.
When asked how Snapchat was working to stop cyberbullying, Turnbull said Snapchat images are designed to disappear after viewing but phones have the ability to take screenshots, something he said is out of the company’s control. He also said Snapchat does not open into a newsfeed of other people’s content, which he says encourages users to passively scroll through potentially dangerous content.
He also said regulation from the Australian government could help stop bullying content through the creation of a singular regulatory framework, which could entail the Online Safety Act being broadened to have the proposed anti-trolling laws fall under its remit.
“Complexity is very challenging for smaller companies. If you have one clear framework regulatory framework, it makes it simpler for companies to understand their obligations. It makes it simpler for the government and regulators to hold people to account. It makes it simpler for consumers to understand their rights,” he said.
Submissions to the parliamentary committee on social media and online safety close next week.
The committee was previously set to provide its findings last month, but the inquiry was extended to March 15. The committee has so far heard from large digital platforms, online abuse victims, and Facebook whistleblower France Haugen.
When Haugen appeared before the committee last month, she testified that Meta takes down the “bare minimum” when it comes to harmful content, especially when content comes in languages that are not spoken prominently in developed countries as there is minimal criticism from these underrepresented users.
The social media and online safety inquiry and the recently established inquiry into the anti-trolling Bill are both looking to get their reports finished ahead of Australia’s federal election. Liberal Senator and Attorney-General Michaelia Cash previously said the social media reforms are among her party’s primary items for this year.
Related Coverage
eSafety worried proposed anti-trolling laws may be used in vigilante-style justiceAustralia’s anti-trolling Bill enters Parliament retaining defamation focusACCC proposes major antitrust reforms targeting big tech dominanceMeta receives its first ever criminal charges from Australian billionaire Andrew ForrestFacebook gives non-US users less protection from harmful content to save money: HaugenHome Affairs singles out Meta as most reluctant to stop online abuseMeta and Twitter want a review of Australian government’s social media laws next yearGoogle and Meta on the defensive in Australian social media probe