18 Apr Facebook Pages need to be held accountable for defamation and hate speech.
In February this year, an interesting case went before the Australian court. Dylan Voller, the now-21-year-old attempted to sue three publishers over reader’s Facebook comments. Voller took on media giants News Corp, Fairfax Media (now Nine Publishing) and the Australian News Channel, claiming that some of their readers posted defamatory comments, to which the publications chose to leave public or failed to monitor at all.
More recently, AFLW Carlton footballer Tayla Harris called for authorities to take action after an image of her posted to 7AFL’s page was bombarded by derogatory, sexually explicit and misogynistic hate speech. Rather than deal with the hate speech or put the correct measures in place, the 7AFL decided to remove the image from its page.
Last week we saw Majak Daw become the latest victim to receive hurtful and racist taunts on social media. Daw is the third footballer to suffer racial slurs online before the second round of the 2019 season had even begun.
The political and social climate is a very sad reality in Australia and, indeed, many other parts of the world. After the March Christchurch shooting, the social temperature escalated, partly triggered by Senator Fraser Anning, leader of Fraser Anning’s Conservative National Party in the Australian Parliament, made anti-Islamic and xenophobic remarks following the Christchurch mosque massacre. His speech and sentiment provoked the largest online petition in Australian history to have him ejected from Parliament. On the contrary, his Facebook following grew by 22 % to 37,319 followers after his “final solution” speech in August 2018 and has since grown to 117,000 followers as of March 2019.
Social media consultants and community managers need to be aware of the political climate, social movements and conversations that their posts can create. In the instance of Voller and the AFL, these pages could have, at the flick of a switch, turned on a Facebook-owned profanity filter, or implemented one of their own – uploading different variations of misspellings on cuss words.
How much longer can companies and publications alike bury their collective heads in the sand and claim they aren’t responsible for allowing hate speech to ignite under their posts or claim ignorance to defamation within their “community”?
Under United Arab Emirates law, publishing defamatory comments on social media is no different to publishing defamatory comments in newspapers, books and magazines. Arguably, the risk of damage via social media is greater than traditional print, given its accessibility and ability to ‘go viral’ in minutes, so the penalties for social media defamation is more severe than print.
In the space of a week, the US Congress held hearings on the rise of white nationalism and the role of social media in spreading it; the British government announced an “online harms” initiative, to curb — with government supervision — the proliferation of hate speech online; and Australia passed a law that requires social media platforms to swiftly remove “abhorrent violent material.” Facebook also announced that it was banning the praise, support and representation of white nationalism and white separatism on Facebook and Instagram. I can personally attest that reporting distasteful comments on social media still has the same tolerance and more often than not “does not breach their community standards”.
For years, under the ACCC, online advertisers and marketers have abided by the law to not allow misleading or false information on their pages – even in their comments area. It comes after a case in 2011, which concluded that a company accepted responsibility for misleading fan posts and testimonials on its social media pages when it knew about them and decided not to remove them.
There is a social movement to stamp out hate speech and being unkind to others. Social media giants are slowly making changes, but companies and social media pages need to be more diligent and accountable under their national governing law.
Just like a business removes graffiti from its doors and walls, pages need to remove the ugly virtual graffiti on their threads.