![]() ![]() Here’s a starter moderation and governance checklist to help you stay compliant with current Australian laws and build a safe, thriving community: They’re risking harsh penalties and there’s an array of better options to help keep your community safe. If the tools you’re using don’t have this basic functionality, consider a change. Importantly, the Act also created a requirement that all online platforms operating in Australia allow users to report content or individuals to a platform administrator or moderator. While your existing moderation practices should already be identifying and acting on this behaviour, it’s valuable for your members to know their options - especially if they’re being targeted outside your community and any ripple effects find their way into your space. Since the adoption of the Act, online community owners are increasingly sharing eSafety reporting links and resources with users. There are now significant fines and possible imprisonment for those engaging in a pattern of harm-inducing behaviour. Online harassment is already covered under Australian criminal codes, but our newly minted Online Safety Act (introduced in 2021 in conjunction with the eSafety Commissioner) ushered in world-first anti-cyberbullying provisions that give adult targets of online abuse more options for redress and accountability. Protection doesn’t require 24/7 hyper-vigilance - just that you have a clear plan, and a trained moderator to execute it within context. Court cases against online community administrators (and their organisations) are increasing, and those without these safeguards in play, they’re not faring well. This means you need protocols, processes and people to help you guard against the risk of defamation, and allow prompt action if you believe that risk is present. This applies to public social media platforms, your association online communities and internal workplace communities equally - in each case you’re considered ‘the publisher’ and owner of what goes on there. Since 2019 (the Voller versus Nationwide decision), anyone gathering or hosting people in a shared online space is potentially liable for defamatory content posted by their users or participants within that space. Let’s look at two important regulatory considerations - defamation and bullying. Though these occur more often on social media platforms than in owned or bespoke communities, they do still happen, so don’t get caught off guard. The most common risks dealt with in online communities in Australia include defamation, hate speech, bullying, threats of harm (to self and others) and increasingly, misinformation. ![]() Cultural moderation covers content and behaviour not necessarily regulated by law, but deeply relevant to your unique community or group and its social norms. Regulatory moderation means the actions that help you stay compliant with laws, regulations and official guidance from government and other institutions (such as rules around hate speech, defamation and cyberbullying). ![]() There are two main types of moderation your online community manager should be thinking about: regulatory, and cultural. And, in the wake of growing Australian regulatory moves to better protect people online, it’s also a matter of compliance. Often misunderstood as merely ‘removing the bad stuff’, moderation is holistic and strategic work that informs both the experience and effectiveness of our communities. Don’t fall into the trap of ‘chill hosting’, rather, be an engaged host that co-creates a rewarding culture and critical guardrails.Ĭentral to proactive hosting is moderation. Members feel unmoored, dominant voices fill the void, and psychological safety - an essential community building block - becomes precarious.Īs more and more organisations realise the importance of gathering their people online, they need to consider their legal and moral duty to create a safe space where those people can thrive around their shared purpose. This may feel like a smart community building move, but it’s actually a killer. She describes the fallacy of the “chill host”, who, fearing they will make it all about them if they step into a guiding role, plays it ‘chill’ and leaves guests to mingle. In her book, The Art of Gathering, Priya Parker writes about the responsibilities of bringing people together. She is the co-founder of Swarm, the national conference for community managers, and All Things in Moderation, the only professional conference dedicated to online moderation, launching May 2023. She is Director of Australian Community Managers, the peak body for online community practitioners, and teaches community management at the University of Sydney. Venessa is Australia’s leading expert in online communities. ![]()
0 Comments
Leave a Reply. |