Auto-moderation can’t handle every absolute threat and issue. People evade filters, try to raid, voice chat grief, post screamers and NSFW. Of course, there are many more possible threats which is why we do have 12 members of staff.
Our community has a fair amount of younger users, with this brings potential risk to the community. From roleplay turned bad with attempted blackmailing to predators looking for someone to exploit. Discord is by far a cool and safe place but in large communities like ours, staff are there to be a ray of hope for people.
Users come to us with reports occasionally about people they see every day in the server or just joined. This ranges from concerning changes in attitude to malicious actions like harassment and defamation. We do what we can to provide them with the resources needed to help where we cannot, and issue action where we can as needed.
Every community is an auditorium of strangers, not everyone can mingle well which is why you need standards and enforcement.
Every staff team needs to have the tools they need to adequately do their job.
Tools our staff use:
Utilizing those tools via bot we can record every action taken and build up records for users who violate our rules. But they also serve to assist us in making our jobs easier. From purging a few dozen spam messages to banning a bucket of raiders who join to spam. Fast and efficient cleanup is a must during peak hours.