Facts Getting started, an introduction for organizations

Cyber hate flourishes where there is a lack of consequences. Strong moderating, clear and enforced rules, and strong positioning against all forms of unfair treatment will have a positive effect and create a more inclusive internet.

Organizations need to work against cyber hate

Audrey Lebioda from the association #jagärhär.

What is a cyber room?

A cyber room can be everything from a Facebook group to a comment section on a blog post. It can be privately owned, owned by a business, or run by a non-profit organization.

This is how you can organize yourselves!

Create guidelines that fit both your organization and the cyber room. Get help from your core values, so that it is more likely for the guidelines to feel natural and actually be followed.

Think about what is important and what you should focus on.

  1. What incidents have there been in the past, and how should they have been handled?
  2. What measures were taken when the incident occurred?
  3. What was effective?
  4. What was less effective?
  5. What do you want to accomplish with the online space that you are in charge of?

Mapping through obstacle screening

Mapping through self-assessment

Checklist for accessible information

Advice Here's how you create accessible information for everyone on the internet

Download The Swedish Agency for Participation’s checklist on how to make information accessible for everyone, regardless of disability. On page three, there is a checklist for websites and online services.

Checklist for accessible information

Make sure that the policies for each channel and meeting room are in concordance with each other. If one online space allows everything, no matter content, and another strictly forbids certain content it causes confusion and undermines the policies.

The importance of active accountability

Make somebody’s accountability for your online spaces clear and active, for example through creating a moderation group. There are no opening hours online, therefore it is crucial that somebody checks for inappropriate comments and posts on a regular basis, for example once a day.

Do an assessment of risk

You should of course post even though you see a risk of that post generating cyber hate, but it is good to make an assessment of risk so you can be prepared in advance. If you believe that a post can generate negative feedback or online hate you should notify the moderation group so that they can check your channels more frequently. In some situation it can be recommended to close down the comment field if you do not have the ability to actively moderate it. It can also be a good idea to not post “high risk content” on a Friday afternoon just before people take off for the weekend. Instead you should choose a day when you have the resources to take responsibility on your channels.

Keep in mind that nobody can be online 24h/ day. Make sure you have a big enough group of people that can help each other. Often a schedule is helpful.

Routines for extreme situations

Have routines for how to act in extreme and unusual cases. In these severe situations it can be necessary to make a report to the police. We encourage you to have policies in place for these kinds of situation, because the law is also in effect online.    

Crisis plan

Answer these questions to complete your routines:

  • How do you assess if the law has been broken in a situation?
  • Who makes the report to the police?
  • Who contacts the person that was affected?

Sexual assault

Harassment

Discrimination

Also the internal online spaces need to be cared for!

Sometimes it is the people within the team or organization that are the problem. Therefore, clear policies and routines for those situations are also necessary.

It is often even more delicate to deal with one’s own employees or activists crossing the line. If a situation occurs between two people in the same organization, you need to know how to act. Having routines for that already in place is important, especially if your organization is small. Being able to rely on already existing documents and policies is a security to help standing up to people acting wrong – even though it is your own friends or colleagues.

Always collect digital evidence!

Save evidence and documentation. Remember to take screenshots, where both time and date can be seen, before posts and messages are deleted. Screenshots can be used as legal evidence.

It is also recommended to be more than one person that looks at the posts and messages, because then the other persons can be called as witnesses in an eventual trial.

Gather digital evidence

Recommendations to combat cyberhate on your platforms:

  • Create guidelines for what can be posted on your online space. Make sure that the guidelines are in line with other internal policies.
  • If your guidelines prohibit sexism, then you need clear strategies and rules against it. For example if somebody posts an inappropriate picture then the first step might be to refer to your guidelines and give the person posting a warning (and make them remove the picture).
  • Be clear and consistent! Both within your moderation group and externally about what is a violation of the rules. Sometimes a system in steps for giving warnings is useful. Then a person violating rules can be warned 1-3 times before getting removed from the online space. For more serious violation the actions might be harsher than giving warnings.
  • Make your guidelines accesable. The guidelines should be easy to find and to understand. Be concrete and clear so that both your users and you yourselves understand them. If they are too fuzzy they are not useful!
  • Guidelines are the rules that the moderation group should monitor compliance for. In addition, it can be useful to create guidelines for the moderation itself, with recommendation and policies for how the moderators should act and make assessments. These can also be communicated externally so that your members and users understand how you think and act when you moderate your space.
  • Another useful tip that many groups and pages use, for example Facebook, is to let people that apply for membership answer a few questions that connect to the values of the group/ page. And also, to make people applying pledge to follow the guidelines and rules. By doing that you make sure that everyone in your group know about your values and guidelines.
  • Make sure rules are applied consequently and just. For everyone to accept the moderation and taking it seriously, it must be without doubt that everyone is treated equally when violating rules, no matter relation to the moderators.

Preventative measures

Administrate social platforms

Checklist for social media preparedness