I was interested to read of Facebook’s plans to introduce new reporting functionality specifically aimed at targeting cyber bullying (Facebook fuses emotion to its anti-bullying efforts – CNN 12th July.)
My experience on working with clients such as ChildLine, CEOP and the Home Office on campaigns and communities focussing on bullying has shown me that cyber bullying is a concern that ranks high on young people’s list of potential pitfalls when using social media.
Further detail on how the changes will look is rather scarce at the moment, although this update from on 10th March 2011, when the changes were first announced, gives some screen shots of the new reporting process.
CNN report that Facebook plan to rename the ‘Report as abuse’ button to ‘This post is a problem’ for 13 and 14 year olds. This makes sense as I know that many young people often have a very specific view of what constitutes ‘abuse’. They sometimes feel that bullying does not qualify as abuse and might not use the reporting functionality as a result. The new definition should help young people to feel more confident in reporting problematic content.
Once clicked, users will encounter a series of questions to help them explain what the problem is using age appropriate and compassionate wording created in collaboration with teams at Yale, Columbia and Berkeley Universities. The new language is designed to allow users to express their emotional reaction to a piece of content and allows Facebook to provide a more tailored set of options to deal with each situation.
There are also two new options for reporting about which I am cautiously optimistic.
Firstly, young people are offered the option to send a pre-written message to the person who submitted the content that has upset them. I really like this. In my experience young people are still experimenting with humour and are often shocked to find out that what they have said might be upsetting. Allowing the injured party to alert them to this may prompt a swift and effective resolution to the situation without things escalating. Thankfully, the alert is pre-written to remove the urge to retaliate which can also be problem when a young person is offended.
Secondly, young people who feel afraid or threatened are offered the option to get help with the content from a trusted friend or adult. The hope is that this will prompt a community-focussed approach to resolving the problem with the trusted person able to offer support, advice and potentially assistance in reporting the concern of the young person to Facebook staff if required.
There will always be genuine trolls and bullies, but these changes will help if they complement existing reporting and investigation processes rather than replace them. I hope they will free up Facebook team members to concentrate on more serious incidences of harassment and provide users with more social options for addressing the day-to-day arguments and misunderstandings that can occur online.
Ultimately, the changes will live or die on the ability of the new reporting questionnaire to point users towards the appropriate channel so I’ll be watching out for more information with interest. They’ve been trialled with a small group of users in the States, and CCN report that they’re due to be rolled out to US users this week. It’ll be interesting to see what impact they have for young people there, and to see if they have the impact intended.