sábado, 16 de junio de 2012

Keeping children safe online: treading the line between safety and free expression

Posted 14 June 2012 10:03am by Tamara Littleton with 2 comments

The investigation into Habbo Hotel has thrown up some difficult questions about how much online environments do and don't do to keep children safe.

There's a line that children's brands tread between protecting their young users from harm and allowing them to express themselves in an environment that they enjoy.

It should be stating the obvious to say that child safety is paramount for any company that creates a place for children to play online (although sadly this isn't always the case).

On any site, however well managed, there will be an element of risk, as there is in the real world, but by using a combination of safety checks, proper moderation and education it is possible to minimise that risk, and create an environment where children and teenagers can play safely.

Part of the problem I think is that some sites try to be all things to all people. Agree what age group you're targeting, and create the appropriate environment for that group. If you leave children in a room together with nothing much to do, they'll create their own games.

If you put hidden, private areas in that room, you're going to attract secretive, illicit and, in the most extreme cases, dangerous behaviour. The same applies to an online world, but exaggerated by a cloak of anonymity.

Young children are still learning that actions have a consequence, how to take responsibility and how to manage risk. If you let them create a new persona through an avatar, they're one step removed, and their inhibitions go. They take more risks. So you need to keep a closer eye on what they're doing.

The need for moderation

Any site that children use should have solid moderation which combines technology (to filter out the most obviously inappropriate content), and human moderators. You need real people who are native speakers to understand the nuances of language.

Children create new words and phrases all the time, particularly when they're talking about sex. They'll try all sorts of variants on phrases to get round filters.

Again, you have to get the balance between allowing children to talk freely, nothing kills a site off quicker than censored chat, and being on the alert for danger signs.

There are some obvious things that should trigger an intervention by a moderator. A request for a webcam chat, for example, or to take a conversation out of the site and onto Skype or MSN should ring a warning bell.

Children will often share personal details, and that should be prevented. Those details might be as straightforward as a phone number or address, or something less obvious, like information about a school's football team that could identify the school the child is at.

There are also warning signs that experienced moderators and best of breed tools can spot to identify someone who's pretending to be younger than they are. Cultural references, language, context and past behaviour patterns are all things that can help you identify grooming behaviour.

Of course, you're unlikely to be able to stop two people of the same age flirting with each other online, but you can stop overtly sexualised chat on a site that attracts young children.

If a child is at any sort of risk, for example of self-harm, suicide or sexual abuse, there must be a clearly defined escalation process to alert the relevant authorities (including groups like CEOP and the IWF in the UK or Cybertipline in the US, for example) who can intervene quickly. If you're not watching the site, you won't be able to do this.  

What age group are you targeting?

A site that lets all ages mix together is trying to be all things to all people. You could have areas for different ages, or if you're not prepared to apply the resource to moderate content for things like sexualised chat, then the site should only be open to adults.

Education is important, both of the children on the site, and of their parents. Children should understand risk, and be taught to avoid it; and parents of young children should monitor – and understand – what their children are doing online. CEOP does some great work in this area.

Above all, the culture of a site will inform how children behave on it. If a site gets a reputation for being a free-for-all, with unchecked behaviour, it will attract the kind of behaviour that could put children at risk.

But if that site has a culture of supporting its users, doing everything it can to keep them safe and educating them about the risks so they can protect themselves, it will attract a different kind of play.

That takes time and resource, so a site has to make a choice: safety over profits, or profits over safety?  

Tamara Littleton is CEO at eModeration and a guest blogger on Econsultancy. 

No hay comentarios:

Publicar un comentario