This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.guardian.co.uk/society/2013/jul/10/child-sex-abuse-online-internet-watch-foundation

The article has changed 4 times. There is an RSS feed of changes available.

Version 0 Version 1
Child sex abuse online: the people who watch it to remove it Removed: Child sex abuse online: the people who watch it to remove it
(about 1 hour later)
Behind the locked doors in a side office at the Internet Watch Foundation (IWF), half a dozen analysts are finishing lunch at their desks. Emma Thomas scrapes from a bowl the last of her tomato and pasta soup, sat in front of her computer where she has spent the morning watching 30 videos of child sexual abuse. This content has been removed because it was launched too early by mistake. It will be reinstated at its correct launch time.
What she has seen so far has been disturbing, she says, but adds that "every day is disturbing". At the desk behind her, a colleague prepares to spend the afternoon trawling through websites known to attract paedophiles, searching for, and removing, links to illegal content. As they police the internet, analysts listen to BBC Radio 6 Music. When they need a break from the relentless flow of images they are required to view, they pause and play Super Mario or ten-pin bowling on the Nintendo Wii.
Staff at the IWF have a uniquely difficult job, charged with watching, classifying and removing images of child sexual abuse from the web, working from a secure office just outside Cambridge, where police have given special dispensation for employees to search for this material.
In the past 12 months there has been a 40% rise in the reports made to the IWF of potentially illegal content and staff link the increase to a surge of public unease about the presence of this material online. Detectives investigating the sexually motivated murders of two young girls, Tia Sharp and April Jones, revealed that both murderers accessed online images of child sex abuse before they killed the children. The police department responsible for internet safety, the Child Sexual Exploitation and Online Protection Centre (CEOP) estimates that there are now millions of sexual abuse images in circulation on the internet.
Within the UK, responsibility for removing them has been outsourced by the big internet providers to the tiny, little-known IWF charity with a staff of under 20, based in a non-descript business park. Employers are divided into two groups, those with administrative roles, and a team of eight people who have received specialist training to equip them to watch the illegal content. The two groups sit in separate rooms, and regular staff are not allowed to enter the hotline room, where online images are assessed, until analysts have removed illegal content from their screens.
Analysts work their way through between 100 and 200 new notifications of online child sexual abuse imagery every day, sent in by the police, or the public, who can make anonymised reports about illegal content via the charity's website. Employees will not use the word pornography in this context, because they believe it does not "accurately reflect the gravity of images" they deal with. Web analysts have to enter the website address and study the content to assess whether the images or film contain inappropriate shots of children; then they have to grade the severity of the material, according to UK sentencing guidelines, on a scale of one to five (from one, erotic posing with no sexual activity, to five, sadism).
If the image is classified as illegal, and is hosted in the UK, the analyst will alert the police at CEOP and contact the server to request that it is removed. Usually this can be done within an hour. In 1996, the UK hosted about 18% of child sexual abuse images globally, since 2003, that figure has dropped below 1%, largely thanks to the work of the IWF. If the image is hosted on a server in another country, staff contact partner helplines in that country, and pass on the request for the page to be removed. Twice a day, the charity sends companies like Google and BT updated lists of sites which show illegal content, and they block them.
Working for the IWF is understandably stressful, and staff are given mandatory monthly counselling sessions, and quarterly group counselling, to help minimise the scarring effects of spending all day looking at images of children being abused. Before they are employed, they go through a rigorous selection process, which uses specialist interviewing techniques (Warner interviews) to ensure that unsuitable applicants are not employed. Potential recruits are shown images of child abuse, of gradually increasing severity, to establish how they respond, and are given time to decide whether to accept the post. Some decide, after discussion with close family members, that it is not something they can pursue. Recently an ex-army officer, who had seen extensive service in Iraq, decided not to accept a job as a monitor.
Peter Burness spent four years as a computer games tester ("It got very monotonous," he says) before joining the charity earlier this year. "They tell you what it will entail. I am quite a calm person," he says, but was nevertheless shocked by the images he saw during his interview process. "One of the main reasons was how happy the children looked. It was very strange."
The charity's work is mainly funded by internet service providers, and acts as a self-regulatory body. Recently, after criticism over the relatively small donation made by Google, the company announced a £1 million donation, to be spread over the next four years, allowing the IWF to employ another five analysts.
Although IT expertise helps, no particular qualifications are needed. Currently the charity employs a former chef, a former IT trainer, someone who was previously a complementary therapist and an ex-mobile phones salesperson.
"It's about personal qualities," explains Chris Barker, the hotline manager, in charge of eight members of staff who come in for two seven-and-a-half-hour shifts, and is responsible for making sure they are managing to handle the strain.
"There are different techniques for coping. For myself, I concentrate on assessing the images from a legal perspective, and on the fact that I am working on getting it removed, I view that as the motivation for what I'm doing," he says. Most of his team take a daily 20-minute walk around by the lake just beyond the business park at lunchtime, to switch off from their work.
They try to limit their own exposure to the material to the bare minimum. "We try not to watch the video in its entirety; where possible, we just scroll through frame by frame. Unless there's a particular reason to listen, we don't put the headphones on. You would only listen if you were searching for clues – maybe a regional accent that could help CEOP locate the abuser."
Staff also do victim identification work. Many of the images that are reported are historical, and are pictures that they have seen repeatedly, rehosted on different sites, but when they come across previously unseen images, they search for clues about where they may have been filmed – looking at the language of the books on the shelves, the style of the plug socket, the shop fronts that may be visible through the window, the logos on the school uniform. Occasionally they are able to give CEOP enough information for them to identify the victim, ensure that the child is protected and make an arrest.
Barker has been working here for over a year, but still occasionally finds the material he encounters profoundly shocking.
"It might be that you see something that is particularly violent or where there is a particularly young victim – a baby or a newborn. The next time you see an image like that you are better able to assimilate it. The first time you see an image you are thinking about the victim, what's going on in their lives, in their families. It is true to say that you build up resilience – if you were shocked every time you see an image that would be hard. But when you become an analyst, you are making an assessment – is this a level one image? How many victims are there? What gender and race are they?" he says.
He compares himself to a member of a fire crew or an ambulance man. "You show up and you see something quite horrific, but you are there to do a job, you are not thinking 'that poor person, lying on the road', you are thinking, 'I need a tourniquet, blood'."
People often fail to understand the very specific remit of the IWF – which is responsible only for indentifying and removing images of child sexual abuse (including non-photographic, computer-generated images of abuse) and a much smaller quantity of criminally obscene adult content – and send very violent bits of footage instead, hoping that the charity can help to remove it.
"I've seen things that have distressed me in the past year. It might be a beheading or something graphically violent. People report a broad range of material; your mind is perhaps not prepared for it. That can throw you a little bit," Barker says. "I do occasionally still see an image that upsets me." Last week someone reported an image of a man, who put his camera on a tripod, picked up a shovel, and in view of the camera, beat a dog to death. "I thought about that dog when I got home. Your mind will force you to deal with that," he says. The content of the video, however, was not something that the IWF is set up to deal with.
He was reduced to tears recently by a woman beating a baby (another video which was beyond the charity's remit, since there was no sexual abuse). "That really upset me. Now I've seen it lots of times. It doesn't register any more," he says.
"It certainly changes your perspective on life. My background was in IT training, I had no prepartion for this. I genuinely cannot believe, having seen the things I've seen that humans can do that to other people, particularly to children. Most people don't have that in their consciousness. Hearing about it on the news is different from seeing it. Hurting a newborn – it is beyond comprehension. These thoughts were not on my radar before," he says. "The rewards are that we are getting this kind of material removed from the web."
Staff who work here are not modern Mary Whitehouse figures with an agenda to censor adult content; on the contrary, the interview process has to ensure that new employees have no strong feelings about pornography on the web in general.
"I wouldn't recruit anyone who had a Mary Whitehouse attitude," Barker says. "If you are so opposed to pornography on the web, you wouldn't put yourself in a position where you'd be looking at porn all day long. We need people to make balanced judgments. That would be hard if you were of the mindset that pornography should not be allowed on the web."
The charity works to target known consumers of pornography, because these are the people most likely to "stumble" across illegal content, and able to report it. Stumble is the charity's preferred word, because it is neutral. " No-one can ever really know what an individual was looking for when they report content. Stumble is a helpful word because it is not accusatory," Barker says. They are anxious to target young men, because this is the group most likely to be exploring areas of the internet and legal adult pornography, that could lead them to child sexual abuse.
"They are the ones we want to get the message out to .They look at pornography. They are vulnerable to something that perhaps they never expected to find. They need to do the right thing and report it," says Emma Lowther, the charity's director of communications. "We don't want to make that group of people feel that they are doing something wrong. We have no opinion whatsoever on legal pornography." Around 80% of people make anonymous reports, but a few leave their details and want to be informed that positive action has been taken to remove the images.
Only about a quarter of the content they remove is classified as commercial content – posted online in order to generate money for the people who uploaded it, and hidden behind paywalls that demand credit-card details before access is granted. The rest is placed there by individuals, who want to share their collection of photographs, in much the same way, Lowther says, that a Ferrari enthusiast might want to share a collection of car images. Websites used by paedophiles circulate codewords which included jumbled numbers and letters, to help people access the sites; the IWF also sends out a daily list of these passwords to internet service providers to help them alter the search results. The charity does not investigate the people who post the images, because its remit is simply to remove the pictures.
About 70% of the content that the IWF processes involves children under the age of 10, but staff are encountering a growing quantity of self-generated footage, made by teenagers using cameras on their mobiles, and uploading it to the web – a process known as sexting. Research last year found 12,224 images and videos which were self-generated, 88% of which had been removed from the original site to which that it had been uploaded, and put in a collection of similar content – folders made up of 15-year-old girls topless, for example.
Despite the proliferation of this kind of material, it is much harder for staff to take action, because they can only request that the page be removed if they can be sure that the image is of a child under 18 (indecent images of children under the age of 18 are a criminal offence), and often that is hard to judge definitively.
Sarah Smith, a technical researcher who works inside the hotline, analysing the content of the material, researching trends within child abuse content, says she could not imagine a job she would rather have. "I love my job. It is the most rewarding work I have ever done. Most people here say that," she says.
She finds the quantity of abuse by parents disturbing. "I thought I would be upset when I first saw the images. I was shocked, but not upset. The physical abuse is bad enough but the incredible abuse of trust is what I found most shocking. Most of the content is inter-familial. The conversation on the video will let you know; the child will say 'Should I do that, dad?'" she says. "It helps when we know that a child has been rescued."
When she finds the nature of her job draining, she goes home and plays the violin. "I don't play well, but I find it relaxing; you have to clear everything out of your head," she says.
[Some names have been changed to protect the identity of staff]