Flaws in YouTube gangster video vetting exposed – Times Online
From The Times, September 18, 2008: Flaws in YouTube gangster video vetting exposed
A Times investigation has exposed failures in the video-sharing website’s monitoring system and prompted action to tighten security
Marcus Leroux, Kaya Burgess and Fran Yeoman
YouTube, the world’s largest video-sharing website, this week removed over two dozen videos glorifying gangs and gang violence which had been on its website in some cases for over 18 months.
Following a Times investigation into harmful and inappropriate material on Youtube, the website took down 30 film clips, most shot in grainy video showing hooded youths brandishing illegal weapons such as machetes, hand guns and even sub-machine guns. Google admitted they were clearly in breach of its own user guidelines which had recently been revised to deal with gang videos.
Google’s Head of Communications in Britain, former Newsnight editor Peter Barron, said that as a result of concern about the use of the website by gangs, it had now introduced new guidelines prohibiting users from showing weapons in their videos in order to intimidate people, but that these had only “gone live” on Friday.
He blamed “teething problems” with the new policy for the fact that its own monitors had failed to removed the material after a Times reporter posing as an ordinary user had flagged them up as inappropriate three days after the new policy had been introduced.
“The new guidelines have just been established, clearly it will take a little while for them to feed through the system,” said Mr Barron.
In recent years YouTube and other “networking” sites such as Bebo, have become a battleground for warring gangs who post videos of themselves brandishing weapons to intimidate their rivals. In many cases responses from other gangs have been posted on the comments page warning the makers of the video about what would happen should they stray onto their territory.
YouTube provides users with the option to “embed” their videos onto other websites, which means that the same clips appear across other sites such as Bebo and MySpace — or are posted direct.
YouTube claims it is not possible to vet material before it is uploaded to the site because of the sheer volume — an estimated 13 hours of video is uploaded every minute and hundreds of thousands of new films are posted on the site every day.
Instead it relies on a policy of self-regulation whereby users can “flag” material they consider inappropriate. Some critics have likened the system to asking drunks to decide on licensing laws.
Flagged content is then reviewed by staff who decide whether to remove it from the site. According to YouTube the “vast majority” of flagged material is reviewed, and if necesary removed, within half an hour.
Yet of 30 videos The Times flagged between 1pm on Monday and 11.15am on Tuesday, only three had been removed by Google before it was contacted by our reporter at 4pm on Tuesday. Google said that it no record of another three being flagged but agreed they breached its guidelines and took them down. Ten of the videos had been flagged by The Times a month earlier but had not been removed.
One video, titled “Welcome to Liverpool” in which youths are shown riding motorbikes and brandishing weapons, had been on the site since June 14, 2007 and had been viewed over 145,000 times before it was removed after the intervention of the Times.
Mr Barron said the videos would not have been removed before Friday because there was nothing in YouTube’s rules barring the posting of gang videos.
“There used to be a set of guidelines for material that was unacceptable, that was hate speech, pornography and violence or threats against a particular person. What we realised was that the type of videos that cause so much concern in Britain wasn’t caught under those guidelines so, as a result of listening to people’s concerns, we’ve decided to include brandishing of weapons and non specific threatening behaviour. We’re in a new era now where that kind of video is classed as unacceptable and will be removed in future.”
Home Secretary Jacqui Smith tonight welcomed YouTube’s change of policy.
“I am extremely pleased that YouTube have today taken action to ban videos glamorising weapons. This is a real step forward. I would like to see other internet service providers follow suit to reinforce our message that violence will not be tolerated either on the internet or in the real world.”
According to research by Nielsen Online, YouTube is the fourth most popular website among British children, visited by 590,000 two to 11-year-olds every month and a further one million children aged 12 to 17. Earlier this year it was heavily criticised when it emerged that a video of a woman allegedly being gang-raped was viewed 600 times before YouTube removed it. It was also revealed the video had been flagged once already.
Concerns about the way sites such as YouTube deal with such material were raised by the House of Commons Culture Media and Sport Select Committee which published a report on harmful material on the internet and in video games in July.
According to the report, one in six children between eight and 15 have viewed “nasty, worrying and frightening” content on the internet. It expressed concern that material on sites such as YouTube do not carry any age classification “nor is there a watershed before which it cannot be viewed.”
In some cases rather than remove a video, YouTube classifies it as “inappropriate for some users” in which case anyone wanting to view it is required to verify they are 18 or over by signing in or registering on the site. Anyone who’s user profile indicates they are under 18 is then blocked from viewing it. However, there is nothing to stop users lying about their age or to prevent under-18s from creating a new user profile with a different date of birth.
The select committee’s Chairman, John Whittingdale MP, said YouTube’s system for dealing with harmful and inappropriate material was generally effective but he was not convinced it could not be doing more. He said Google had told his committee it was going to “tighten procedures for removing inappropriate material…. Clearly from the evidence you have gathered they still have a long way to go.”
Communications regulator Ofcom has also criticised YouTube’s review process as “opaque” adding that “because it is impossible to determine what proportion of content is potentially harmful, there is no means to assess the overall effectiveness of the system”. It called on the industry to draw up a code of practice requiring sites such as YouTube to increase the transparency of their review processes “for example by reporting on times for dealing with ‘flags’.”
When The Times asked Google’s Peter Barron how many people YouTube has monitoring inappropriate material and how much material it removes every day, he said it was the company’s policy not to give such figures but that it represented a “tiny tiny proportion” of the total content. He denied the company’s refusal to provide figures meant there was no way of judging how effective YouTube’s system of self-regulation is.
“We can be judged on the kind of exercise you have undertaken,” he said. “ You got us on the cusp of a change in policy, you’ll have to look again in a few weeks time [to judge if it’s working].”
Note also the Times, September 18, 2008: Internet needs a strong system of self-regulation
and The Times, September 18, 2008: Temptation to close sites poses even greater danger and The Times
September 18, 2008: Police scour social websites to tackle brutality and boasts of young criminals and Times Online, March 26, 2008: Boys urged to punch and headbutt each other in YouTube video