Google technology catches out man accused of uploading over 3,000 child porn images and he is arrested by the FBI
Daily Mail / UK
November 24, 2013
Google’s efforts to block child pornography snared a victim earlier in November when a California man was arrested, accused of uploading over 3,000 pornographic images online.
Raul Gonzales, 40, identified in criminal complaint on November 6, was recently arrested by the FBI in Woodland, as reported by CBS.
Yet the investigation against him started in March, when Google’s ‘hashing’ technology detected images that Gonzales had added to their photo sharing site Picasa.
The web giant then alerted the National Center for Missing and Exploited Children, which discovered more images uploaded by Gonzalez to Tumblr.
The FBI then took over the investigation. Disturbingly, CBS also reported that the agency found pictures of a 9-year-old child who is close to the family.
The station also said that Gonzales had admitted to sexually assaulting this child.
Google’s servers are able to search through images uploaded online. Algorithm technology can detect possible examples of child pornography.
Once such images are found it is examined by a human employee to check that the photo depicts abuse and not something more innocent, like a child at bathtime.
Every offending picture can then be tagged with a particular digital fingerprint, which shows up if the image is reloaded online elsewhere.
Following UK Prime Minister David Cameron’s recent efforts to tackle child pornography, Google Chief Executive Eric Schmidt wrote an op-ed in the Daily Mail on the issue.
In that article, Schmidt explained the ways in which his company was using technology to take on the disturbing problem:
Cleaning up search:
We’ve fine tuned Google Search to prevent links to child sexual abuse material from appearing in our results.
While no algorithm is perfect – and Google cannot prevent paedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids.
As important, we will soon roll out these changes in more than 150 languages, so the impact will be truly global.
We’re now showing warnings – from both Google and charities – at the top of our search results for more than 13,000 queries.
These alerts make clear that child sexual abuse is illegal and offer advice on where to get help.
Detection and removal:
There’s no quick technical fix when it comes to detecting child sexual abuse imagery.
This is because computers can’t reliably distinguish between innocent pictures of kids at bathtime and genuine abuse. So we always need to have a person review the images.
Once that is done – and we know the pictures are illegal – each image is given a unique digital fingerprint.
This enables our computers to identify those pictures whenever they appear on our systems. And Microsoft deserves a lot of credit for developing and sharing its picture detection technology.
But paedophiles are increasingly filming their crimes. So our engineers at YouTube have created a new technology to identify these videos.
We’re already testing it at Google, and in the new year we hope to make it available to other internet companies and child safety organisations.
There are many organisations working to fight the sexual exploitation of kids online – and we want to ensure they have the best technical support.
So Google plans to second computer engineers to both the Internet Watch Foundation (IWF) here in Britain and the US National Center for Missing and Exploited Children (NCMEC). We also plan to fund internships for other engineers at these organisations.