Saturday, June 22, 2013

eradicating child porn images

The headline says, Google builds new system to eradicate child porn images from the web, but what does it really mean?
The new database, which is expected to be operational within a year, will allow child porn images which have already been “flagged” by child protection organisations such as the Internet Watch Foundation (IWF) to be wiped from the web in one fell swoop.
Farther down the article, it says something different:
Scott Rubin, Google’s spokesman, said: “We are creating an industry-wide global database of ‘hashed’ images to help all technology companies find these images, wherever they might be. 
“They will then be blocked and reported.”
And at the end,
“Recently, we have started working to incorporate these fingerprints [codes that identify illegal images] into a cross-industry database. This will enable companies, law enforcement, and charities to better collaborate on detecting and removing child abuse images.” 
They won't be eradicating child porn images in "one fell swoop," they will be making it easier for Google and other companies to block access to illegal images. They will also be taking on the task of reporting illegal images to law enforcement.


Eradicating child porn is an interesting idea--knees jerk everywhere in favor of it--but don't forget the law of unintended consequences. What else could happen when child porn is driven further underground?

Seeing the images is the best way to identify the people responsible for the abuse recorded in them. If we can magically make these images disappear, are we eliminating the chance that someone would recognize the people in the images? 

And don't kid yourself--the images wouldn't go away. They may not be easily available anymore, but they will be out there. Google doesn't index every website--many websites are coded so that Google cannot "see" their content. 

Making child porn images harder to find--will that increase the value of new images? There will be new images; that's about the only certainty in this plan.

Destroying pictures of a fire does nothing to stop arson. 

UPDATE: Response to gleakk's comment here.

4 comments:

sd said...

some key points i think people should know
*just a publicity stunt by a company that makes billions
*"hashed images" is another way of saying that an image has a unique alpha-numerical value (e.g. 2fcab58712467eab4004583eb8fb7f89 )
*blocking/reporting images to law enforcement is the current way of "eradicating" these images
*people will bring images from the "dark side" of the Internet (hidden websites/services) and bring them into the "clearnet" (readily accessible websites on the Internet)
*it will be a never ending cycle of delete this; upload that
*money being wasted on hunting down people who have pictures regardless of time period they were taken; images will continue to resurface whether or not suspect is put in prison

gleakk said...

I'm not sure I understand, are you saying this is a BAD thing? I disagree with a lot of your opinions but I can usually see the logic behind them, not in this case though. Firstly, the arson analogy doesn't work, looking at pictures of a fire isn't a crime, looking at pictures of child porn is. Also your comment about increasing the value of images seems to fly in the face of your strong assertions the principles of supply and demand don't apply to child porn. This measure isn't a cure all by any means but I can't see how it could possibly hurt the situation.

Unknown said...

@sd--

It is extremely uncommon to find a website that is hosting child pornography. It all comes from P2P sites like Limewire (used to be big on sharing mp3's--oh how the times have changed). If there is a website that is hosting child porn, it will get shut down immediately as there are international laws prohibiting the publication/posting of child porn. So, there will be no "darkside" and "clearnet" areas of the internet.

The majority of law enforcement action taken on child pornography is from the monitoring of these P2P networks and arresting those that are downloading or uploading the images (based on username, IP address, etc.). It is very rare that LE are 'alerted' to child porn images on the internet; the 'alerts' are usually from a family member or someone else who notices the images on a computer (think Geek Squad or any other computer tech support company).

You are right; the images may resurface but won't attempting to eradicate the internet of these images be better than no attempt at all?

Anonymous said...

Tom if you just read the news that law enforcement puts out then your knowledge on the subject is limited. There is in fact another place in which child pornography is hosted which cannot be shutdown nor traced back to any users. These websites or "darkside" of the World Wide Web in which sd was referring to are known as hidden services or hidden websites which are available only through Tor (The Onion Router). People trawling there will save pictures and post them on "clearnet" websites, which are websites which are not hidden (e.g. youtube, facebook, 4chan etc.). The hidden websites/services found on Tor have domain names that are along the lines of kdniiwoqoeitlasliwiru.onion . For more information do some research on the "Hidden Wiki". I trust that your knowledge will be much more suited to have these discussions next time!

The same goes with P2P people download pictures from Frostwire, Kazaa, Ares, GigaTribe, etc. and post them on public domains (e.g. 4chan.org).