Thursday, September 16, 2021

Apple wants to protect the children...but is that what would happen?

A couple of months ago, Apple announced Expanded Protections for Children*:

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM [Child Sexual Abuse Material] online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. [Emphasis added.]

A footnote indicates the the new features will be available in the U.S. 

The protective feature that interests me most is the one that intends to limit the spread of CSAM. You can read many different interpretations of the technical ins and outs but the question I want to think about is not technical: Will it protect children?

People eager to punish those who look at illegal images will be happy. Those with collections above an unspecified "threshold" amount should worry that they will be discovered.

Collections? Yes. Apple cryptography will be looking at images on Apple devices, deciding if they include CSAM, and then deciding if the collection of images includes enough to report to the National Center for Missing and Exploited Children (NCMEC). How many is enough? Your guess is as good as mine because Apple does not say.

To determine if an image is CSAM, Apple will see if any of the images on the device match images already in the NCMEC database of images. 

Let me restate that. Apple will identify images that have already been identified by NCMEC as CSAM. Those images might be of children who were abused at the time the image was created. This offers no protection for kids who are currently being abused. If an abuser is recording the abuse and uploading it to the internet for other viewers, those images will not be identified as CSAM because new images are not in the NCMEC database yet.

If I am able to figure out that new images are not going to be discovered and reported (yet), so can people who want to distribute child porn. Where will those new images come from? Is Apple inadvertently encouraging the production of new images? 

The NCMEC database makes possible arrests of people who look at those images, not of the people who are abusing children and recording the abuse. The distinction is important if you care about protecting children who are being abused. 

Unfortunately for those children, the focus is on arresting the viewers and not the abusers. Arresting, convicting, and punishing people who look at existing images does not protect children who want the abuse to stop.

If you want to be picky about it, the broad label of "CSAM" includes anything that is considered child pornography and many, if not most, of those images are not of children being abused. A revealing image uploaded by a minor can be distributed to viewers beyond the intended audience. Once that image is noticed by NCMEC, it will forever be tagged as CSAM, even though there was no sexual abuse involved. 

Back to the question: How does this protect children?

Since sexting is done with cell phones and millions of kids use them, how many sexting images of underage kids will be sent to NCMEC by mandatory reporters? How many arrests of teens will result? Are those kids protected? Their actions may have been foolish but should they be criminal? 

If we can agree that dumping kids into the criminal justice system for sexting is a bad idea, why is it a good idea to arrest adults for looking at those images? How does that protect children who are being abused?

When someone is arrested for possessing, receiving, or distributing child porn, the images remain available on the internet just as they were before the arrest. How does that protect children who are in the images of actual sexual abuse? Sending the arrested person to prison for looking at illegal images gives law enforcement something to boast about and something for people to feel good about (bad guy goes to prison!) but it protects no one. The arrest and incarceration of viewers have no effect on the child in the image.

The biggest thing to remember is that new images, perhaps of current, on-going abuse, will not be found via the Apple cryptography exercise. Kids who are being abused get nothing from the theater of child porn arrests. Do not let this news from Apple fool you into thinking that children are protected by their plans to scan devices for CSAM.

People are horrified by the suggestion, but what would happen if it were legal to view child porn? More people would see those images. Many would condemn the idea for that reason alone. The idea that someone could look at the images is so abhorrent that people stop thinking at that point. They insist no one should be able to see these images--but for a child who has been recorded during sexual abuse, a larger audience could be the key to exposing the abuser. As it is now, it would be nearly impossible for a person to come forward to identify children or abusers in the images, because that report would include an admission of committing the crime of looking at child porn. What is seen in child porn stays in child porn.

Parents who discover that pornographic images of their children have been uploaded to the internet have no way to track down those images, no way to ask for the images to be removed from web servers. Simply searching for those images could result in arrest and incarceration. 

Apple may have good intentions of stopping people from looking at child porn but we must recognize that arresting a viewer is not the same as protecting children. Children are not protected when it is illegal to see images that prove abuse.

Protecting children from sexual abuse is imperative. Arresting people who look at child porn is not protection from abusers. It is sound and fury, signifying nothing in the battle against child sexual abuse.

When we hear that an individual, a company, an organization wants to protect the children, we must stop to examine their actions and the effect of those actions on children who need protection. Making images illegal when those very images could prove abuse does not protect the children. Driving images of child abuse further underground makes new images more valuable and that does not protect the children.

Apple might mean well but they have bought into the idea that punishing people for looking at a certain category of images will protect children. Like so many others, they are promoting the idea that looking at images of abuse is worse than the abuse itself.

Let's keep our priorities straight.


* A note has been added to this article: 

Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.