Outside the Box: Stop with the misleading outrage over Apple’s move to identify child sex photos

This post was originally published on this site

On Aug. 5, Apple announced adoption of a new technology to ensure children are protected online. The software includes a new automatic process to detect known images of child sexual abuse materials (CSAM) on iPhones in the U.S. It also offers protections for children seeking to send or receive sexually explicit photos. 

Apple’s
AAPL,
+1.03%

announcement was an important breakthrough in the urgent need to protect the most vulnerable members of society, our children. As a shareholder and parent, I applaud this announcement. I encourage the rest of the tech industry to follow suit in adopting this or similar technology. It might have kept my own underage daughter from being groomed online through Facebook
FB,
+1.11%

and later abducted and forced into sex slavery.

Improvements in technology, the growing use of mobile devices by kids, and an increase of online usage during the COVID pandemic have exponentially increased the risk to children from sexual predators. In 2020 there were 21.4 million reports of CSAM containing 65.4 million images and videos. Many of these involve children 10 years old or younger. Many more incidents go unreported. 75% of children trafficked or sold for sex are advertised online according to Thorn.

You would think Apple’s announcement would be unarguably good and welcomed by everyone, regardless of their political views. Instead, it unleashed a furious backlash in the tech community. Edward Snowden and others, for example, suggest Apple is creating a “backdoor” that will erode privacy and enable bad actors to access personal messages.

Read: Critics warn Apple plan to scan iPhones for images of child sexual abuse could be misused

How the technology works

This backlash is misleading and completely misses the point. This new technology is not a backdoor and does not provide Apple with users’ private conversations. Rather, it checks whether the unique numerical identifier of an image (“the hash”) matches any of the unique hashes for known child sexual abuse material (CSAM) held by the National Center for Missing and Exploited Children, a national not-for-profit and the clearinghouse for CSAM in the U.S.

If there are multiple “hits” or matches with known CSAM, only then can Apple see and verify the image and forward it to child safety organizations. This methodology carefully threads the needle of protecting privacy while providing child safety.

Tech companies already provide automatic features that detect unusual activity, such as protecting your device or encrypted messages from viruses and rooting out your spam email. We don’t complain when tech companies warn us of malware on our computers or banks detect misuse of our credit cards to prevent theft. By taking steps to identify and detect horrendous child sexual abuse, the tech industry can demonstrate how it is keeping our children safe and potentially avert more extreme legal and regulatory action from concerned governments around the world.

In October 2017, Pope Francis, Catholic leaders, and global experts from six continents held a Child Dignity World Congress aimed at fighting online child sexual abuse. UNICEF presented the results of a joint study with the London School of Economics that found between 32% to 68% of children had viewed sexual images online. Pope Francis urged investors to join the fight against child exploitation online.  

In 2018, investors including the Christian Brothers Investment Services, Proxy Impact, Maryknoll Sisters, the Benedictine Sisters of Virginia, and the Sisters of St. Dominic of Caldwell, N.J., and myself, took up the call.

We filed a shareholder resolution against Verizon
VZ,
-0.18%

to combat online child sexual exploitation. Institutional Shareholder Services, the largest proxy voting advisory firm in the world, recommended supporting our proposal. We won 34% of the vote, over 940 million shares worth over $53 billion.

Similar resolutions have been filed at Facebook and Sprint/T-Mobile
TMUS,
+0.51%
,
 and shareholders have engaged in dialogues with Google parent Alphabet
GOOGL,
+1.90%

GOOG,
+1.92%
,
Apple and AT&T.

What Facebook is doing

Facebook, the world’s largest social media company, moved in the opposite direction. In 2019, Mark Zuckerberg published “A Privacy-Focused Vision of Social Networking” on Facebook, which discussed shifting Facebook and Instagram messaging to end-to-end encryption. This unleashed a host of opposition from governments and an open letter from 129 child-protection NGOs.

Facebook was the source for more than 20.3 million reports of CSAM, which accounted for 94% of cases reported in the U.S. in 2020 alone. The National Center for Missing and Exploited Children estimated that Facebook’s plan to apply end-to-end encryption to its platforms could effectively make invisible 70% of CSAM cases that are currently being detected and reported as well as enable predators to identify and connect with children they otherwise would never have met.

No one wants more regulation, but that will be the result if things don’t change. In December 2019, the Senate Judiciary Committee held a hearing on encryption and public safety that included representatives from both Facebook and Apple. Child sexual abuse was repeatedly used as an example for harms stemming from end-to-end encrypted communication, and many comments from bipartisan Committee members threatened legislative action.

On March 5, 2020, Senators Lindsey Graham, Richard Blumenthal, Josh Hawley and Dianne Feinstein introduced “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act” that would create incentives for companies to “earn” liability protection for violations of laws related to online CSAM. 

In June 2020, the Technology Coalition, a not-for-profit supported by leading tech companies to help combat online child sexual abuse, launched Project Protect to drive real, lasting change for children’s digital safety, including “self-generated indecent imagery featuring youth.” This is just one of the issues Apple’s announcement and changes address.

We should all applaud Apple, the Technology Coalition and Project Protect. But this initiative didn’t happen by itself. Anyone with a stake in protecting our children — not-for-profits, businesses, parents and investors — must continue to raise our concerns and press the tech industry to lead the way in solving the problems their platforms now enable. On July 27, Facebook announced partnerships to address the sensitive topic of age verification of children.

Further collaboration on technical solutions, plus a prohibition on end-to-end encryption for communication for children under 13, could put us on a path to a more balanced, self-regulated internet. Heinous predators will be checked, and you and I will be able to carry on our private business undisturbed by regulators.

Let’s keep on the right path by developing more solutions to shutting down predators and protecting our kids while maintaining privacy for adults. I lay out a challenge to the brilliant minds working across the tech industry — help us root out a global scourge and keep our kids safe. We can’t do it alone.

Lisette Cooper is vice chair of Fiduciary Trust International, a wealth management firm that is part of the Franklin Templeton family of companies.

Now read: OnlyFans to ban ‘sexually explicit’ content after pressure from financial partners

Plus: Facebook and others doing ‘not nearly enough’ to stop COVID misinformation, surgeon general says