Our Online Child Abuse Reporting System Is Overwhelmed, Because The Incentives Are Screwed Up & No One Seems To Be Able To Fix Them

from the mismatched-incentives-are-the-root-of-all-problems dept

The system meant to stop online child exploitation is failing — and misaligned incentives are to blame. Unfortunately, today’s political solutions, like KOSA and STOP CSAM, don’t even begin to grapple with any of this. Instead, they prefer to put in place solutions that could make the incentives even worse.

The Stanford Internet Observatory has spent the last few months doing a very deep dive on how the CyberTipline works (and where it struggles). It has released a big and important report detailing its findings. In writing up this post about it, I kept adding more and more, to the point that I finally decided it made sense to split it up into two separate posts to keep things manageable.

This first post covers the higher level issue: what the system is, why it works the way it does, and how the incentive structure of the system is completely messed up (even if it was done with good intentions), and how that’s contributed to the problem. A follow-up post will cover the more specific challenges facing NCMEC itself, law enforcement, and the internet platforms themselves (who often take the blame for CSAM, when that seems extremely misguided).

There is a lot of misinformation out there about the best way to fight and stop the creation and spread of child sexual abuse material (CSAM). It’s unfortunate because it’s a very real and very serious problem. Yet the discussion about it is often so disconnected from reality as to be not just unhelpful, but potentially harmful.

In the US, the system that was set up is the CyberTipline, which is run by NCMEC, the National Center on Missing and Exploited Children. It’s a private, non-profit; however, it has a close connection with the US government, which helped create it. At times, there has been some confusion about whether or not NCMEC is a government agent. The entire setup of it was designed to keep it as non-governmental, to avoid any 4th Amendment issues with the information it collects, but courts haven’t always seen it that way, which makes it tricky (even as the 4th Amendment is important).

And while the system was designed for the US, it has become a defacto global system, since so many of the companies are US based, and NCMEC will, when it can, send relevant details to foreign law enforcement as well (though, as the report details, that doesn’t always work well).

The main role CyberTipline has taken on is coordination. It takes in reports of CSAM (mostly, but not entirely, from internet platforms) and then, when relevant, hands off the necessary details to the (hopefully) correct law enforcement agency to handle things.

Companies that host user-generated content have certain legal requirements to report CSAM to the CyberTipline. As we discussed in a recent podcast, this role as a “mandatory reporter” is important in providing useful information to allow law enforcement to step in and actually stop abusive behavior. Because of the “government agent” issue, it would be unconstitutional to require social media platforms to proactively search for and identify CSAM (though many do use tools to do this). However, if they do find some, they must report it.

Unfortunately, the mandatory reporting has also allowed the media and politicians to use the number of reports sent in by social media companies in a misleading manner, suggesting that the mere fact that these companies find and report to NCMEC means that they’re not doing enough to stop CSAM on their platforms.

This is problematic because it creates a dangerous incentive, suggesting that internet services should actually not report CSAM they found, as politicians and the media will falsely portray a lot of reports as being a sign of a failure by the platforms to take this seriously. The reality is that the failure to take things seriously comes from the small number of platforms (Hi Telegram!) who don’t report CSAM at all.

Some of us from the outside have thought that the real issue was that NCMEC and law enforcement had been unsuccessful on the receiving end to take those reports and do enough that was productive with them. It seemed convenient for the media and politicians to just blame social media companies for doing what they’re supposed to do (reporting CSAM), ignoring that what happened on the back end of the system might be the real problem. That’s why things like Senator Ron Wyden’s Invest in Child Safety Act seemed like a better approach than things like KOSA or the STOP CSAM Act.

That’s because the approach of KOSA/STOP CSAM and some other bills is basically to add liability to social media companies. (These companies already do a ton to prevent CSAM from appearing on the platform and alert law enforcement via the CyberTipline when they do find stuff.) But that’s useless if those receiving the reports aren’t able to do much with them.

What becomes clear from this report is that while there are absolutely failures on the law enforcement side, some of that is effectively baked into the incentive structure of the system.

In short, the report shows that the CyberTipline is very helpful in engaging law enforcement to stop some child sexual abuse, but it’s not as helpful as it might otherwise be:

Estimates of how many CyberTipline reports lead to arrests in the U.S. range from 5% to 7.6%

This number may sound low, but I’ve been told it’s not as bad as it sounds. First of all, when a large number of the reports are for content that is overseas and not in the US, it’s more difficult for law enforcement here to do much about it (though, again, the report details some suggestions on how to improve this). Second, some of the content may be very old, where the victim was identified years (or even decades) ago, and where there’s less that law enforcement can do today. Third, there is a question of prioritization, with it being a higher priority to target those directly abusing children. But, still, as the report notes, almost everyone thinks that the arrest number could go higher if there were more resources in place:

Empirically, it is unknown what percent of reports, if fully investigated, would lead to the discovery of a person conducting hands-on abuse of a child. On the one hand, as an employee of a U.S. federal department said, “Not all tips need to lead to prosecution […] it’s like a 911 system.”10 On the other hand, there is a sense from our respondents—who hold a wide array of beliefs about law enforcement—that this number should be higher. There is a perception that more than 5% of reports, if fully investigated, would lead to the discovery of hands-on abuse.

The report definitely suggests that if NCMEC had more resources dedicated to the CyberTipline, it could be more effective:

NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage. NCMEC faces resource constraints that impact salaries, leading to difficulties in retaining personnel who are often poached by industry trust and safety teams.

There appear to be opportunities to enrich CyberTipline reports with external data that could help law enforcement more accurately triage tips, but NCMEC lacks sufficient technical staff to implement these infrastructure improvements in a timely manner. Data privacy concerns also affect the speed of this work.

But, before we get into the specific areas where things can be improved in the follow-up post, I thought it was important to highlight how the incentives of this system contribute to the problem, where there isn’t necessarily an easy solution.

While companies (Meta, mainly, since it represents, by a very wide margin, the largest number of reports to the CyberTipline) keep getting blamed for failing to stop CSAM because of its large number of reports, most companies have very strong incentives to report anything they find. This is because the cost for not reporting something they should have reported is massive (criminal penalties), whereas the cost for over-reporting is nothing to the companies. That means, there’s an issue with overreporting.

Of course, there is a real cost here. CyberTipline employees get overwhelmed, and that can mean that reports that should get prioritized and passed on to law enforcement don’t. So you can argue that while the cost of over-reporting is “nothing” to the companies, the cost to victims and society at large can be quite large.

That’s an important mismatch.

But the broken incentives go further as well. When NCMEC hands off reports to law enforcement, they often go through a local ICAC (Internet Crimes Against Children) task force, who will help triage it and find the right state or local law enforcement agency to handle the report. Different law enforcement agencies who are “affiliated” with ICACs receive special training on how to handle reports from the CyberTipline. But, apparently, at least some of them feel that it’s just too much work, or (in some cases) too burdensome to investigate. That means that some law enforcement agencies are choosing not to affiliate with their local ICACs to avoid this added work. Even worse, some law enforcement agencies have “unaffiliated” themselves with the local ICAC because they just don’t want to deal with it.

In some cases, there are even reports of law enforcement unaffiliating with an ICAC out of a fear of facing liability for not investigating an abused child quickly enough.

A former Task Force officer described the barriers to training more local Task Force affiliates. In some cases local law enforcement perceive that becoming a Task Force affiliate is expensive, but in fact the training is free. In other cases local law enforcement are hesitant to become a Task Force affiliate because they will be sent CyberTipline reports to investigate, and they may already feel like they have enough on their plate. Still other Task Force affiliates may choose to unaffiliate, perceiving that the CyberTipline reports they were previously investigating will still get investigated at the Task Force, which further burdens the Task Force. Unaffiliating may also reduce fear of liability for failing to promptly investigate a report that would have led to the discovery of a child actively being abused, but the alternative is that the report may never be investigated at all.

[….]

This liability fear stems from a case where six months lapsed between the regional Task Force receiving NCMEC’s report and the city’s police department arresting a suspect (the abused children’s foster parent). In the interim, neither of the law enforcement agencies notified child protective services about the abuse as required by state law. The resulting lawsuit against the two police departments and the state was settled for $10.5 million. Rather than face expensive liability for failing to prioritize CyberTipline reports ahead of all other open cases, even homicide or missing children, the agency might instead opt to unaffiliate from the ICAC Task Force.

This is… infuriating. Cops choosing to not affiliate (i.e., get the necessary training to help) or removing themselves from an ICAC task force because they’re afraid if they don’t help save kids from abuse that they might get sued is ridiculous. It’s yet another example of cops running away, rather than doing the job they’re supposed to be doing, but which they claim they have no obligation to do.

That’s just one problem of many in the report, which we’ll get into in the second post. But, on the whole, it seems pretty clear that with the incentives this out of whack, something like KOSA or STOP CSAM aren’t going to be of much help. Actually tackling the underlying issues, the funding, the technology, and (most of all) the incentive structures, is necessary.

Filed Under: , , , , , , ,
Companies: ncmec

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Our Online Child Abuse Reporting System Is Overwhelmed, Because The Incentives Are Screwed Up & No One Seems To Be Able To Fix Them”

Subscribe: RSS Leave a comment
12 Comments
Anonymous Coward says:

One thing at the time. They’re banning TikTok this year, and got a lot of useless things planned for this decade, but it’s on the list.
And kids will grow up by them, it may not even necessary to do something about it for now.

(Joke aside, I’m sure that even IA won’t be of any help because of the false positive it will provide, and I can’t blame people that want to quit the task force because they feel useless, after seeing so much of it.)

ECA (profile) says:

As was said long ago

So you create another department.
Who gets to enforce things?
Cant afford to place agents in ALL 50 states, then let the Cops do something?
Cops are busy, chasing speeders and the homeless.
We took most of the minor drug crimes away from them, so What are they doing?

As with many state and fed agencies, WE cut them back and back. They look bored and act lazy, we cut more.

terribly tired (profile) says:

I’m way out of my depth here so this might be a daft question, but if the NCMEC has more or less accidentally become the global default, has anyone floated any plans that would help deliberately facilitate that role globally?

This feels to me like something that should be a US-helmed international non-profit. Losing a few of my eurocents every month to it wouldn’t hurt me in the slightest, but in aggregate it might do a world of good, and for a lot cheaper than with every nation trying to go it alone. Unless I’m missing twenty obvious non-starters, of course. Wouldn’t be the first time.

Anonymous Coward says:

The other problem with these reports that wasn’t mentioned is that most of the reports are false positives – I don’t have links any longer but there was a Canadian hearing on Pornhub a few years back that found that most of the CSAM that was reported wasn’t in fact CSAM, similarly the IWF here in the UK were having issues as 90% of public reports weren’t for CSAM, which was using up their limited resources.

That One Guy (profile) says:

'If we violate the law we get sued, oh the humanity!'

This liability fear stems from a case where six months lapsed between the regional Task Force receiving NCMEC’s report and the city’s police department arresting a suspect (the abused children’s foster parent). In the interim, neither of the law enforcement agencies notified child protective services about the abuse as required by state law.

If cases like that are the cause of police departments refusing to be involved then that says a lot more about them then they might want to admit to. Neither agency reported the problem to CPS for half a year despite that being required by law, meaning they absolutely did deserve to get sued for that so other departments pointing to that to justify their fears are all but admitting out loud that they too would be indifferent should they find out that one or more kids are in danger.

Anonymous Coward says:

In some cases local law enforcement perceive that becoming a Task Force affiliate is expensive, but in fact the training is free.

The training is, in fact, the least of the issues. The trainee then becomes indispensable and possibly unavailable for other tasks. In short, by training someone, they then have to follow through and dedicate yet more resources. The money for that doesn’t come out of thin air.

On the other hand, budgets are a matter of allocation. If the department doesn’t state that they need to set aside that amount, it doesn’t get budgeted for. A self-fulfilling cycle.

Anonymous Coward says:

It’s a damn shame that enough people are bad enough at math that shady politicians can somehow get away with blatantly misrepresenting the truth with useless BS “statistics”. Assuming a spherical cow in a vacuum, a company somehow perfectly catching 100% of the attempts via the filter would still be blamed for “not doing enough” simply because they reported that someone tried to post bad stuff.

Without the associated false-positive rate (which is known to be quite high), the number of reports alone is entirely meaningless outside of “how many eyeballs do we need to process them”. A “sudden 10x increase” in reports could just mean someone rolled out a badly tuned filter catching a lot of innocent stuff, if you don’t look at the false positive rate.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...