OpenAI Sees Significant Rise in Child Exploitation Reports This Year

OpenAI reported 80 times more incidents of child exploitation to the National Center for Missing & Exploited Children (NCMEC) in the first half of 2025 compared to the same period in 2024, based on a recent company update. The NCMEC’s CyberTipline serves as a Congressionally authorized hub for reporting child sexual abuse material (CSAM) and other child exploitation cases.
Legally, companies must report signs of child exploitation to the CyberTipline. Upon receiving a report, NCMEC reviews it and sends it to the relevant law enforcement agency for further investigation.
NCMEC report statistics can be complex. A rise in reports may reflect changes in a platform’s automated moderation or its criteria for deeming a report necessary, rather than indicating an actual increase in harmful activity.
Moreover, the same content can lead to multiple reports, and a single report may address numerous pieces of content. Some platforms, such as OpenAI, provide data on both the number of reports and the total pieces of content involved for a clearer understanding.
OpenAI representative Gaby Raila mentioned that the company made investments late in 2024 “to enhance [its] ability to review and act on reports to keep pace with current and future user growth.” Raila noted that this period aligns with “the introduction of additional product interfaces that permitted image uploads and the increasing popularity of our services, leading to more reports.” In August, Nick Turley, vice president and head of ChatGPT, revealed that the app experienced four times the amount of weekly active users compared to the previous year.
In the first half of 2025, the volume of CyberTipline reports sent by OpenAI was about equal to the number of content pieces reported—75,027 compared to 74,559. In the first half of 2024, OpenAI sent 947 CyberTipline reports concerning 3,252 pieces of content, with both reports and content counts significantly increasing between the two time frames.
In this context, content can refer to various elements. OpenAI has stated that it reports all instances of CSAM, including uploads and requests, to NCMEC. Beyond its ChatGPT app, which allows users to upload files—including images—and generates text and images, OpenAI also provides API access to its models. The latest NCMEC report does not include any incidents related to the video-generation app Sora, as its release in September falls outside the covered time frame.
The increase in reports mirrors a broader trend seen by NCMEC at the CyberTipline associated with the rise of generative AI. An analysis of CyberTipline data revealed that reports involving generative AI surged by 1,325 percent from 2023 to 2024. NCMEC has yet to disclose data for 2025, and while other major AI organizations like Google publish statistics on their NCMEC reports, they do not clarify the percentage of those reports that pertain to AI.
