Child sexual abuse content growing online with AI-made images, report says
Child sexual abuse content growing online with AI-made images, report says
Add1
Some children and families were extorted for financial gain by predators using AI-made CSAM, according to the NCMEC.
The center received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI, a category it only started tracking in 2023, a spokesperson said.Add2
“The NCMEC is deeply concerned about this quickly growing trend, as bad actors can use artificial intelligence to create deepfaked sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children engaged in graphic sexual acts,” the NCMEC report states.
“For the children seen in deepfakes and their families, it is devastating.”
AI-generated child abuse content also impedes the identification of real child victims, according to the organization.Add3
Creating such material is illegal in the United States, as making any visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachusetts-based prosecutor from the Department of Justice, who spoke on the condition of anonymity.
In total in 2023, the CyberTipline received more than 35.9m reports that referred to incidents of suspected CSAM, more than 90% of it uploaded outside the US. Roughly 1.1m reports were referred to police in the US, and 63,892 reports were urgent or involved a child in imminent danger, according to Tuesday’s report.Add4
There were 186,000 reports regarding online enticement, up 300% from 2022; enticement is a form of exploitation involving an individual who communicates online with someone believed to be a child with the intent to commit a sexual offense or abduction.Add5
Comments
Post a Comment