". . . and having done all . . . stand firm." Eph. 6:13

Newsletter

The News You Need

Subscribe to The Washington Stand

X
Commentary

How Pedophiles Are Using Artificial Intelligence to Exploit Kids

February 28, 2023

Artificial intelligence (more commonly known as “AI”) has gained attention and popularity in recent months, particularly since the launch of the ChatGPT chatbot from OpenAI, which captivated the internet with both its impressive abilities and surprising limitations. The millions of AI users in the U.S. are mostly eager to cheat on their homework or escape writing work emails; however, some bad actors have also discovered how to employ AI technology to attain far more nefarious ends.

Britain’s National Crime Agency is conducting a review of how AI technology can contribute to sexual exploitation after the recent arrest of a pedophile computer programmer in Spain shocked the continent. The man had been found to be utilizing an AI image generator to create new child sexual abuse material (CSAM) based on abusive images of children that he already possessed. The Daily Mail noted that the “depravity of the pictures he created appalled even the most experienced detectives …”

Many AI programs function by inputting data or content that teaches the program to recognize patterns and sequences, and recreate them in new ways. When pedophiles or sexual abusers get their hands on AI, they can further exploit the victims featured in real images to create new — and even more graphic — content. Though the AI-generated images are not “real” in the sense that they are photographs of events that necessarily transpired, they are nevertheless inherently exploitative of the victims used to train the AI, remnants of whose images may still be featured in the new CSAM.

Another form of artificial intelligence that has gained recent notoriety is known as a “deepfake.” In these unsettling images, audio clips, or videos, AI is able to create shockingly realistic manipulations of an individual’s likeness or voice in any scenario that the creator desires. While deepfakes can be used in a variety of harmful contexts, like depicting a political candidate in a situation that would damage his reputation, sexual predators who weaponize the technology have proven to be particularly vicious.

Last week, discussion of deepfake technology reached a fever pitch as a community of female online content creators realized that their images had been uploaded online in the form of pornographic deepfakes. The women who had been victimized reported feeling extremely violated and deeply unsettled with the knowledge that this pornographic content had been created and distributed without their consent — and that people who knew them personally had been watching the deepfakes to satisfy their own perversions. Deepfake technology knows few bounds; pedophiles with access to images of children could similarly employ this form of AI to create CSAM.

The normalization of AI-created pornography or child sexual abuse material serves no beneficial purpose in society — and, in fact, can influence cultural mores in profoundly harmful ways. Already, having the technological capability to manufacture AI-generated CSAM has emboldened pedophile-sympathizers to advocate for their inclusion in the liberal umbrella of sexual orientations.

The Young Democrats, the youth division of the Democratic Party in the Netherlands, recently made a statement claiming that not only is pedophilia “a sexual orientation that one is born with,” but also claiming that the “stigma” surrounding pedophilia is causing pedophiles to suffer from higher rates of depression and suicidal thoughts. The Dutch Young Democrats group advocates against criminalizing hand-drawn or AI-generated child sexual abuse material because it “does not increase the risk of child abuse” and could potentially “help pedophiles get to know their feelings without harming other people.”

Pornography of any kind is inherently exploitative — the pornography industry thrives off dubious consent and, often, knowing exploitation of trafficking victims and minors. Using AI technology to create images or videos that constitute pornography or child sexual abuse material perpetuates a chain of abuse even if the new content created is different from abuse that physically occurred.

AI-generated pornography or CSAM cannot circumvent the extreme violations against human dignity caused by creating exploitative sexual content. Modern nations require laws that appropriately address modern concerns; while the progression of AI technology can, in some ways, certainly benefit society, its capability to produce exploitative material and allow the rot of pedophilia to continue festering must be addressed.



Amplify Our Voice for Truth