Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

An Alaskan man reported someone for AI CSAM and was then arrested for the same thing


If you are going to call the police and have someone abuse you for reporting their interest in child sexual abuse material (CSAM), it may not be the best idea to have the same material on your own device be able to gather more information. But that’s what a man in Alaska allegedly did. It brought him into police custody.

404 Media reports Earlier this week, a man, Anthony O’Connor, was eventually arrested after a police search of his devices allegedly revealed AI-generated child sexual abuse material (CSAM).

from 404.

According to the newly submitted charging documentsAnthony O’Connor contacted law enforcement in August to alert them about an unidentified pilot who had shared child sexual abuse material (CSAM) with O’Connor.In investigating the crime, and with O’Connor’s consent, federal authorities searched his phone for additional information. A review of the electronics revealed that O’Connor allegedly offered to prepare a virtual reality CSAM for the pilot, according to the criminal complaint.

According to police, the unidentified pilot shared with O’Connor an image of a child he had taken at a grocery store, and the two discussed how they could transport the minor into an open virtual reality world.

Law enforcement claims they found at least six explicit, AI-generated CSAM images on O’Connor’s devices, which he said were intentionally downloaded, as well as several “real” ones that O’ had unknowingly mixed in As a result of the search, in Connor’s home, law enforcement agencies found a computer, as well as many hard drives, hidden in the vent of the house. A 41-second video of the rape of a child was allegedly found as a result of a computer check.

In an interview with authorities, O’Connor said she regularly reported CSAM to Internet service providers “but still enjoyed the images and videos sexually.” It’s unclear why she decided to report the pilot to law enforcement. Maybe he had a conscience, or maybe he really believed that his AI CSAM wasn’t breaking the law.

AI image generators are usually trained using real photos; In other words, AI-generated pictures of children are fundamentally based on real images, and there is no way to separate the two.

The first such arrest of someone for possession of AI-generated CSAM just happened back in May when the FBI arrested a man for using Stable Diffusion to create “thousands of realistic images of pre-pubescent minors”.

Proponents of AI will say that it has always been possible to create explicit images of minors with Photoshop, but AI tools are making it exponentially easier to do so.A recent report found that one of six congressmen have been targeted by AI-generated deepfake porn. Many products have guardrails to prevent the worst use, just as printers do not allow counterfeiting. The introduction of barriers prevents at least some of this behavior.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *