"This is AI generating entirely fictional images of entirely fictional children who don’t even exist. As such, there is no actual harm being done to any actual children. AI image generation tools can only generate images based on data that is fed into the software. Ergo, if these images were not generated in a way that involved feeding the AI system actual child pornography, then I don’t see how it isn’t both legally and ethically questionable to claim that the AI can generate actual criminal child pornography. There is no victim, there are no kids in illegal situations, and there is no way to verify the age of the nonexistent “kids” in the images… so how, exactly, can this legally be considered child pornography? Take any real child porn image in existence and law enforcement can often point to the image and say “that kid’s name is X and he/she is X years old.” If they can’t, it’s only because they simply don’t know the identities of the kids in the images (who are still very much real kids). However, since there are no actual kids here, there is no possible way to identify any real kids here. To abuse AI in the way that this man did is certainly reprehensible, to say the least (although not even remotely unexpected), but, even so, I simply don’t like the precedent that this is going to set and how easily this could be abused by law enforcement for other kinds of prosecutions. As previously stated, this man indubitably tried to groom a kid and belongs in prison for that alone, but I just don’t feel comfortable prosecuting him for the images given the circumstances."