• 0 Posts
  • 9 Comments
Joined 2 years ago
cake
Cake day: December 20th, 2023

help-circle
  • To me, it’s more like the Netherlands giving out free syringes and needles so that drug consumers at least wouldn’t contract something from the used ones.

    To be clear: granting any and all pedophiles access to therapy would be of tremendous help. I think it must be done. But there are two issues remaining:

    1. Barely any government will scrape enough money to fund such programs now that therapy is astronomically expensive
    2. Even then, plenty of pedophiles will keep consuming CSAM, legally or not. There must be some incentives for them to choose the AI-generated option that is at least less harmful than the alternative.


  • Why though? If it does reduce consumption of real CSAM and/or real life child abuse (which is an “if”, as the stigma around the topic greatly hinders research), it’s a net win.

    Or is it simply a matter of spite?

    Pedophiles don’t choose to be attracted to children, and many have trouble keeping everything at bay. Traditionally, those of them looking for the least harmful release went for real CSAM, but it’s obviously extremely harmful in its own right - just a bit less so than going out and raping someone. Now that AI materials appear, they may offer the safest of the highly graphical outlets we know, with least child harm done. Without them, many pedophiles will revert to traditional CSAM, increasing the amount of victims to cover for the demand.

    As with many other things, the best we can hope for here is harm reduction. Hardline policies do not seem to be efficient enough, as people continuously find ways to propagate the CSAM and pedophiles continuously find ways to access it and leave no trace. So, we need to think of ways to give them something which will make them choose AI over real materials. This means making AI better, more realistic, and at the same time more diverse. Not for their enjoyment, but to make them switch for something better and safer than what they currently use.

    I know it’s a very uncomfortable kind of discussion, but we don’t have the magic pill to eliminate it all, and so must act reasonably to prevent what we can prevent.



  • That would be true if children were abused specifically to obtain the training data. But what I’m talking about is using the data that already exists, taken from police investigations and other sources. Of course, it also requires victim’s consent (as they grow old enough), as not everyone will agree to have materials of their abuse proliferate in any way.

    Police has already used CSAM with victim’s consent to better impersonate CSAM platform admins in investigative operations, leading to arrests of more child abusers and those sharing the materials around. While controversial, this came as a net benefit as it allowed to reduce the amount of avenues for CSAM sharing and the amount of people able to do so.

    The case with AI is milder, as it requires minimum human interaction, so no one will need to re-watch the materials as long as victims are already identified. It’s enough for the police to contact victims, get the agreement, and feed the data into AI without releasing the source. With enough data, AI could improve image and video generation, driving more watches away from real CSAM and reducing rates of abuse.

    That is, if it works this way. There’s a glaring research hole in this area, and I believe it is paramount to figure out if it helps. Then, we could decide whether to include already produced CSAM into the data, or if adult data is sufficient to make it good enough for the intended audience to make a switch.



  • I feel like our relationship to it is also quite messed.

    AI doesn’t actually undress people, it just draws a naked body. It’s an artistic representation, not an X-ray. You’re not getting actual nudes in this process, and AI has no clue how the person looks like naked.

    Now, such images can be used to blackmail people, because again, our culture didn’t quite catch up with the fact that every nude image can absolutely be AI-generated fake. When it does, however, I fully expect creators of such things to be seen as odd creeps spreading their fantasies around and any nude imagery to be seen as fake by default.