• Randomgal@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 days ago

      Yeah NGL. Kind of weird that all the noise is around the machine, that can’t be held accountable, because it is a machine, and not on the people actually making the images.

      This is like blaming the camera for recording something illegal.

      Maybe they should look at the hand holding the tool if they want accountability.

    • Cruel@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      Most AI platforms allow sexualized content to varying degrees. Google, Instagram, Tiktok, etc all host CSAM. Always have. It’s been understood that they’re not liable to the extent that they remove it when reported. Their detection technology is pretty good as to automatically handle it, but never perfect. They keep track of origins and comply with subpoenas which has gotten tons of people convicted.

      Grok image gen was put behind a paywall, which people claim is worse. However, most people paying lose anonymity and thus can be appropriately handled when they request illicit content, even if Grok refuses the request like it usually does.

      I think the Grok issue is sensationalized and taken out of context of the realities of what happens online and in law enforcement.