A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • Ryantific_theory@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

    As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

    That’s pretty good for 2023.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.

      • Ryantific_theory@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.