• rainwall@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    3 months ago

    It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

    The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      3 months ago

      You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

      Like combining unrelated concepts isn’t the whole fucking point?

      • mcv@lemmy.zip
        link
        fedilink
        English
        arrow-up
        10
        ·
        3 months ago

        No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.

      • stray@pawb.social
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        3 months ago

        It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.

          • stray@pawb.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            3 months ago

            I’m saying it can’t combine clothed children and naked adults to make naked children. It doesn’t know what “naked” means. It can’t imagine what something might look like. It can only make naked children if it has been trained on them directly.