• WatDabney@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    134
    ·
    3 months ago

    If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      52
      ·
      3 months ago

      Nothing made-up is CSAM. That is the entire point of the term “CSAM.”

      It’s like calling a horror movie murder.

      • ryper@lemmy.ca
        link
        fedilink
        English
        arrow-up
        34
        ·
        3 months ago

        It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

        • greenskye@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          3 months ago

          I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

          I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

          Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          15
          ·
          3 months ago

          You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            2
            ·
            3 months ago

            The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

            You are completely wrong.

            https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

            “CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

            “Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              13
              ·
              3 months ago

              RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

              We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                3
                ·
                3 months ago

                Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

          • VeganBtw@piefed.social
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            3 months ago

            Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
            […]
            Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
            (Emphasis mine)

            https://en.wikipedia.org/wiki/Child_pornography

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              9
              ·
              3 months ago

              ‘These several things are illegal, including the real thing and several made-up things.’

              Please stop misusing the term that explicitly refers to the the real thing.

              ‘No.’

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        3 months ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

        Were you too busy fapping to read the article?

          • rainwall@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            3 months ago

            It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

            The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              3 months ago

              You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

              Like combining unrelated concepts isn’t the whole fucking point?

              • mcv@lemmy.zip
                link
                fedilink
                English
                arrow-up
                10
                ·
                3 months ago

                No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 months ago

                  True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.

              • stray@pawb.social
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                3 months ago

                It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.

                  • stray@pawb.social
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    3 months ago

                    I’m saying it can’t combine clothed children and naked adults to make naked children. It doesn’t know what “naked” means. It can’t imagine what something might look like. It can only make naked children if it has been trained on them directly.