• WatDabney@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    134
    ·
    27 days ago

    If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      52
      ·
      27 days ago

      Nothing made-up is CSAM. That is the entire point of the term “CSAM.”

      It’s like calling a horror movie murder.

      • ryper@lemmy.ca
        link
        fedilink
        English
        arrow-up
        34
        ·
        27 days ago

        It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

        • greenskye@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          27 days ago

          I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

          I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

          Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          15
          ·
          27 days ago

          You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            2
            ·
            27 days ago

            The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

            You are completely wrong.

            https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

            “CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

            “Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              13
              ·
              27 days ago

              RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

              We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                3
                ·
                27 days ago

                Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

          • VeganBtw@piefed.social
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            27 days ago

            Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
            […]
            Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
            (Emphasis mine)

            https://en.wikipedia.org/wiki/Child_pornography

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              9
              ·
              27 days ago

              ‘These several things are illegal, including the real thing and several made-up things.’

              Please stop misusing the term that explicitly refers to the the real thing.

              ‘No.’

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        27 days ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

        Were you too busy fapping to read the article?

          • rainwall@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            26 days ago

            It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

            The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              26 days ago

              You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

              Like combining unrelated concepts isn’t the whole fucking point?

              • mcv@lemmy.zip
                link
                fedilink
                English
                arrow-up
                10
                ·
                26 days ago

                No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  26 days ago

                  True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.

              • stray@pawb.social
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                26 days ago

                It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.

    • fonix232@fedia.io
      link
      fedilink
      arrow-up
      11
      ·
      27 days ago

      It is when one side of the political palette is “against” it but keeps supporting people who think CSAM is a-okay, while the other side finds it abhorrent regardless who’s pushing it.

    • andybytes@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      27 days ago

      I mean the capitalist are the ones calling the shots since the imperial core is no democracy. This is their battle we are their dildos.

  • bearboiblake@pawb.social
    link
    fedilink
    English
    arrow-up
    33
    ·
    27 days ago

    inb4 “In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech”

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      24
      ·
      27 days ago

      There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.

      • deranger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        27 days ago

        Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.

        These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          16
          ·
          27 days ago

          There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            1
            ·
            27 days ago

            CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                9
                ·
                27 days ago

                You’re the only one using that definition. There is no stipulation that it’s from something that happened.

                Where is your definition coming from?

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  9
                  ·
                  27 days ago

                  My definition is from what words mean.

                  We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?

              • queermunist she/her@lemmy.ml
                link
                fedilink
                English
                arrow-up
                7
                ·
                27 days ago

                How do you think a child would feel after having a pornographic image generated of them and then published on the internet?

                Looks like sexual abuse to me.

          • Sas@piefed.blahaj.zone
            link
            fedilink
            English
            arrow-up
            9
            ·
            27 days ago

            It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              14
              ·
              27 days ago

              Nothing done to your likeness is a thing that happened to you.

              Do you people not understand reality is different from fiction?

              • athatet@lemmy.zip
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                26 days ago

                Please send me pictures of your mom so that I may draw her naked and post it on the internet.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  26 days ago

                  Threats are a crime, but they’re a different crime than the act itself.

                  Everyone piling on understands that it’s kinda fuckin’ important to distinguish this crime, specifically, because it’s the worst thing imaginable. They just also want to use the same word for shit that did not happen. Both things can be super fucking illegal - but they will never be the same thing.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          20
          ·
          27 days ago

          ‘If you care about child abuse please stop conflating it with cartoons.’

          ‘Pedo.’

          Fuck off.

          • Leraje@piefed.blahaj.zone
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            26 days ago

            Someone needs to check your harddrive mate. You’re way, way too invested in splitting this particular hair.

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        27 days ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

  • KelvarCherry [They/Them]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    32
    ·
    edit-2
    25 days ago

    Did Covid-19 make everyone lose their minds? This isn’t about corporate folks being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???

    • pulsewidth@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      26 days ago

      Unironically this behaviour is just “pivoting to a run for office as a Republican” vibes nowadays.

      Its no longer even ‘weird behaviour’ for a US CEO.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      4
      ·
      26 days ago

      For some reason Epic studios just let Tim Sweeney say the most insane things. If I was a shareholder I’d want someone to take his phone off him.

    • yata@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 days ago

      Trump has shown these oligarchs that they don’t have to pretend to not be arrogant oligarchs anymore. They can speak their minds without suffering any kind of repercussion or censure for their insane narcissistic greed.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    27 days ago

    Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account.

    It’s like he’s hell bent on driving people away from Epic. I’m not sure I could be more abrasive if I tried, without losing the plausible deniability of not trying to troll.

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    26 days ago

    Absolutely insane take. The reason Grok can generate CP is because it was trained on it. Musk should be arrested just for owning that shit.

    • ShaggySnacks@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      7
      ·
      26 days ago

      We all live in a two tier justice system.

      The one tier is for the capital class. Generally, as long as they don’t commit crimes against the government or others in the capital class. These offenders get the slap on the wrist justice system. The Government had enough evidence between witnesses and documentary evidence from the Epstein files to atleast open investigations and charge some of the people. The only people to be arrested and charged were Epstein and Maxwell. It took a long time before either of them faced any serious consequences for their actions.

      Everyone else gets the go fuck yourself justice system.

  • Ledivin@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    2
    ·
    27 days ago

    Tim Sweeney vocally supports child porn and deep fake porn? He certainly looks like the type of creeper, so I guess I’m not that surprised.

    I wonder how many times he’s been to Trump and Epstein’s Pedophile Island 🤔