A chart titled “What Kind of Data Do AI Chatbots Collect?” lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
    • exothermic@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      5 months ago

      Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.

      • Kiuyn@lemmy.ml
        link
        fedilink
        arrow-up
        24
        ·
        edit-2
        5 months ago

        I recommend GPT4all if you want run locally on your PC. It is super easy.

        If you want to run in a separate server. Ollama + some kind of web UI is the best.

        Ollama can also be run locally but IMO it take more learning than GUI app like GPT4all.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        5 months ago

        Check out Ollama, it’s probably the easiest way to get started these days. It provides tooling and an api that different chat frontends can connect to.

      • TangledHyphae@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        5 months ago

        https://ollama.ai/, this is what I’ve been using for over a year now, new models come out regularly and you just “ollama pull <model ID>” and then it’s available to run locally. Then you can use docker to run https://www.openwebui.com/ locally, giving it a ChatGPT-style interface (but even better and more configurable and you can run prompts against any number of models you select at once.)

        All free and available to everyone.

      • skarn@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        5 months ago

        If you want to start playing around immediately, try Alpaca if Linux, LMStudio if Windows. See if it works for you, then move from there.

        Alpaca actually runs its own Ollama instance.

        • SeekPie@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 months ago

          And if you want to be 100% sure that Alpaca doesn’t send any info anywhere, you can restrict it’s network access in Flatseal as it’s a flatpak.

        • Smee@poeng.link
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Ollama recently became a flatpak extension for Alpaca but it’s a one-click install from the Alpaca software management entry. All storage locations are the same so no need to re-DL any open models or remake tweaked models from the previous setup.

      • skarn@discuss.tchncs.de
        link
        fedilink
        arrow-up
        6
        ·
        5 months ago

        I can actually use locally some smaller models on my 2017 laptop (though I have increased the RAM to 16 GB).

        You’d be surprised how mich can be done with how little.

      • Smee@poeng.link
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        It’s possible to run local AI on a Raspberry Pi, it’s all just a matter of speed and complexity. I run Ollama just fine on the two P-cores of my older i3 laptop. Granted, running it on the CUDA-accelerator (GFX card) on my main rig is beyond faster.

  • Cris16228@lemmy.today
    link
    fedilink
    arrow-up
    96
    arrow-down
    1
    ·
    5 months ago

    Me when Gemini (aka google) collects more data than anyone else:

    Not really shocked, we all know that google sucks

    • will_a113@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      24
      ·
      5 months ago

      And I can’t possibly imagine that Grok actually collects less than ChatGPT.

        • Ziglin (it/they)@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Yeah I feel like there’s a supposedly missing somewhere. We don’t know their servers so at the very least ‘user content’ is based on trust.

      • HiddenLayer555@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        Skill issue probably. They want to collect more but Musk’s shitty hires can’t figure it out. /s

  • abdominable@lemm.ee
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    5 months ago

    I have a bridge to sell you if you think grok is collecting the least amount of info.

  • morrowind@lemmy.ml
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    5 months ago

    Note this is if you use their apps. Not the api. Not through another app.

    • will_a113@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      Not that we have any real info about who collects/uses what when you use the API

      • morrowind@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        5 months ago

        Yeah we do, they list it in privacy policies. Many of these they can’t really collect even if they wanted to

  • Sonalder@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    5 months ago

    Anyone has these data from Mistral, HuggingChat and MetaAI ? Would be nice to add them too

    Edit : Leo from brave would be great to compare too

    • Gadg8eer@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      5 months ago

      The Broligarchy: “Everything.”

      Me: Squints Pours glowing demon tanning lotion on ground

      Trump: “You dare dispute my rule?! And you would have these… mongrels… come here to die?”

      Open Source Metaverse online. Launching Anti-StarLink missiles…

      Warning. FOSS Metaverse alternative launch detected.

      The Broligarchy: “This was not how it was supposed to be…”

      Me: “Times change. But war, war never changes.”

      “We will never be slaves. But we WILL be online. For the Open Source Metaverse we deserve!”

      Anyway, hopefully that’s the real future in some sense. The metaverse is, technologically, in a state resembling 1995’s World Wide Web. We can stop the changes that made social media happen the first time, but that comes at a grave cost of it’s own… Zero tolerance for interference with the FOSS paradigm. This means no censorship even for the most vile of content, and no government authority over online activity ever again. It also means we have less than 150 years to become immortal because having children inherently puts kids at risk of sexual exploitation, so everyone - literally everyone - must be made infertile permanently to make that impossible.

      Life extension is actually plausible, and omnispermicide would make denying it a war crime. That is the only fix I can see, but all of you would never pay it. That is why I stopped writing; every goddamn story and society at large championed “anti-escapism” in 2017 and onwards, and I will NEVER forgive you all for that. Fuck reality. I Have No Truth and I Must Dream. I want to die because I hate you all.

    • serenissi@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      5 months ago

      Nope, these services almost always require user login, eventually tied to cell number (ie non disposable) and associate user content and other data points with account. Nonetheless user prompts are always collected. How they’re used is a good question.

        • serenissi@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Yes it is possible to create disposable-isque api keys for different uses. The monetary cost is the cost of privacy and of not having hardware to run things locally.

          If you have reliable privacy friendly api vendor suggestions then do share. While I do not need such services now, it can a good future reference.

          • jagged_circle@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            I think I only used chatgpt once to play around, and it was one of those. I dont remember the name, sorry

  • krnl386@lemmy.ca
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    5 months ago

    Wow, it’s a whole new level of f*cked up when Zuck collects more data than the Winnie the Pooh (DeepSeek). 😳

    • Octagon9561@lemmy.ml
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      5 months ago

      The idea that US apps are somehow better than Chinese apps when it comes to collecting and selling user data is complete utter propaganda.

      • Duamerthrax@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        5 months ago

        Don’t use either. Until Trump, I still considered CCP spyware more dangerous because they would be collecting info that could be used to blackmail US politicians and businesses. Now, it’s a coin flip. In either case, use EU or FOSS apps whenever possible.