• RedditWanderer@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    2 days ago

    Watch them suddenly try to ban this chinese code under the same stuff they didn’t want to go after tiktok for.

    • ✺roguetrick✺@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      2 days ago

      At this point regulatory capture is expected in the states. But we don’t have a corrupt government. Not at all.

      • Sabata@ani.social
        link
        fedilink
        English
        arrow-up
        24
        ·
        2 days ago

        Oh no, don’t make me torrent my illegal and unregulated Ai like a cool cyberpunk hacker.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Deepseek R1 runs with open source code from an American company, specifically Huggingface.

      They have their own secret sauce inference code, sure, but they also documented it on a high level in the paper, so a US company can recreate it if they want.

      There’s nothing they can do, short of a hitler esque “all open models are banned, you must use these select American APIs by law.” That would be like telling the US “everyone must use Bing and the Bing API for all search queries, anything else is illegal.”

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 days ago

      They are already talking about it.

      U.S. officials are looking at the national security implications of the Chinese artificial intelligence app DeepSeek, White House press secretary Karoline Leavitt said on Tuesday, while President Donald Trump’s crypto czar said it was possible that intellectual property theft could have been at play.

      https://archive.ph/t37xU

      • Naia@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        2 days ago

        They might try, but if their goal was to destabilizing western dominance for LLMs making it completely open source was the best way.

        This isn’t like TikTok. They have a server that hosts it, but anyone can take their model and run it and there are going to be a lot of us companies besides the big Ai ones looking at it. Even the big Ai ones will likely try to adapt the stuff they’ve spent to long brute forcing to get improvement.

        The thing is, it’s less about the actual model and more about the method. It does not take anywhere close to as many resources to train models like deepseek compared to what companies in the US have been doing. It means that there is no longer going to be just a small group hording the tech and charging absurd amounts for it.

        Running the model can be no more taxing than playing a modern video game, except the load is not constant.

        The cat is out of the bag. They could theoretically ban the direct models released from the research team, but retrained variants are going to be hard to differentiate from scratch models. And the original model is all over the place and have had people hacking away at it.

        Blocking access to their hosted service right now would just be petty, but I do expect that from the current administration…

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          Running the model can be no more taxing than playing a modern video game, except the load is not constant.

          This is not true, Deepseek R1 is huge. There’s a lot of confusion between the smaller distillations based on Qwen 2.5 (some that can run on consumer GPUs), and the “full” Deepseek R1 based on Deepseekv3

          Your point mostly stands, but the “full” model is hundreds of gigabytes, and the paper mentioned something like a bank of 370 GPUs being optimal for hosting. It’s very efficient because its only like 30B active, which is bonkers, but still.