• Brkdncr@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    17 days ago

    Someone is going to make bank by catering to consumers. Will the market accept nvidia back with open arms if/when the ai investments fall through?

    • neclimdul@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      16 days ago

      As a Linux gamer, nvidia was already on thin ice.

      Also I had past them up on recentish purchases since they only really controlled the highest end of the market which I don’t have the budget for. So honestly I have no intention of welcoming them back unless there is literally no other option. You made your bed.

      • youmaynotknow@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 days ago

        This is still a pain point for me. I have been looking for a laptop with an AMD GPU for years to use with Linux, but System76, Starlabs, framework, etc insist on only having Nvidia as a discreet option. Or is it that AMD does not have laptop GPUs? Could be.

        • wonderingwanderer@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 days ago

          This is not an advertisement, but have you checked laptopwithlinux (dot) com?

          They’re based in Europe and I’m pretty sure they offer laptops with AMD GPUs, if integrated ones count. Not sure if it’s the highest end stuff, might not have VRAM, but there are definitely AMD laptop GPUs

          • youmaynotknow@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 days ago

            Thanks. I’ll check them out. But I was actually referring to discreet GPUs. I think I’ve never seen an AMD laptop GPU before.

            • wonderingwanderer@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              15 days ago

              Oh, yeah I don’t know if they carry those. It’s harder to fit one in a laptop case, so I only see them in specialized gaming laptops, and unfortunately most gaming laptops on the market seem to use nvidia.

              Maybe them ceasing to produce consumer products will open a niche that others might fill. Time will tell.

          • youmaynotknow@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            16 days ago

            Because I travel a lot for work. My PC is way less powerful than my current laptop precisely because I spend more time in the road.

            • dubyakay@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              15 days ago

              Hmm. Even when I was doing graveyard shifts with basically six hours of just me and my laptop during the dead of the night, my desktop was still more powerful than my gaming laptop.

              • youmaynotknow@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                15 days ago

                I have a somewhat old Gazelle 16 with an 11th Gen i7 and a 3050TI. My PC is a MinisForum miniPC, pretty good for what I need, but nowhere near as powerful as my laptop.

    • mycodesucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      17 days ago

      That would be nice. But video cards are a VERY niche piece of engineering. The knowledge of HOW to make them is locked in a handful of people, and the ability to make them locked behind a very niche set of equipment that will ALSO be exploding in cost.

      One does not simply start a graphics card company.

      • Brkdncr@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        17 days ago

        I don’t think a newcomer could do it, but a company like Intel is posed to be in a good position. They don’t have much market share but they have a good product.

        • mycodesucks@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          17 days ago

          The problem with that is Intel is subject to the same bullshit economic assessments as AMD and Nvidia… They’ll just as soon retool for ai as well.

        • CosmoNova@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          16 days ago

          Intel is arguably worse. They‘re in a bad spot right now so they can‘t do crazy things like Nvidia but they totally would and will go down the same path. I don‘t think US designed hardware will ever truly come back to end consumer products.

          • nforminvasion@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            16 days ago

            Nothing will. They’re moving us onto techno feudalism. We won’t earn anything and we’ll be wage slaves if they’re merciful to us, otherwise most people will be in camps and dead, unless we stand up real damn soon .

            They’re actively moving away from the bottom 90% of consumers, it’s just not worth it anymore and maybe once it was worth advertising to us, but no longer. The top 10% owns 93% of stocks and control at least 55% of the market revenue as of early 2025, probably closer to 60-65% now, after tariffs, layoffs, and the nonexistent recession We’re all imagining and definitely isn’t real. /s

      • ThePantser@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        17 days ago

        Intel is partly owned by the US government now. You think they want tech going to the people when they themselves want them for skynet.

      • wonderingwanderer@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 days ago

        I get the disdain for GenAI, but are AI chips really the problem? Maybe they’re more expensive and price people out, but it’s not like they’re built on plagiarism like most generative AI models.

        As far as I’m aware, they’re just capable of running highly complex multivariable calculi in parallel, making them more efficient for AI applications, but wouldn’t the same features make them better for more realistic physics and other game mechanics like procedural generation, NPC pathfinding and behaviors, etc.?

        I guess it would suck for anyone who doesn’t have the hardware to play a game, but there could always be options to configure in the settings to make it playable, like “don’t use tensor calculus in game physics” or whatever

        • MonkderVierte@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 days ago

          Far as i know, GPUs are more specialized on vector calculations. Some upscaling/frame generation techniques use AI hardware but that’s it.

          • wonderingwanderer@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 days ago

            Vectors, tensors, and matrices. Not all AI chips are GPUs though, there are currently NPUs in development and the next generation of consumer chips might have them integrated in the CPU.

            They’re not good for deterministic equations, like gravity, collisions, or pathfinding, but they could advance other aspects of games like procedural generations, fluid dynamics, NPC dynamic personalities and emergent behaviors.

            Some things are still better left to the CPU or GPU, but offloading some tasks to the NPU might allow for more complexity like simulating full weather systems with Parametric Partial Differential Equations

            I’m speculating, of course. But playing a game inside a fully-simulated physics engine seems like it could be cool (despite being resource-intensive in current hardware)

    • FirmDistribution@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      16
      ·
      16 days ago

      I wish there were more laptops using AMD gpus here in Brazil. You basically can’t find any laptop with an AMD gpu if you search for “gamer laptop” in Brazilian stores.

      • MasterNerd@lemmy.zip
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        15 days ago

        Gaming laptops are a not really worth it imo. They’re underpowered, overheat easily, and tend to break quickly. That doesn’t even touch on their battery life, even when not under load.I’d recommend getting a steam deck if you really need the portability, but it doesn’t look like they’re available in Brazil :/

        • Nalivai@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          15 days ago

          Steam deck and a gaming laptop don’t have the same niche. Laptop is great when you don’t have a permanent spot to setup a gaming computer, or traveling a lot for example, but still want to enjoy full experience. Deck is more for playing “on the go” so to speak.
          Buying gaming laptop was the best decision for me

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 days ago

            You can buy a USB dock and plug every peripheral into the Deck.

            You can use desktop mode, too.

            There is nothing the Deck can’t do. It has weaker hardware, but you can game on it just fine.

            I have mine plugged into a projector, for watching movies or playing games.

            • dreamkeeper@literature.cafe
              link
              fedilink
              English
              arrow-up
              2
              ·
              15 days ago

              I can’t imagine using a steam deck to do anything productive. If he’s traveling a lot it makes no sense. No one wants to drag around a monitor and KBM when they could just buy a laptop.

              • msage@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                15 days ago

                I mean, it has enough RAM to handle some workloads.

                But yes, laptop (ideally without a dedicated graphics card) would be better, if you want to do more than gaming.

                Though for gaming it’s more than enough, unless you’re into RTS.

                • CheeseNoodle@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  13 days ago

                  Especially right now, at least where I am a steamdeck is half the price of a regular laptop with the same specs.

            • Nalivai@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              15 days ago

              It has weaker hardware

              And that’s kind of the point. Don’t get me wrong, Deck is an amazing thing and I am happy it exists, and if I ever get some spare money, I’m buying myself one.
              But a laptop will always be stronger, and that’s a significant difference.

        • FirmDistribution@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 days ago

          I’d recommend getting a steam deck

          Nah. I want the 15.6 inch screen screen and the keyboard and touch pad that comes with it. The steam deck is too small and I think it’s a little expensive here (Valve is not officially selling in Brazil as far as I know).

          EDIT: didn’t see you already mentioned them not being sold here. I think no third world country has the deck being officially sold for them.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      15 days ago

      I will when someone makes a GPU that can surpass a 4090. Not even Nvidia themselves can pull that off, so I’m not getting my hopes up.

      I’m going to be stuck with this GPU for the next decade the way things are going…not that I’m complaining. It’s a beast of a card, especially for someone like me who could only ever afford bargain bin parts until one day I came into a windfall. (That was a fun 4 years.) I don’t have to worry about games being unoptimized because I can simply brute force them with pure GPU processing power. I was getting 90 FPS in Last of Us on launch. Even Cities Skylines 2 runs smoothly.

      • doingthestuff@lemy.lol
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        15 days ago

        My 4070ti isn’t as beefy but there’s still not a non-nvidia upgrade. And I am able to play most of my games at 4k / 120. I’d like to upgrade mine so I can give my card to my daughter and give her 3060ti to her brother who is currently running a 1060.

    • Eagle0110@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      15 days ago

      The time to prevent Nvidia from practically gaining a total monopoly on the entire market by stopping buying Nvidia, was 10 years ago, not now.

      Now, I’ll consider buying a GPU from you instead if you can make a GPU that satisfies technical needs like Nvidia could, but you cannot.

      • scala@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 days ago

        AMDs the last 10 years have been great. No overheating, like they did 20 years ago.

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      13
      ·
      15 days ago

      i think the latest is that china has managed to create a GPU that’s ~7 years behind. i’m not sure that’s “a GPU from 7 years ago” or “it will take them 7 years, acknowledging that there’s a known path so will take less time”

      AFAIK they’ll have to figure out EUV or some other method of lithography at that scale, which they’re trying really hard at but it’s one heck of a difficult thing to do which is why only TSMC currently actually has it working

      • dreamkeeper@literature.cafe
        link
        fedilink
        English
        arrow-up
        3
        ·
        15 days ago

        Their current GPU is roughly equal to a 4060 which isn’t that bad when you consider how far behind they are in terms of time.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          13 days ago

          iirc that was the claim but it did significantly worse in actual tests, I wish em luck though as we can always use more competition on the market/

      • percent@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 days ago

        Oh I wasn’t wishing for anything, just pointing out the possibility. There are some Chinese companies gearing up to fill the gap in the memory market. GPUs would be much harder, but maybe very profitable.

  • RejZoR@lemmy.ml
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    16 days ago

    While AMD is no angel, I’m glad I went for Radeon RX 9070 XT this time. Really good GPU and fuck NVIDIA. I hope unified RDNA5 will work out for AMD.

    • muusemuuse@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 days ago

      I went with Intel ARC since I don’t actually need GPU processing power so much as a decent media engine and VRAM for future projects and Intel has that ready to go under Linux. In the CPU side AMD is the only option that makes sense and for gaming AMDs GPUs have already been the practical option for years but their media engines are trash.

      But we don’t need NVIDIA and we don’t even need high end GPUs as much as we think we do.

    • sbbq@lemmy.zip
      link
      fedilink
      English
      arrow-up
      15
      ·
      16 days ago

      If consumers can’t get new gpus, devs aren’t going to bother spec’ing for them. This’ll probably just result in a stalling of tech you’ll see at home for a few years. Honestly that seems to be happening already. The leaps we’d seen in previous generations seem to be slowing anyway. Maybe this is just a plateau of tech for a while. Good for consumers when they accept that they don’t have to always be on the bleeding edge.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        17
        ·
        16 days ago

        I want devs to write games for £400 Steam Decks. I don’t want them to write games for £3000 GPUs.

        There’s realistically no games that won’t run on PS5 level hardware. Every effect that can be done with raytracing can be done a little worse without it.

      • dovahking@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        15 days ago

        There are already games that lies on the fringe of photorealism like bodycam. As the other guy said, we need more games with better story than better graphics. It’ll be good for the industry if not every AAA game requires a RTX 69000.

  • A_Random_Idiot@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    16 days ago

    i really hope nvidia collapses when the AI bubble pops. They’ve been more harm than good for consumers for too long.

    • hamsterkill@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      16 days ago

      It won’t collapse. It’ll lose a huge chunk of its stock price, but it both has other business to fall back on and its chips will still likely be used in whatever the next tech trend is - probably neural network AI or something.

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        15 days ago

        I am not sure. They have other businesses but not sure those other businesses are able to sustain the obligations that nVidia has committed to in this round. They are juggling more money than their pre-AI boom market cap by a wide margin, so if the bubble pops, unclear how big a bag nVidia will be left holding and if the rest of their business can survive it. Guess they might go bankrupt and come out of it eventually to continue business as usual after having financial obligations wiped away…

        Also, they have somewhat tarnished their reputation with going all in on the dataenter equipment to, seemingly here, abandoning the consumer market to make more capacity for the datacenters. So if AMD ever had an opportunity to maybe cash in, well, here it might be… Except they also dream of being a big datacenter player, but weaker demand may leave them with leftover capacity…

          • A_Random_Idiot@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            15 days ago

            never underestimate AMDs ability to shoot itself in the foot when its not under immediate threat of collapse/bankruptcy.

        • hamsterkill@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 days ago

          juggling more money than their pre-AI boom market cap by a wide margin

          I’m not sure what you mean by this. Nvidia carries a vanishingly small amount of debt for its size. It has way more liquidity than debt.

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 days ago

            Like how nVidia buys equity in a customer and in part promises expensive real product as part of it. So they may have so many billions worth of equity in a customer and might be able to leverage that to fund that production if needed, but if that equity evaporates, then they still are on the hook for the expensive product committment.

            So maybe not yet straightforward debt, but a whole lot of expensive balls in the air that could manifest as a committed expense when there’s no actual money to execute…

            Just seems like a lot of financial moves that are far from straightforward of a magnitude that could wipe a company out.

  • Mk23simp@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    17 days ago

    Hey, I’ve seen this one before.

    Last time it was crypto instead of AI, but other than that it’s just the same shit again.

  • UltraBlack@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    16 days ago

    We’re running straight into a future where consumers’ only option for computers are a cloud solution like MS 365

      • M0oP0o@mander.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        15 days ago

        That “economy” is already falling apart. Subscriptions are down, services on “the cloud” are becoming less reliable, piracy is way up again, and major nations and companies are moving to alternatives.

        Hell, DDR3 is making a comeback. All that is needed is one manufacturer to start making 15 year old tech again and bam, the house of cards falls.

  • Paranoidfactoid@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    15 days ago

    If you want to do work with the GPU you’re still buying NVIDIA. Particularly 3D animation, video/film editing, and creative tools. Even FOSS tools like GIMP and Krita prefer NVIDIA for GPU accelerated functions.

  • boaratio@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    16 days ago

    I know radeons don’t really have the performance crown, but as a life long Nvidia GPU and Linux user, the PITA drivers are not a problem when you use an AMD radeon card.