Well I am shocked, SHOCKED I say! Well, not that shocked.

    • overload@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 month ago

      Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.

      • Robust Mirror@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      46
      ·
      1 month ago

      It doesn’t help that the gains have been smaller, and the prices higher.

      I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.

      • ByteJunk@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 month ago

        I’m in the same boat.

        In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.

        I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.

      • GrindingGears@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 month ago

        Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.

      • arudesalad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?

      • harxanimous@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Well that depends on your definition of significant. Don’t get me wrong, the state of the GPU market is not consumer friendly, but even an RX 9070 provides over a 50% performance uplift over the RX 6800.

    • missingno@fedia.io
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      1 month ago

      I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.

    • 474D@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 month ago

      “When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 month ago

        Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.

        Nowadays the new cards are 10% faster for 15% more money.

        I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.

    • Sixty@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 month ago

      When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?

      Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.

        • Jesus_666@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 month ago

          That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.

      • Robust Mirror@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 month ago

        If consoles can last 6-8 years per gen so can my PC.

        Your PC can run 796 of the top 1000 most popular games listed on PCGameBenchmark - at a recommended system level.

        That’s more than good enough for me.

        I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.

    • qweertz (they/she)@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      27 days ago

      Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on me in October last year 😔)

  • bluesheep@lemm.ee
    link
    fedilink
    English
    arrow-up
    48
    ·
    1 month ago

    Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

    Yeah no shit, what a weird fucking take

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    47
    ·
    1 month ago

    For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.

          • CybranM@feddit.nu
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            Have you tried 4k? The difference is definitely noticeable unless you play on like a 20" screen

              • Robust Mirror@aussie.zone
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 month ago

                I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…

              • CybranM@feddit.nu
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 month ago

                Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution

        • Damage@feddit.it
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          1 month ago

          Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.

          “You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.

          • GrindingGears@lemmy.ca
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            3
            ·
            1 month ago

            It’s just kind of unnecessary. Gaming in 1440p on something the size of your average computer monitor, hell even just good ol’ 1080 HD, is more than sufficient. I mean 1080 to 4k sure there’s a difference, but 1440p it’s a lot harder to tell. Nobody cares about your mud puddle reflections cranking along in a game at 120 fps. At least not the normies.

            Putting on my dinosaur hat for a second, I spent the first decade of my life gaming in 8/16 bit and 4 color CGA, and I’ve probably spent the last thirty years and god only knows how much money trying to replicate those experiences.

            • Damage@feddit.it
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 month ago

              I mean I play at 1440p and I think it’s fine… Well it’s 3440x1440, problem is I can still see the pixels, and my desk is quite deep. Do I NEED 4k? No. Would I prefer if I had it? Hell yes, but not enough to spend huge amount of money that are damaging to an already unrealistic market.

          • BCsven@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            Does it really help gameplay on the average monitor? If it is a fast paced game Im not even paying attention to pixels

    • Lord Wiggle@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 month ago

      unless they’re rich or actually need it for something important

      Fucking youtubers and crypto miners.

    • Grimtuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      I bought a secondhand 3090 when the 40 series came out for £750. I really don’t need to upgrade. I can even run the bigger AI models locally as I have a huge amount of VRAM.

      Games run great and look great. Why would I upgrade?

      I’m waiting to see if Intel or AMD come out with something awesome over the next few years. I’m in no rush.

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      But then the Nvidia xx90 series have never been for the average consumer and I dont know what gave you that idea.

  • simple@piefed.social
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    1 month ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 month ago

      5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.

      4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.

      Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.

      The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    1 month ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • MotoAsh@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 month ago

      Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        The majority sure, but there are some gems though.

        Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example

        You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 month ago

        It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

        • MotoAsh@lemmy.world
          cake
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          The irony is it is optimized in several notable cases, like cyberpunk 2077 and most major ue5 engine based games. It’s just all the mipmap levels from distant to 4k up close really add up when the game actually has a decent amount of content.

          I wonder how many people really run games at settings that require the highest detail. I bet a lot of people would appreciate half the DL size or more just to leave 'em out and disable ‘ultra’ settings.

    • tea@lemmy.today
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.

  • candyman337@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    1 month ago

    It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.

      • Bakkoda@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 month ago

    Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.

    • 46_and_2@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 month ago

      As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 month ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 month ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

      • jacksilver@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.