• RamRabbit@lemmy.world
    link
    fedilink
    English
    arrow-up
    217
    ·
    12 days ago

    Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.

    Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.

    Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.

      • nokama@lemmy.world
        link
        fedilink
        English
        arrow-up
        36
        ·
        12 days ago

        And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.

          • nokama@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            10 days ago

            It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.

            This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.

            When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.

        • bufalo1973@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 days ago

          In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        15
        ·
        12 days ago

        As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          11 days ago

          It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer

          • UnspecificGravity@piefed.social
            link
            fedilink
            English
            arrow-up
            15
            arrow-down
            1
            ·
            11 days ago

            Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.

      • billwashere@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 days ago

        Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 days ago

          Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those

    • wccrawford@discuss.online
      link
      fedilink
      English
      arrow-up
      33
      ·
      12 days ago

      All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.

      • MotoAsh@piefed.social
        link
        fedilink
        English
        arrow-up
        17
        ·
        12 days ago

        Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      12 days ago

      Even the 6-core Phenom IIs from 2010 were great value.

      But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.

  • Voytrekk@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    110
    ·
    12 days ago

    Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.

    • Prove_your_argument@piefed.social
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      7
      ·
      12 days ago

      DDR4 compatibility held on for a while though after AM5 was full DDR5.

      The only real issue they had which has led to the current dire straits is the 13th/14th gen gradual failures from power/heat which they initially tried to claim didn’t exist. If that didn’t happen AMD would still have next to no market share.

      You still find people swearing up and down that intel is the only way to go, even despite the true stagnation of progress on the processor side for a long, long time. A couple of cherry picked benchmarks where they lead by a miniscule amount is all they care about, scheduling / parking issues be damned.

      • msage@programming.dev
        link
        fedilink
        English
        arrow-up
        19
        ·
        12 days ago

        Oh hell naw, the issues with Intel came up much sooner.

        Ever since Ryzen came out, Intel just stagnated.

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          12 days ago

          I don’t disagree that intel has been shit for a long time, but they were still the go to recommendation all the way through the 14th gen. It wasn’t until the 5800x3d came along that people started really looking at AMD for gaming… and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.

          I had a 5800x because I didn’t want yet another intel rig after a 4790k. Then I went on to the 5800x3d, before the 9800x3d now. The 5800x was behind intel, and for me it was just a stopgap anyway because a 5950x was not purchasable when I was building. It was just good enough.

          As someone who lived through the fTPM firmware issue on AM4… I can confidently state that the tpm freezes were a dealbreaker. If you didn’t use fTPM and had the module disabled, or you updated your firmware after release you were fine - but the ftpm bug was for many, MANY years unsolved. It persisted for multiple generations. You could randomly freeze for a few seconds in any game (or any software) at any time… sometimes only once every few hours, sometimes multiple times in the span of a few minutes. That’s not usable by any stretch for gaming or anything important.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            12 days ago

            I’ve had AMDs since forever, my first own build with Phenom II.

            They were always good, but Ryzens were just best.

            Never used TPM, so can’t comment on that. And most people never used it,

            But yes, so many hardcore Intel diehards, it’s almost funny if it wasn’t sad. Like Intels legacy of adding wattage to get nothing in return.

          • Mavytan@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 days ago

            This might be true for the top of the line builds, but for any build from budget to just below that Ryzen has been a good and commonly recommended choice for a long time

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    2
    ·
    12 days ago

    I’ve been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.

    That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You’re welcome.

    It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it’s the underdog now, LOL.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      12 days ago

      AMD almost always had the better price/performance

      Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.

        (I did later buy a couple of used Opteron 6272s, but that’s different for multiple reasons.)

      • Octagon9561@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        I’ve got an FX 8350, sure AMD fell behind during that time but it was by no means a bad CPU imo. Main PC’s got a 7800X3D now but my FX system is still working just fine to this day, especially since upgrading to an SSD and 16GB RAM some years ago. It can technically even run Cyberpunk 2077 with console like frame rates on high settings.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 days ago

          I mean… It functioned as a CPU.

          But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.

          Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:

          It was bad.

          Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.

    • LastYearsIrritant@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      12 days ago

      I decide every upgrade which one to go with. Try not to stay dedicated to one.

      Basically - Buy Intel cause it’s the best last I checked… Oh, that was two years ago, now AMD should have been the right one.

      Next upgrade, won’t make that mistake - buy AMD. Shit… AMD is garbage this gen, shoulda gotten Intel. Ok, I’ll know better next upgrade.

      Repeat forever.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        12 days ago

        TBF, AMD has been pretty rock-solid for CPUs for the last 5-6 years. Intel… not so much.

        My last two computers have been AMD, the last time I built an Intel system was ~2016

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        I mean the i7s had SMT. You had to pay extra for SMT, whereas AMD started giving it to you on every SKU except a few low-end ones.

        • jnod4@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 days ago

          Is it true that all of them had SMT but they just locked it away for lower tiers processors and some managed to activate it despite Intel’s effort?

  • mesa@piefed.social
    link
    fedilink
    English
    arrow-up
    41
    ·
    12 days ago

    I remember, it was a huge issue for programs. Developers were just not supporting other chipsets because Intel was faster than the competition and mostly cheaper. Then they got more expensive, did some shitty business to MINIX and stayed the same speed wise.

    So now we see what actual competition does.

    • DaddleDew@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      ·
      edit-2
      12 days ago

      I do want them to stay alive and sort themselves out though. Otherwise in a few years it will be AMD who will start outputting overpriced crap and this time there will be no alternative on the market.

      They’re already not interested in seriously putting competitive pressure on NVidia’s historically high GPU prices.

      • mesa@piefed.social
        link
        fedilink
        English
        arrow-up
        15
        ·
        12 days ago

        I’m personally hoping more 3rd parties start making affordable RISC V. But yeah I agree, having Intel stick aroundbwould be good for people as you said.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        12 days ago

        Not only that, but (as an American) I do want the US to have some fab capability. A strong Intel is in our national security interest.

        • RamRabbit@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          12 days ago

          Yeah, if Taiwan is ever invaded, having US-based fabs will be crucial for world supply. Absolutely want to see Intel survive and TSMC continue to build factories here.

          Nothing would say ‘get fucked’ like Intel going belly up and Taiwan exploding. The supply of any new computer parts would be a dumpster fire for years.

  • BootLoop@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    30
    ·
    12 days ago

    Pretty wild to see. Glad to see it though. Hope to see the same thing happen with GPUs against Nvidia as well.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    30
    ·
    11 days ago

    So the editor asked AI to come up with an image for the title “Gamers desert Intel in droves” and so we get a half-baked pic of a CPU in the desert.

    Am I close?

  • eli@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    12 days ago

    I know we shouldn’t have brand loyalty, but after the near decade of quad core only CPUs from Intel, I can’t help but feel absolute hate towards them as a company.

    I had a 3770k until AMD released their Ryzen 1000 series and I immediately jumped over, and within the next generation Intel started releasing 8 core desktop cpus with zero issues.

    I haven’t bought anything Intel since my 3770k and I don’t think I ever will going forward.

    • InFerNo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 days ago

      The 3770k was legendary. I used it for so long. I upgraded to a 7600k almost a decade ago and now just ordered my first AMD chip (Ryzen 9700X). The Intel chips were solid, did so long with them, I hope this AMD system will last as long.

      • CptOblivius@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 days ago

        Yep, I kept the 3770k until I bought a 7800x3d. It lasted that long, and I gave my son the 3770k system and it was still overkill to play the games he wanted. Rocket League, Minecraft, fortnite etc…

      • ripcord@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        7700k here I will upgrade from (likely to AMD) one day. But still almost zero reason to.

      • eli@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        I still have my 3770k but it’s in storage.

        I bought a 1700X and was using that until upgrading to a 3700X, which I’m still using today in my main gaming desktop.

        I think you’ll be fine!

  • salacious_coaster@feddit.online
    link
    fedilink
    English
    arrow-up
    19
    ·
    12 days ago

    You can pry my Intel CPU from my cold dead hands…Because I’m never buying a new computer again. I have enough computers already to last until Armageddon.

  • somethingold@lemmy.zip
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 days ago

    Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I’m loving this new CPU. I had an AMD Athlon XP in the early 2000’s that was excellent so I’ve always had a positive feeling towards AMD.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 days ago

      AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.

    • YiddishMcSquidish@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 days ago

      I played through mass effect 3 when it was new on a discount AMD laptop with an igpu. Granted it was definitely not on max setting, but it wasn’t with everything turned all the way down either.

  • Lfrith@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    ·
    10 days ago

    So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.

    I remember people said back then people usually don’t upgrade their CPU, so its not that much a selling point. But, people didn’t upgrade because they couldn’t due to constant socket changes on the Intel side.

    My fps numbers were very happy after the CPU upgrade, and I didn’t have to get a new board and new set of ram.

  • randombullet@programming.dev
    link
    fedilink
    English
    arrow-up
    10
    ·
    12 days ago

    I wish AMD has better pcie pass through for the iGPU than Intel. My jellyfin is on Intel because I have hardware encoding support.

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    11 days ago

    I bought into AM5 first year with Zen 4. I’m pretty confident Zen 7 will be AM5. There’s got to be little chance for DDR6 to be priced well by the end of the decade. Confident that I’ll be on AM5 for 10+ years but way better than the Intel desktop I had for 10 years because I will actually have a great update path for my motherboard. AM4 is still relevant. That’s getting to almost 10 years now. It’ll still be a great platform for years to come. Really if you bought early in the life of first gen chips on the socket for AM4/AM5, you’re looking at a 15 year platform. Amazing