• Suavevillain@lemmy.world
    link
    fedilink
    English
    arrow-up
    97
    arrow-down
    2
    ·
    6 days ago

    AI has taken more things since it’s big push to be adopted in the public sector.

    Clean Air

    Water

    Fair electricity bills

    Ram

    GPUs

    SSDs

    Jobs

    Other people’s art and writing.

    There are no benefit to this stuff. It is just grifting.

  • Randelung@lemmy.world
    link
    fedilink
    English
    arrow-up
    104
    arrow-down
    1
    ·
    6 days ago

    This bubble is going to become the entire market, isn’t it. Until it becomes too big to fail because 80% of the workforce is tied up in it. Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

    • Ensign_Crab@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      6 days ago

      Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

      After the bailouts at the expense of the poor, of course.

    • humanspiral@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      it becomes too big to fail because 80% of the workforce is tied up in it

      In 2008, banking sector and auto industry needed bailouts for the investor/financial class. Certainly, there was no need to layoff core banking employees, if government support was the last resort to keep the doors open AND gain controlling stake over future banking profitablity in a hopefully sustainable (low risk in addition to low climate/global destruction) fashion. The auto bailout did have harsher terms than the banking bailout, and recessions definitely harm the sector, but the bailouts were definitely focused on the executives/shareholders who have access to political friendships that result in gifts instead of truly needed lifelines, or wider redistribution of benefits from sustainable business.

      The point, is that workforce is a “talking point” with no actual relevance in bailouts/too big to fail. That entire stock market wealth is concentrated in the sector, and that we have to all give them the rest of our money (and militarist backed surveillance freedom) or “China will win” at the only sector we pretend to have a competitive chance in, is why our establishment needs another “too big to fail moment”. We’ve started QE ahead of the crash this time.

      Work force is relatively small in AI sector. Big construction, but relatively low operations employment. It displaces other hiring too.

  • lechekaflan@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    1
    ·
    edit-2
    6 days ago

    Yet another chapter in the fucking AI craze started up by them fucking techbros.

    Also, someone forgot that in some places in the world, people have to use older PCs with SATA drives. That, until their discontinuation announcements, Crucial and Samsung SATA drives were several tiers better than, say, those cheapo Ramsta drives.

    • Psythik@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      34
      ·
      6 days ago

      Discontinuing outdated tech has nothing to do with AI. SATA SSDs need to be retired. NVME is superior and widely available.

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 days ago

        Especially since you can get M.2 to SATA adapters, so people stuck with SATA only motherboards can still upgrade their storage.

        Literally the same deal when companies stopped making IDE drives, people just used SATA to IDE adapters instead.

        • FrederikNJS@lemmy.zip
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 days ago

          Do you know of any m.2 to SATA adapters that support NVMe? Or are these only for Sata M.2s?

          • The_Decryptor@aussie.zone
            link
            fedilink
            English
            arrow-up
            7
            ·
            6 days ago

            Man, it sure would be helpful for my argument if I could.

            I went back and checked the ones I was looking at, very helpful fine print stating “not for NVEM ssds”, so they all only work with mSATA M.2 SSDs, hell of a let down.

            • Amju Wolf@pawb.social
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 days ago

              Despite how similar the interface is the protocol is completely different. NVMe is basically just PCIe, so adaptating it so that it runs “under” SATA would be difficult if not almost impossible. And most definitely not worth the extra price.

              • The_Decryptor@aussie.zone
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 days ago

                Well no you can translate, but it just seems that nobody has actually made a product to do so.

                e.g. those M.2 SSD to USB adapters, those aren’t speaking NVMe to the host device. They either talk the traditional USB “bulk transfer” protocol, or potentially SCSI, translating that to NVMe for the SSD itself.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    6 days ago

    AFAIK this has already been a problem, you can find Samsung M.2 SSDs for cheaper than Samsung SATA SSDs at the same capacity, because their cloud customers have all flown past classic SATA/SAS for NVME U.2 and U.3, which is much more similar to M.2 due to NVME.

    I was planning on adding a big SSD array to my server which has a bunch of external 2.5 SAS slots, but it ended up being cheaper and faster to buy a 4 slot M.2 PCIe card and buy 4 M.2 drives instead.

    Putting it on a x16 PCIe slot gives me 4 lanes per drive with bifurication, which gets me the advertised maximum possible speed on PCIe 4.

    Whether or not the RAM surge will affect chip production capacity is the real issue. It seems all 3 OEMs could effectively reduce capacity for all other components after slugging billions of dollars into HBM RAM. It wouldn’t just be SSDs, anything that relies on the same supply chain could be heavily affected.

    • iglou@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      Exactly this. Micron ended their consumer RAM. Sansung here is just stopping producing something that is arguably outdated, and has a perfectly fine, already more available, most often cheaper or equivalent modern replacement.

  • Kyden Fumofly@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 days ago

    The leak comes after another report detailed that Samsung has raised DDR5 memory prices by up to 60%.

    MF… And why they wind down SSD production this time? Last time was 2 years ago, because the SSD prices were low and they wanted to raise them (which happened).

  • dependencyinjection@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    38
    ·
    7 days ago

    When I built a PC a couple of years ago when I really didn’t need one, then over specced it just because. I’m very happy right now as the prices are insane, feel like I could sell the PC for more than it cost me which mental.

    • Randelung@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 days ago

      Don’t worry, you can use AI on anything that can access the internet! No need to ever have personal (let alone private) thoughts - I’m sorry, data - again.

      MS has been trying to get you to give up your personal computer for years. Do everything in the cloud, please! Even gaming with Stadia! And now they’re getting their wish. All it took was running the entire global economy.

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      19
      ·
      6 days ago

      Not just the tech industry. A huge proportion of the US economy is made up of betting on AI. Like the crash of 2008 (but worse, some predict) it will hurt everyone but the richest, who will become even richer.

  • Zorsith@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    7 days ago

    Tbh its not a bad call. Used to work somewhere that bought hundreds of 500gb SATA SSDs for laptop upgrades that just… sat on a shelf, because none of the new laptops ordered could even take a SATA drive. Hell, they’re Crucial branded so they’re probably collectable if micron keeps crucial dead for long enough.

    • RamRabbit@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      7 days ago

      That sucks. They probably could give them out to employees as a little bonus thing. Build a bit of goodwill. Rather than have them sit on a shelf.

      • Zorsith@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        7 days ago

        Government. Ain’t nobody want to get caught “stealing” from the government (they’re probably going to be destroyed ten years after they’re completely obsolete). Waste of damn near a hundred terabytes of storage.

        • bthest@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 days ago

          Government. Ain’t nobody want to get caught “stealing” from the government

          It’s fine as long as you’re rich. Trump and friends are clearing the place out.

        • RamRabbit@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          7 days ago

          It would have to be a voluntary thing, not just handed to everyone. “Put your name on this sheet if you want one.”

          • shalafi@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            7 days ago

            My wife would sign up and she has no idea what a drive is, of any type. If you try to explain it to her she’ll turn her ears off. But she’ll only hear “FREE STUFF!”.

  • calamityjanitor@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    7 days ago

    I have 4x 6TB HDDs in my NAS. Around 5 years ago I decided to simply replace any dead drives with 6TB ones instead of my previous strategy of slowly upgrading their size. I figured I could swap to 8TB 2.5" SATA SSDs that had just started to exist and would surely only get cheaper in the future…

      • calamityjanitor@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 days ago

        In my head I thought one could make relatively cheap high capacity in 2.5" SATA form factor by having more NAND chips of lower capacity. You give up speed and PCB space but that’s fine since bandwidth and IOPS are limited by SATA anyway and there’s plenty of space compared to M.2.

        Turns out to not shake out that way, controller ICs that support SATA aren’t coming out any more, and NAND ICs are internally stacked to use up channels while not taking up PCB space.

        There are some enterprise options, but they’re mad expensive.

        • SapphironZA@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 days ago

          I’ve cracked open a few faulty sata SSDs. Quite a few of the recent models are just 2242 or 2230 m.2 ssd’s with a converter. Even bigger 2TB ones.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    6 days ago

    Aside: WTF are they using SSDs for?

    LLM inference in the cloud is basically only done in VRAM. Rarely stale K/V cache is cached in RAM, but new attention architectures should minimize that. Large scale training, contrary to popular belief, is a pretty rare event most data centers and businesses are incapable of.

    …So what do they do with so much flash storage!? Is it literally just FOMO server buying?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      6 days ago

      Storage. There aren’t enough hard drives, so datacentres are also buying up SSDs, since it’s needed to store training data.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        6
        ·
        6 days ago

        since it’s needed to store training data.

        Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          6 days ago

          As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity.

          In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes.

          Things like books and wikipedia pages aren’t that bad, wikipedia itself compressed is only 25GB, maybe a few hundred petabytes could store most of these items, but images and videos are also valid training data and that’s much larger, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.

    • Urga@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      The lines used to produce vram also do ssd nand flash, so they make less ssds to make more vram