• MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    14 hours ago

    I read I think just last week but for sure in the last month that someone has created an AI card that lowers power usage by 90%. (I know that’s really vague and leaves a lot of questions.) It seems likely that AI-specific hardware and graphics hardware will diverge — I hope.

    • wonderingwanderer@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      I think it’s called an inferencing chip. I read about it a few months ago.

      Basically, the way it was explained, the most energy-intensive part of AI is training the models. Once training is complete, it requires less energy to make inferences from the data.

      So the idea with these inferencing chips is that the AI models are already trained; all they need to do now is make inferences. So the chips are designed more specifically to do that, and they’re supposed to be way more efficient.

      I kept waiting to see it in devices on the consumer market, but then it seemed to disappear and I wasn’t able to even find any articles about it for months. It was like the whole thing vanished. Maybe Nvidia wanted to suppress it, cause they were worried it would reduce demand for their GPUs.

      At one point I had seen a smaller-scale company listing laptops for sale with their own inferencing chips, but the webpage seems to have disappeared. Or at least the page where they were selling it.