• doodledup@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    6
    ·
    2 months ago

    Alexa and LLMs are fundamentally not too different from each other. It’s just a slightly different architecture and most importantly a much larger network.

    The problem with LLMs is that they require immense compute power.

    I don’t see how LLMs will get into the households any time soon. It’s not economical.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 months ago

      The problem with LLMs is that they require immense compute power.

      To train. But you can run a relatively simple one like phi-3 on quite modest hardware.

    • Halcyon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      The immense computing power for AI is needed for training LLMs, it’s far less for running a pre-trained model on a local machine.

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      2 months ago

      The problem with LLMs is that they require immense compute power. I don’t see how LLMs will get into the households any time soon. It’s not economical.

      You realize the current systems run in the cloud?

      • doodledup@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Well yea. You could slap Gemini Google-Home today. You wouldn’t even need a new device for that probably. The reason they don’t do that is econimical.

        My point is that LLMs aren’t replacing those devices. They are the same thing essentially. Just one a trimmed version of the other for economic reasons.