• Kid_Thunder@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    It’s already here. I run AI models via my GPU with training data from various sources for both searching/GPT-like chat and images. You can basically point-and-click and do this with GPT4All which integrates a chat client and let’s you just select some popular AI models without knowing how to really do anything or use the CLI. It basically gives you a ChatGPT experience offline using your GPU if it has enough VRAM or CPU if it doesn’t for whatever particular model you’re using. It doesn’t do images I don’t think but there are other projects out there that simplify doing it using your own stuff.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      The m series Mac s with unified memory and ML cores are insanely powerful and much more flexible because your 32gb of system memory is now GPU vram etc

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      I was meaning for mobile tech, running your own personal AI on your phone.

      • Kid_Thunder@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        Right now the closest we have to that is running ampere clusters. I’m saying that because it is going to be some years before any phone GPU/CPU is going to be able to effectively run a decent AI model. I don’t doubt there will be some sort of marketing for ‘boosting’ AI via your phone CPU/GPU but it isn’t going to do much more than be a marketing ploy.

        It is far more likely that it will still continue to be offloaded to the cloud. There is going to be much more market motivation to continue to put your data on the cloud instead of off of it.