Phoenix3875@lemmy.world to Programmer Humor@programming.dev · 2 days agoWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldimagemessage-square45fedilinkarrow-up1367arrow-down134
arrow-up1333arrow-down1imageWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldPhoenix3875@lemmy.world to Programmer Humor@programming.dev · 2 days agomessage-square45fedilink
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up5·2 days agoyou don’t even need a supported gpu, I run ollama on my rx 6700 xt
minus-squarepassepartout@feddit.orglinkfedilinkarrow-up2·18 hours agoI have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
minus-squareBaroqueInMind@lemmy.onelinkfedilinkarrow-up2·1 day agoYou don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up2·21 hours agoI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
minus-squaretomjuggler@lemmy.worldlinkfedilinkarrow-up1·5 hours agoI ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu
you don’t even need a supported gpu, I run ollama on my rx 6700 xt
I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu