daggermoon@lemmy.world to No Stupid Questions@lemmy.world · 5 days agoWhat can I actually do with 64 GB or RAM?message-squaremessage-square98fedilinkarrow-up183arrow-down15file-text
arrow-up178arrow-down1message-squareWhat can I actually do with 64 GB or RAM?daggermoon@lemmy.world to No Stupid Questions@lemmy.world · 5 days agomessage-square98fedilinkfile-text
minus-squarezkfcfbzr@lemmy.worldlinkfedilinkEnglisharrow-up20·5 days agoI have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU. And, yeah, docker’s always taking up 3-4 GB.
minus-squareMubelotix@jlai.lulinkfedilinkarrow-up6arrow-down2·5 days agoEither you use your CPU and RAM, either your GPU and VRAM
minus-squarezkfcfbzr@lemmy.worldlinkfedilinkEnglisharrow-up2·5 days agoFair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.
minus-squareMubelotix@jlai.lulinkfedilinkarrow-up3·5 days agoYou can run a very decent LLM with that tbh
I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.
And, yeah, docker’s always taking up 3-4 GB.
Either you use your CPU and RAM, either your GPU and VRAM
Fair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.
You can run a very decent LLM with that tbh