I got 32 additional GB of ram at a low, low cost from someone. What can I actually do with it?

  • zkfcfbzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    5 days ago

    I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.

    And, yeah, docker’s always taking up 3-4 GB.

      • zkfcfbzr@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        Fair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.