For most of college, I’ve kept it simple: I’d create a directory in my home folder for each project, then eventually move older or inactive ones into ~/programming/. When I change devices or hit file size limits, I’ll compress and send things to my NAS.
This setup has worked pretty well so far. But now that I’m graduating and my projects keep stacking up, I’m starting to wonder if there’s a more efficient system out there.
Curious—how do you all organize and store your projects? Any tips or methodologies that have made your lives easier over time?
The only person I’ve talked to about this is my mentor who’s been programming since the 60s (started on the IBM 1620 and Bendix G15) and he just mostly keeps projects in directories in his home directory and uses his godly regular expressions skills to find things that way. Makes me wonder if I’m overthinking it…
I have a folder for my projects on root and within those projects I have my GitHub repos all contained within their own directory named the same as the project.
If I am learning something, I have a folder for the topic I am learning, and a logseq file with all of my notes. Then I have folders for my book references, one for video or audio references, and then a folder for my practice projects.
I just have all my active projects under $HOME/projects/ and projects from other people in $HOME/bin/
I have a dedicated directory with subdirectories for each project and that’s it
Whatever Cargo generates for me. If I use workspaces, then I put the subprojects to the root of the directory.
I think this is asking about where to keep projects, not how to organize them internally.
Ah, OK. I keep them in the Documents directory.
xd
On my personal computer
~/Projects/<name>
, you need to remember that real-life is not like college, you won’t be working on a new project every week. If you have more stuff than you can manage like this, you’ve bitten more than you can chew.On my work computer it’s a bit more complex, because I have to work with other people’s projects as well, so I have a
~/Work
folder and in it several folders by type of stuff, e.g.ops
for operational stuff such as scripts to deploy stuff or grant permissions,code
for servers (and client) code, etc. Also if I’m working on something specific that requires multiple repos I create a folder for that project with the repos inside.Everything is in git, and I tend to use IntelliJ as an IDE. So my projects are all in ~/Ideaprojects/[PROJECTNAME]
When I change devices or hit file size limits, I’ll compress and send things to my NAS.
Whaaatt!?!!? That sounds like you don’t use git? You should use git. It is a requirement for basically any job and there is no reason to not use it on every project. Then you can keep your projects on a server somewhere, on your NAS if you want else something like github/gitlab/bitbucket etc. That way it does not really matter about your local projects, only what is on the remote and with decent backups of that you don’t need to constantly archive things from your local machine.
Yeah I think a local Git server would be good, will try our forgejo since people seem to like it— I’ve been using git for a lot of projects but not so much for large files and HW stuff since when using GitHub there are size limitations. Does seem like it would be freeing to be able to delete whatever I want from my workstation without worrying about losing stuff
Size limitations? In git?
What is the average size of your source code files?
Normally you’d never run out of space in git unless you’re committing large binary files.
I put stuff in places, then immediately forget where I put them.
“… and God said, let us make man in our own image.”
I push every project I work on right away to my gitea instance. If I expect not to work on something for some time I just delete the local copy.
When I change devices or hit file size limits, I’ll compress and send things to my NAS.
Well, that sounds inconvenient.
Yeah, I really should start using Git for everything, but I’ve been working with a lot of large datasets recently (mostly EEG data). A big part of improving accuracy comes from cleaning the data, which is huge and takes a while to process. I could set up a local Git server to keep track of everything or just save the base data files and regenerate as needed, but on my current setup, that process can take anywhere from 2-6 hours depending on the task. So for now, I’ve just been managing everything locally to save time.
git LFS might be for you. If the data takes so long to reprocess I think it is fine to check it in (possibly using LFS).
I keep a root folder. On Windows it’s in c:\something on Linux it’s in /something
Under there I’ve got projects organized by language. This helps me organize nix shells and venvs.
Syncthing keeps the code bases and synced between multiple computers
I don’t separate work from home because they don’t live in the same realm.
Only home stuff in the syncthing.
${HOME}/Projects/(Personal|Work)/<project name>
If either folder gets to busy I start to create projects Meta folders that normally corospond with a gitlab group.
I just blow out the folders with a good ol rm -rf ./ And git pull if I want to mess with it again.
Similar question was recently asked here
Generally what I’ve seen work well in my career and is consistent across thousands of devs I’ve worked with:
~/[whateverFolderNameYouWillRemember]/[organization]/[project]
I recommend when it comes to finding things to just use a fuzzy finder, such as fzf.
Building on this, I recommend zoxide instead of only fzfing or regexping.
For people who like to keep everything they ever create, like college students, you can use
z 18.04/1
to get to a directory like~/hw/random-school/fresh-1/analysis-18.04/pset1
.Lets you nest without fear.
(Also, about your question: I’ve personally used
~/git/<projname>/
and~/git/<org>/<projname>
at the same time – e.g.~/git/aur/fuzzel-git
)
I used to be put everything in ~/Programming at the top level. I later started grouping projects by type (JVM, Web etc.) in subfolders because it was getting hard to find things. This was synced with Nextcloud. However, I then at some point passed 2 million files (200GB) in said folder and decided to search for a better solution.
I ended up using a selfhosted Forgejo instance. It allows for easy code searching across all projects, tagging projects by topic and language, LFS, and has useful project management tools built-in.
I’ve seen a lot of talk about large file sizes. How can you realistically reach 200GB in text? That’s around 2*10^11 characters. Or do you guys store something else as well, like sqls of data or pictures/textures/models?
It consisted of tensors weights, datasets (which can reach several gigabytes), images, 3d models, and roughly 250+ programming projects with binaries, git without LFS and also a lot build files.
Nextcloud was able to sync it all, but syncing was getting so slow that I had to keep my new laptop running for almost an entire day to get all synced to it. It also wasn’t that great at excluding certain folders (like build cache folders or NPM package files), you would have to set up exclusions on each device separately. Another problem with Nextcloud sync was that it would sometimes duplicate projects after had been moved in a subfolder.
As an addition to your post, I’m also in the process of learning C/C++, and I’m curious also how others arrange their actual project files and include directories. Like, for example, if there’s a bunch of classes having to do with UI elements, do you just group them each under their own file all in their own directory? I’ve also seen projects where everything was just thrown into the top level directory, both headers and implementation files together in a giant pile of source files.
projects/[rust|cpp|python|..]/proj-name
I used to do this, but imho the used language is hardly a useful index. When does it happen that you want to see everything written python? For me that’s never.
Also where do you put multi-language projects? Like, go backend with typescript frontend or whatever.