I’m beautiful and tough like a diamond…or beef jerky in a ball gown.

  • 30 Posts
  • 72 Comments
Joined 8 months ago
cake
Cake day: July 15th, 2025

help-circle




  • As a current web developer I’m with you except for the tables. I hate tables with a burning, fiery passion.

    I do like responsive design to account for mobile, tablet, and desktop so I try to blend that with classic “just works / just here for the data” layouts and generally minimize the flash in favor of the content. In other words, my projects go to a lot of work to look that simple lol.












  • Disclaimer: : All of my LLM experience is with local models in Ollama on extremely modest hardware (an old laptop with NVidia graphics) , so I can’t speak for the technical reasons the context window isn’t infinite or at least larger on the big player’s models. My understanding is that the context window is basically its short term memory. In humans, short term memory is also fairly limited in capacity. But unlike humans, the LLM can’t really see (or hold) the big picture in its mind.

    But yeah, all you said is correct. Expanding on that, if you try to get it to generate something long-form, such as a novel, it’s basically just generating infinite chapters using the previous chapter (or as much of the history fits into its context window) as reference for the next. This means, at minimum, it’s going to be full of plot holes and will never reach a conclusion unless explicitly directed to wrap things up. And, again, given the limited context window, the ending will be full of plot holes and essentially based only on the previous chapter or two.

    It’s funny because I recently found an old backup drive from high school with some half-written Jurassic Park fan fiction on it, so I tasked an LLM with fleshing it out, mostly for shits and giggles. The result is pure slop that seems like it’s building to something and ultimately goes nowhere. The other funny thing is that it reads almost exactly like a season of Camp Cretaceous / Chaos Theory (the animated kids JP series) and I now fully believe those are also LLM-generated.












  • I don’t even bother with local ports anymore. It’s just too much hassle when I switch providers, email services all seem to universally sinkhole anything originating from a residential IP even if I am able to convince them to unblock 25/TCP, and I refuse to pay extra for a static IP or upsell to business class at a massive price increase.

    My ISP, while otherwise fine, still has not rolled out IPv6 yet and the DHCPv4 lease duration is short and will randomly assign a different IP rather than renewing the lease on the existing one. I don’t like relying on dynamic DNS or relying on running a daemon to update my public DNS records when my public IP changes. Been there, done that, and bought a crappy t-shirt at the gift shop.

    I’ve had a VPS for close to 10 years now that is my main frontend and, through some VPN and routing trickery, allows me to have my email server on-prem but use the VPS for all inbound and outbound communication. A side effect benefit of this setup is I can run my email server from literally anywhere and from anything with an internet connection. I’ve got a copy of my email stack on a Pi Zero clone that stays in sync with my main one. During long power outages, I can start that up and run it from a hotspot with a power bank running it for almost 2 days (or indefinitely when I’m also charging the power bank from a solar panel lol).