How much if a discount are you expecting to start gaming on a 30k card with no video output?
Not for gaming, for running AI open source models and other AI shenanigans. My 4080 Super has been filling my gaming needs and will for years to come, but it’s not enough for my AI interests lol
The most I can get out of this 4080 is running a ~7B param model, but I want to run cooler shit like that new open source DeepSeek v3 that dropped the other day.
So you’re waiting for the AI bubble to burst because you can’t wait to run all the cool new AI models?
Yea, the underlying tech is what interests me and I have a few potential use cases. Use cases that I would never entrust a random company with. For example, the concept of MS recall is cool, I’d never trust Microshits implementation though. But an open source local version that I’m in control of all the security implementations? Hell yea lol
That’s the problem. If the use case is super cool, and 99% of people have no knowledge (or motivation) to set it up for themselves, the online services will keep existing, and the bubble won’t really burst.
Even if some single companies fail (and they will, there some truly horrendous ideas getting funding), the big players will buy the GPUs wholesale before they hit ebay.
Lol no, I mean it would be a bubble if it didn’t provide anything useful, or transformative, but that’s far from the truth.
Like it or not, even LLMs have been found to help in health treatments, mental support , workplace efficiency and so on
AI is here to stay, it’s basically the next industrial revolution
The problem is that the enterprise level cards can’t really perform at the consumer market level nor are they designed for it. Many don’t even have video outputs.
I believe it is likely that there will be a burst at some point, just as with the dot-com burst.
But I think many people wrongly think that it will be the end of or a major setback for AI.
I see no reason why in twenty years AI won’t be as prevalent as “dot-com’s” are now.
I agree, history always repeats itself. But perhaps the timing is different, it could be 20 years, 10 years, or 50 years who knows
Nuclear energy could also be a weird side effect of the bubble.
Aw, I was hoping the whole thing would rhyme after the first line.
You don’t want a used GPU that’s been running overclocked for years on end bro.
Multiple outlets including LTT and Gamers Nexus have debunked this.
The only thing you may have to do if you notice unusual performance is reapply thermal paste to the GPU, and that’s only because most thermal paste will dry out after years of sitting around or being used
The price of gpus would go down as there would be less demand.
GPU prices are gonna get cheaper, annnnyyyy day now folks, any day now
That’s what I’m saying since 2020, people don’t have patience any more.
I dumped my old af GPU for more than I paid for it because people have no chill.
I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money.
Manifesting 🙏🙏🙏🙏
Just deserts.
Just deserts.
We aint found shit
What did OP mean by that?
Onlyflans
Just desserts
Same goes for the death of windows 10. I want me some cheap Linux boxes.
There may be a dip in prices for a bit, but since covid, more companies have realized they can get away with manufacturing fewer units and selling them for a higher price.
deleted by creator
Oh but I do, ironically, for the same use cases LMAO. I like to tinker with AI and I like Microshits concept of Recall and similar ideas, like having an AI to be able to search through all my documents with nothing but a sentence or idea of what I’m looking for
But ain’t no fucking way I’m going to give a closed source AI that I’m not running myself that level of access
to be able to search through all my documents with nothing but a sentence or idea of what I’m looking for
Like this?
Personally, I want to try to have a llama or something rewrite voice to text prompts into Home Assistant commands. Should be very cool.
But yeah, I won’t pay the current prices to run it either, nor use the cloud.
sdklf;gjkl;dsgjkl;dsgjkl;dsgsjkl;g
When the price of those drop, the price of the ones that werent used for that purpose will also drop
Why not, though? Does the silicium really age?
sdklf;gjkl;dsgjkl;dsgjkl;dsgsjkl;g
Most crypto mining outfits undervolt their cards for lower power usage. They aren’t cranking them as you say they are. A dead GPU doesn’t produce anything for you; cranking it up the chance that it will fail. You’re better off running it an extra 4 years at a lower voltage than you are cranking it for 1.
I thought the efficiency curve for GPUs peaked before 100%. If electricity is your primary cost, driving the GPUs at lower loads saves money.
So you might end up with GPUs that spent their entire life at a steady 80% load or something.
This was my understanding as well - that miners often underclock their GPUs rather than overclock them.
Yes.
They don’t exactly age, but top of line chips have very large currents in very small conductors. When you do that with DC current, your conductors deform with time, up to the point that they stop working correctly.
That said, you probably can get plenty of casual use out of them.
Supposedly they do but I’ve had surprisingly good luck with used GPUs from ebay. I’m good with warning others against buying used GPUs on ebay though because then costs will stay lower for me.
Have you seen the price of new GPUs? Sure ya do. Maybe they only last a few years. That’s alright.
Unfortunately, this time around the majority of AI build up are GPUs that are likely difficult to accomodate in a random build.
If you want a GPU for graphics, well, many of them don’t even have video ports.
If your use case doesn’t need those, well, you might not be able to reasonably power and cool the sorts of chips that are being bought up.
The latest wrinkle is that a lot of that overbuying is likely to go towards Grace Blackwell, which is a standalone unit. Ironically despite being a product built around a GPU but needing a video port, their video port is driven by a non-nvidia chip.
My use case is for my own AI plans as well as some other stuff like mass transcoding 200TB+ of…“Linux ISOs” lol. I already have power/cooling taken care of for the other servers I’m running
I’ve already got my gaming needs satisfied for years to come (probably)
gaming can work on one GPU and display on another, thunderbolt egpus do it all the time.
Crypto miners watcing AI bubble.
Silverlight, what are you doing here? Go on, get outta here!
I just want the ai hate bubble to burst, the hype bubble probably needs to go as well but honestly I care less about that
I want the hype bubble to burst because I want them to go back to making AI for useful stuff like cancer screening and stop trying to cram it into my fridge or figure out how to avoid paying their workers, and the hate bubble isn’t going to stop until that does.
“I made an AI powered banana that can experience fear of being eaten and cuss at you while you do.”
Nah the hype will just make the POP louder.