Alt Text: an image of Agent Smith from The Matrix with the following text superimposed, “1999 was described as being the peak of human civilization in ‘The Matrix’ and I laughed because that obviously wouldn’t age well and then the next 25 years happened and I realized that yeah maybe the machines had a point.”
I genuinely do not understand these very obviously biased comments. By the very definition of AI, we have had it for decades, and suddenly people say we don’t have it? I don’t get it. Do you hate LLMs so much you want to change the entire definition for AI (and move it under AGI or something)? This feels unhinged, disconnected from reality, biases so strong it looks like delusions
What is delusional is calling a token generator intelligent. These programs don’t know what the input is, nor do they understand what they put out. They “know” that after this sequence of tokens, what a likely successive token is based on previously supplied data.
They understand nothing. They generate nothing new. They don’t think. They are not intelligent.
They are very cool, very impressive and quite useful. But intelligent? Pffffffh
Why is it so hard for you to understand word “artificial”? It seems like you even avoid it. Just like artificial everything, especially weed and flavours, it’s not the real thing, and was never meant to be the real thing, and yet you’re essentially an old man yelling at cloud because something is artificial and does not act like the real human intelligence
Artificial means man made, not literally not it
Like “artificial stone” means “a man made stone-equivalent material”, not a pink fluffy unicorn
I don’t understand what point are you trying to make. Yes, AI, and everything else artificial is man made, I never said it was not. Is it anywhere good as the human intelligence? No, I was also clear about that, so what are you arguing right now? The original argument was whether LLM counts as AI (and existence of AI itself), and by every definition, it does.
We should steal the term from Mass effect, what we have is early VI, virtual intelligence, not AI.
Or we call it what it is, a token generator. Or input imitator, would fit as well.
This argument pre-dates the modern LLM by several decades. When the average person thinks of AI, they think of Star Wars or any of a myriad of other works of science fiction. Most people have never heard the term in any other context and so are offended by the implied comparison (in their understanding of the word) of LLM models as being equal to Data from Star Trek.