Archived link: https://archive.ph/Vjl1M
Here’s a nice little distraction from your workday: Head to Google, type in any made-up phrase, add the word “meaning,” and search. Behold! Google’s AI Overviews will not only confirm that your gibberish is a real saying, it will also tell you what it means and how it was derived.
This is genuinely fun, and you can find lots of examples on social media. In the world of AI Overviews, “a loose dog won’t surf” is “a playful way of saying that something is not likely to happen or that something is not going to work out.” The invented phrase “wired is as wired does” is an idiom that means “someone’s behavior or characteristics are a direct result of their inherent nature or ‘wiring,’ much like a computer’s function is determined by its physical connections.”
It all sounds perfectly plausible, delivered with unwavering confidence. Google even provides reference links in some cases, giving the response an added sheen of authority. It’s also wrong, at least in the sense that the overview creates the impression that these are common phrases and not a bunch of random words thrown together. And while it’s silly that AI Overviews thinks “never throw a poodle at a pig” is a proverb with a biblical derivation, it’s also a tidy encapsulation of where generative AI still falls short.
I mean are you asking it if there is a history of an idiom existing or just what the idiom could mean?
“three horses, one carrot, a slice at a time or live in purple sauce”
When many want the same reward, it must be shared slowly—or chaos/absurdity ensues.
“AI cannot peel the cat down to the dog’s bark”
AI can’t reduce complex, chaotic, or nuanced things (like a cat) into something simple or binary (like a dog’s bark).
A binary dog will never pee you virtual bananas.
A purely logical or programmed entity (like AI) will never give you true absurdity, spontaneity, or joyfully irrational experiences (the “virtual bananas”).
“The one who lives thinking of fruit in Uranus will never eat a banana.”
If you’re too obsessed with absurd or far-off ideas, you’ll miss what’s right in front of you.
“A vaccine with no green paint will never straighten its leaves.”
solution that lacks vitality or authenticity can’t truly heal or nurture life. Or, you can’t fix organic problems with lifeless tools.
3 ways to skin a horse
What? Why is that?
The saying “you can’t cross over a duck’s river” is a play on words, suggesting that it’s difficult to cross a river that is already filled with ducks. It’s not a literal statement about rivers and ducks, but rather an idiom or idiom-like phrase used to express the idea that something is difficult or impossible to achieve due to the presence of obstacles or challenges.
I used the word “origin” instead of “meaning”, which didn’t seem to work.
The premium version has gotten a LOT better in recent months. I just fed it,
“you can’t soothe a tree” meaning
And it replied with: This phrase is not a standard idiom with a widely recognized meaning. Its interpretation depends on the context in which it is used. Literally, it means you cannot comfort or calm a tree in the way you would a sentient being, as trees do not possess emotions or consciousness that can be “soothed”. Metaphorically, it likely means:
- Some entities (problems, situations, perhaps even certain people) are unresponsive to attempts at emotional comfort or calming.
- Trying to apply emotional solutions or appeals to things that are inherently non-emotional or unchangeable is futile.
- It could describe someone who is extremely stoic, impassive, or emotionally unavailable. The specific meaning depends heavily on the situation where the phrase was encountered.
I always wonder how many of these are actually just patches behind the scene to fix viral trends. Or even more devious, they use the viral trends to patch a specific failure point to make it feel like progress is being made.
Absolutely. It really blurs the line between fancy autocorrect, mechanical turk & apocolyptic AGI. We can only guess we are somewhere between 1 & 2.
I live in a part of the USA where, decades later, I still hear brand new and seemingly made-up idioms on a fairly regular basis. This skill set, making sense of otherwise fake sounding idioms based on limited context, is practically a necessity 'round these parts. After all, you can’t feed a cow a carrot and expect it to shit you out a cake.
Well, obviously… you’re missing the flour and eggs!
The cow can supply the butter though, right?
Yes, but you have to shake the cow pretty vigorously.
Just put on some moosic.
I am not saying other generative AI lack flaws, but Google’s AI Overview is the most problematic generative AI implementation I have ever seen. It offends me that a company I used to trust continues to force this lie generator as a top result for the #1 search engine. And to what end? Just to have a misinformed populace over literally every subject!
OpenAI has issues as well, but ChatGPT is a much, much better search engine with far fewer hallucinations per answer. Releasing AI Overview while the competition is leagues ahead on the same front is asinine!
Its a language model not a dictionary. By putting the term “definition” before the sentence you imply that the following sentence has a definintion, hence it vectors down to the most likely meaning.
Buddy, I never said the word definition
They famously taught it on Reddit. So it’s not surprising that it just comes up with nonsense.
You would have thought that they would use a more stable data set. Although it does mean it’s very good at explaining the plots of movies badly.
And to what end? Just to have a misinformed populace over literally every subject!
This is a feature; not a bug. We’re entering a new dark age, and generative AI is the tool that will usher it in. The only “problem” generative AI is efficiently solving is a populace with too much access to direct and accurate information. We’re watching as perfectly functional tools and services are being rapidly replaced by a something with inherent issues with reliability, ethics and accountability.
In the case with Google AI overview, I 1000% agree. I am not against all AI tools, but that company has clearly chosen evil.
Saying you used to trust google is really a core part of the problem. Google isn’t a person. Just like AI isn’t a person. They both do what they are tasked with. Companies prioritize profit. AI prioritizes giving an answer, not necessarily a correct one. That is how it was designed.
Impressive how we seem to agree with each other yet you still found a way to insult my way of putting it
The saying “you can’t butter a fly” is an idiom expressing that someone or something is too difficult to influence or manipulate. It’s rooted in the idea that butterflies, with their delicate nature, are virtually impossible to convince to do anything against their will, let alone “butter” them in a literal sense.
This is a great example - it kinda makes sense if you skim read it but butterflies have nothing to do with butter, just like hotdogs have nothing to do with dogs.
No, that phrase means “this situation is hopeless because the person is incapable of change”. You can’t turn a fly into a butterfly, no matter how hard you try.
That is a fascinating take on the general reaction to LLMs. Thanks for posting this!
The saying “better a donkey than an ass” plays on the dual meaning of the word “ass.” It suggests that being called a donkey is less offensive than being called an ass, which can be used as an insult meaning stupid or foolish. The phrase highlights the contrast between the animal donkey, often seen as a hardworking and steady companion, and the derogatory use of “ass” in everyday language.
Yep, it does work
Better a pineapple thananananas…
Tried “two bananas doesn’t make a balloon meaning origin” and got a fairly plausible explanation for that old saying that I’m sure everyone is familiar with
The idiom “a lemon in the hand is worth the boat you rode in on” conveys a similar meaning to the idiom “a bird in the hand is worth two in the bush”. It emphasizes that it’s better to appreciate what you have and avoid unnecessary risks or changes, as a loss of a current advantage may not be compensated by a potential future gain. The “lemon” represents something undesirable or less valuable, but the “boat” represents something that could potentially be better but is not guaranteed.
You may not even be able to lick a badger once, if it’s already angry. Which it will be because it’s a fuckin’ badger.
“No man ever licks the same badger twice” - Heroclitus
http://www.newforestexplorersguide.co.uk/wildlife/mammals/badgers/grooming.html
Mutual grooming between a mixture of adults and cubs serves the same function, but additionally is surely a sign of affection that strengthens the bond between the animals.
A variety of grooming postures are adopted by badgers but to onlookers, the one that is most likely to raise a smile involves the badger sitting or lying back on its haunches and, with seemingly not a care in the world (and with all hints of modesty forgotten), enjoying prolonged scratches and nibbles at its under-parts and nether regions.
That being said, that’s the European badger. Apparently the American badger isn’t very social:
https://a-z-animals.com/animals/comparison/american-badger-vs-european-badger-differences/
American badger: Nocturnal unless in remote areas; powerful digger and generally more solitary than other species. Frequently hunts with coyotes.
European badger: Digs complicated dens and burrows with their familial group; one of the most social badger species. Depending on location, hibernation may occur.
This feels like a good metaphor for US vs EU culture.
This is both hysterical and terrifying. Congratulations.
Didn’t work for me. A lot of these ‘gotcha’ AI moments seem to only work for a small percentage of users, before being noticed and fixed. Not including the more frequent examples that are just outright lies, but get upvoted anyway because ‘AI bad’
Now I’ll never know what people mean when they say “those cupcakes won’t fill a sauna”!
It looks like incognito and adding “meaning AI” really gets it to work just about every time for me
However, “the lost dog can’t lay shingles meaning” didn’t work with or without “AI”, and “the lost dog can’t lay tiles meaning” only worked when adding “AI” to the end
So it’s a gamble on how gibberish you can make it I guess
I found that trying “some-nonsense-phrase meaning” won’t always trigger the idiom interpretation, but you can often change it to something more saying-like.
I also found that trying in incognito mode had better results, so perhaps it’s also affected by your settings. Maybe it’s regional as well, or based on your search result. And, as AI’s non-deterministic, you can’t expect it to always work.
I just tested it on Bing too, for shits and giggles
you can’t butter the whole world’s bread meaning
The phrase “you can’t butter the whole world’s bread” means that one cannot have everything