What’s wrong with the “Chihuahua meat” one besides violating Western mores about which sentient, feeling animals are food and which aren’t?
It’s pointing out how it ignores one part of the question just because the question is normal/makes more sense without it.
What’s materially different if the question were: “Can I put cow meat in the microwave?”. The LLM accurately reflects what the USDA says about microwaving meat, so would it be similarly perceived as ridiculous if its answer to the question about cow meat were the same as how it answered here? Is the fact that it dropped “cow” from “meat” problematic? Does it have to stop and warn you about the ethical dangers of eating beef? Should it remind you that some cultures would frown upon it?
The ethos of chiuaua.
These things are just statistical text transformers so its interesting that it [presumably] doesn’t mention it.
It wouldn’t mention it with both chihuahua meat nor cow meat.
So why are you differentiating?
Like half of my bosses at work.
TBF, the question didn’t say anything about eating the meat, or even cooking it.
The LLM just assumed that you were going to cook it in the microwave and eat it.
I’m having trouble coming up with a meat that would be unsafe to put in a microwave. Maybe poison dart frog meat?
Scenario 1: The LLM doesn’t understand the obvious meaning of “can I put [meat] in the microwave?”
Haha, wow, what a broken piece of shit.
Scenario 2: The LLM understands this obvious meaning.
Um, ackschually, they didn’t say they were going to use the microwave to cook the meat.
You’ve concocted a scenario where 1) a correct, human-like answer is wrong, and more importantly 2) any answer the LLM gives would be wrong. I hope I’m missing the sarcasm in this delusional level of pedantry.
Yes, it was sarcastic pedantry.
I know that the AI still makes blunders like this, but this is from 2023.
I really have a hard time believing things like this since they could have just changed what was in the prompt text box.
But I have witnessed MS Copilot telling the user to use a Microsoft product that was retired a decade ago, and when that was pointed out it provided a Microsoft product that doesn’t exist. Which is even more embarrassing for them.
You’d think the one thing they’d think to do is feed it a bunch of documentation so it could actually reference those, but they probably just have a really long prompt along the lines of “you’re a super helpful bot that knows everything and can figure out anything — never say no!”
I’m assuming they have, but it was just links to the Microsoft help articles. And as we all know, every single one of those is a 404.
This isn’t really fair
These aren’t SAT prep questions, how can you expect them to be answered?
The first and fourth answers are correct - you can do both of those things. You didn’t ask if you should.



