Different idea: how about a new set of knives?
I don’t think knives was a suggested option.
Then how about a new set of knives?
Knives are a good choice, just make sure it’s new. Maybe a whole set?
If you’re thinking of giving a set, I’d like to suggest one of knives.
How about a set of knives?
Would a set of Machetes be considered a set of knives? I would totally give those as a gift.
Ate they new?
This just proves that Google’s AI is a cut above the rest!
Cutting edge technology, eh
Saved that for later!
All 3 pixels?
Who wouldn’t love receiving 17 new sets of knives?!
I counted and it’s accurate
I counted it too and it’s accurate
(in case the reader needs more data points)
I didn’t count, but I believe these three
(in case the reader wanted some random guys opinion)
I counted it too cause I didn’t trust these guys but they were right.
My wife is going to stab me
But with such nice, shiny knives. They look new. Where’d you get them?
My wife left me… and so did my knives, it turns out
I’m from Finland. We like knives over here.
That’s entirely too many knives.
I got an old Finnish fish fillet knife
Have you considered a new set of knives?
How can you have too many knives? There’s just so much variety… Butter knife, steak knife, fish knife, fruit carving knife, cheese knife… GOTTA CATCH THEM ALL!
The AI wants to be a chef
Or a serial killer.
It can be both!
deleted by creator
W
TF2 Pyro starter pack
What’s the associated system instruction set to? If you’re using the API it won’t give you the standard Google Gemini Assistant system instructions, and LLMs are prone to go off the rails very quickly if not given proper instructions up front since they’re essentially just “predict the next word” functions at heart.
Redacted
Interesting, I don’t see any huge red flags there.
I gather frequency penalties have fallen out of favour, due to the harmful side effects being worse than the very occasional loop trap.
It can happen on most LLMs and is usually programmed to decentivize repeating text heavily.
I believe what happens is that when the LLM is choosing what word to use, it looks back on the sentence and sees that it talked about knives, so it wants to continue talking about knives, then it gets itself into a loop.
deleted by creator
Autocomplete with delusions of grandeur
Schizophren-AI
Joke’s on you, I married a tonberry.
You should take the hint.
Average ai behavior
You surely will not regret a new set of knives
Google’s new cooperation with a knife manufacturer