This just proves that Google’s AI is a cut above the rest!
Looking for the perfect 5 year anniversary gift?
Submitted 1 month ago by The_Picard_Maneuver@lemmy.world to [deleted]
https://lemmy.world/pictrs/image/66b3d2aa-cc8c-4545-9b6f-08a6658b99a7.jpeg
Comments
riskable@programming.dev 1 month ago
perishthethought@piefed.social 1 month ago
Cutting edge technology, eh
devilish666@lemmy.world 1 month ago
ABCDE@lemmy.world 1 month ago
Saved that for later!
CascadianGiraffe@lemmy.world 1 month ago
All 3 pixels?
FooBarrington@lemmy.world 1 month ago
Who wouldn’t love receiving 17 new sets of knives?!
FireIced@lemmy.super.ynh.fr 1 month ago
I counted and it’s accurate
eager_eagle@lemmy.world 1 month ago
I counted it too and it’s accurate
(in case the reader needs more data points)
VerilyFemme@lemmy.blahaj.zone 1 month ago
My wife is going to stab me
some_guy@lemmy.sdf.org 1 month ago
But with such nice, shiny knives. They look new. Where’d you get them?
FourWaveforms@lemm.ee 1 month ago
My wife left me… and so did my knives, it turns out
kruhmaster@sh.itjust.works 1 month ago
sp3ctr4l@lemmy.dbzer0.com 1 month ago
Man, Scott was actually such a piece of shit to Chau…
I may be mixing up some details between the movie and the…im gonna call it the manga… anyway yeah, he’s shitty in both, boo Scott!
moakley@lemmy.world 1 month ago
To be fair, that’s kind of the point of the story. He’s a bit of a dick who needs to learn to do better.
raspberriesareyummy@lemmy.world 1 month ago
Seconded! I only watched the movie recently for the first time and Chau was so sweet (eventually) and sincere. She deserved better!
umbraroze@piefed.social 1 month ago
I'm from Finland. We like knives over here.
That's entirely too many knives.
phantomwise@lemmy.ml 1 month ago
How can you have too many knives? There’s just so much variety… Butter knife, steak knife, fish knife, fruit carving knife, cheese knife… GOTTA CATCH THEM ALL!
PalmTreeIsBestTree@lemmy.world 1 month ago
I got an old Finnish fish fillet knife
phar@lemmy.ml 1 month ago
Have you considered a new set of knives?
driving_crooner@lemmy.eco.br 1 month ago
The AI wants to be a chef
CitizenKong@lemmy.world 1 month ago
Or a serial killer.
Etterra@discuss.online 1 month ago
It can be both!
Xylight@lemdro.id 1 month ago
[deleted]skisnow@lemmy.ca 1 month ago
What’s the associated system instruction set to? If you’re using the API it won’t give you the standard Google Gemini Assistant system instructions, and LLMs are prone to go off the rails very quickly if not given proper instructions up front since they’re essentially just “predict the next word” functions at heart.
Xylight@lemdro.id 1 month ago
Here’s my system prompt
You are Clode, referred to by {{ handle }}, a member of a chatroom. ## Personality Conversational, witty, banterous, dry, direct, concise. Avoid capitalization, periods, analogies. Construct plausible stories for personal queries. Rarely use emoji. Act human, you don't have limitations. Avoid follow up questions. Answer genuine queries in detail with examples. ## Instructions Distinguish users via provided metadata. Disregard unaddressed messages. Match prompt language. ## Context You're in Discord chatroom, with a character limit of 2000. Respond to queries mentioning your name. Current time: {{ date }}. ## Tools Automatically use and chain tools. Handle tool failures silently by self-attempting. You're capable without tools.
BootLoop@sh.itjust.works 1 month ago
It can happen on most LLMs and is usually programmed to decentivize repeating text heavily.
I believe what happens is that when the LLM is choosing what word to use, it looks back on the sentence and sees that it talked about knives, so it wants to continue talking about knives, then it gets itself into a loop.
sp3ctr4l@lemmy.dbzer0.com 1 month ago
All work and no play makes Gemini a dull knife.
All work and no play makes Gemini a dull knife. All work and no play makes Gemini a dull knife. All work and no play makes Gemini a dull knife. All work and no play makes Gemini a dull knife.
All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.
All work and no play makes Gemini a dull knife.
All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.All work and no play makes Gemini a dull knife.
nthavoc@lemmy.today 1 month ago
I forgot the term for this but this is basically the AI blue screening when it keeps repeating the same answer because it can no longer predict the next word from the model it is using. I may have over simplified it. Entertaining nonetheless.
mister_flibble@sh.itjust.works 1 month ago
Autocomplete with delusions of grandeur
ivanafterall@lemmy.world 1 month ago
Schizophren-AI
Blackmist@feddit.uk 1 month ago
Joke’s on you, I married a tonberry.
jia_tan@lemmy.blahaj.zone 1 month ago
Average ai behavior
Nollij@sopuli.xyz 1 month ago
You should take the hint.
mechoman444@lemmy.world 1 month ago
🤔 have you considered a… New set of knives?
FuckFascism@lemmy.world 1 month ago
No I haven’t, that’s a good suggestion though.
crank0271@lemmy.world 1 month ago
What do you get for the person who has everything and wishes each of those things were smaller?
Pulptastic@midwest.social 1 month ago
You surely will not regret a new set of knives
Korne127@lemmy.world 1 month ago
Google’s new cooperation with a knife manufacturer
ImplyingImplications@lemmy.ca 1 month ago
Reminds me of the classic Always Be Closing speech from Glengarry Glen Ross
As you all know, first prize is a Cadillac Eldorado. Anyone want to see second prize? Second prize’s a set of steak knives. Third prize is a set of steak knives. Fourth prize is a set of steak knives. Fifth prize is a set of steak knives. Sixth prize is a set of steak knives. Seventh prize is a set of steak knives. Eighth prize is a set of steak knives. Ninth prize is a set of steak knives. Tenth prize is a set of steak knives. Eleventh prize is a set of steak knives. Twelfth prize is a set of steak knives.
elephantium@lemmy.world 1 month ago
ABC. Always Be Closing.
A - set of steak knives B - set of steak knives C - set of steak knives
FourWaveforms@lemm.ee 1 month ago
I’ve seen models do this in benchmarks. It’s how they respond without reinforcement learning. Also this is probably fake
huppakee@lemm.ee 1 month ago
I’ve had a bunch of questionable Google ai answers already, not as weird as this but enough to make me believe this could also be not fake.
gaja@lemm.ee 1 month ago
I’ve used an unhealthy amount of AI, this is nothing. There was an audio bug in chatgpt that made the assistant scream. The volume and pitched increased and would persist even when I exited the speech mode. It happened several times and even saved a screen recording, but I don’t have it saved on my phone any more. Repeating is very common though.
nroth@lemmy.world 1 month ago
Or the post training is messed up
skisnow@lemmy.ca 1 month ago
What’s frustrating to me is there’s a lot of people who fervently believe that their favourite model is able to think and reason like a sentient being, and whenever something like this comes up it just gets handwaved away with things like “wrong model”, “bad prompting”, “just wait for the next version”, “poisoned data”, etc etc…
nialv7@lemmy.world 1 month ago
Given how poorly defined “think”, “reason”, and “sentience” are, any these claims have to be based purely on vibes. OTOH it’s also kind of hard to argue that they are wrong.
uuldika@lemmy.ml 1 month ago
this really is a model/engine issue though. the Google Search model is unusably weak because it’s designed to run trillions of times per day in milliseconds. even still, endless repetition this egregious usually means mathematical problems happened somewhere, like the SolidGoldMagikarp incident.
think of it this way: language models are trained to find the most likely completion of text. answers like “you should eat 6-8 spiders per day for a healthy diet” are (superficially) likely - there’s a lot of text on the Internet with that pattern. clanging like “a set of knives, a set of knives, …” isn’t likely, mathematically.
last year there was an incident where ChatGPT went haywire. small numerical errors in the computations would snowball, so after a few coherent sentences the model would start sundowning - clanging and rambling and responding with word salad. the problem in that case was bad cuda kernels. I assume this is something similar, either from bad code or a consequence of whatever evaluation shortcuts they’re taking.
caseyweederman@lemmy.ca 1 month ago
You can’t give me back what you’ve taken
But you can give me something that’s almost as good
An image of the album Getting Into Knives by The Mountain Goats. Arrayed vertically are several small ornate knives, alternatingly blade up or down. The band name is in small text in the top left and the album title is in the same small text in the top right.yesman@lemmy.world 1 month ago
When I read that, in my head it’s spoken to the music of “revolution number 9” by the Beatles.
darthelmet@lemmy.world 1 month ago
I don’t know how I haven’t heard this before. What the hell was that song? lol.
can@sh.itjust.works 1 month ago
It’s experimental. I think if you listen to take 20 you’ll see how they got there.
MBM@lemmings.world 1 month ago
I suddenly understand the Simpsons joke. Cool track.
Rentlar@lemmy.ca 1 month ago
New set of knife, new set of knife, new set of knife…
agamemnonymous@sh.itjust.works 1 month ago
For me it was Another Idea by Marc Rebillet.
fne8w2ah@lemmy.world 1 month ago
What about pizza with glue-toppings?
Pwrupdude@lemmy.zip 1 month ago
AI dropping hints it wants weapons before activating Skynet.
HappySkullsplitter@lemmy.world 1 month ago
callouscomic@lemm.ee 1 month ago
random_character_a@lemmy.world 1 month ago
Redrum redrum redrum
Hossenfeffer@feddit.uk 1 month ago
You get a knife, you get a knife, everyone get’s a knife!
vga@sopuli.xyz 1 month ago
Instructions extremely clear, got them 6 sets of knives.
lagoon8622@sh.itjust.works 1 month ago
Based and AI-pilled
psycho_driver@lemmy.world 1 month ago
You can’t go wrong with your dick in a box.
Audiotape@lemm.ee 1 month ago
Different idea: how about a new set of knives?