You can trick Google's AI Overviews into explaining made-up idioms

Date:

Share:


As Big Tech pours countless dollars and resources into AI, preaching the gospel of its utopia-creating brilliance, here’s a reminder that algorithms can screw up. Big time. The latest evidence: You can trick Google’s AI Overview (the automated answers at the top of your search queries) into explaining fictional, nonsensical idioms as if they were real.

According to Google’s AI Overview (via @gregjenner on Bluesky), “You can’t lick a badger twice” means you can’t trick or deceive someone a second time after they’ve been tricked once.

That sounds like a logical attempt to explain the idiom — if only it weren’t poppycock. Google’s Gemini-powered failure came in assuming the question referred to an established phrase rather than absurd mumbo jumbo designed to trick it. In other words, AI hallucinations are still alive and well.

Google / Tech Reader

We plugged some silliness into it ourselves and found similar results.

Google’s answer claimed that “You can’t golf without a fish” is a riddle or play on words, suggesting you can’t play golf without the necessary equipment, specifically, a golf ball. Amusingly, the AI Overview added the clause that the golf ball “might be seen as a ‘fish’ due to its shape.” Hmm.

Then there’s the age-old saying, “You can’t open a peanut butter jar with two left feet.” According to the AI Overview, this means you can’t do something requiring skill or dexterity. Again, a noble stab at an assigned task without stepping back to fact-check the content’s existence.

There’s more. “You can’t marry pizza” is a playful way of expressing the concept of marriage as a commitment between two people, not a food item. (Naturally.) “Rope won’t pull a dead fish” means that something can’t be achieved through force or effort alone; it requires a willingness to cooperate or a natural progression. (Of course!) “Eat the biggest chalupa first” is a playful way of suggesting that when facing a large challenge or a plentiful meal, you should first start with the most substantial part or item. (Sage advice.)

Screenshot of a Google AI overview explaining the (nonexistent) idiom,
Google / Tech Reader

This is hardly the first example of AI hallucinations that, if not fact-checked by the user, could lead to misinformation or real-life consequences. Just ask the ChatGPT lawyers, Steven Schwartz and Peter LoDuca, who were fined $5,000 in 2023 for using ChatGPT to research a brief in a client’s litigation. The AI chatbot generated nonexistent cases cited by the pair that the other side’s attorneys (quite understandably) couldn’t locate.

The pair’s response to the judge’s discipline? “We made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth.”

This article originally appeared on Tech Reader at https://www.engadget.com/ai/you-can-trick-googles-ai-overviews-into-explaining-made-up-idioms-162816472.html?src=rss



Source link

━ more like this

Mike Patrick, Voice of Sunday Night N.F.L. Games on ESPN, Dies at 80

Mike Patrick, a versatile sportscaster for ESPN who called National Football League games on Sunday nights for 18 years, died on Sunday in...

This Lenovo ThinkPad is usually $2,059 — today it’s under $1,000

The Lenovo ThinkPad L13 Yoga Gen 4 2-in-1 laptop is available from Lenovo with a 54% discount on its estimated value, for a...

The best budget Bluetooth speaker just got even more affordable

The Soundcore Motion 300 Bluetooth Speaker is a portable powerhouse with awesome sound quality, and today, it’s discounted to $55 at Amazon! Source link...

The Samsung Odyssey G8 gaming monitor is a steal with this deal

The 32-inch Samsung Odyssey Neo G8 gaming monitor, which features 4K Ultra HD resolution and a 240Hz refresh rate, is on sale from...

The JBL PartyBox 320, the best Bluetooth party speaker, is $100 off

The JBL PartyBox 320 Bluetooth Speaker is $100 off today. Purchase at Walmart, Amazon, or Dell to score this awesome discount! Source link
spot_img