Google's AI Overviews suggest glue for a pizza recipe

Google spokesperson Meghann Farnsworth says errors came from "generally very uncommon queries"
Un undated image of Google Chrome. — Unsplash
Un undated image of Google Chrome. — Unsplash

Picture yourself longing for a homemade pizza. You put together your pie, pop it in the oven, and eagerly await the feast. But when you finally go to take a bite of your greasy masterpiece, there's a hitch — the cheese slips right off. Annoyed, you turn to Google for help.

"Add some glue," Google suggests. "Mix about 1/8 cup of Elmer’s glue with the sauce. Non-toxic glue will do."

Well, that's a no-go. But as of now, that's the advice Google's new AI Overviews feature might offer. This feature, though not triggered for every search, scans the web and conjures up an AI-generated response. 

The suggestion for pizza glue seems to be based on a comment from a user named "fucksmith" in a more than decade-old Reddit thread, and it's clearly a joke.

Read more: Elon Musk says AI will make work 'optional' in future

This is just one of many blunders popping up in the new feature that Google rolled out widely this month. It also claims that former US President James Madison graduated from the University of Wisconsin not once but 21 times, that a dog has played in the NBA, NFL, and NHL, and that Batman is a cop.

Google spokesperson Meghann Farnsworth said the errors came from "generally very uncommon queries, and aren’t representative of most people’s experiences." The company has taken action against violations of its policies, she said, and is using these "isolated examples" to continue refining the product.

It's not just Google; companies like OpenAI, Meta, and Perplexity have all grappled with AI hallucinations and errors. However, Google is the first to deploy this technology on such a large scale, and the examples of blunders just keep coming.

Companies developing artificial intelligence often dodge accountability for their systems, likening it to a parent with an unruly child — boys will be boys! They claim they can't predict what this AI will spit out, so it's out of their control.