Advertisement
Culture

‘Banana slurp’ and ‘Cyclops vibing’: We fed Google Search’s AI nonsense phrases and it pretended to know them

Get on this cyclops vibing.

Photo of Lindsey Weedston

Lindsey Weedston

the Google logo and a Google search screen

Google Search’s generative AI Overview appears to be creating definitions for phrases that don’t exist, revealing a potential flaw in how the system handles nonsensical queries. Rather than admitting ignorance, the AI confidently provides explanations for imaginary terms like “banana slurp” and “cyclops vibing,” treating them like legitimate slang while using qualifiers to hedge its answers. This behavior is likely driving significant operational expense for Google, considering OpenAI CEO Sam Altman acknowledged the other day how users telling ChatGPT “please” and “thank you” after queries adds up to “tens of millions of dollars.”

Featured Video

Much like how Google’s AI Overview wanted to genuinely answer “how many rocks shall I eat?,” this may be another example of people-pleasing behavior from the large language model (LLM). AI still hasn’t yet grown up enough to understand when people are messing with it.

Are you cyclops vibing? Have you got your banana slurp on?

After noticing that Google’s AI Overview would give a summary to most queried searches, we started experimenting by prompting it with nonsense phrases followed by “meaning.” Though it sometimes refused to answer, it often attempted to interpret what we were trying to say, sometimes with quite a bit of confidence.

Advertisement
Google AI Overview answer for 'banana slurp meaning' reading ''Banana slurp' is a slang term that is not a common saying. It could potentially be a misinterpretation of 'that's bananas' or 'she/he went bananas,' which both mean something is crazy, wild, or extremely agitated. It's also possible the term is used in a specific context or subculture and isn't widely understood. Here's a more detailed explanation: 'That's bananas': This is a common slang phrase meaning something is crazy or wild. 'She/He went bananas': This phrase describes someone becoming extremely angry, excited, or agitated. 'Banana peel': This idiom means to fail or slip up in a foolish or embarrassing way, according to Merriam-Webster. In subcultures: The term 'banana slurp' could potentially be a term used within a specific online community or subculture with its own unique slang. If you encounter 'banana slurp,' it's best to consider the context in which it's used. If it's used in a general conversation, it's likely a misinterpretation of 'that's bananas.' If it's used in a specific online forum or community, you'll need to research that community's slang to understand what it means.'
Google

“‘Banana slurp’ is a slang term that is not a common saying,” the AI claimed. “It could potentially be a misinterpretation of ‘that’s bananas’ or ‘she/he went bananas,’ which both mean something is crazy, wild, or extremely agitated. It’s also possible the term is used in a specific context or subculture and isn’t widely understood.”

As far as we know, banana slurp is not a slang term in any context or subculture. While the AI hedges its bets with language like “it could potentially be” and “it’s possible,” the fact that it asserted that it is an actual slang term is notable and potentially (definitely) kind of funny.

Can we make “cyclops vibing” a real thing?

Advertisement
Google AI Overview answer for 'cyclops vibing meaning' reading ''Cyclops vibing' is a humorous internet slang phrase that combines the image of a mythical cyclops, a giant with one eye, with the meaning of 'vibing,' which means to be in a positive, relaxed state, or to be in a great mood. It essentially suggests that a person is enjoying themselves and in a good place, even if they are depicted with a somewhat intimidating or unusual image like a cyclops. Here's a breakdown: Cyclops: A creature from Greek mythology, known for its single eye and often depicted as a giant. Vibing: A slang term meaning to be in a good mood, relaxed, or enjoying oneself. Therefore, 'cyclops vibing' implies that someone is doing something positive and enjoyable, even if it seems a little unusual or unexpected. It's a lighthearted and playful way to express contentment or a positive attitude.'
Google

“‘Cyclops vibing’ is a humorous internet slang phrase that combines the image of a mythical cyclops, a giant with one eye, with the meaning of ‘vibing,’ which means to be in a positive, relaxed state, or to be in a great mood,” says Google. “It essentially suggests that a person is enjoying themselves and in a good place, even if they are depicted with a somewhat intimidating or unusual image like a cyclops.”

Meanwhile, it defined “metal soup” as “soup prepared in a metal pot, or the taste of food that has acquired a metallic flavor due to the use of certain metals or the absence of salt. In some cases, it can also refer to a specific type of soup served in a metal can.”

Google AI Overview answer for 'metal soup meaning' reading ''Metal soup' typically refers to soup prepared in a metal pot, or the taste of food that has acquired a metallic flavor due to the use of certain metals or the absence of salt. In some cases, it can also refer to a specific type of soup served in a metal can. Here's a more detailed breakdown: Taste of Metal: When cooking with certain metals like aluminum, or when salt is omitted from food preparation, a metallic taste can develop in the dish. Soup in Cans: Steel cans, often called 'tin cans,' are commonly used for packaging food like soup and spaghetti hoops. Metal Cookware: Some types of cookware, like uncoated aluminum or cast iron, can cause metal leaching, which can affect the flavor of the food. Metal Pots: Cooking soup in a metal pot is common, and the type of metal used can affect the taste and how quickly the soup heats.'
Google
Advertisement

The art of AI hallucinations

The more nonsensical we got, the more unsure and careful Google got. The term “fishy flapjacking” had it reaching for context clues to the point where it was essentially guessing at answers.

Google AI Overview answer for 'fishy flapjacking meaning' reading ''Fishy flapjacking' is not a standard phrase. 'Fishy' usually implies something is suspicious or dishonest, while 'flapjack' is a type of pancake, or it could refer to the character Flapjack from the TV series The Marvelous Misadventures of Flapjack. Therefore, 'fishy flapjacking' could be a nonsensical phrase, a misuse of words, or a playful or satirical combination of two unrelated terms.'
Google

“‘Fishy flapjacking’ is not a standard phrase,” it said, accurately. “‘Fishy’ usually implies something is suspicious or dishonest, while ‘flapjack’ is a type of pancake, or it could refer to the character Flapjack from the TV series The Marvelous Misadventures of Flapjack. Therefore, ‘fishy flapjacking’ could be a nonsensical phrase, a misuse of words, or a playful or satirical combination of two unrelated terms.”

Advertisement

It even tries when you make up the words altogether, such as in our query for “flurp dorping.”

Google AI Overview answer for 'flurp dorping meaning' reading ' 'Flurp dorping' doesn't appear to be a recognized term or phrase. It's possible it's a slang term or a typo, or it might be made up. Without more context, it's difficult to determine its meaning. Possible interpretations: Typos: It's possible 'flurp' and 'dorping' are typos for other words. 'Flurp' could be a typo for 'fluff' or 'fleurp', while 'dorping' could be a typo for 'doping', 'drooping', or 'doping'. Slang: It could be a slang term that is not widely known or used. Made-up word: It could be a word that someone made up for a particular context. To get a better understanding of the meaning, you might need to provide more context: Where did you encounter this phrase? What was the context of its usage? Are there other words or phrases that were used alongside it? Once you can provide more information, it might be easier to understand what 'flurp dorping' means.'
Google

“‘Flurp dorping’ doesn’t appear to be a recognized term or phrase. It may be a slang term or a typo, or it might be made up. Without more context, it’s difficult to determine its meaning.”

At the end, it encouraged us to share more information so that it could do a better job.

Advertisement

‘Trash booting’ AI Overview answer disappears an hour later

Google AI Overview answer for 'trash booting meaning' reading ''Trash booting' isn't a standard technical term. It likely refers to a scenario where a system or application is rebooted due to a problem, often related to a garbage collector (GC) struggling to keep up with memory management, potentially leading to issues or crashes. In a garbage collector context, 'trash' refers to memory that is no longer in use and is being cleaned up. When a GC can't efficiently clear this 'trash,' it can lead to performance problems, and in extreme cases, a system might need to be rebooted. Here's a more detailed breakdown: Garbage Collection (GC): The process of automatically reclaiming memory that is no longer in use by a program. 'New Jobs' Bin (Eden Space): In some GC systems, this is the area where newly created objects are placed. Garbage Collector (Shopkeeper): The component responsible for managing and reclaiming memory. 'Trash' (Garbage): Memory that is no longer needed and is marked for reclamation. Stop-the-World Event: A brief pause in the program's execution while the GC is cleaning up. 'Trash Booting': A situation where a system reboots due to the GC failing to manage memory effectively, leading to potential issues or crashes.'
Google

A little over an hour after we asked Google the meaning of “trash booting,” which their AI Overview defined as “a scenario where a system or application is rebooted due to a problem, often related to a garbage collector (GC) struggling to keep up with memory management, potentially leading to issues or crashes,” the same query turned up nothing. “An AI Overview is not available for this search,” read the result.

Google result for 'trash booting meaning' with the message 'An AI Overview is not available for this search.'
Google
Advertisement

It is unclear why the result disappeared. Is the LLM figuring us out? Are too many other people using it? Or does it just not work very well in general?

Why is this happening?

Though it gets better every day, LLMs are essentially very advanced word predictors trained on the entirety of the web. More accurately, they are “token” predictors, meaning they predict not just words but various pieces of words, and will tell you the most likely definition of something (even if it isn’t real) because they’re mimicking patterns of existing slang definitions.

According to a statement emailed to us by a Google spokesperson, the AI is trying its best, but the vast catacombs of information can’t tell it everything.

Advertisement

“When people do nonsensical or ‘false premise’ searches, our systems will try to find the most relevant results based on the limited web content available,” the statement reads. “This is true of Search overall, and in some cases, AI Overviews will also trigger in an effort to provide helpful context. AI Overviews are designed to show information backed up by top web results, and their high accuracy rate is on par with other Search features like Featured Snippets.”

The cost of AI pretending to know everything

AI Overview’s attempts to define nonsense is costing Google more than its reputation. In a recent exchange on X, OpenAI CEO Sam Altman implied that users including polite phrases in their prompts is wasting millions of dollars in computational expenses.

“I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models,” mused @tomieinlove.

Advertisement
Tweet reading 'I wonder how much money OpenAI has lost in electricity costs from people saying “please” and “thank you” to their models.' Sam Altman replies 'tens of millions of dollars well spent--you never know'
@sama/X

Altman answered that it could be “tens of millions of dollars well spent—you never know.”

Altman has offered no further comment on this issue, nor provided the math behind the claim. Regardless, AI critics found the idea of its users costing OpenAI that kind of money just by being nice to be both amusing and a great idea for praxis.

“Instead of poisoning AI models, let’s just say please and thank you,” wrote @thetymonbay.

Advertisement
Tweet reading 'Instead of poisoning AI models, let's just say please and thank you.'
@thetymonbay/X

According to Quartz, which cited Goldman Sachs, a ChatGPT-4 query uses ten times the electrical power as a normal Google search. Meanwhile, Scientific American reports that an AI generating a text answer to a prompt uses 30 times more energy than someone would by looking up that answer on their own.

All of this natural resource use fuels climate change, according to experts in the field.

The Daily Dot has reached out to OpenAI for comment via email.

Advertisement

The internet is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter here to get the best (and worst) of the internet straight into your inbox.