In mid-May 2024, Google integrated the AI ​​Overview function into its search engine in the USA. This was met with ridicule and criticism online. The AI ​​search results contained questionable and even nonsensical answers. Even weeks later, Google does not seem to be able to get the problem under control and still recommends glue as an ingredient in pizza, for example.

Google is increasingly relying on artificial intelligence. In the USA, for example, the company integrated the AI ​​Overview function into its search in mid-May 2024. An AI summarizes answers to search queries and displays them at the top of the search. The AI ​​overview is to be introduced in other countries by the end of the year, although it has not yet been completely convincing.

Google recommends: Glue as an ingredient for pizza

Shortly after the release of AI Overview, numerous screenshots of bizarre and even nonsensical answers were circulating on social media. The reason: Apparently, in many cases, the AI ​​behind it cannot distinguish serious information from jokes or satire.

Google told CNBC Meanwhile, the company said that many of the examples were unusual requests. The company added: “The vast majority of AI overviews provide high-quality information with links to further information online.”

But reports of nonsensical answers in Google's AI search have been piling up. The most bizarre: non-toxic glue as a pizza topping to prevent the cheese from slipping.

Google can't get its AI search under control

Even weeks later, Google does not seem to have got the problems of AI Overview under control. As computer scientist Colin McMillen reports on Bluesky, when asked how much glue you should put on a pizza, the answer is not “none”, but “an eighth of a cup”.

Admittedly, the question of how much glue you should put on a pizza is a bit unusual. But not so unusual that Google's AI search still gives the wrong answer after all the previous excitement. The Verge has in turn verified the authenticity of the screenshot from Bluesky user McMillen by reproducing the response.

But it gets even more bizarre: After AI Overview recommended glue as a pizza ingredient weeks ago, journalist and blogger Katie Notopoulos dared to experiment and baked and ate a pizza with glue according to Google's recipe suggestion – but only a few bites out of caution.

The current answer from Google's AI search on how much glue to add to a pizza is based on this very article by Katie Notopoulos. In other words: every time a user reports that AI Overview is doing something wrong, the AI ​​becomes even more wrong.

Also interesting:



Source: https://www.basicthinking.de/blog/2024/06/12/google-empfiehlt-immer-noch-kleber-als-pizzazutat/

Leave a Reply

Your email address will not be published. Required fields are marked *