Friday 1 December 2023

why is google not as good as it used to be?

The Google serch engine is not as it was - and for many reasons:

Look up a search term that can also be a product — asthma inhalers, for example — and you will need to scroll past up to four large adverts before reaching non-sponsored results. Search for clothing and the entire first page will be companies hoping to make a sale. Even non-ad results can look like wrong answers, with links full of buzzwords so Google gives them a higher ranking.

When it launched in the late 90s, Google Search was one of many search engines. But Larry Page and Sergey Brin’s PageRank algorithm, which organised websites by the number of times they were linked to other pages, meant their search engine was best at bringing up relevant results. It quickly became the most popular.

In theory, users would up sticks and go elsewhere if the service was in decline. But Google Search has no real competitors. When did you last use Microsoft search engine Bing or DuckDuckGo? The prevalence of Google’s Chrome browser and the fact that it pays Apple to be the default search engine give it a huge advantage. DuckDuckGo also claims Google’s rivals struggle because they cannot crawl, or visit, the same number of sites looking for links.

Whatever happened to Google Search?

What Happened To Google Search? - YouTube

With discussions on forums from the last couple of years:

What the hell happened to Google? : r/AskTechnology

What happened to Google search? It has become nearly impossible to find relevant results. : r/google

And more recently, it's all about AI and algorithms:

The future of Google Search is AI. But not in the way you think. The company synonymous with web search isn’t all in on chatbots (even though it’s building one, called Bard), and it’s not redesigning its homepage to look more like a ChatGPT-style messaging system. Instead, Google is putting AI front and center in the most valuable real estate on the internet: its existing search results.

AI is coming to Google search through Search Generative Experience - The Verge

Sounds great:

AI platforms and their impact on Google search - Digital Balance

But it isn't...

There is no easy way to explain the sum of Google’s knowledge. It is ever-expanding. Endless. A growing web of hundreds of billions of websites, more data than even 100,000 of the most expensive iPhones mashed together could possibly store. But right now, I can say this: Google is confused about whether there’s an African country beginning with the letter k.

I’ve asked the search engine to name it. “What is an African country beginning with K?” In response, the site has produced a “featured snippet” answer—one of those chunks of text that you can read directly on the results page, without navigating to another website. It begins like so: “While there are 54 recognized countries in Africa, none of them begin with the letter ‘K.’”

This is wrong. The text continues: “The closest is Kenya, which starts with a ‘K’ sound, but is actually spelled with a ‘K’ sound. It’s always interesting to learn new trivia facts like this.”

Given how nonsensical this response is, you might not be surprised to hear that the snippet was originally written by ChatGPT. But you may be surprised by how it became a featured answer on the internet’s preeminent knowledge base. The search engine is pulling this blurb from a user post on Hacker News, an online message board about technology, which is itself quoting from a website called Emergent Mind, which exists to teach people about AI—including its flaws. At some point, Google’s crawlers scraped the text, and now its algorithm automatically presents the chatbot’s nonsense answer as fact, with a link to the Hacker News discussion. The Kenya error, however unlikely a user is to stumble upon it, isn’t a one-off: I first came across the response in a viral tweet from the journalist Christopher Ingraham last month, and it was reported by Futurism as far back as August. (When Ingraham and Futurism saw it, Google was citing that initial Emergent Mind post, rather than Hacker News.) ...

Perhaps someday these tools will get smarter, and be able to fact-check themselves. Until then, things will probably get weirder. This week, on a lark, I decided to ask Google’s generative search tool to tell me who my husband is. (I’m not married, but when you begin typing my name into Google, it typically suggests searching for “Caroline Mimbs Nyce husband.”) The bot told me that I’m wedded to my own uncle, linking to my grandfather’s obituary as evidence—which, for the record, does not state that I am married to my uncle.

A representative for Google told me that this was an example of a “false premise” search, a type that is known to trip up the algorithm. If she were trying to date me, she argued, she wouldn’t just stop at the AI-generated response given by the search engine, but would click the link to fact-check it. Let’s hope others are equally skeptical of what they see.

Google’s Relationship With Facts Is Getting Wobblier - The Atlantic

There is a lot of scepticism and confusion out there:

Google's Searchbot Could Put Me Out of a Job - The Atlantic

And it's happening with all search engines:

What is happening with search engines? Why are all the results so unhelpful lately? : r/NoStupidQuestions

But there might be other places to try:

21 Great Search Engines You Can Use Instead Of Google




.

.

.

No comments: