(2023-09-15) Berjon Fixing Search

Robin Berjon: Fixing Search. We Don't Have To Put Up With Broken Search. This week and next, all of tech policy is dizzily watching Google be put on trial by the US government for monopolisation of search. (anti-trust)

the web would be a better place without Google Search. Google's quality has been steadily dropping over the years, to the point that no one is even surprised when it gets caught making up a completely fake Shrek 5 movie, further ascribing it a 92% rating amongst Google users.

The web's implicit current architectural approach to search is broken. If we get rid of Google today but change nothing else, we'll just get another Google.

The historical relationship between publishers and search engines was a simple and rather fortuitous mutualistic affair

Unfortunately, for this kind of mutualistic symbiosis to persist over time, there needs to be a quick feedback mechanism to punish a defecting party trying to get more out of the relationship than it puts in.

For search engines, the control mechanism is competition. So long as there's competition between search engines, if one of them starts being extractive with respect to publishers then publishers can simply exclude the engine using robots.txt.

as search became increasingly monopolized it started failing.

As Google came to dominate, it started simultaneously doing more with the content than just indexing it, demanding more of publishers, and sending less traffic.

At the very least we ought to restore competition, but should consider more intentional approaches to govern this cycle.

That's the second architectural problem: web search as supported today limits user agency and doesn't empower other parties to experiment with new interfaces, multi-sourcing, or new business models. We would be better off with standardised and reusable search infrastructure, it would liberate innovation. (A paying API?)

And the third and final aspect is that web browsers are paid to select the default search engine for you.

Maintaining the current level of quality and pace of evolution in browsers requires about $2 billion per year.

However, browsers selling users to search engines to keep the web going is a bit like selling weapons in a war zone to bankroll cancer research.

Search defaults are the moat for search engines, they're basically all of the moat.

Worse, a search monoculture is much easier to optimise for when compared to a competitive search market with different methodologies. This means that people will eventually become good at gaming it (because there is a high incentive to do so). In turn, 1) search result quality will invariably drop as it becomes dominated by content optimised for ranking rather than for people and 2) the web itself will fill up with SEO-optimised garbage

Having a single search default also means that there is only one search engine for all topics, when result quality and user experience would be better served by search verticals specialising in a given topic.

Effecting change at this scale isn't trivial even if the technical solutions are relatively simple

The first part of the fix is that search engines should not be exposed to people as web sites. Instead, they should be an API that the browser accesses and renders itself

Of course, APIs are nice but someone needs to be pay the search engines given that under this approach they can no longer show ads

It needs to be coupled with a strict no default search policy. No browser is allowed to pick default search sources (Like picking your Mastodon instance?)


Edited:    |       |    Search Twitter for discussion