The Tyler Woodward Project

Why Search Feels Worse Now

Tyler Woodward Episode 4

Search shouldn’t feel like walking into a shopping mall when you asked for a library. We dig into why results seem to have slid downhill: crowded ad units, affiliate-heavy pages, and AI summaries that sound confident while averaging mediocre sources, and what it takes to find real answers again. From a broadcast engineer’s lens, noise rose across the web, and ranking complexity can’t magically create signal. The stakes are bigger than shopping; search is how we fix gear, choose tools, and check claims, so bad incentives become bad decisions.

We break down the mechanics in plain English: how monetization reshapes the first screen, how SEO evolved into an adversarial game, why click-based metrics misread satisfaction, and how AI made it cheap to scale polished but shallow content. We also unpack the zero-click trend and the erosion of source checking, where citations exist yet fail to back specific claims. The result is a feedback loop where high-effort content declines, walled gardens hoard practical knowledge, and users get served summaries of summaries.

Then we set a bar for what “good” should mean by 2026. A better search engine would optimize for task completion, long-term trust, transparent sourcing, spam resistance, and true diversity of sources and formats. Think receipts-first AI answers, penalties for content networks that scale junk, and a ranking objective that values whether you solved the problem, not whether you lingered on a page. To help right now, we share a practical toolkit: surgical search operators, bias toward vendor docs and standards, teardown-style reviews and long-term ownership notes, and a disciplined habit of verifying AI outputs with at least two strong sources. We finish with a simple habit that compounds: build your personal trust graph with bookmarks, RSS, and notes on who was right last time.

If this helped you cut through the noise, follow the show, share it with a friend who’s drowning in listicles, and leave a quick review so others can find smarter search tactics too.

Send me a text message with your thoughts, questions, or feedback

Support the show

If you enjoyed the show, be sure to follow The Tyler Woodward Project and leave a rating and review on Apple Podcasts or your favorite podcast app—it really helps more people discover the show.

Follow the show on Threads or Bluesky. Get in touch on the official Matrix Space for the podcast.

All views and opinions expressed in this show are solely those of the creator and do not represent or reflect the views, policies, or positions of any employer, organization, or professional affiliation.

Tyler:

You're not imagining it. Search results did get worse. Try this sometime. Search for a product you generally genuinely want. Like best router for apartment or how to fix a laptop battery drain. And count how many results feel like they were written by someone who actually touched a damn thing. Now, here's the uncomfortable truth. The web didn't just get bigger, the incentives changed. And if we want good search again in 2026, we have to talk about what search engines actually are optimizing for, because it might not be what you think. Welcome back to the Tyler Woodward Project. I'm Tyler, a broadcast engineer by trade, a Linux nerd by choice, and I really enjoy demystifying tech that's supposedly too complicated for people. Today, I'm answering two questions. First, why did Google search results feel like they slid downhill, especially over the last few years? Second, what would a good search engine optimize for in twenty twenty six? Now that AI answers are sitting right under the top of the page now. I'm gonna keep this practical. I'll explain the mechanics once in plain English, then we'll talk about incentives, and then I'll give you a set of tactics you can use immediately to get better results without needing a computer science degree or a ritual sacrifice to the algorithms. One quick glossary up front, SEO, which means search engine optimization. Basically the art of the art and science of getting a page to rank. SERP S-E-R-P means search engine results page. The page you see after you search. And when I say AI answers, I mean the search engine generated generation of summary, often using a technique called retrieval augmented generation or rag R A G. It retrieves some sources and then generates an answer from them or tries to. All right. Let's talk about why search feels worse. At a high level, Google's job is to match your search to the most useful stuff on the web. That sounds simple enough, right? But it's a hostile environment. Spammers want traffic, publishers want revenue, and Google wants to keep you searching while also making a little money. So the ranking system becomes an optimization problem. If you reward the wrong thing, you get the wrong behavior at internet scale. One big shift is monetization pressure. Search is a massive business, and ads are a massive part of it. Even when the ad labeling is technically present, the practical effect is that commercial searches, anything that smells like I might buy something, get a SERP that's crowded with ad units, shopping modules, top picks, and other widgets before you ever reach what people traditionally think of as the results. And that changes what you experience as quality. The first screen becomes a product page for Google, not a directory of the web. Second shift, SEO evolved from help search engines understand my page into an adversal adversarial industry. If ranking is valuable, people will game it. If the game changes, they'll adapt. And now we have an ecosystem full of content designed to rank first and help later, if at all. A lot of this is affiliate arbitras. Pages optimized to catch a query like best standing desk, funnel you through affiliate links, and collect a commission. Some of those pages are genuinely useful, but many are just rewritten summaries of other summaries padded with expert phrasing and engineered to look authoritative. Third shift, the measurement problem. Search engines can't directly measure that this helped the person. They approximate with signals like click through rate, dwell time, and whether you come back and search again. Those signals are easy to misread. If you click a result, spend let's say three minutes on it because it's confusing as hell, and then come back and search again, is that a good outcome or a bad one? The system all it sees is activity. You experienced frustration. Fourth shift. Content volume exploded. And not just from people. AI tools, yeah, made it cheap to produce plausible looking text at a ridiculous scale. Even before modern generative AI, content farms could churn. Now it's worse because the writing looks smoother while still being empty. Or worse, confidently wrong. This creates a pollution problem. The web is full of pages that look like answers. Search engines have to filter hard. And any filtering mistake is now amplified because there's so much junking competing for the same queries. Fifth shift. The open web got a whole lot thinner. A lot of the best, how do I fix this weird thing? That knowledge moved into places that are hard to index well or hard to access at all. Community platforms, walled gardens, private groups, chat servers, paid walled sites, apps, and tools that don't expose much to crawlers. Sometimes this knowledge exists, but it's not easily visible to classic search, so the algorithm overweighs what it can see. SEO'd pages, aggregators, and big domains that publish a lot. Six, shift. Google started answering more queries directly on the SERP. That includes featured snippets, knowledge panels people also ask, and now AI generative summaries. If you get your answer without clicking anything, that's called a zero click. From a user perspective, zero click can be convenient. From the web ecosystem perspective, it reduces traffic to publishers. And when publishers lose traffic, they either monetize harder, go beyond behind paywalls, or optimize more aggressively for whatever scraps or you know, scraps of traffic remain, often by targeting queries that still trigger clicks. That feedback loop tends to make the remaining clickable web results more spammy, more affiliate heavy, and more extreme in tone because extremes unfortunately convert better. This is the part that feels very broadcast engineer-y to me. In broadcast, you can't cheat physics. When noise rises, you can crank up the gain, but you're just amplifying noise. Search is fighting the same battle. When the web's signal-to-noise ratio drops, the ranking system can get more complex. But complexity doesn't magically create signal. You still have to find clean sources, or you're just making higher resolution garbage. Now, to be fair, Google has fought spam for decades and keeps shipping updates that try to reward experience, credibility, and helpfulness, but spammers react fast. AI made that reaction time even faster. It's a cat and mouse game. So why does it matter? Because search is infrastructure. It's how people troubleshoot their tech, choose products, learn skills, and verify claims. When search degrades, the cost isn't just annoyance, it's wasted time, worse decisions, and people getting nudged toward whatever's most profitable, not what's most accurate. Next, I want to I want to make this real with a concrete story. What the bad search experience looks like now and how AI answers changed the failure modes. Let's run a typical modern search query. Say you search best budget laser printer 2026 or best wireless earbuds for calls. Here's what tends to happen. First, you know, the first screen, shopping modules, sponsored placements, and a few big name sites. Some are legitimate reviews, but many are listicles that all recommend the same handful of products because affiliate programs and ad partnerships quietly shape what gets covered. Then you click one. You get a long page with a table of top picks, a bunch of filler paragraphs, and suspicious suspiciously similar phrasing across multiple sites. You scroll, you skim, and you realize you didn't learn anything. You could have just guessed from the product page. So you refine the search, add Reddit, add forum, add a model number, add site, you know, this filters. Suddenly you find a thread where someone says, I've owned this for 18 months, the rollers wear out, and the driver crashes on Mac OS after the latest update. That's what you wanted. Not a top 10, but an actual lived experience in specific failure modes. Now, layer AI answers on top. In the best case, AI gives you a quick summary that points you to solid sources, calls out trade-offs, and saves you time. In the worst case, it averages the internet. If the input sources are polluted, affiliate fluff, outdated pages, scraped content, then the AI can produce a confident, cleanly written summary of junk. And AI adds a special new problem. Providence. With classic search, you might land on a page and at least know who said it, when, and in what context. With AI answers, that chain can get blurry. Even if citations are shown, they can be incomplete, misattributed, or not actually supporting the specific claim being made. That's not AI, you know, it's not because AI is malicious in any way. It's because language models are optimized to produce plausible text. Without careful grounding, the system can sound right while being absolutely wrong. There's also a cultural angle here. We're training ourselves out of source checking. With the CERP become, you know, when the CERP becomes an answer box, people stop clicking through. When people stop clicking through, publishers get less incentive to maintain higher effort content. When high effort content declines, the AI and the ranking systems have less good material to learn from. That's a recipe for a knowledge ecosystem that slowly replaces primary sources with summaries of summaries. Why this matters is bigger than shopping. The same mechanics apply to health queries, finance questions, and technical troubleshooting. Bad search makes it harder to verify anything. And that pushes people toward either cynicism, nothing's reliable, or blind trust. The answer box said so, so it must be correct. If we agree the problem is incentives plus pollution plus weird measurement, then the fix can't be bring back the old Google because, well, the web itself has changed. So what would a genuinely good search engine optimized for 2026 even look like? Let's start with the big idea. A search engine is only as good as its objective function. The thing it tries to maximize. If you optimize for more queries, you can accidentally reward confusion. If you optimize for more clicks, you can accidentally reward clickbait. If you optimize for more ad revenue, you will eventually get a CERP that feels like a shopping mall with a library in the back. In plain terms. A good search engine should optimize for successful task completion. Did you fix the issue? Choose the right product, learn the thing or find the page you meant to find. That means measuring satisfaction carefully, not just measuring activity. A good search engine should optimize for long-term trust. Not did this answer feel good in the moment. But when people check later, was it right? That implies stronger feedback loops based on corrections, expert review where appropriate, and weighing sources with real track records. A good search engine should optimize for source transparency. If you show an A, you know, an AI answer, it should be easy to see what it relied on, what's uncertain, and what's being inferred. Here are the receipts, should be the first class feature, not a footnote. A good search engine should optimize for spam resistance as a core product feature. That means treating SEO spam like malware. Detect patterns, penalize networks, and make it expensive to scale junk. This is especially important with AI generated content because cheap to produce changes the economics of spam. A good search engine should optimize for diversity of sources and formats, not diversity as a buzzword. Diversity as resilience. If every query funnels you to the same five mega websites, you end up with a monoculture. And monocultures fail. Now, practical takeaways you can use today, even if Google doesn't change anything tomorrow. Use search operators intentionally. Try site colon whatever quotes for exact phrases, uh phrases, and exclusions with dash to remove garbage domains or terms. If you're troubleshooting, add a specific error code, model number, or maybe a log snippet in quotes. Bias toward primary sources. For software and security topics, look for vendor docs, official change logs, standard docs, and reputable communities. For products, look for teardown style reviews, long term ownership updates and forums where people discuss the actual failures. Treat AI answers like a draft, not a verdict. Use them to get vocabulary and maybe a starting plan. Then verify with at least. Two solid sources. If the AI can't give you citations that actually back the key claim, assume it's shaky. Don't trust it. Build your own trusted web. Bookmark a few sites and communities that consistently deliver. Subscribe to RSS feeds where you can and keep a small notes file of places that were right last time. This sounds old school because it is, but it still works. I can't tell you how many bookmarks I have of places I found that have fixed problems. And I have found the answer I was looking for so I could come back again if I needed to. Why this matters? You can't control Google's incentives, but you can control your workflow. A small shift in how you search can save, I don't know, hours. And it reduces the change. You'll make, you know, it reduces the chance you'll make a decision based on whatever content was most aggressively optimized for this week. So yeah, search get worse. Not because engineers forgot how to build a search engine, but because the web got a whole lot noisier and the incentives tilted toward monetization and scalable manipulation. The good news is we can also define what good should mean in 2026. Success, trust, transparency, and resistance of spam, especially when AI is generating the first thing you see. Visit Tylerwoodward.me, follow at Tylerwoodward.me on Instagram and threads, subscribe and like the show on your favorite podcast platform. I'll catch you next week.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The Why Files: Operation Podcast Artwork

The Why Files: Operation Podcast

The Why Files: Operation Podcast
Sightings Artwork

Sightings

REVERB | Daylight Media
Darknet Diaries Artwork

Darknet Diaries

Jack Rhysider
99% Invisible Artwork

99% Invisible

Roman Mars
StarTalk Radio Artwork

StarTalk Radio

Neil deGrasse Tyson