The Tyler Woodward Project

Dead Internet, Human Costs

Tyler Woodward Episode 3

Ever scroll past the same joke, the same cropped video, and replies that don’t quite sound human? We dig into why the web can feel hollow without falling for the doomsday take. As a broadcast engineer and Linux nerd, I frame the “dead internet theory” like a signal problem: automation raised the noise floor, ranking systems amplified low-cost content, and honest creators now compete with industrial output that’s optimized for clicks, not clarity.

We start by separating the soft claim from the hard one. The soft claim holds: bots, SEO farms, and AI pipelines flood feeds and search with mass-produced posts. The hard claim doesn’t: humans haven’t vanished. What changed is the layer you see, feeds and results curated by algorithms that maximize watch time and engagement. That shift rewires incentives. If you can publish 10,000 posts and only 10 need to hit to pay, volume wins. Add fake likes, coordinated replies, and engagement pods, and the ranking loop gets gamed. Trust takes the hit, and users feel the static.

Then we tour the platforms. On search, “how to” results read like cloned pages with long intros and vague steps. On social, reply zones fill with generic praise, rage bait, and suspicious links seconds after posting. Short video feeds surface reposts, mirrored clips, and AI slideshows with confident claims and no sources. Forums and subreddits still show human texture thanks to moderation, but stealth marketing and AI-polished posts slip through. We also clear a key misconception: AI use doesn’t equal a bot; the problem is automation at deceptive scale.

Finally, we get practical. Treat feeds as outputs, not reality. Curate aggressively with mutes and pruned follows. Learn quick tells for bots and source-free videos. Use RSS, reputable newsletters, and search operators to bypass SEO sludge. Harden your browser with a content blocker, a password manager, and two-factor authentication. Keep one mental rule: authenticity costs something. Real people have constraints and histories; industrial content is smooth and interchangeable. Want the internet to feel alive again? Go where humans pay a cost to be present, moderated communities, creators with reputations, and spaces where conversation is the product.

If this resonates, follow along, share it with a friend who’s tired of AI slop, and leave a review so more people can find the human web.

Send me a text message with your thoughts, questions, or feedback

Support the show

If you enjoyed the show, be sure to follow The Tyler Woodward Project and leave a rating and review on Apple Podcasts or your favorite podcast app—it really helps more people discover the show.

Follow the show on Threads or Bluesky. Get in touch on the official Matrix Space for the podcast.

All views and opinions expressed in this show are solely those of the creator and do not represent or reflect the views, policies, or positions of any employer, organization, or professional affiliation.

Tyler:

You scroll a little, and it's the same joke again. Same image, same phrasing, same replies, and somehow the comments are either empty, weirdly enthusiastic, or aggressively angry in ways that doesn't really feel human. And then you have a thought. Wait, am I talking to bots? Today we're digging into the dead internet theory. What it claims, what's actually happening, what's sort of exaggerated, and how to stay sane online when the internet feels like it's being replaced with a vending machine full of content. Welcome back to the Tyler Woodward Project. I'm Tyler, a broadcast engineer by trade, a Linux nerd by choice. And I enjoy the mystifying tech that's supposedly too complicated for people. The dead internet theory, if you've never heard that phrase, is basically the idea that most online activity is no longer driven by real humans. Instead, it's bots talking to other bots. Auto generated post, fake engagement, and a handful of actual people wandering around inside it like they're in a an abandoned mall food court. There are two versions of this story. The soft version says a lot of what you see is automated, manipulated, and optimized for engagement. The hard version says most people are basically gone, and you're living in a synthetic internet. We're going to treat it like a like an engineering problem. Define the claim, look at the signals, look at the incentives, and talk about what you can do. Also, quick glossary as we go. SEO stands for search engine optimization, making pages rank higher and search results, LLM, its large language model, the kind of AI that predicts and generates text. And ad tech is the whole advertising technology machine that buys and sells your attention in milliseconds. Here's the plan. First, we'll break down what's true, what's overstated, and then second, we'll look at how this shows up across social platforms, video, and the open web. And finally, I'll give you some practical takeaways so you can tune your feeds and your brain back to something livable. Let's steel man the theory first, because the feeling that the internet is dead didn't come from nowhere. A huge amount of online content is automated. Some of it is harmless, scheduled posts, weather bots, auto captions, repost accounts, news headlines syndicated everywhere. But a lot of it is built to capture money or influence. Fake reviews, fake storefronts, affiliate spam, political astroturfing, scam DMs, and bot replies engineered to bait you into arguing. And man, I have seen a few of my own on that over the years. And yes, AI cranked the volume knob to eleven. Before, content farms had to pay actual writers. Now a person can spin up a pipeline where a model generates 500 articles, an open generator makes original thumbnails and a script posts them across dozens of sites at a time. The cost per piece of content drop through the floor. So the soft version of the dead internet theory is absolutely grounded in reality. Automation is everywhere, and the incentives reward it. Now, the hard version. The most humans are gone. Doesn't hold up as stated. The internet is, you know, it still has billions of real users. What's changed is that on many platforms, the most visible layer is that optimized content, not authentic conversation. Here's the key idea. The internet isn't dead, but a lot of it is now mediated by ranking systems. On social platforms, you don't see what your friends said today. You see what the platform thinks will keep you there. Keep your eyeballs on the screen. On search, you don't see the best answer. You see what a ranking system believes matches your query influenced by SEO and site reputation signals. On video platforms, you don't see what's new. You see what will maximize screen time. That distinction matters because it means low effort, high output. High output content has a structural advantage. If you can produce ten thousand pieces of content and only ten need to rank or go viral to pay the bills, you win. And the content doesn't even have to be good. It just has to be clickable and plausible long enough to earn an ad impression, an affiliate click, or even a follow. That's the economics underneath that uh quote unquote dead feeling. Attention uh arbitrage. One more layer. Bots aren't just posting, bots also shape what gets promoted. Fake likes, fake follows, coordinated replies, engagement pods. These are all ways to push algorithmic systems. And once ranking systems get polluted, the incentives get even worse. Honest creators have to compete with noise that can be manufactured at scale. This is why it matters in real life when trust collapses. People spend more time verifying basic stuff. And let's be real, that's exhausting. It also changes what gets built. Platforms start optimizing for containment, not discovery. Because discovery becomes dangerous when the map is full of traps. From a broadcast engineering lens, this looks like a classic signal-to-noise problem. If you raise the noise floor high enough, even a clean signal becomes hard to detect. And the viewer blames the TV, not the transmitter. Online, when the noise floor rises, spam, bots, AI slop, people stop trusting the channel itself. So what's the accurate way to say it? The internet isn't dead, but parts of it are industrial. They are mass produced, optimized, and increasingly synthetic. And because we experience the internet through feeds and search results, the industrial layer can feel like the whole thing. Now, let's connect that to what you've probably seen this week. Let's do a quick platform tour because dead internet theory feels most convincing when you zoom into the specifics. First, the open web and search. You search for something simple, like best router settings or how to fix a washing machine era code. And you get ten pages that read like they were written by the same person, wearing different mustaches, long intros, vague paragraphs, and then a list of steps that never actually solve the real problem. That's SEO spam and made for advertising content. Some of it is AI generated, some of it is human written but templated. The goal is the same: rank, show ads, earn clicks. You see this most with recipe websites. The whole long life story before actually getting to the recipe. Second, let's talk about X. Facebook, threads, the reply zones. You post a thought, and within minutes you get replies that are either oddly generic, well said, or aggressively polarizing, or they contain a click that or a link that you probably should never click. Those are often bot accounts or semi-automated accounts. Even when they're real, and I'll say that in quotes, quote unquote, real, they can be coordinated. The effect is to bend the vibe of the conversation. More heat, less light, and a lot more performative shouting. Third, let's talk about TikTok, YouTube Shorts, and other video platforms. Repost are the big tell. The same clip with the same captions, maybe slightly cropped, maybe mirrored, posted by 50 other accounts. And then there's the AI angle, synthetic narration, AI generated images and content that looks like a documentary, but is basically a slideshow of confident sounding claims. Some channels do this responsibly. A lot don't. The platform doesn't know what's true and knows what's getting watched, though. Fourth, Reddit and other forms. These places still have some of the best of the human internet left. Because communities moderate and people call nonsense out. But they're also targets for stealth marketing and influence campaigns. Um a genuine question. You know, a genuine question post, let's say, about a product can be an ad. A heartfelt story can be a test balloon. And AI makes it easier to write posts that sound authentic enough to blend in with all the other garbage. Now, let me be careful about a common misconception. AI text equals bot. Not necessarily. Plenty of real people use AI to edit, translate, or outline. That's not the problem. The problem when automation is used to scale deceptive you know, deception or junk. Also, bot percentages are hard to pin down. Platforms publish their own estimates and enforcement stats, and researchers publish different numbers depending on methodology and what they defined as a bot. If you want to verify claims, look for what data set, what time window, what definition, and whether the measurement can be reproduced. Here's the cultural piece, though. The part that hits you in the chest. When every platform starts to feel like a mall kiosk selling the same phone case, people retreat. They move into private group chats, Discord servers, newsletters, small forums, and friend circles. The public web becomes more of a billboard. And the real conversation goes backstage. I've been moving in this direction for a little while now. It's uh, you know, it's a shame. That's the dead internet sensation. Not that humans vanished per se, but that the public commons got commercialized and automated so hard that authentic human presence got harder to find. And the moment you believe you're mostly surrounded by bots, you behave differently. You assume bad faith faster. You stop posting, you stop helping strangers. That's a real consequence. No, you know, no conspiracy required. So you know, next, let's talk about what you can do practically without turning your life into a full-time fact-checking job. First, treat feeds like output, not reality. A feed is a ranking system's best guess at what will keep you scrolling. If it feels bleak, repetitive, or just flat out bizarre, that's not necessarily what's happening. It's what performs, curates aggressively, mute keywords, hides suggested post, and prune follows. You're not being rude. You're doing maintenance. Second, learn a few bot and slop detection habits. If an account has a brand new creation date, a generic profile photo, an incoherent posting schedule, and replies that don't actually respond to what you said, assume automation until proven otherwise. If a video makes big claims but never shows sources, names, or primary material, treat it like entertainment, not information. Third, use better discovery tools on purpose. RSS, really simple syndication, is still one of the healthiest ways to follow sites you trust without the algorithm deciding your mood. Email newsletters can be good too if the writer has a reputation to protect. For search, try adding specific operators like quoting key phrases using site filters or appending forum or PDF when you want less SEO nonsense. Fourth, harden your browser a little. Add an ad blocker like UBlock Origin. Reduce the incentive structure of junk sites and it protects you from malvertising. A password manager plus two-factor authentication or two FA keeps account takeovers from turning your real profile into a bot megaphone. I mean, none of none of this is perfect, but it lowers your exposure. Fifth, keep one mental rule. Authenticity costs something. Real humans have constraints. They forget. They have uneven tone. They have a history that makes sense. Industrial content tends to be smooth, frequent, and strangely interchangeable. When you remember that, the internet feels less haunted and more like a marketplace with a lot of counterfeit goods. Still useful, but you check the labels a little closely. You know what I mean? And this matters because your attention is a finite resource. If the web trains you to expect manipulation, you'll eventually stop engaging with anything, including the good stuff. The goal isn't paranoia. The goal is let's say selective trust. So is the internet dead? No. But the part you can easily bump into, the loud, shiny, highly ranked part, that I Has been flooded with automation and AI made the flood cheaper and way faster. If you want the internet to feel alive again, go where humans pay a cost to be there. Communities with moderation, creators with a reputation, and spaces where conversation is the product, not a side effect. Visit TylerWodworth.me, follow at Tylerwoodward.me on Instagram and threads, and subscribe and like the show on your favorite podcast platform. I'll catch you next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The Why Files: Operation Podcast Artwork

The Why Files: Operation Podcast

The Why Files: Operation Podcast
Sightings Artwork

Sightings

REVERB | Daylight Media
Darknet Diaries Artwork

Darknet Diaries

Jack Rhysider
99% Invisible Artwork

99% Invisible

Roman Mars
StarTalk Radio Artwork

StarTalk Radio

Neil deGrasse Tyson