The Tyler Woodward Project
The Tyler Woodward Project is a weekly show about the way technology, science, and culture actually collide in real life. Each episode breaks down the systems, tools, and ideas shaping how we work, communicate, and live, without the buzzwords, posturing, or fake hype. Expect smart, grounded conversations, a bit of sarcasm, and clear explanations that make complex topics feel human and relevant.
The Tyler Woodward Project
Tracking Eyes In Public
A camera that notices you, zooms in, and follows sounds like a neat feature until that feed is viewable on the open internet with zero friction. We dig into AI-enabled PTZ systems, why they transform surveillance from passive recording into active selection, and how a single misconfiguration can turn a powerful tool into a public broadcast. Pulling from reporting by 404 Media and Ben Jordan, we connect the dots between “debug” interfaces, Shodan indexing, and real-world harms that scale with ease of access.
We start by translating the tech: what pan-tilt-zoom actually means, how AI-assisted tracking changes the risk profile, and why close-ups convert generic footage into identifying data. Then we take it to street level: how a network of cameras evolves from isolated views into coverage, coverage into routes, and routes into routines. We call out the flaw in “public roadway” defenses: a bystander can’t rewind last week, share a link, or search across locations. A world-viewable interface can. The result is frictionless replay that maps families, commuters, and kids with chilling granularity.
From there we get practical. We outline the exact questions residents, journalists, and city staff should ask: what’s deployed, who can access live and archived views, what authentication is enforced, whether an independent security assessment audited the real deployment, and how fast an emergency shutdown can happen. Then we set non-negotiable baselines: no public exposure of admin or troubleshooting endpoints, mandatory encryption, multi-factor authentication, rigorous logging and review, and architectures that prevent one bad setting from becoming a public feed. If a system is designed to track people, it must be designed to protect people with urgency measured in days, not quarters.
If you care about privacy, public safety, and smart governance, this one matters. Listen, share it with someone who manages infrastructure in your city, and tell us the one safeguard you think should be mandatory everywhere. Subscribe, leave a review, and help push this conversation into the rooms where deployment decisions get made.
Sources:
404 Media Podcast / https://youtu.be/DrGVGphD2L0?si=3AY9rL8cLdL2xm4a
Benn Jordan’s Video / https://youtu.be/vU1-uiUlHTo?si=udjf65lX3WZNi1JA
Send me a text message with your thoughts, questions, or feedback
If you enjoyed the show, be sure to follow The Tyler Woodward Project and leave a rating and review on Apple Podcasts or your favorite podcast app—it really helps more people discover the show.
⚠️ All views and opinions expressed in this show are solely those of the creator and do not represent or reflect the views, policies, or positions of any employer, organization, or professional affiliation.
Picture a camera that doesn't just record. It notices you, zooms in, and follows you. Now, picture that same camera discoverable on the open internet like a public web page. So anybody with a link can watch. That's not sci-fi someday. That's deployment and configuration reality. And it needs to be addressed, and it needs to be addressed now. Welcome back to the Tyler Woodward Project. I'm Tyler, a broadcast engineer by trade, a Linux nerd by choice, and I enjoyed demystifying tech that's supposedly too complicated for people. This episode is coming out of my normal release schedule. Consider it a bonus episode. I'm doing this because this kind of stuff can't wait, and it needs to be addressed. Today we're talking about Flox Condor cameras and why this whole concept is terrifying when it's used broadly. And even more terrifying when the technical guardrails are non existent. A quick and uh important credit here. We got to give huge thanks to 404 media for reporting on this and bringing this situation to light. And also, big credit to Ben Jordan for documenting it in detail and pushing this story into the broader conversation. Because getting this out into the daylight is the only reason any of this has any chance of getting fixed. If you want to read the original reporting and support the people doing the hard work, go check out 404 Media and Ben Jordan's video that helped spotlight the all all of this. Before we go any further, because of my job, because of my day-to-day work, I can't advocate any specific policy changes or tell any city what kind of ordinances they should be passing. But it does, it doesn't take policy position to say this. If surveillance infrastructure is exposed to the open internet, that's just unacceptable. And the response time should be measured in days, not in quarters. Here's my plan. First, we'll define what these cameras are in plain English and why AI tracking changes the risk factor. Then we'll connect it into real life, how the how this changes behavior in public, even if you've never done anything wrong in your life. Finally, we'll do practical takeaways, questions to ask, and technical expectations that should be non-negotiable. Let's transcode the buzzwords for you out the gate. PTZ means pan, tilt, zoom. So instead of a fixed camera that watches a whole wide scene, a PTZ camera can physically turn and zoom to keep the subject, the recording object in frame. Now, add the detail that makes this feel like a different category of surveillance. It's AI-assisted. It has AI-assisted tracking. And the reporting around Flox Condor cameras, these are described as being designed to detect and track people. And 404 Media describes configurations where the camera automatically zooms toward faces as people move through the area. In Ben Jordan's description, these are PTZ cameras that use AI to zoom and follow you around, whether you're a person of interest or not. That's the difference between a camera exist and a camera is actively paying attention. It's not just recording anymore. It's selection, it's prioritization. It's a system deciding who gets to who gets the close-up. Now let's talk about the open internet part because that's the accelerant. There's a search engine called Shodan that indexes internet connected devices. 404 Media reported that exposed flock condor cameras, their interfaces, could be found via Shodan and viewed without any authorization, any authentication, no two-factor, anything, just wide open out there on the internet. And without any kind of authentication, that's the whole ballgame. Because it's it's one thing for a city to say, we've got cameras. It's another thing entirely if someone somewhere can passively watch people in real time without logging in. According to 404 Media, the exposure involved access to live streams and described they described access to uh archive video in settings through exposed interfaces. In Ben Jordan's account, he describes finding administrative interfaces for dozens of cameras where video wasn't even encrypted and no username or password was required to view live and archived footage. Now, a fair and important note, Flock publicly framed this as a limited configuration issue involving a troubleshooting only debug interface that was temporarily accessible from the internet. Yeah, sure. And you know, they stated it did not allow camera control, cloud access, account access, or analytics or search features. Go watch Ben Jordan's video. Proves otherwise. Flock also stated that what it what could be viewed was comparable to what can be observed from a public roadway. Okay, sure. Here's why that doesn't settle the fear at all. A person standing near a roadway doesn't get frictionless replay. They don't get easy sharing. They don't get checkback later access to see patterns across days. And when the camera can automatically zoom in and track the footage becomes much more identifying. A wide shot is annoying-ish. A tight shot is a face, a tattoo, a logo on a uniform. And a kid's backpack. Home's entryway. Details you can't unsee. Here's the part people miss this is a system that gets more powerful as you add more cameras. One camera is a view. Multiple cameras becomes coverage. Coverage becomes routes, routes become routines. So the technical core of this episode is simple. If a system is designed to track people, the security has to be designed like it's protecting people because it is. Now let's connect this to real life. If you've ever watched someone walk down a sidewalk on a security camera feed, you know the feeling. It's distant, it's grainy, it's abstract, and whatever. But when a system can auto-follow and auto-zoom, that distance collapses. In Ben Jordan's framing, this wasn't just you can see a camera feed. He described being able to browse recent activity across locations like library, like a library of footage, what he calls Netflix for stalkers. And even if you've never touched a setting, just watching is enough to do damage because watching reveals patterns. This is where the fear about identifying families becomes real. You don't need a name if you can repeatedly see the same people, the same car, the same school, pickup routine, the same playground schedule. Time plus video equals a profile. Ben Jordan talks about the Hathorne effect. People changing behavior because they know they're being observed. And how surveillance doesn't just deter crime, it deters harmless being human moments. And that matters because public space isn't only when where we commute, it's where we decompress, learn, practice, and just exist. Now layer that with the possibility that the audience isn't authorized staff, but literally anyone can be in there watching. It's not oversight. That's just that's a vulnerability. And to be clear, this isn't a story about one company being uniquely evil. It's a story about high impact surveillance tech meeting real-world deployment mistakes. When the gear is powerful enough, the margin for error has to go way down. All right, let's uh let's say we get practical, shall we? Without turning this into a guide for doing harm. If these systems exist in your town or in your county or your local shopping district, here are a few questions that are fair to ask, I think. What exactly is deployed? Fixed cameras, license plate readers, an AI-enabled PTZ people tracking camera? Who can access live views and archives and what authentication is required? Was an independent security assessment performed on the deployment system? Not just the uh the sales demo, you know? What's the emergency process? If exposure is discovered, who gets paged, how fast can access be cut off, and how is the public notified? Now, here are the baseline expectations that shouldn't be controversial to anybody. This should be baseline stuff. Troubleshooting and admin interfaces should not be reachable from the public internet, plain full stop, period, exclamation mark, whatever. Access should be should it should require some form of authentication, you would think, right? And access should be logged and reviewed. The system should be designed so a single misconfiguration can't turn into a public broadcast. And this is the line that matters the most, especially if you're hearing this as a parent or a runner or someone who just wants to exist in public without being tracked and feeling like they're hunted. If a camera can follow you, the system is already intimate, I think. So the security has to be intimate too, tight, deliberate, and constantly verified. Last the boundary. Again, clearly stated. I can't advocate for specific policy changes because of my role, my day-to-day work. But this needs to be addressed, and it needs to be addressed now. Because the risk is immediate when the exposure is immediate. The hook was the camera that notices you, zooms in and follows you. When that kind of system is exposed to the open internet for anybody to watch, the harm scales instantly. Visit Tylerwoodward.me, follow at Tylerwoodward.me on Instagram and threads, subscribe and like the show on your favorite podcast platform. I'll catch you next week.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
The Why Files: Operation Podcast
The Why Files: Operation Podcast
Sightings
REVERB | QCODE
Search Engine
PJ Vogt
The 404 Media Podcast
404 Media
Darknet Diaries
Jack RhysiderTechdirt
Techdirt
IT Horror Stories
NinjaOne
Hard Fork
The New York Times
The Ezra Klein Show
New York Times Opinion
Alive with Steve Burns
Lemonada Media
99% Invisible
Roman Mars
StarTalk Radio
Neil deGrasse Tyson
Primary Technology
Stephen Robles and Jason Aten
Uncanny Valley | WIRED
WIRED