Ad Tech’s Bielefeld Problem: What If None of It Is Real?
by Shirley Marschall on 15th Apr 2026 in News

In her latest column, Shirley Marschall looks at a decades-old conspiracy theory, and what it has in common with ad tech's 'frictionless' promise...
Ever heard of the town Bielefeld in Germany? Probably not… but you might have heard about the Bielefeld Conspiracy. In 1994, a German student posted a simple claim online: the city of Bielefeld does not exist. Not metaphorically. Literally.
Every supposed detail, the roads, the university, even the football club, was, he argued, part of an elaborate illusion maintained by a shadowy entity known only as “SIE” (“THEM”). To separate believers from sceptics, he proposed three questions: Do you know anyone from Bielefeld? Do you know anyone who has ever been there? Have you been there yourself? If not, what exactly are you trusting?
The joke spread because it was obviously absurd. A shared fiction everyone could enjoy without consequence.
Ad tech is arriving at something similar, just without the self-awareness.
What’s emerging is not a conspiracy, but a system that doesn’t seem to need external reality to function any longer. Not by design (at least not entirely) but as an AI side effect, the mechanics have evolved into something frictionless and synthetic.
Frictionless. The magical version of the world the industry keeps promising, faster, easier, cleaner, with nothing getting in the way. In the digital version, it looks a lot less magical. AI now generates supply en masse, content, impressions, audiences…entire environments of synthetic interactions. Also AI now bids on that supply, evaluates its performance, and feeds the results straight back into optimisation. And around it goes. An Ouroboros market, self-consuming, self-reinforcing, and increasingly self-sufficient. The loop closes neatly, efficiently, and entirely AI-internal.
As Natasha Whitfield-Niven, CEO & Founder of CertM8, points out: “If the same systems are generating signals, acting on them, and validating them, then any meaningful check has to come from outside that loop; and it needs to be something the system can’t influence or reinterpret.”
At no point does something definitively real have to happen. Only something measurable and convincing enough. Clicks resemble engagement. Conversions resemble outcomes. Optimisation resembles improvement. Each layer validates the next, and as long as the outputs remain plausible, the system holds together. Yes, that’s the bar now. Not truth. Not causality. Just plausibility.
There used to be distance between what happened and what was reported. That gap was inconvenient, sure, but it also acted as a form of discipline. Measurement pointed at reality, even if imperfectly.
As Thomas From, CPO at Adnami, described it: “There has always been a gap between what we measure and what happens in the real world. I think of it like a map: it doesn’t replicate reality, but it’s good enough to navigate it. With AI systems there is a twofold risk: that awareness of the gap fades, and that the gap itself widens as closed systems increasingly optimize against their own outputs. In other words, the map warps until any resemblance to reality is lost, and few will notice.”
Now that this gap is AI-ed away, the systems generating behaviour are increasingly the same systems interpreting it. The distinction between reality, simulation, and optimisation isn’t really being looked at. It’s mostly being treated as irrelevant. After all, if the numbers move, the system is working. Right?
Which raises the question, not a philosophical one, but a practical one: if you step outside the interface, outside the dashboards, the models, the outputs, how much of what is being reported can actually be traced back to something independently verifiable? Or, put differently: how much of the system is observing the world, and how much of it is observing itself?
What takes shape is a market that is no longer tightly anchored to the outside world. Not fully detached, but sufficiently insulated to operate on its own internal logic. Synthetic supply can perform against synthetic demand. Modelled users can complete modelled journeys. Performance improves, because performance is defined within the same system that produces it.
The industry’s instinct is to respond by adding more data, more signals, more precision. But that only reinforces the loop. What’s missing isn’t yet another layer of optimisation, but a mechanism that forces the system to confront something external, something it didn’t generate, doesn’t control, and cannot quietly redefine. Because without that, internal coherence starts to pass for truth.
“The missing layer isn’t more data, it’s a mechanism that reconnects optimisation back to external reality before internal coherence is mistaken for truth,” says Whitfield-Niven. That’s exactly the Bielefeld problem. Not a lack of data but a lack of anything that forces the system to confront reality.
Reintroducing that kind of constraint is inconvenient (at best), unimaginable (at worst). It creates friction (which we just successfully eliminated). It breaks clean feedback loops, exposes the gap between modelled performance and actual impact. In other words, it does exactly what the current system has been engineered to eliminate. But that friction is exactly the point, because without it, we’re left with a market that works perfectly on its own terms and increasingly only on its own terms.
The Bielefeld Conspiracy was always a joke. Bielefeld exists. You can go there and meet people from there. The verification is trivial. Ad tech should, theoretically, be equally trivial to verify. An impression was served. A human saw it. A behaviour followed. The chain is traceable.
In practice, it often isn’t… somewhere between the AI-generated content, AI-bidding, AI-measurement and AI-optimisation, the connection to external reality got a bit severed. Add to the mix a supply chain that is too long, too automated, and a loop that is too closed and too dependent on self-reported signals from parties with incentives to report success.
At that point, it almost doesn’t matter whether any of it is real. The system works, the numbers move, the story holds. And if you’re not looking too closely, that’s usually enough.
Read all of Shirley's columns here, and find her on LinkedIn
Ad NetworkAd SpendAgentic AIAIGenerative AIMeasurement




Follow ExchangeWire