How We Built a Truth Machine - and Broke It
Mass media made a shared reality; the attention economy shatters it. What does that shift means for leadership?
We can’t unsee what’s vanished
We all know it: the facts and truths that people agree on are disappearing. A global pandemic happens, and people can’t agree whether actions taken in response count as sensible public health measures…or unwarranted attacks on personal freedom.
A U.S. presidential election happens, and citizens can’t agree whether it was ‘free and fair’…or blatant theft of their country’s highest office. Climate change is either a con job or an existential threat to life as we know it. We took our shared reality for granted. Now we’re losing it.
We built a truth-machine
In pre-modern times, sources of truth were rather singular. Truth was whatever those in authority (priests, monarchs) pronounced it to be. The Enlightenment democratised the manufacture of shared reality; scientific method turned truth into something that anyone* could generate by following accepted methods of observation and rational thinking. (Yes, it helped to have the right background and access to a printing press.)
The 19th and 20th centuries introduced a new pillar to our knowledge infrastructure: mass media (national newspapers and then television broadcasters). The economics of the industry dictated that many people be served the same content. (It cost a lot to broadcast one programme, but it cost nothing if one more person tuned into it.) Hence, “mass” media. It was a truth-machine that generated a background hummmmmm of shared reality and reinforced it daily to us, its audiences. “…And that’s the way it is,” Walter Cronkite famously signed off the CBS Evening News each night for 19 years, from 1962 to 1981. “All the news that’s fit to print” is still the slogan of The New York Times.
It hypnotized and harmonized us
20th-century media theorists (Noam Chomsky, Marshall Macluhan) strove to pull that hummmmmm into the foreground, to gird us against the power this machine had to hypnotize what we thought and didn’t think. Looking back, it’s plain to see how much the mainstream missed.
But this background hummmmmm also had some useful properties—which we mostly took for granted. The most important of these side effects included:
- Public trust in the quality of information we were getting. The economics and politics of mass media conspired to produce (and enforce) standards of quality and ethics. Reporting the news was a profitable, powerful privilege; the license to do so could be withdrawn by regulators or stolen by competitors. For all the medium’s flaws, audiences and readerships widely suspected that, if they ever did take the time and energy to verify a given story for themselves, they would uncover the same bare facts [1].
- Public trust in public institutions dedicated to creating knowledge: journalism, science, academia.
- Public trust in experts, i.e., in people who follow generally accepted standards of practice and evidence to know what they know.
- A weighty notion of “the common good”—weighty enough to demand some self-sacrifice on my part, at least sometimes.
We were hypnotized, but also somewhat harmonized, by doing so much of our social discourse and debates on the same wavelengths.

Then we broke it
Today, everyone can own a truth-machine.
The new economics of information are:
- Costless creation. The internet drove content production costs down to whatever time it took someone to make it. Now generative and agentic AI are driving that time to zero.
- Costless, permissionless distribution. Any message can spread freely, instantly, everywhere. Even language barriers are dissolving, thanks to automated instant translation.
The supply of content is approaching infinity. But we each have only so many hours in a day to consume it. So, the content platforms and purveyors tug us and nudge us this way and that, competing for the one remaining scarce thing that sets the limit of their power and profitability: our attention [2].
The game has changed. Engagement, not accuracy, marks the quality of content now. Exhibit A: More people around the world use Facebook for news than any other platform, but it is not for news [3]. It is for establishing a habit of use.
The algorithms that populate our social media feeds serve up whatever’s most likely to grab us—which, the past twenty years of evidence has taught, often means pushing content that triggers our “dark passions”: anger, fear, or domination [4]. The new economics of media dictate that, ideally, no one be served the same diet of content. Instead, each person’s feed should be as unique as their fingerprint, tuned precisely to their prior consumption patterns. The UX (pull-to-refresh; new content every scroll; notification badges; read-receipt blue tick marks) is designed for addiction, the way junk foods exploit our taste for sugar and fat [5].
Until next time…
In the next post, we’ll survey the ledger of harms (to borrow Tristan Harris’s phrase): the dysfunctions across politics, science, and society that leading researchers link to this collapse of common knowledge.
For here and now:
- What’s this piece making you think about, from your own experiences?
- What “common knowledge” has vanished from your world – but shouldn’t?
This has been post #1 in the chapter Common Knowledge is Collapsing.
Subscribe
Further Reading
[1] Jonathan Rauch – The Constitution of Knowledge
[2] Rasmus Kleis Nielsen – The Changing Economic Contexts of Journalism
[3] Maria Ressa - Facebook Broke Democracy in Many Countries around the World, Including in Mine
[4] William Galston - Anger, fear, domination: Dark passions and the power of political speech
[5] Zeynep Tufekci - YouTube, the Great Radicalizer