Chapter 9 — The Knowledge Market
Why Truth Now Behaves Like a Financial Instrument
The modern world congratulates itself on having solved the scarcity of information.
This is an impressive achievement, and a deeply ambiguous one.
For scarcity was never the only function information performed. It was also a constraint. It limited noise. It limited entry. It created time, friction, and hierarchy. It ensured—often unfairly, sometimes brutally—that claims passed through gates before they acquired authority. The gates were imperfect. Sometimes corrupt. Often exclusionary. But they performed an elemental systemic task:
they made truth difficult to manufacture.
The central fact of the contemporary knowledge environment is that we have removed most of those gates while preserving the existential stakes that made gatekeeping, in the first place, a social necessity. The consequence is not merely “misinformation” in the casual sense. It is a structural transformation in how truth behaves.
Truth has begun to behave like a financial instrument.
Not because people became dumber. Not because institutions became wholly malignant. And certainly not because the public suddenly developed a taste for conspiracy as entertainment. Those are psychological readings of a structural shift.
The structural shift is this:
Knowledge now circulates at scale without reliable mechanisms of settlement.
In such an environment, claims do not win by being true. They win by being tradable.
And tradability is not a moral category.
It is a market property.
The older knowledge environment had two basic characteristics.
First, publication was scarce. Second, enforcement was real.
“Enforcement” here does not mean police or censorship. It means something more mundane and more fundamental: the ability of institutions to close disputes, stabilize reputations, and impose consequences for error.
If you published a claim in an academic journal, you did not merely speak. You entered a field in which counterclaims could be made, errors could be demonstrated, reputations could be damaged, and careers could be altered. The system was imperfect, but it did not treat knowledge as consequence-free performance.
The new environment reverses this ratio.
Publication is now frictionless. Enforcement is now optional.
This is what scale does when it outruns institution. It detaches speech from consequence. It permits what might be called epistemic free entry—a condition in which the marginal cost of producing a claim approaches zero, while the marginal cost of evaluating it rises sharply.
This is the first asymmetry of the knowledge market:
- It is cheap to assert.
- It is expensive to verify.
When verification becomes expensive, it becomes selective. And when it becomes selective, it becomes partisan, tribal, and identity-bound—not because people are irrational, but because the system has made full verification impossible as a general practice.
The citizen, in earlier eras, could delegate substantial epistemic labor to institutions: newspapers, universities, courts, professional associations, licensing bodies. Those institutions were never neutral, but they performed a key coordinating function: they produced a reasonably shared reality at the level required for collective action.
Now the individual must do the institution’s work privately.
This is the epistemic equivalent of the pension-to-401(k) shift described in the previous chapter:
the guarantee is withdrawn; the stakes remain; the individual inherits the job.
The manifest function of the internet is democratization of access. The latent function is privatization of epistemic labor.
Under such conditions, the knowledge environment does not become “freer.”
It becomes more expensive—cognitively, emotionally, and socially.
And as with any cost that is not priced explicitly, it will not be distributed evenly.
Those with surplus time, high cognitive bandwidth, or strong ideological motivation will dominate the evaluation layer. Those without such resources will either outsource judgment to substitutes—tribes, influencers, charismatic simplifiers—or retreat entirely.
That is not ignorance.
That is rational triage under overload.
When an environment expands faster than its systems of evaluation, it adopts substitute signals.
In a knowledge market, those substitute signals are rarely epistemic. They are social.
The most important substitute signal is visibility.
In earlier knowledge regimes, visibility was downstream of validation. A claim became widely known because it survived institutional filtering. The filtration could be biased, but it existed.
In the current regime, visibility becomes upstream of validation. A claim becomes widely believed because it becomes widely seen. The seeing is then treated as evidence of merit.
This is the second asymmetry:
- Credibility is no longer primarily earned.
- Credibility is acquired through distribution.
The mistake, often made by people who wish to preserve optimistic narratives of modernity, is to treat virality as a defect of human psychology. “People like drama.” “People like outrage.” “People are addicted to dopamine.”
Those statements may be true and still miss the mechanism.
Virality functions as a price signal.
In a market flooded with supply, one needs a proxy for value. The proxy cannot be truth because truth is expensive to verify. So the proxy becomes engagement: clicks, shares, likes, comments, outrage, memes, repetition.
These are not epistemic signals. They are liquidity signals.
They tell you how easily the claim circulates—not whether it is accurate.
But the system interprets circulation as value because circulation is the only available public metric. Thus, claims with high tradability acquire the aura of truth. Their price rises. Their visibility increases. Their authority compounds.
This is not a corruption of the system.
It is the system’s adaptation to scale.
When everyone can publish, the system must rank.
And when it ranks without enforcement, it ranks by visibility.
This is the first step in transforming truth into an instrument: it becomes something traded for attention, not settled by proof.
At this point, one must introduce an old sociological mechanism with renewed force:
the Matthew Effect.
It is often summarized as “the rich get richer,” but the deeper claim is structural:
Advantage compounds through visibility, not merit.
In knowledge markets, this means:
the more attention a claim receives, the more credible it appears; the more credible it appears, the more attention it receives.
This recursion produces something that looks like authority but behaves like market capitalization.
Authority becomes an asset class.
It can be accumulated, traded, leveraged, converted into money, converted into political power, converted into social immunity. It can be carried from domain to domain regardless of competence. It can be reissued in new contexts with minimal friction.
In older systems, expertise was local. A physician was not presumed to be a constitutional theorist. A physicist was not presumed to be a historian. A journalist was not presumed to be a virologist. Status did not travel easily because institutions enforced domain boundaries.
But in an attention market, boundaries dissolve.
A person with authority in one domain can cash it in everywhere, because the currency is not competence—it is visibility.
This is the third asymmetry:
- Expertise was once domain-bound.
- Authority is now portable.
The Matthew Effect accelerates because platforms are themselves compounding machines. They do not merely display what people choose. They amplify what is already amplified. They create a feedback loop between popularity and reach. The result is a knowledge environment that behaves like a winner-take-most market.
And in winner-take-most markets, a small number of actors dominate distribution regardless of quality, while the majority compete for attention with ever more extreme or stylized outputs.
Truth, under such conditions, becomes an underfunded competitor.
For truth is usually slower than spectacle, more qualified than certainty, and less attractive than simplification. It does not compress well. It does not go viral easily. It is not naturally optimized for algorithmic survival.
This is not merely a cultural problem. It is structural selection.
The environment selects for claims that travel efficiently through human cognition and platform incentives. Those claims then acquire authority through repetition. Authority then stabilizes itself through attention, regardless of epistemic content.
To describe this as “post-truth” is too theatrical.
It is not post-truth.
It is pre-settlement.
Once truth behaves like an instrument, the liberal instinct is to believe that the market can be corrected through information.
“If we just give people the facts.”
“If we just debunk the misinformation.”
“If we just teach media literacy.”
These are not stupid ideas. They are incomplete. They assume the environment still contains enforcement mechanisms strong enough that correction can close disputes.
But the knowledge market is defined precisely by the absence of binding closure.
Debunking fails for structural reasons.
First: correction is more expensive than assertion.
To correct a false claim, one must often do several things that the false claim did not have to do:
- define terms
- introduce context
- cite sources
- establish method
- show uncertainty
- explain inference
- anticipate counter-moves
The false claim, meanwhile, is a one-sentence asset designed for rapid circulation.
Second: correction rarely produces a state change.
In institutions, correction mattered because it carried consequence. A paper could be retracted. A license could be revoked. A reputation could be formally damaged. An actor could be forced to concede.
In the knowledge market, none of that is required.
A claim can be disproved and still circulate. It can lose on evidence and still win on attention. It can be “debunked” and gain strength because debunking itself increases visibility.
This is why the most modern and fatal property of misinformation is not falsity.
It is immunity.
The claim is structurally protected not by secrecy but by liquidity. It is too easy to reissue. Too easy to remix. Too easy to relabel. Too easy to reframe as persecution: “Look how hard they are trying to silence me.”
Third: correction is experienced as domination.
In environments where there is no shared court, correction is not interpreted as clarification. It is interpreted as attack. The corrector becomes an aggressor because he attempts to impose settlement without legitimate authority.
This is the central coordination failure:
people no longer share the rules by which disputes end.
Therefore, disputes do not end.
They escalate. They moralize. They become identity battles.
“Debunking” then becomes a ritual performance, not a closure mechanism. It is a signal of group membership: I am aligned with the fact-checkers. I am aligned with the rationalists. I am aligned with the institutions.
But rituals do not settle. They merely reassure.
This produces a particularly modern irony:
the institutions of truth continue to speak the language of correction in a world where correction no longer binds.
They are performing the manifest function while the latent function has shifted beneath them.
This is why “debunking” has the peculiar taste of futility. It is not that the correctors are wrong. It is that the system is no longer designed to accept closure.
Correction is now a kind of unpaid labor performed by those still devoted to shared reality.
The market does not reward them.
It extracts from them.
We can now state the modern condition without melodrama.
Information increases. Truth decreases.
This sounds paradoxical only if one assumes that information naturally produces truth.
It does not.
Information is raw supply.
Truth is the result of settlement: procedures, standards, authority, enforcement, and closure.
A society can have an explosion of information and still lose truth if it loses the capacity to settle disputes and bind outcomes.
What is shrinking is not data.
What is shrinking is shared legibility.
The knowledge market produces abundance, but abundance without settlement produces what markets always produce when quality control fails:
a flood of low-grade instruments that can be traded but cannot be trusted.
This is why truth now behaves like a financial instrument.
It is no longer primarily a state of the world discovered through shared procedure.
It is a position taken, defended, leveraged, and signaled.
It becomes:
- a portfolio identity
- a tribe marker
- a weapon
- a brand
- a substitute for belonging
- a form of protection against humiliation
And when truth becomes that, people do not merely disagree.
They cannot coordinate.
Because they are not operating in the same epistemic currency.
They do not share settlement rules.
They do not even share what counts as evidence.
They do not share who is authorized to close the dispute.
This is the condition your readers experience in daily life as:
- “We can’t talk anymore.”
- “Nothing is agreed upon.”
- “Everything is politics.”
- “No one can be convinced.”
- “Everyone has their own reality.”
Those complaints are not merely cultural despair.
They are structural diagnosis.
They are anomie.
Norms remain visible—facts, expertise, institutions, credentials—but they no longer bind. The grammar remains; enforcement dissolves. The result is not chaos, but permanent argument without settlement.
This is why the knowledge market is not a side issue.
It is the primary modern engine of coordination collapse.
When you cannot share reality, you cannot fix problems. When you cannot close disputes, you cannot govern. When you cannot enforce epistemic standards, you cannot even maintain friendship across difference.
The system does not require bad actors to reach this state.
It requires only scale, cheap publication, and weak enforcement.