Bullshit Detector
Why people bullshit
"Never tell a lie when you can bullshit your way through it." — Eric Ambler
"One of the most salient features of our culture is that there is so much bullshit." — Harry Frankfurt
Frankfurt's 1986 essay On Bullshit draws a precise distinction that's easy to miss: a liar cares about the truth and is trying to hide it; a bullshitter doesn't care about the truth at all. The bullshitter's goal is the impression they create on the listener. Truth is at best a convenient side-effect, at worst an obstacle.
This distinction matters because the two require different defenses. You catch a liar by cross-referencing claims against evidence. You catch a bullshitter by noticing that they don't know, don't care, and aren't trying — they're performing. Most people's intuition conflates the two, which is why bullshit slips past detectors calibrated for lies.
People bullshit for four main reasons:
- They don't know but can't say so. Incentives in most professional settings punish "I don't know" and reward confident-sounding answers. Bullshit fills the gap.
- They're optimizing for perception, not accuracy. Sales pitches, VC updates, performance reviews — any context where the goal is a specific impression — rewards bullshit structurally.
- They've never been punished for it. If a forecast has no accountability loop, bullshit is the rational strategy. The person who predicts 10 recessions in a row is indistinguishable from the person who correctly predicts one.
- They actually believe it. The most dangerous form. A sincere bullshitter has rationalized their position to the point where they no longer notice the claims are unsubstantiated.
The key indicator: asymmetry of risk
==A key indicator of bullshit is the asymmetry of risks — when someone who does not have a skin in the game advocates something that benefits them if it succeeds but hurts only you if it fails, then they are bullshitting you.==
This is Nassim Taleb's formulation and it's the single highest-signal heuristic. It short-circuits most arguments about expertise, credentials, or eloquence. The question to ask isn't "is this person smart?" It's "what happens to them if they're wrong?"
- The analyst recommending the stock has no position in it. Asymmetry.
- The consultant selling a transformation has never operated the system they're reorganizing. Asymmetry.
- The pundit forecasting a recession has no portfolio exposure and no track record of prior calls. Asymmetry.
- The engineer pushing a rewrite will leave the company in 6 months. Asymmetry.
Not all asymmetry is bullshit — sometimes an advisor genuinely offers something of value without personal exposure. But in the absence of skin in the game, the burden of proof shifts. Track record, testability of claims, and willingness to stake something personal all become necessary signals.
Other practical detectors
Seven heuristics that have held up in practice:
- Ask for a falsifiable prediction. "The market is volatile" is bullshit. "This stock will be above $X by date Y" is a claim you can track. The instinctive move to hedge predictions into unfalsifiability is diagnostic.
- Ask what would change their mind. A serious position comes with conditions under which it would be revised. A bullshit position doesn't — it's a mood, not a model.
- Replace the jargon with plain words. If the plain version still makes sense, you've probably received a real idea. If it dissolves into nothing, you got bullshit. This is especially useful for consultant and academic jargon.
- Check for load-bearing metaphors. A single metaphor doing all the argumentative work ("it's the Uber of X," "synergy across the stack") usually means there isn't a real argument underneath.
- Track the concreteness gradient. Good arguments get more specific as you push on them. Bad arguments retreat to higher abstraction when questioned. A speaker who escalates to abstraction every time you ask a concrete question is bullshitting.
- Notice the claim-to-evidence ratio. A 30-minute pitch with zero numbers, zero sources, and zero caveats is a tell. Real analysis has texture.
- Watch what they do, not what they say. The consultant who won't invest their own money. The analyst who doesn't use the product. The founder who tells you the company is doing great but is selling personal shares. Actions are higher-bandwidth signals than words.
The inverse: calibrating your own output
A good bullshit detector applies inward as well as outward. Three failure modes to watch for:
- Generating confident answers to questions you don't actually know. The impulse is nearly universal; the discipline is catching yourself doing it and downgrading to "I don't know, but here's how I'd find out."
- Over-smoothed status updates. "Everything's on track" when three of five workstreams are orange is bullshit, even if unintentional. Learn to say "these two are at risk for these reasons" without flinching.
- Retrofitted narratives. Explaining a success with a reason you didn't hold at the time. This is hindsight bias wearing a story hat. If you can't find the pre-outcome written record, be skeptical of your own explanation.
Calibration — knowing how sure you are, and matching your stated confidence to your actual confidence — is the single biggest defense against generating bullshit yourself. Philip Tetlock's Superforecasting work is the clearest evidence that this can be trained, and that people who train it produce dramatically better predictions.
Why the detector matters
The cost of under-detecting bullshit compounds. A single bad hire based on a polished résumé and an unverified narrative. A quarter spent on a strategy recommended by someone with no stake in the outcome. A product bet made on a VC trend piece that aged badly. Each individual miss feels small; over a career they're the majority of career-damaging mistakes.
A working detector doesn't require cynicism. Most people are well-intentioned most of the time. The detector is about noticing the structural signals — incentives, accountability, falsifiability — that distinguish substance from performance, regardless of intent.
See also
- Bikeshedding — the cognitive asymmetry where confident opinions on trivial matters crowd out serious analysis.
- Steve Jobs: Managers and Bozos — the managerial failure mode where confident-sounding process displaces real expertise.
- Good to Great — Collins on confronting brutal facts, the organizational analogue of personal bullshit detection.