An Inquiry

How to Think
Clearly

· · ·

Notes on the systems, traps, and habits of mind for those who would build a better commons together.

i
Preface
The Premise

A city is built twice. First in the minds of its founders, then in the world.

When a community gathers to design something new (new institutions, new norms, a new commons), the quality of what gets built is bounded by the quality of the thinking that produces it. Bad maps make bad cities. Self-deception scales. The earliest decisions ossify into the streets people will walk for decades. So we must take seriously the discipline of seeing things as they are.

These notes draw on Kahneman, Galef, Anna Riedl, and the broader rationalist tradition, with the assumption that the reader is among the builders.
ii
Preface
First Principle

Thinking is a trainable skill.

Most people treat their reasoning the way they treat their handwriting: a thing they do, never quite examined, never deliberately practiced. Yet the operations of the mind are trainable. Calibration improves with feedback. Bias awareness compounds. The habit of asking "how would I know if I were wrong?" becomes second nature, given time.

Three commitments shape what follows:

  1. Know your machinery. Understand the two systems by which the mind reasons, and where each fails.
  2. Build the habit. Learn the small rituals (thought experiments, calibration, steel-manning) that surface what you would otherwise miss.
  3. Keep your sovereignty. Stay the author of your own thinking, especially in tight-knit groups where memetic gravity is strong.
iii
Part I
Two Modes of Mind

On the architecture of thought, and the lazy intelligence of the brain at rest.

1
Part I · Two Modes of Mind
Kahneman

There are two systems running in your head.

Daniel Kahneman, drawing on decades of work with Amos Tversky, proposed a useful fiction: the mind has two modes of operation. They are not anatomical structures. They are a way of speaking about two kinds of cognition that share the same skull and rarely agree on who is in charge.

"System 1 operates automatically and quickly, with little or no effort, and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it."Daniel Kahneman, Thinking, Fast and Slow

2
Part I · Two Modes of Mind
The Two Systems

Fast and slow.

System I
The Intuitive Mind
  • Fast, automatic, effortless
  • Recognizes faces, drives familiar roads
  • Reads emotion in a voice
  • Speaks in feelings and impressions
  • Confident, even when wrong
System II
The Deliberative Mind
  • Slow, effortful, attentive
  • Multiplies 17 × 24, parses contracts
  • Holds counterfactuals in mind
  • Speaks in propositions and probabilities
  • Tires quickly, costs glucose
3
Part I · Two Modes of Mind
The Inconvenient Truth

System 2 is lazy. It outsources almost everything.

The intuition arrives first. Fully formed, certain, narratively complete. By the time conscious reasoning shows up, its job has been quietly downgraded to press secretary, defending what System 1 already concluded.

It is a metabolic compromise. Deliberation is expensive, and most of life does not warrant it. The cost is that we generate post-hoc rationalizations and call them reasoning. Clear thinking begins with the willingness to interrupt this default.

A useful question to ask in a heated discussion: "Did I work that out, or did it arrive whole?"
4
Part I · Two Modes of Mind
Cognitive Ease

We mistake fluency for truth.

Information that is easy to process (familiar, repeated, rhymed, well-typeset) feels more true. The brain takes ease as evidence. This is why slogans persuade, why repeated lies become beliefs, and why a clearly typeset claim is judged more credible than the same claim handwritten.

In a young community, this matters: ideas that get repeated in casual conversation acquire the patina of consensus, regardless of their merit. The clear thinker learns to feel the pull of fluency and discount it.

"Familiarity is not easily distinguished from truth." Kahneman
5
Part II
A Field Guide to Bias

The systematic errors of System 1, catalogued so they may be recognized when they occur in the wild, most of all in oneself.

6
Part II · A Field Guide to Bias
Six Common Biases

The mind's predictable distortions.

Confirmation Bias
We seek and remember evidence that fits what we already believe, and quietly skip past the rest.
Availability Heuristic
We judge how common something is by how easily examples come to mind. Vivid beats common.
Anchoring
The first number heard sets the gravity well. Subsequent estimates orbit it, even when arbitrary.
Sunk Cost Fallacy
We continue investing in a doomed plan because of what we have already spent. The past should not vote.
Halo Effect
A single attractive trait, like charm or success in one domain, colours every judgment we make of a person.
Hindsight Bias
After an event, we remember having predicted it. The past becomes legible in a way it never was in advance.
7
Part II · A Field Guide to Bias
Biases of the Group

Where many minds gather, new errors emerge.

Social Proof
If others believe it, we lean toward believing it too, independent of evidence. Useful in restaurants, dangerous in beliefs.
In-Group Favouritism
Arguments from inside the tribe receive lenient treatment. Identical arguments from outside are scrutinized.
Status Quo Bias
Whatever currently exists feels right. New proposals carry an unfair burden of proof.
Mimetic Desire
We learn what to want by watching others want it. In tight communities, the desire-graph collapses fast.
Groupthink
A cohesive group converges on a view because dissent feels disloyal, and silence is read as agreement.
Optimism Cascade
Founders surrounded by other founders are unusually optimistic. The prior calibration of the room is already wrong.
8
Part II · A Field Guide to Bias
Try It

Which kills more people each year?

Trust your gut, then check. The gap between your first answer and the actual numbers is the availability heuristic at work.

"In a typical year, which causes more deaths worldwide?"
Shark attacks: about 5 to 10. Cow-related deaths: roughly 20 in the US alone. Sharks star in films; cows rarely do. The mind catalogues vividness much more readily than frequency.
9
Part III
Methods of Rationality

A handful of practices, neither ornate nor mystical, that reliably produce better thinking when applied as habits.

10
Part III · Methods of Rationality
Method I

Hold beliefs as probabilities.

A clear thinker does not say "I believe X" as one might salute a flag. They say "I am 70% confident in X" and watch how that confidence moves as evidence arrives. This is the only honest grammar for an uncertain world.

Try the slider. Where does your confidence sit on the claim below? Move it. Notice how strange it feels to commit to a number.

"This community will still exist, in some recognizable form, ten years from today."
impossibleuncertaincertain
50%
A coin flip. You have no information either way.
11
Part III · Methods of Rationality
Method II

Calibration is a trainable skill.

A well-calibrated forecaster, when they say "70% confident," is right about 70% of the time. Most people are not. We are systematically overconfident on things we feel sure of, and miscalibrated in characteristic ways.

The fix is mundane. Make predictions. Write them down with probabilities. Months later, score yourself. Notice that your 90%s come true 65% of the time. Adjust.

"A bet is any decision in which you stand to gain or lose something of value, based on the outcome."Julia Galef

If you are unwilling to bet on it at the odds you claim to believe, you do not actually believe it at those odds.
12
Part III · Methods of Rationality
Method III

Steelman before you strike.

A straw man is a weak version of the opposing argument that is easy to defeat. A steel man is the strongest version of it, sometimes stronger than what its proponents themselves articulate. Defeat that one, if you can.

The discipline has two effects. First, it dramatically improves the quality of disagreement. Second, and this is the part most people miss, it occasionally reveals that the steelmanned argument is correct, and yours was not.

Practice
"What is the strongest case for the position I disagree with most? Could a thoughtful, well-informed person hold it?"
If you cannot construct such a case, you do not yet understand the disagreement. Suspend conclusion until you can.
13
Part III · Methods of Rationality
Method IV

Conduct the pre-mortem.

Before the project begins, gather the team and assume the project has already failed catastrophically. Treat the failure as completed fact. Now write the autopsy. Why did it fail? What did we miss? Who saw it coming and was not heard?

This small reframing of failure as completed fact bypasses the optimism that surrounds every founding moment. It loosens the tongue of the cautious member who would otherwise stay polite.

A good pre-mortem produces at least one objection that genuinely surprises the founders. If everyone reaches the same comfortable failure modes, the exercise has not bitten.
14
Part III · Methods of Rationality
Method V

Update by degrees.

When new evidence arrives, the right question is "how should this nudge my probability?" Strong evidence moves the needle a lot. Weak evidence moves it a little. No evidence moves it not at all.

Most disagreements that look like contradictions are actually just different priors confronting the same data. Two reasonable people, starting from different places, can update toward different conclusions and both be doing it correctly.

"If you are not changing your mind, you are doing something wrong."Julia Galef

15
Part IV
The Scout Mindset

A précis of Julia Galef's The Scout Mindset (Penguin, 2021), with attribution and respect.

16
Part IV · The Scout Mindset
Galef's Distinction

Two postures toward your own beliefs.

Galef's central image: there are two stances one can take toward a question. The soldier defends a position; the scout draws a map. The soldier asks "can I believe this?" or "must I believe this?" The scout asks "is this true?"

The Soldier

Defends. Attacks. Concedes ground. Treats reasoning as combat in service of a side already chosen.

The Scout

Surveys the terrain. Maps it accurately. Reports back, even when the news is unwelcome.

Julia Galef, The Scout Mindset: Why Some People See Things Clearly and Others Don't (2021). Outline by Rob Bensinger, LessWrong.
17
Part IV · The Scout Mindset
Galef · Chapter 2

Why the soldier is the default.

Soldier mindset is protection. It guards things that genuinely matter to us. Galef organizes the protections into two halves:

Three Emotional

For the self

Comfort. Avoid unpleasant feelings.
Self-esteem. Feel good about who you are.
Morale. Stay motivated for hard things.

Three Social

For the group

Persuasion. Convince yourself to convince others.
Image. Believe what makes you look good.
Belonging. Hold the views your tribe holds.

Soldier mindset is rational, locally. It often serves you. The trouble is that we deploy it constantly, even when it does not.
18
Part IV · The Scout Mindset
Galef · Chapter 4

The signs of a scout are behavioural.

Almost everyone, asked, will report being open-minded. Galef's test is sharper: do not ask what someone believes about themselves. Ask what they have done.

  1. Do you tell people when you realize they were right? At the moment, plainly, without delay or face-saving qualifiers.
  2. How do you receive criticism? Can you point to feedback you acted on? To a critic you promoted?
  3. Do you ever prove yourself wrong? Have you, in the last month, sought evidence against a view you hold?
  4. Do you have good critics? Can you name people who disagree with you whom you nonetheless respect?
  5. Can you name times you were in soldier mindset? If you cannot, you are still in it.
19
Part IV · The Scout Mindset
Galef · Chapter 5

Five thought experiments to catch yourself.

Galef's central tool. You cannot detect motivated reasoning by inspecting your conclusion. It will look reasonable from the inside. You must compare your reasoning to the counterfactual world in which your motivations were different.

i.
The Double Standard
Am I judging this person by a standard I would not apply to one I liked?
ii.
The Outsider
If I had just stepped into this situation today, with no history, what would I do?
iii.
The Conformity
If no one I respected held this view, would I still hold it?
iv.
The Selective Skeptic
If this evidence supported the other side, how credible would I find it?
v.
The Status Quo
If my current arrangement weren't the default, would I actively choose it?
20
Part IV · The Scout Mindset
Galef · Chapter 6

How sure are you, really?

Galef borrows an analogy from the evolutionary psychologist Robert Kurzban. Inside every person there is a board of directors, which makes the actual decisions, and a press secretary, which makes the public statements. The press secretary makes claims. The board makes bets.

When you say "I'm sure of it," you are speaking as press secretary. Ask the board: would you stake $100 on this? $10,000? At what odds? The bet reveals what you actually believe.

"The core skill is being able to tell the difference between the feeling of making a claim and the feeling of actually trying to guess what's true."Julia Galef

21
Part IV · The Scout Mindset
Galef · Chapter 11

Lean into confusion.

When an observation conflicts with your model, the soldier explains it away. The scout writes it down. Anomalies, accumulated, are how paradigms shift, but only if you stop dismissing them.

Galef's recommendation is uncomfortable: be willing to stay confused. Resist the urge to resolve a puzzle prematurely. Keep a list of things that don't quite fit your worldview. Some will turn out to be noise. Some will turn out to be the most important data you possessed.

"All too often, we assume the only two possibilities are 'I'm right' or 'The other guy is right.' But there is often a hidden option C."
22
Part IV · The Scout Mindset
Galef · Chapters 13–14

Hold your identity lightly.

A belief becomes an identity when criticism of it feels like an attack on you. Identities make some thoughts much harder to think. They make some updates impossible. They are the deepest layer of soldier mindset, and the hardest to notice because they no longer feel like beliefs at all. They feel like who you are.

The remedy is to hold your identity lightly. As description. As current best guess. As a working position you would update if the evidence asked you to.

Diagnostic
Could you pass an ideological Turing test for the position you most disagree with?
Could you write its strongest case so well that an actual believer could not tell you from one of their own? If not, you do not yet understand it. Until then, your disagreement is unearned.
23
Part V
Thought Experiments
for Builders

Specific exercises for those engaged in the design of new institutions, communities, and the early architecture of a city in formation.

24
Part V · For Builders
Rawls' Test

Design from behind the veil.

Imagine you must design the rules of this place (its membership policies, its governance, its norms around conflict) without yet knowing who you will be inside it. You may turn out to be the founder, the newcomer, the long-tenured resident, the dissenter, the child, the elder. You will not know your role until after the rules are set.

Now design.

Apply It
"Would I sign off on this rule if I did not yet know which side of it I would land on?"
Cuts through the natural tendency of any founding group to design institutions that flatter the founders' own position.
25
Part V · For Builders
The Civic Pre-Mortem

It is five years from now. The project has failed.

The community we are building has dissolved, fractured, or quietly become irrelevant. The most thoughtful early members have drifted away. A retrospective is being written. What does it say?

  1. The founder's circle never let go. Decisions remained legibly the property of three or four people; everyone else became spectator.
  2. Dissent was filtered. Polite disagreement was tolerated. Structural critique was discouraged, and the structural critics left first.
  3. The vibes calcified. An aesthetic became a litmus test. New people self-selected to match. Cognitive diversity collapsed.
  4. We confused enthusiasm for evidence. The bar for new initiatives became charisma. Energy ran out faster than judgment improved.
Useful only if it produces objections you have not yet heard the founders say out loud.
26
Part V · For Builders
The Newcomer Test

Would a newcomer recognize what we say we are?

A community has two descriptions: the one in its founding documents, and the one a thoughtful newcomer would write after a week of attendance. The first describes aspirations. The second describes practices. The gap between them is the most interesting thing about the community.

The discipline is to keep the gap small, and to know which document is more accurate when they disagree.

Practice
"If a stranger spent ten days here and wrote an honest field report, what would surprise me about it?"
If nothing would surprise you, either you are unusually self-aware or, more likely, you have stopped seeing the place clearly.
27
Part V · For Builders
The Reverse Test

If you weren't already here, would you choose to come?

Every member of an established community quietly accumulates reasons not to leave: friendships, sunk effort, pride. These reasons are real. They are also not the same as reasons to join. Confusing one for the other is how communities calcify long after their best version has departed.

Once a year, perform the reverse test on yourself. Imagine you are an outsider with full information about who is here, what they are building, and how it has gone. Would you, today, choose to come?

"If your current arrangement weren't the status quo, would you actively choose it?"Galef, the Status Quo Bias Test

28
Part VI
Cognitive Sovereignty

An introduction, drawing on the work of Anna Riedl, to the idea that one's capacity to think is itself a thing to be tended, defended, and chosen deliberately.

29
Part VI · Cognitive Sovereignty
Riedl · The Concept

The right and the practice of authoring your own thought.

Cognitive sovereignty, in Anna Riedl's formulation, is the capacity to remain the author of one's own thinking under the gravitational pull of social proof, algorithmic curation, and the deep pleasures of belonging. It is the precondition for being a useful member of any community.

A sovereign mind contributes more to a collective. The dependent mind only echoes. Distinguishing the two requires effort.

A community of dependent minds reaches consensus quickly and is wrong slowly. A community of sovereign minds reaches consensus slowly and is wrong less often.

Drawing on the introductory work of Anna Riedl on cognitive sovereignty.
30
Part VI · Cognitive Sovereignty
Riedl · The Information Diet

Your inputs are not neutral.

The mind is shaped by what it consumes, in proportions that surprise people who imagine themselves immune. The feeds you scroll, the conversations you walk past, the small frictions and rewards built into your tools. All of them are training you, whether you know it or not.

Sovereignty begins with deliberateness about inputs. The discipline is to give a clear answer to the question, "why is this thing in front of me right now, and is it serving the mind I want to have?"

  1. Audit weekly. What did you read, watch, and overhear? What of it would you choose again?
  2. Curate, don't consume. Subscribe to people. Choose your sources by hand.
  3. Leave room for silence. Original thought rarely arrives during scrolling.
31
Part VI · Cognitive Sovereignty
Riedl · The Group

Tight communities have strong gravity.

In a high-bandwidth, high-trust group, beliefs spread the way dialect spreads, by exposure, repetition, and the soft pleasure of fitting in. This is mostly fine. It is also how communities of clever people sometimes converge on conclusions no individual member would have reached alone.

Contrarianism is no escape. Defining oneself against the group is still letting the group do the defining. The discipline is the practice of thinking through: deriving views from premises you have actually examined, holding them at the confidence the evidence warrants, and being willing to disagree without making a performance of it.

A useful question, asked privately and often: "Would I still believe this if I were not surrounded by people who believe it?"
32
Part VI · Cognitive Sovereignty
Riedl · The Practice

Build your views from the ground up, at least once.

Take a belief that matters to you. Trace it back. Why do you hold it? Because you read it, because someone you respect holds it, because everyone around you holds it, or because you have actually worked through the argument and the evidence?

You will not have time to do this for every belief. You should not. But for the small set of beliefs on which your most important decisions ride, doing it once is the difference between a sovereign mind and a borrowed one.

"Hold few opinions, and hold those few from your own thinking."A working paraphrase

33
Coda
In Summary

Clear thinking is a habit, and habits compound.

No one becomes a clear thinker by reading a deck. The transformations described here are slow, granular, and unglamorous. A small calibration exercise once a week. A pre-mortem before the next decision. The willingness, once a month, to ask whether you would still hold this view if no one around you did.

The reward, accumulated over years, is the rarest thing in any community: a person whose word carries weight because it has been earned. A friend whose disagreement is welcome because it is honest. A mind worth the company of other minds.

· · ·
34
Finis

See clearly.
Build well.

· · ·

Credit to Daniel Kahneman, Julia Galef, Anna Riedl, Rob Bensinger's outline on LessWrong, and the broader rationalist tradition. Written for those building something together.

35