Back to Blog
7 min read

How to Keep a Decision Journal (And Why Spreadsheets Don't Cut It)

I started tracking my decisions in a Google Sheet three years ago. It lasted about six weeks. Here is what I learned about why most decision journals fail, and what actually works.

What Is a Decision Journal?

A decision journal is exactly what it sounds like: a record of the important choices you make, captured at the moment you make them. Not after the outcome. Not when you are trying to remember what you were thinking. Right then, while your reasoning is fresh.

The idea has been floating around investment circles for decades, but it got a serious academic boost from Daniel Kahneman's work on noise in decision-making. His research showed that people are shockingly inconsistent in their own judgments. Present the same case to the same person on two different days, and they will often reach different conclusions. A decision journal forces you to confront that inconsistency because you have a written record you cannot retroactively edit in your memory.

Why Bother? Because Hindsight Is a Liar

Here is the core problem a decision journal solves: once you know the outcome, you unconsciously rewrite the story of how you got there. Psychologists call this hindsight bias. When your investment doubles, you remember being confident. When the hire does not work out, you remember having doubts. Neither memory is accurate.

A journal pins your actual state of mind to a specific moment. Three months later, when the outcome is known, you can compare what you believed against what happened. That comparison is where all the learning lives.

A Simple Template That Works

You do not need anything complicated. Five fields are enough to capture the information that matters:

1. Date and decision. What are you deciding, and when? Be specific. Not "hiring decision" but "hiring Maria as senior engineer for the payments team."
2. Your confidence level. How likely do you think this will succeed? Put an actual number on it. 70%? 90%? This is the most important field and the one people skip most often.
3. Your reasoning. Two to three sentences on why you are making this call. What evidence are you weighing? What alternatives did you consider?
4. What would change your mind. This one is powerful. Write down the specific signal that would tell you this decision was wrong. If you cannot name one, that is a red flag.
5. Outcome (filled in later). When the result is known, come back and record what happened. Did it match your confidence level?

That is it. Five fields. The whole entry should take less than two minutes.

The Three Mistakes Everyone Makes

I have watched dozens of people try to start decision journals. The failure modes are remarkably consistent.

Mistake 1: Skipping the confidence level. People write down what they decided and why, but they do not assign a probability. Without a number, you cannot measure calibration. And calibration is the entire point. Saying "I think this will work" is not a prediction. Saying "I am 75% confident this will work" is.

Mistake 2: Never reviewing. A decision journal you write but never read back is just a diary. The value comes from the comparison between your predictions and reality. Set a monthly reminder to review past entries and update outcomes.

Mistake 3: Only logging big decisions. The best practice ground is frequent, smaller decisions where you will know the outcome within weeks. Will this feature ship on time? Will the candidate accept our offer? Will this quarter's revenue hit target? High volume and fast feedback loops are how you actually improve.

Why Spreadsheets Eventually Fail

I am not against spreadsheets. I love spreadsheets. But for decision journaling specifically, they have a few structural problems.

First, there is no built-in feedback mechanism. You log predictions in one column and outcomes in another, but the spreadsheet does not tell you whether your 80% calls are actually happening 80% of the time. You would need to build calibration charts yourself, and nobody does that.

Second, spreadsheets are passive. They do not remind you to record outcomes. They do not nudge you when a decision is due for review. The result is a trail of incomplete entries where you logged the prediction but forgot to close the loop.

Third, there is no pattern detection. After 50 or 100 logged decisions, the interesting question is not how any single prediction turned out. It is whether you have systematic biases. Are you consistently overconfident in hiring decisions? Do you underestimate technical timelines? A spreadsheet holds the data, but it does not surface those patterns.

What Automated Decision Tracking Gets You

This is the gap that purpose-built tools fill. When your decision journal understands what it is storing, it can do things a spreadsheet cannot: calculate your Brier score automatically, generate calibration curves that show you exactly where your confidence levels are off, flag decisions that are due for outcome review, and show you trend lines so you can see if you are actually improving over months.

The difference is like tracking a workout in a notebook versus using a fitness app. The notebook works, technically. But the app shows you progress, spots plateaus, and keeps you honest. For something as important as the quality of your judgment, the feedback loop matters.

How Calibrated Are You Right Now?

Before you start a decision journal, get a baseline. Our free Calibration Challenge takes 2 minutes and measures your prediction accuracy across 10 questions. No account needed.