Take your ideas more seriously
I often think of myself as having an “activation energy” for doing different tasks, where I need to inject some amount of initial energy into an intention in order to overcome inertia and act on it. However, I think this model is maybe too oversimplified. Dynomight had a good post recently on this where they gave something slightly better: I actually have an accountant in my head named Jim, who doesn’t like certain ideas for various specific reasons:
- Idea is unpleasant compared to alternatives
- Doubt that the idea will work/have a good outcome
- Idea isn’t important
- Timeline is too long
This last idea is especially weird. Why does Jim time discount so hard? The canonical answer is that the future is hard to predict, but this isn’t always true. Maybe Jim just doesn’t understand that, but I think that a better answer is that the last two bullet points should actually be combined into: “idea doesn’t feel important,” and a key aspect of feeling important is feeling real. Jim’s inappropriate time discount is actually a heuristic that you shouldn’t take too seriously anything that doesn’t feel sufficiently real, and the far future often doesn’t feel real regardless of how confident you are in your predictions.
The best example of the concept of your ideas feeling real is the classic bowling ball pendulum physics demonstration. Any high school physics student can tell you that the bowling ball can’t swing to a postion higher than the one it started from without getting extra energy from somewhere, but this fact feels entirely unreal when the ball is heading for your face. You might not move out of the way, but it’s very probably not due to some dispassionate, rational certainty about the behavior of the ball. The ball heading for your face (and, especially, your feelings about this situation) are way more real than any energy conservation argument.
A big part of this realness, I think, is availability, or how easy it is to imagine an outcome. It’s so easy to imagine the ball swinging up just a little farther and hitting your nose that it doesn’t seem impossible at all, even though that’s exactly what it is. And to some extent that’s a good thing: this heuristic works most of the time, and you probaby wouldn’t be here if your ancestors hadn’t evolved instincts that stopped heavy objects from hitting them in the face. And yet:
- For years, I subscribed to a free email service called “Scott’s Cheap Flights,” which sends you email alerts about good deals on flights from your home airport to cool vacation spots. Every now and then, I’d get an email telling me that I could travel to, say, Madrid for $200. I was in the fortunate position of having both the time and the money to do this, but in reality I never even came close to buying the tickets. Why? I think it’s because it didn’t feel real to me. I didn’t think of myself as the sort of person who did things like that, and so it was really hard to imagine myself actually dropping everything and going to Madrid.
- Many of the arguments from my post on ethical agency are relevant here. It’s often easy to know, intellectually, what the right thing to do is, but it’s hard to do that thing unless you already felt like it. I think another way to frame this is that your knowledge of the right thing to do doesn’t feel real to you.
- Most people struggle to ask other people out because they’re afraid of getting rejected. This is way illogical: if you suppose there’s a universal total ordering on preference of who to date (note: I am aware that this is not how it works), then dating is a stable matching problem, and the best solution is for you to ask people out in order of preference until one of them says yes. This maximizes your rejections, in that before you settle down, every single person who would reject you has already done so. Avoiding rejection is therefore the exact opposite of what you want to do. This is fairly obvious, and I have a very hard time imagining that I’m the first person to come up with it. There’s also another point, which is that getting shot down actually doesn’t feel very bad and can in fact be sort of fun. For all this, almost no one rejection-maxxes in real life.
In other words, I think most people systematically don’t take their ideas seriously enough, and that this leads them to both miss out on positive value and generate negative value. That said, my experience is that people are distributed across an extraordinarily broad and also multidimensional parameter domain of how seriously they take their own ideas. As a few examples of dimensions:
- Abstract vs concrete - I think you should be taking abstract ideas about as seriously as concrete ones, after adjusting for confidence
- High vs low confidence - Penalizing low confidence seems to be reasonable, but you should remember to penalize consistently (ie, you shouldn’t take low-confidence negative predictions very seriously)
- Easy vs hard to visualize - Should be taking these equally seriously unless you’re intentionally using availability as a low-confidence heuristic
- Type of risk (physical, social, etc) - You should mostly just make sure you’re calibrated on these. Social risk especially seems to get inappropriate weights (in both directions).
- Positive vs negative - You should be taking positive and negative forecasts about equally seriously
I’m a bit surprised to be advocating for taking your ideas seriously, because I sometimes hear “not taking your thoughts so seriously” cited as a good mindfulness thing. I think there’s something about taking ideas seriously, but not necessarily thoughts in general and not necessarily all the time. The idea here is to be well-calibrated, and calibration issues can happen in either direction: for example, sometimes OCD manifests as frequent discursive thoughts that focus on unlikely distasters, accompanied by a feeling that you have to think about this or else it will actually happen. In general, I think it can be useful when you hear something like “x is bad” to ask yourself, “does this imply that the optimal amount of x is zero? What is the optimal amount of x?” Often I find that the heuristic of x being bad hides a lot of utility that we can derive from x in moderation.
I’m not sure if there are any good practices for taking your ideas more seriously, but probably a good place to start, as with most cognitive biases, is being aware of the mistake.