Find an author, media outlet, or other opinion source who holds different views from you, but who has a better-than-average shot at changing your mind-someone you find reasonable or with whom you share some common ground.ĥ. ![]() The next time a worry pops into your head and you’re tempted to rationalize it away, instead make a concrete plan for how you would deal with it if it came true.Ĥ. ”), ask yourself how sure you really are.ģ. When you notice yourself making a claim with certainty (“There’s no way. The next time you’re making a decision, ask yourself what kind of bias could be affecting your judgment in that situation, and then do the relevant thought experiment (e.g., outsider test, conformity test, status quo bias test).Ģ. She ends the book with a list of ‘Scout Habits’ for the reader to practice. All the same, an organisation that spends its time praising itself for how wonderfully self-critical it can be. The author criticises people like this too, well, in part, although, she has nice things to say about Richard Dawkins. Look, I don’t believe in God, I feel uncomfortable around people who say very silly things about science – but you will definitely know something is deeply wrong if I ever join an evangelical Atheist rationalist society or if I start wearing ‘Science Rocks, but you are just too stupid to know…fart face.’ t-shirt. But again, the problem isn’t always what is said, but rather the unbearable smugness with which it is said.Īt one point the author says that all she is really doing is providing us with the tools discussed in all those books popular in the early 2000s more accessible to people – books like Predictably Irrational, Thinking: Fast and Slow, Mistakes Were Made, but not by me… I was struggling with this book anyway, but became completely put off at the end when she started spruiking her group – the ‘effective altruists’. And don’t get me wrong – I’ve been that person too. The sort who says things like ‘correlation doesn’t equal causation’ or ‘it seems like you are projecting’ or ‘can we stick to the facts and leave the ad hominen attacks at the door’. Surely we have all met one of these people in our travels. Aristotle liked to talk of the ‘golden mean’ – his rational position between two extremes – and this book certainly plays that idea for all that it is worth – but actually, I sometimes feel that the centre can be just as extreme as any of the ‘ends’. But I’ve started thinking that perhaps the geometric figure that most accurately describes ideas is the triangle. Then I thought of this continuum as being more like a circle, where the far left and far right end up virtually touching – something the current pandemic has made particularly clear to me as I’ve watched as some Marxists have started sounding much more like Q-Anon supporters. ![]() I used to think of political beliefs as existing on a continuum running from left to right. There is a kind of smug, self-satisfaction to books like this that invariably make me feel, regardless of how useful parts of them might prove to be, uncomfortable. With fascinating stories ranging from Warren Buffett's investing strategies to subreddit threads and modern partisan politics, Galef explores why our brains deceive us and what we can do to change the way we think. While the soldier and the scout are both essential to an actual army, a scout mindset will benefit most of us more in decision-making. A scout surveys the land, seeking accuracy and understanding to find all available information-good and bad-to gain a more holistic picture. ![]() Galef explains that to be more right more often, we need to approach ideas less like a soldier and more like a scout. Your mind decides what you want to be true, so you concoct a justification for why, logically, that idea makes the most sense. This happens when you read a headline suggesting an idea you support isn't as great as it's cracked up to be, and you immediately find flaws in the article. We protect our beliefs aggressively and ignore any evidence that we might be wrong. Julia Galef's insight is that most of us naturally have a "soldier" mindset. But what if we could train our minds to make more rational decisions, without any blow to our confidence? They've evolved to help us forget or ignore our painful mistakes, while fueling our irrational instincts. A better way to combat knee-jerk biases and make smarter decisions, from cofounder and president of the Center for Applied Rationality and "Rationally Speaking" podcast host Julia Galef.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |