Take a photo of a barcode or cover
emispaghetti's review against another edition
4.0
4 stars for actual quality of writing and content, 1 star for personal enjoyment… this was like a giant ball of anxiety as someone living in the U.S
bjorng's review
4.0
A good, pointed discussion about the problems caused by cognitive dissonance. The only surprise about it is how pervasive the problems are and how little attention they get from this perspective.
Even though it contains a lot of depressing facts about people, and doesn't offer an easy answer for how to get out of the dilemma it describes, I enjoyed this book. It has a surprisingly light tone given the material, and it is very accessible.
Update: Having incorporated it into my thought processes for a few weeks now, I feel like many of my opinions are now informed by the dissonance/self-justification processes described in the book. It changes my perspective on people and events.
Even though it contains a lot of depressing facts about people, and doesn't offer an easy answer for how to get out of the dilemma it describes, I enjoyed this book. It has a surprisingly light tone given the material, and it is very accessible.
Update: Having incorporated it into my thought processes for a few weeks now, I feel like many of my opinions are now informed by the dissonance/self-justification processes described in the book. It changes my perspective on people and events.
sumatra_squall's review
5.0
Mistakes were Made (but not by me) explains why people are able to do terrible things that to the rest of us might seem downright unethical and unconscionable. How do these people live with themselves? Do they have no morals or integrity?
Tavris and Aronson argue that individuals who make mistakes don't see themselves as lacking scruples or integrity. Instead, they might engage in all kinds of twisted logic to justify themselves, to minimise cognitive dissonance. Tavris and Aronson define cognitive dissonance as "a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent". Cognitive dissonance is deeply uncomfortable so our instinct is to convince ourselves that there is no dissonance, that we didn't do anything bad and can therefore continue to see ourselves as competent and fundamentally good.
Our tendency to minimise cognitive dissonance means we only see and accept information that is consonant with our beliefs and ignore or dismiss information that contradicts our views. And so over time, we become more and more entrenched in our own views and cannot even begin to see why others might hold a different perspective (they must be stupid or crazy or both).
This can lead to laughable consequences - like Doomsday groups finding that their predictions of the end of the world have fallen flat proclaiming that the world was in fact saved by their faith (we weren't wrong! We were not foolish! In fact, you should thank us for saving the earth!). But it can also lead to deeply disturbing outcomes - think of the divisive political landscape we see today. Or think about bullying - bullies don't see themselves as nasty, cruel sadists who are making others miserable for kicks. No, those being bullied have asked for it in some way. They deserve to be treated badly. And wrongful convictions where the police and prosecutors refuse to review because they cannot fathom that they could have made a mistake.
Looking at issues through the lens of cognitive dissonance is illuminating. It made so much sense. Like when Tavris and Aronson observe that it's "the people who almost decide to live in glass houses who throw the first stones"; that when people have made a choice either way (e.g. whether to cheat or not to cheat), they internalise their beliefs and convince themselves that this is the only possible way, that they always felt this way and there was no dilemma or ambiguity to navigate. "It is as if they had started off at the top of a pyramid, a millimeter apart; but by the time they have finished justifying their individual actions, they have slid to the bottom and now stand at opposite corners of the base."
Like when Tavris and Aronson note that "standing at the top of the pyramid, we are [often] faced not with a black-and-white, go/no-go decision, but with a gray choice whose consequences are shrouded. The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment - action, justification, further action - that increases our intensity and commitment, and may end up taking us far from our original intentions or principles." People don't start out wanting to be corrupt. They start out accepting small tokens of appreciation, then small favours, and then one day they've somehow managed to justify to themselves why it's ok to accept an all expenses paid luxury vacation.
And how it explains why it is so difficult to overcome blind spots, biases and prejudices - we would rather "put a lot of energy into preserving [our] prejudice rather than having to change it". We actively seek confirming evidence or reasons to justify our prejudice and dismiss information that challenges our prejudices.
The urge to minimise cognitive dissonance also means we cannot fully trust our memories. We may create "self-serving memory distortions" that fit the stories we want to tell ourselves. These false memories aren't quite the same as lying; we might genuinely believe that the narrative we've spun is authentic and we create memories that support this narrative. Tavris and Aronson suggest that people's stories of alien abduction are really false memories that these individuals genuinely believe and have invested in after deciding that they best fit the experience they went through.
So what does this mean for us? For one thing, if we're looking for advice on a particular issue, don't approach someone who's already invested in a particular option (so don't ask for car advice from someone who has just splashed out on a car). Two, be careful accepting incentives for doing certain things because no matter how small these incentives are (think pharma companies offering doctors pens and notepads), they skew our judgement, evoking an implicit desire to reciprocate. Three, ensuring that you are never caught in a "closed loop" where you set yourself up for a self-fulfilling prophecy by creating more opportunities to illuminate blind spots. During investigations, this might mean avoiding jumping to conclusions and zooming in on only one suspect while ignoring other possibilities. More generally, this might mean finding a few trusted naysayers in our lives, "critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off". Four, developing the habit or looking at our actions dispassionately and critically, to better catch ourselves sliding into self-justification and then more committed action.
Mistakes Were Made centres around one core idea - that our efforts to minimise cognitive dissonance creates a whole range of big problems. But it never feels like it is belabouring the point. Rather, each chapter sheds light on how self-justification to minimise cognitive dissonance can account for different problems we see - blind spots and biases, flawed memories, deepening rifts and spiralling conflict, to name a few. A fascinating and illuminating read.
Tavris and Aronson argue that individuals who make mistakes don't see themselves as lacking scruples or integrity. Instead, they might engage in all kinds of twisted logic to justify themselves, to minimise cognitive dissonance. Tavris and Aronson define cognitive dissonance as "a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent". Cognitive dissonance is deeply uncomfortable so our instinct is to convince ourselves that there is no dissonance, that we didn't do anything bad and can therefore continue to see ourselves as competent and fundamentally good.
Our tendency to minimise cognitive dissonance means we only see and accept information that is consonant with our beliefs and ignore or dismiss information that contradicts our views. And so over time, we become more and more entrenched in our own views and cannot even begin to see why others might hold a different perspective (they must be stupid or crazy or both).
This can lead to laughable consequences - like Doomsday groups finding that their predictions of the end of the world have fallen flat proclaiming that the world was in fact saved by their faith (we weren't wrong! We were not foolish! In fact, you should thank us for saving the earth!). But it can also lead to deeply disturbing outcomes - think of the divisive political landscape we see today. Or think about bullying - bullies don't see themselves as nasty, cruel sadists who are making others miserable for kicks. No, those being bullied have asked for it in some way. They deserve to be treated badly. And wrongful convictions where the police and prosecutors refuse to review because they cannot fathom that they could have made a mistake.
Looking at issues through the lens of cognitive dissonance is illuminating. It made so much sense. Like when Tavris and Aronson observe that it's "the people who almost decide to live in glass houses who throw the first stones"; that when people have made a choice either way (e.g. whether to cheat or not to cheat), they internalise their beliefs and convince themselves that this is the only possible way, that they always felt this way and there was no dilemma or ambiguity to navigate. "It is as if they had started off at the top of a pyramid, a millimeter apart; but by the time they have finished justifying their individual actions, they have slid to the bottom and now stand at opposite corners of the base."
Like when Tavris and Aronson note that "standing at the top of the pyramid, we are [often] faced not with a black-and-white, go/no-go decision, but with a gray choice whose consequences are shrouded. The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment - action, justification, further action - that increases our intensity and commitment, and may end up taking us far from our original intentions or principles." People don't start out wanting to be corrupt. They start out accepting small tokens of appreciation, then small favours, and then one day they've somehow managed to justify to themselves why it's ok to accept an all expenses paid luxury vacation.
And how it explains why it is so difficult to overcome blind spots, biases and prejudices - we would rather "put a lot of energy into preserving [our] prejudice rather than having to change it". We actively seek confirming evidence or reasons to justify our prejudice and dismiss information that challenges our prejudices.
The urge to minimise cognitive dissonance also means we cannot fully trust our memories. We may create "self-serving memory distortions" that fit the stories we want to tell ourselves. These false memories aren't quite the same as lying; we might genuinely believe that the narrative we've spun is authentic and we create memories that support this narrative. Tavris and Aronson suggest that people's stories of alien abduction are really false memories that these individuals genuinely believe and have invested in after deciding that they best fit the experience they went through.
So what does this mean for us? For one thing, if we're looking for advice on a particular issue, don't approach someone who's already invested in a particular option (so don't ask for car advice from someone who has just splashed out on a car). Two, be careful accepting incentives for doing certain things because no matter how small these incentives are (think pharma companies offering doctors pens and notepads), they skew our judgement, evoking an implicit desire to reciprocate. Three, ensuring that you are never caught in a "closed loop" where you set yourself up for a self-fulfilling prophecy by creating more opportunities to illuminate blind spots. During investigations, this might mean avoiding jumping to conclusions and zooming in on only one suspect while ignoring other possibilities. More generally, this might mean finding a few trusted naysayers in our lives, "critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off". Four, developing the habit or looking at our actions dispassionately and critically, to better catch ourselves sliding into self-justification and then more committed action.
Mistakes Were Made centres around one core idea - that our efforts to minimise cognitive dissonance creates a whole range of big problems. But it never feels like it is belabouring the point. Rather, each chapter sheds light on how self-justification to minimise cognitive dissonance can account for different problems we see - blind spots and biases, flawed memories, deepening rifts and spiralling conflict, to name a few. A fascinating and illuminating read.
cari1268's review against another edition
4.0
I love the subject of cognitive dissonance and would definitely read more books on the same topic. Mistakes Were Made definitely made the human race (me included!) look stupid. This book made me want to justify myself less and try to be more objective.
The self help portions of his book were light/almost non-existent. I wish there was more hand-holding from the author about how to avoid the pitfalls of cognitive dissonance. I also HATED the chapter on Trump. As someone who avoids politics, I felt blindsided by politics in that chapter. There was not enough relating the information back to cognitive dissonance. It felt like the author needed to vent and I wish he would just stuck to Twitter/X.
4 Stars.
The self help portions of his book were light/almost non-existent. I wish there was more hand-holding from the author about how to avoid the pitfalls of cognitive dissonance. I also HATED the chapter on Trump. As someone who avoids politics, I felt blindsided by politics in that chapter. There was not enough relating the information back to cognitive dissonance. It felt like the author needed to vent and I wish he would just stuck to Twitter/X.
4 Stars.
pinkash3's review against another edition
3.0
This was an interesting read. I like the information presented and it made me think about myself. I feel like after a certain point things were becoming redundant. In fact, I only skimmed the last chapter while looking for new information.
cmjustice's review against another edition
5.0
Brilliant exploration, fascinating, provocative and illuminating. Recommended for all skeptics.