Scan barcode
yaltidoka24's review against another edition
3.0
Blinks:
1. Rationality is a means to an end or the ability to use knowledge to attain goals.
2. Rationality helps you decide between passions.
3. Ignorance and self-constraint can be rational choices.
4. Science applies rationality to the real world.
5. Institutions make us less partial - and more rational
6. Punishing people for their own good creates a more rational commons.
7. Our most important moral idea is compelling because it's rational
1. Rationality is a means to an end or the ability to use knowledge to attain goals.
2. Rationality helps you decide between passions.
3. Ignorance and self-constraint can be rational choices.
4. Science applies rationality to the real world.
5. Institutions make us less partial - and more rational
6. Punishing people for their own good creates a more rational commons.
7. Our most important moral idea is compelling because it's rational
pigeon_brisk's review against another edition
Pinker's loyalty to rationality irks me in a way that's hard to put my finger on. Don't get me wrong, it's more than interesting, even FASCINATING at times. But especially his political analyses and explanations feel off or misled
_annabel's review against another edition
3.0
It was ok. I think I was hoping for more. It was basically a wordy first-year probability and stats text book. So went over a lot of probability theorems and statistical interpretations. The last two chapters were more interesting. I think he could have made his point better without all the introductory maths. And should have spent longer on the final conclusions.
brokensandals's review against another edition
4.0
At first I thought this was going to be just another book on fallacies and cognitive biases, but the inclusion of chapters that go through the axioms of rational choice theory and explain statistical decision theory makes it a bit unusual in the genre. So, bonus points for that. One thing I’m not sure I’d realized before is how making decisions by “process of elimination” can cause you to violate transitivity—that is, it can lead you to choose A over B, B over C, and C over A, implying you’ve failed to establish a coherent set of underlying preferences (see pages 186-187).
Pinker sprinkles in plenty of fun wordplay to keep the book from being too dry:
One very surprising claim: that polygraphs are significantly more reliable than eyewitness testimony[2]. I thought, as Wikipedia says, that polygraphs are “junk science”. I don’t know what to make of this. Anyway, that’s just part of a generally very depressing discussion about estimating the (in)accuracy of jury decisions:
Two political themes come up from time to time. One is Pinker’s disdain for Donald Trump. The other is his frustration with the left’s pressure toward ideological conformity:
I emphasized the part about “the nature of beliefs” because, coming from a background where there was heavy pressure to conform to a particular conservative and religious ideology, I’m very sensitive to the harms that come from using a person’s beliefs as the indicator of their moral decency. I do think there’s a false dichotomy in that quote though; I would say that people both have high confidence that their beliefs are objectively true, and see those beliefs as core components of their identity—and these are mutually reinforcing. We feel comfortable judging people for their beliefs because we feel so certain that our beliefs are obviously correct, and at the same time our confidence is often subconsciously bolstered by the awareness that our social identity depends on maintaining our beliefs.
Pinker draws a distinction between “reality mindset” beliefs and “mythology mindset” beliefs; for the latter:
The telltale sign of mythology-mindset beliefs is a failure to fully act on them—e.g., “[t]hough millions of people endorsed the rumor that Hillary Clinton ran a child sex trafficking ring out of the basement of the Comet Ping Pong pizzeria…virtually none took steps commensurate with such an atrocity, such as calling the police.”[6]
I think there’s something to that distinction, but I also think it gives the wrong impression of the subjective experience of holding such beliefs. It seems to suggest that people don’t really care whether their mythology-mindset beliefs are objectively true. That might be right for the pizzagate case, but for other examples—such as “that if one doesn’t accept Jesus as one’s savior one will be damned to eternal torment in hell”[7]—it doesn’t fit my experience at all. Often people are so attached to the truth of a belief that they can’t really even entertain the possibility of it not being true. The fact that they don’t act on it in the way we think would be logical doesn’t mean much; they may also hold different beliefs from us on a number of other issues that are relevant to deciding what the logical way to act on any particular belief is. Besides, the fact that some act would be extremely abnormal is itself enough to make most people reluctant to do it, even if they can’t explain why they shouldn’t.
In setting up the “reality” vs “mythology” categorization, Pinker first mentions a “reflective” vs “intuitive” categorization (from Dan Sperber), and a “distal” vs “testable” categorization (from Robert Abelson)[8], which could be interesting to read more about.
Returning to Pinker’s complaints about academia:
(Here’s an archived copy of the page cited for “confidence in universities is sinking”. Notably it says that “[t]he decline is most evident among Republicans…but Democrats and independents are also less confident now [2018] than they were three years ago.”)
(I assume Pinker means “active open-mindedness” in a precise sense: as he discusses a couple pages earlier, psychological studies have assessed this trait by questionnaires and found it to correlate with some desirable things.[10] I first read about this just last month in Effective Altruism and the Human Mind and it sort of blew my mind, because the questionnaire is essentially just asking whether you think open-mindedness is good, not whether you behave in open-minded ways or even see yourself as open-minded. Yet apparently that’s enough to predict performance on certain tasks.)
The bit about climate change really grabbed my attention, though I’d point out that widespread climate denialism predates the “left-wing monoculture”, and I would not be surprised if at least some of the people bringing up this objection had really already made their minds up for other reasons. (People who feel strongly attached to a position will often trot out every half-plausible-sounding argument in its favor that they can come up with, without being genuinely interested in that argument’s validity or being realistically open to changing their minds if it’s refuted.) But I share Pinker’s concern that intense social pressure not to voice dissenting opinions is ultimately destructive.
The book ends with a section called “Rationality and Moral Progress”, where Pinker looks at some examples of moral arguments from the 1500s onwards. (He’s quick to say he “cannot claim that good arguments are the cause of moral progress”[11] since correlation doesn’t imply causation, but… it sure seems like he wants us to conclude that arguments are an important contributor to moral progress.)
[1] Steven Pinker, Rationality: What It Is, Why It Seems Scarce, Why It Matters, First edition (New York: Viking, 2021), 147.
[2] Ibid., 219, citing “a meta-analysis in National Research Council 2003 , p. 122,” which appears to be available here: https://nap.nationalacademies.org/catalog/10420/the-polygraph-and-lie-detection.
[3] Ibid., 220.
[4] Ibid., 92–93, emphasis added.
[5] Ibid., 300.
[6] Ibid., 299.
[7] Ibid., 301.
[8] Ibid., 299.
[9] Ibid., 313–14, emphasis added.
[10] Ibid., 310–11.
[11] Ibid., 329.
[12] Ibid.
(crosspost)
Pinker sprinkles in plenty of fun wordplay to keep the book from being too dry:
The cluster illusion, like other post hoc fallacies in probability, is the source of many superstitions: that bad things happen in threes, people are born under a bad sign, or an annus horribilis means the world is falling apart. When a series of plagues is visited upon us, it does not mean there is a God who is punishing us for our sins or testing our faith. It means there is not a God who is spacing them apart.[1]
One very surprising claim: that polygraphs are significantly more reliable than eyewitness testimony[2]. I thought, as Wikipedia says, that polygraphs are “junk science”. I don’t know what to make of this. Anyway, that’s just part of a generally very depressing discussion about estimating the (in)accuracy of jury decisions:
The heart-sinking conclusion is that juries acquit far more guilty people, and convict far more innocent ones, than any of us would deem acceptable.[3]
Two political themes come up from time to time. One is Pinker’s disdain for Donald Trump. The other is his frustration with the left’s pressure toward ideological conformity:
The ad hominem, genetic, and affective fallacies used to be treated as forehead-slapping blunders or dirty rotten tricks. Critical-thinking teachers and high school debate coaches would teach their students how to spot and refute them. Yet in one of the ironies of modern intellectual life, they are becoming the coin of the realm. In large swaths of academia and journalism the fallacies are applied with gusto, with ideas attacked or suppressed because their proponents, sometimes from centuries past, bear unpleasant odors and stains. It reflects a shift in one’s conception of the nature of beliefs: from ideas that may be true or false to expressions of a person’s moral and cultural identity. It also bespeaks a change in how scholars and critics conceive of their mission: from seeking knowledge to advancing social justice and other moral and political causes.[4]
I emphasized the part about “the nature of beliefs” because, coming from a background where there was heavy pressure to conform to a particular conservative and religious ideology, I’m very sensitive to the harms that come from using a person’s beliefs as the indicator of their moral decency. I do think there’s a false dichotomy in that quote though; I would say that people both have high confidence that their beliefs are objectively true, and see those beliefs as core components of their identity—and these are mutually reinforcing. We feel comfortable judging people for their beliefs because we feel so certain that our beliefs are obviously correct, and at the same time our confidence is often subconsciously bolstered by the awareness that our social identity depends on maintaining our beliefs.
Pinker draws a distinction between “reality mindset” beliefs and “mythology mindset” beliefs; for the latter:
Whether they are literally “true” or “false” is the wrong question. The function of these beliefs is to construct a social reality that binds the tribe or sect and gives it a moral purpose.[5]
The telltale sign of mythology-mindset beliefs is a failure to fully act on them—e.g., “[t]hough millions of people endorsed the rumor that Hillary Clinton ran a child sex trafficking ring out of the basement of the Comet Ping Pong pizzeria…virtually none took steps commensurate with such an atrocity, such as calling the police.”[6]
I think there’s something to that distinction, but I also think it gives the wrong impression of the subjective experience of holding such beliefs. It seems to suggest that people don’t really care whether their mythology-mindset beliefs are objectively true. That might be right for the pizzagate case, but for other examples—such as “that if one doesn’t accept Jesus as one’s savior one will be damned to eternal torment in hell”[7]—it doesn’t fit my experience at all. Often people are so attached to the truth of a belief that they can’t really even entertain the possibility of it not being true. The fact that they don’t act on it in the way we think would be logical doesn’t mean much; they may also hold different beliefs from us on a number of other issues that are relevant to deciding what the logical way to act on any particular belief is. Besides, the fact that some act would be extremely abnormal is itself enough to make most people reluctant to do it, even if they can’t explain why they shouldn’t.
In setting up the “reality” vs “mythology” categorization, Pinker first mentions a “reflective” vs “intuitive” categorization (from Dan Sperber), and a “distal” vs “testable” categorization (from Robert Abelson)[8], which could be interesting to read more about.
Returning to Pinker’s complaints about academia:
Since no one can know everything, and most people know almost nothing, rationality consists of outsourcing knowledge to institutions that specialize in creating and sharing it, primarily academia, public and private research units, and the press. That trust is a precious resource which should not be squandered. Though confidence in science has remained steady for decades, confidence in universities is sinking. A major reason for the mistrust is the universities’ suffocating left-wing monoculture, with its punishment of students and professors who question dogmas on gender, race, culture, genetics, colonialism, and sexual identity and orientation. … On several occasions correspondents have asked me why they should trust the scientific consensus on climate change, since it comes out of institutions that brook no dissent. That is why universities have a responsibility to secure the credibility of science and scholarship by committing themselves to viewpoint diversity, free inquiry, critical thinking, and active open-mindedness.[9]
(Here’s an archived copy of the page cited for “confidence in universities is sinking”. Notably it says that “[t]he decline is most evident among Republicans…but Democrats and independents are also less confident now [2018] than they were three years ago.”)
(I assume Pinker means “active open-mindedness” in a precise sense: as he discusses a couple pages earlier, psychological studies have assessed this trait by questionnaires and found it to correlate with some desirable things.[10] I first read about this just last month in Effective Altruism and the Human Mind and it sort of blew my mind, because the questionnaire is essentially just asking whether you think open-mindedness is good, not whether you behave in open-minded ways or even see yourself as open-minded. Yet apparently that’s enough to predict performance on certain tasks.)
The bit about climate change really grabbed my attention, though I’d point out that widespread climate denialism predates the “left-wing monoculture”, and I would not be surprised if at least some of the people bringing up this objection had really already made their minds up for other reasons. (People who feel strongly attached to a position will often trot out every half-plausible-sounding argument in its favor that they can come up with, without being genuinely interested in that argument’s validity or being realistically open to changing their minds if it’s refuted.) But I share Pinker’s concern that intense social pressure not to voice dissenting opinions is ultimately destructive.
The book ends with a section called “Rationality and Moral Progress”, where Pinker looks at some examples of moral arguments from the 1500s onwards. (He’s quick to say he “cannot claim that good arguments are the cause of moral progress”[11] since correlation doesn’t imply causation, but… it sure seems like he wants us to conclude that arguments are an important contributor to moral progress.)
My greatest surprise in making sense of moral progress is how many times in history the first domino was a reasoned argument. A philosopher wrote a brief which laid out arguments on why some practice was indefensible, or irrational, or inconsistent with values that everyone claimed to hold. The pamphlet or manifesto went viral, was translated into other languages, was debated in pubs and salons and coffeehouses, and then influenced leaders, legislators, and popular opinion. Eventually the conclusion was absorbed into the conventional wisdom and common decency of a society, erasing the tracks of the arguments that brought it there. Few people today feel the need, or could muster the ability, to formulate a coherent argument on why slavery is wrong, or public disembowelment, or the beating of children; it’s just obvious. Yet exactly those debates took place centuries ago.
And the arguments that prevailed, when they are brought to our attention today, continue to ring true. They appeal to a sense of reason that transcends the centuries, because they conform to principles of conceptual consistency that are part of reality itself.[12]
[1] Steven Pinker, Rationality: What It Is, Why It Seems Scarce, Why It Matters, First edition (New York: Viking, 2021), 147.
[2] Ibid., 219, citing “a meta-analysis in National Research Council 2003 , p. 122,” which appears to be available here: https://nap.nationalacademies.org/catalog/10420/the-polygraph-and-lie-detection.
[3] Ibid., 220.
[4] Ibid., 92–93, emphasis added.
[5] Ibid., 300.
[6] Ibid., 299.
[7] Ibid., 301.
[8] Ibid., 299.
[9] Ibid., 313–14, emphasis added.
[10] Ibid., 310–11.
[11] Ibid., 329.
[12] Ibid.
(crosspost)
jzkannel's review against another edition
3.0
This book was alright. I definitely picked it out thinking it would be on the level of Freakonomics, or Jordan Ellenberg books, in terms of how engaging they were and it wasn't quite there. Some good illustrative examples but a lot of the explanations of different theories of rationality got pretty bogged down in jargon that was hard to follow.
carlynmarie's review against another edition
challenging
informative
reflective
medium-paced
4.0
Difficult to listen to as an audiobook because he refers to the graphs and charts frequently.
francisjshaw's review against another edition
3.0
If you are a scientist or mathematician you are going to love this book. It will give you goosebumps of pleasure. You will likely bypass the many contradictions in this work to rationality itself. You will likely see the great leap forward of the industrial revolution and the next 100 years that followed, without remembering the millions killed in the two world wars because of its success. You will likely forget that the world threatened by climate change is caused by such wonderful advances. You will embrace the concept of life, liberty, and the pursuit of happiness made by men who owned slaves, denying such freedoms to some. You mad conclude that if we were more rational the German people would not have believed in the Nazis who persuaded that your friends and neighbors of yesterday should now be eliminated. Rationality is and has always been a moving target, driven by some out of reason and by others out of self-interest and power. Rationality should accomplish much, but often doesn't because other virtues are more appealing. It would be rational if politicians were representative of the people or if wealth was shared, but power and control are more appealing. When I look at the biggest decisions in my life, none would meet the rationality test. Each has provided learning because I haven't judged with reason, logic, or formulas. Rationality today will be seen as foolishness tomorrow and if humanity is going to take a leap forward it's not going to be because of rationalty, but because of compassion which leads to better outcomes for all.
verdana's review against another edition
2.5
If you're thinking about reading this book, don't do it. Go and read Thinking, Fast and Slow by Daniel Kahneman instead. It explains the same things much better.
drwhere's review against another edition
2.0
Book is fine. But for those interested in the topic who likely have some education in probability and statistics, this is prett pedantic. Having had many years of courses as a scientist and engineer, this book was kinda like a best-of highlight real of the interesting examples where probability theory does or doesn’t break our intuition. There were some useful ideas about how many cognitive biases are exacerbated by the context and might not be so biased when we consider their potential utility in their original context or how they may not even be present in original context and are only measurable when humans are asked to do reasoning in contrived manner of a psych study or taken advantage of by manipulative advertising/technology/etc. If you haven’t had stats/bayesian reasoning/probability/game-theory ever, this would be a decent read. But if you have go check out The Signal and the Noise by Nate Silver, that’s a way more interesting read on this same topic.