Take a photo of a barcode or cover
A review by generalheff
How Not to Be Wrong: The Hidden Maths of Everyday Life by Jordan Ellenberg
4.0
If you are unconvinced of how maths might be relevant to the world around you, this is the book for you. Jordan Ellenberg takes the reader gently through a ream of mathematical topics - such as linearity, inference and so on - and (this isn't a textbook remember) puts these potentially horrible notions into concrete, intelligible terms.
The opening chapter is emblematic of the work as a whole. Ellenberg introduces the Laffer curve - which is just a line starting at zero at 0%, rising to some peak and then dropping to zero again at 100%. What does the graph show? It is a qualitative illustration of the fact that raising taxes from 0% generates some government income; you can keep raising taxes for a while, but eventually you hit a tipping point where additional taxes will actually reduce government revenue. Why? Because, in the limit of 100% taxation, no one would work at all. This graph, supposedly drawn by economist Arthur Laffer at a dinner with Dick Cheney and Donald Rumsfeld among others, is demonstrative of the non-linearity of the relationship between government revenue and income tax rates. Yes, raising rates below some threshold can raise revenues, but only to a point.
Why does the reader care? Because ignoring non-linearity leads to an awful lot of mistakes (the book is called 'How Not to be Wrong' keep in mind). Taking just one example, Ellenburg notes media headlines reporting that Sweden was reducing its taxes. If soak and spend Sweden is doing this, then what a great advert for reducing taxes in America too. This, implicitly to be sure, makes the assumption that the relationship between taxes and government revenues is linear. If revenues go up when Sweden lowers its taxes then so too in the US (a negative linear relation). But this is entirely false: the US is possibly to the left of the Laffer Curve's hump, while Sweden might be to the right. As such, Sweden may well lower its taxes and see a rise in government takings, but the US might see the opposite. Linearity cannot be assumed (and indeed is demonstrably false in this case, as Laffer showed). .
From here the author works his way into discussions of inference, or the art of understanding statistical significance; expectation, giving the wonderful example of a lottery that could be gamed under certain conditions, a fact exploited by some students who actually calculated the 'expected value' of their takings; regression, in particular focussing on the incredibly important notion of regression to the mean; and existence, or the nature of mathematical objects (formalism and that kind of thing).
If that all sounds forbiddingly difficult it, for the most part, is anything but. Ellenberg admirably explains difficult ideas in simple terms. The Law of Large Numbers, for instance, is just one such 'very formal sounding thing' (technical term). But in this author's hands, it is brilliantly (er) handled. First we are told about a highly counterintuitive finding: that North Dakota has the lowest rates of brain cancer in the US while South Dakota has one of the highest. This is explained first by pointing to the Law of Large Numbers - which says (in Wikipedia's terms) that "the average of the results obtained from a large number of trials should be close to the expected value and will tend to become closer to the expected value as more trials are performed".
This is hopelessly abstract. Instead, Ellenburg makes the point that small states like North or South Dakota will only have a few instances of a rare condition like brain cancer. Therefore, a few more cases in South Dakota can swing the proportion of cases in that state from the bottom to the top of the rankings (and vice versa for the North). By contrast, there are so many people in Texas or California that a few cases more or less will barely move the needle. This is the Law in action. With lots of people come more cases; small fluctuations are cancelled out and you're left with the number of cases you'd expect. As Ellenburg puts it: "smaller populations are inherently more variable". Intuitive result but spun out of what appears to be a bizarre finding.
Far from a series of disconnected examples, the author impressively manages to weave these illustrations of his subject together. Indeed, a key takeaway is how mathematics - the study of structures in some sense - is almost designed to provide transferable insight from one area to another. A real tour de force is found in the latter part of the book. The discussion initially is on expectation values. The application is to the lottery example mentioned above and why, under certain circumstances, it is actually worth playing: the expected value of a ticket exceeds its price, so just play a lot of tickets and you are highly likely to win overall.
There is a question remaining about what numbers should be chosen. Random choices works pretty well given the scheme the students have going on, but specific numbers are, it turns, out even better. Yet after a big discussion of expectation values, we take up a seemingly irrelevant issue, that of projective geometry. This leads to unusual geometries, including the seven-point Fano plane. This, it is shown, actually has precisely the property one would want when choosing numbers for a 3 pick, 7 ball lottery (out of nowhere - it really comes as a surprise when you return to the lottery example).
Though the expected value of using Fano plane numbers is equal to just randomly choosing your 3 picks, there is a lot less variance in the Fano approach. In other words, you won't win as big but you also are much less likely not to win much. The description of variance in the middle of all this is superb. The author then steps beyond 7-ball lotteries by describing how Fano's plane furnished an IBM researcher with an idea for error correcting an early computer. Finally (after some information theory) we return to larger error correcting codes that can furnish us with sets of tickets with just the properties you want for the particular lottery strategy the student's were employing.
It really is astonishing how long the lottery example is threaded through the overall account, cropping up in the context of multiple different mathematical stories and concepts. This is characteristic of the book as a whole. The author, whose passion for the subject veritably leaps off the page, paints an inspiring picture of an intertwined subject full of amazing results and surprising linkages. I defy anyone not to be a little excited by the maths on display.
The book is a little long; it also tails off somewhat. The author's attempts to bring to life some of the more abstruse arguments around the reality of mathematics, by going into the details of mathematics' turn to formalism in the twentieth century is a bit of a slog. I'm also not convinced it matches the style of the rest of the book - stepping as it does away from the pleasingly graspable examples (such as lotteries or Swedish taxes). There is also some overwrought discussion of elections and polls. In particular, the incredibly number-heavy discussion of different electoral schemes was mind-numbing and difficult to follow.
Aside from a few less than stellar sections and a bit of an unwelcome turn towards philosophy in the final pages of the book, this work is exceptionally well worth reading. For those who have scarcely thought of maths since school I think this is a must read, albeit a potentially challenging one in places. For those, like me, with an applied maths background, this book will likely be an easier read but, I still think, a very worthwhile one. I loved the recaps of (say) Bayes' theorem and all that though it wasn't anything new. But what really pulled me in were the discussions of the linkages and the connectedness of the subject as a whole. In short, I relished reading an expert's take on the field, particularly his comments on current maths and the future of the subject. In fact, I would welcome an entire Ellenberg book on modern mathematics as, in the hands of this author, I feel I might just about follow (some of) it.
The opening chapter is emblematic of the work as a whole. Ellenberg introduces the Laffer curve - which is just a line starting at zero at 0%, rising to some peak and then dropping to zero again at 100%. What does the graph show? It is a qualitative illustration of the fact that raising taxes from 0% generates some government income; you can keep raising taxes for a while, but eventually you hit a tipping point where additional taxes will actually reduce government revenue. Why? Because, in the limit of 100% taxation, no one would work at all. This graph, supposedly drawn by economist Arthur Laffer at a dinner with Dick Cheney and Donald Rumsfeld among others, is demonstrative of the non-linearity of the relationship between government revenue and income tax rates. Yes, raising rates below some threshold can raise revenues, but only to a point.
Why does the reader care? Because ignoring non-linearity leads to an awful lot of mistakes (the book is called 'How Not to be Wrong' keep in mind). Taking just one example, Ellenburg notes media headlines reporting that Sweden was reducing its taxes. If soak and spend Sweden is doing this, then what a great advert for reducing taxes in America too. This, implicitly to be sure, makes the assumption that the relationship between taxes and government revenues is linear. If revenues go up when Sweden lowers its taxes then so too in the US (a negative linear relation). But this is entirely false: the US is possibly to the left of the Laffer Curve's hump, while Sweden might be to the right. As such, Sweden may well lower its taxes and see a rise in government takings, but the US might see the opposite. Linearity cannot be assumed (and indeed is demonstrably false in this case, as Laffer showed). .
From here the author works his way into discussions of inference, or the art of understanding statistical significance; expectation, giving the wonderful example of a lottery that could be gamed under certain conditions, a fact exploited by some students who actually calculated the 'expected value' of their takings; regression, in particular focussing on the incredibly important notion of regression to the mean; and existence, or the nature of mathematical objects (formalism and that kind of thing).
If that all sounds forbiddingly difficult it, for the most part, is anything but. Ellenberg admirably explains difficult ideas in simple terms. The Law of Large Numbers, for instance, is just one such 'very formal sounding thing' (technical term). But in this author's hands, it is brilliantly (er) handled. First we are told about a highly counterintuitive finding: that North Dakota has the lowest rates of brain cancer in the US while South Dakota has one of the highest. This is explained first by pointing to the Law of Large Numbers - which says (in Wikipedia's terms) that "the average of the results obtained from a large number of trials should be close to the expected value and will tend to become closer to the expected value as more trials are performed".
This is hopelessly abstract. Instead, Ellenburg makes the point that small states like North or South Dakota will only have a few instances of a rare condition like brain cancer. Therefore, a few more cases in South Dakota can swing the proportion of cases in that state from the bottom to the top of the rankings (and vice versa for the North). By contrast, there are so many people in Texas or California that a few cases more or less will barely move the needle. This is the Law in action. With lots of people come more cases; small fluctuations are cancelled out and you're left with the number of cases you'd expect. As Ellenburg puts it: "smaller populations are inherently more variable". Intuitive result but spun out of what appears to be a bizarre finding.
Far from a series of disconnected examples, the author impressively manages to weave these illustrations of his subject together. Indeed, a key takeaway is how mathematics - the study of structures in some sense - is almost designed to provide transferable insight from one area to another. A real tour de force is found in the latter part of the book. The discussion initially is on expectation values. The application is to the lottery example mentioned above and why, under certain circumstances, it is actually worth playing: the expected value of a ticket exceeds its price, so just play a lot of tickets and you are highly likely to win overall.
There is a question remaining about what numbers should be chosen. Random choices works pretty well given the scheme the students have going on, but specific numbers are, it turns, out even better. Yet after a big discussion of expectation values, we take up a seemingly irrelevant issue, that of projective geometry. This leads to unusual geometries, including the seven-point Fano plane. This, it is shown, actually has precisely the property one would want when choosing numbers for a 3 pick, 7 ball lottery (out of nowhere - it really comes as a surprise when you return to the lottery example).
Though the expected value of using Fano plane numbers is equal to just randomly choosing your 3 picks, there is a lot less variance in the Fano approach. In other words, you won't win as big but you also are much less likely not to win much. The description of variance in the middle of all this is superb. The author then steps beyond 7-ball lotteries by describing how Fano's plane furnished an IBM researcher with an idea for error correcting an early computer. Finally (after some information theory) we return to larger error correcting codes that can furnish us with sets of tickets with just the properties you want for the particular lottery strategy the student's were employing.
It really is astonishing how long the lottery example is threaded through the overall account, cropping up in the context of multiple different mathematical stories and concepts. This is characteristic of the book as a whole. The author, whose passion for the subject veritably leaps off the page, paints an inspiring picture of an intertwined subject full of amazing results and surprising linkages. I defy anyone not to be a little excited by the maths on display.
The book is a little long; it also tails off somewhat. The author's attempts to bring to life some of the more abstruse arguments around the reality of mathematics, by going into the details of mathematics' turn to formalism in the twentieth century is a bit of a slog. I'm also not convinced it matches the style of the rest of the book - stepping as it does away from the pleasingly graspable examples (such as lotteries or Swedish taxes). There is also some overwrought discussion of elections and polls. In particular, the incredibly number-heavy discussion of different electoral schemes was mind-numbing and difficult to follow.
Aside from a few less than stellar sections and a bit of an unwelcome turn towards philosophy in the final pages of the book, this work is exceptionally well worth reading. For those who have scarcely thought of maths since school I think this is a must read, albeit a potentially challenging one in places. For those, like me, with an applied maths background, this book will likely be an easier read but, I still think, a very worthwhile one. I loved the recaps of (say) Bayes' theorem and all that though it wasn't anything new. But what really pulled me in were the discussions of the linkages and the connectedness of the subject as a whole. In short, I relished reading an expert's take on the field, particularly his comments on current maths and the future of the subject. In fact, I would welcome an entire Ellenberg book on modern mathematics as, in the hands of this author, I feel I might just about follow (some of) it.