The Signal and the Noise: Why So Many Predictions Fail--but Some Don't

Front Cover
Penguin, 2012 - Business & Economics - 534 pages
31 Reviews
"Nate Silver's The Signal and the Noise is The Soul of a New Machine for the 21st century." --Rachel Maddow, author of Drift

Nate Silver built an innovative system for predicting baseball performance, predicted the 2008 election within a hair's breadth, and became a national sensation as a blogger--all by the time he was thirty. He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of the website FiveThirtyEight.

Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.

In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good--or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary--and dangerous--science.

Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise.

With everything from the health of the global economy to our ability to fight terrorism dependent on the quality of our predictions, Nate Silver's insights are an essential read.

What people are saying - Write a review

User ratings

5 stars
4 stars
3 stars
2 stars
1 star

LibraryThing Review

User Review  - timspalding - LibraryThing

As a devotee of 538, website and especially podcasts, I was looking forward to this book. I wanted to love it. It was enjoyable, certainly, and I learned a few things. But after opening the door, it ... Read full review

LibraryThing Review

User Review  - Razinha - LibraryThing

Anyone who can make statistics interesting deserves five stars for that talent alone. Silver not only does that, but he writes in a fun,style, covering sports, gambling, weather, poker, stocks ... Read full review



Other editions - View all

Common terms and phrases

About the author (2012)

At about the time The Signal and the Noise was first published in September 2012, "Big Data" was on its way becoming a Big Idea. Google searches for the term doubled over the course of a year,1 as did mentions of it in the news media.2 Hundreds of books were published on the subject. If you picked up any business periodical in 2013, advertisements for Big Data were as ubiquitous as cigarettes in an episode of Mad Men.

But by late 2014, there was evidence that trend had reached its apex. The frequency with which Big Data was mentioned in corporate press releases had slowed down and possibly begun to decline.3 The technology research firm Gartner even declared that Big Data had passed the peak of its "hype cycle."4

I hope that Gartner is right. Coming to a better understanding of data and statistics is essential to help us navigate our lives. But as with most emerging technologies, the widespread benefits to science, industry, and human welfare will come only after the hype has died down.


I worry that certain events in my life have contributed to the hype cycle. On November 6, 2012, the statistical model at my Web site FiveThirtyEight "called" the winner of the American presidential election correctly in all fifty states. I received a congratulatory phone call from the White House. I was hailed as "lord and god of the algorithm" by The Daily Show''s Jon Stewart. My name briefly received more Google search traffic than the vice president of the United States.

I enjoyed some of the attention, but I felt like an outlier--even a fluke. Mostly I was getting credit for having pointed out the obvious--and most of the rest was luck.*

To be sure, it was reasonably clear by Election Day that President Obama was poised to win reelection. When voters went to the polls on election morning, FiveThirtyEight''s statistical model put his chances of winning the Electoral College at about 90 percent.* A 90 percent chance is not quite a sure thing: Would you board a plane if the pilot told you it had a 90 percent chance of landing successfully? But when there''s only reputation rather than life or limb on the line, it''s a good bet. Obama needed to win only a handful of the swing states where he was tied or ahead in the polls; Mitt Romney would have had to win almost all of them.

But getting every state right was a stroke of luck. In our Election Day forecast, Obama''s chance of winning Florida was just 50.3 percent--the outcome was as random as a coin flip. Considering other states like Virginia, Ohio, Colorado, and North Carolina, our chances of going fifty-for-fifty were only about 20 percent.5 FiveThirtyEight''s "perfect" forecast was fortuitous but contributed to the perception that statisticians are soothsayers--only using computers rather than crystal balls.

This is a wrongheaded and rather dangerous idea. American presidential elections are the exception to the rule--one of the few examples of a complex system in which outcomes are usually more certain than the conventional wisdom implies. (There are a number of reasons for this, not least that the conventional wisdom is often not very wise when it comes to politics.) Far more often, as this book will explain, we overrate our ability to predict the world around us. With some regularity, events that are said to be certain fail to come to fruition--or those that are deemed impossible turn out to occur.

If all of this is so simple, why did so many pundits get the 2012 election wrong? It wasn''t just on the fringe of the blogosphere that conservatives insisted that the polls were "skewed" toward President Obama. Thoughtful conservatives like George F. Will6 and Michael Barone7 also predicted a Romney win, sometimes by near-landslide proportions.

One part of the answer is obvious: the pundits didn''t have much incentive to make the right call. You can get invited back on television with a far worse track record than Barone''s or Will''s--provided you speak with some conviction and have a viewpoint that matches the producer''s goals.

An alternative interpretation is slightly less cynical but potentially harder to swallow: human judgment is intrinsically fallible. It''s hard for any of us (myself included) to recognize how much our relatively narrow range of experience can color our interpretation of the evidence. There''s so much information out there today that none of us can plausibly consume all of it. We''re constantly making decisions about what Web site to read, which television channel to watch, and where to focus our attention.

Having a better understanding of statistics almost certainly helps. Over the past decade, the number of people employed as statisticians in the United States has increased by 35 percent8 even as the overall job market has stagnated. But it''s a necessary rather than sufficient part of the solution. Some of the examples of failed predictions in this book concern people with exceptional intelligence and exemplary statistical training--but whose biases still got in the way.

These problems are not so simple and so this book does not promote simple answers to them. It makes some recommendations but they are philosophical as much as technical. Once we''re getting the big stuff right--coming to a better understanding of probably and uncertainty; learning to recognize our biases; appreciating the value of diversity, incentives, and experimentation--we''ll have the luxury of worrying about the finer points of technique.

Gartner''s hype cycle ultimately has a happy ending. After the peak of inflated expectations there''s a "trough of disillusionment"--what happens when people come to recognize that the new technology will still require a lot of hard work.


But right when views of the new technology have begun to lapse from healthy skepticism into overt cynicism, that technology can begin to pay some dividends. (We''ve been through this before: after the computer boom in the 1970s and the Internet commerce boom of the late 1990s, among other examples.) Eventually it matures to the point when there are fewer glossy advertisements but more gains in productivity--it may even have become so commonplace that we take it for granted. I hope this book can accelerate the process, however slightly.

This is a book about information, technology, and scientific progress. This is a book about competition, free markets, and the evolution of ideas. This is a book about the things that make us smarter than any computer, and a book about human error. This is a book about how we learn, one step at a time, to come to knowledge of the objective world, and why we sometimes take a step back.

This is a book about prediction, which sits at the intersection of all these things. It is a study of why some predictions succeed and why some fail. My hope is that we might gain a little more insight into planning our futures and become a little less likely to repeat our mistakes.

More Information, More Problems

The original revolution in information technology came not with the microchip, but with the printing press. Johannes Gutenberg''s invention in 1440 made information available to the masses, and the explosion of ideas it produced had unintended consequences and unpredictable effects. It was a spark for the Industrial Revolution in 1775,1 a tipping point in which civilization suddenly went from having made almost no scientific or economic progress for most of its existence to the exponential rates of growth and change that are familiar to us today. It set in motion the events that would produce the European Enlightenment and the founding of the American Republic.

But the printing press would first produce something else: hundreds of years of holy war. As mankind came to believe it could predict its fate and choose its destiny, the bloodiest epoch in human history followed.2

Books had existed prior to Gutenberg, but they were not widely written and they were not widely read. Instead, they were luxury items for the nobility, produced one copy at a time by scribes.3 The going rate for reproducing a single manuscript was about one florin (a gold coin worth about $200 in today''s dollars) per five pages,4 so a book like the one you''re reading now would cost around $20,000. It would probably also come with a litany of transcription errors, since it would be a copy of a copy of a copy, the mistakes having multiplied and mutated through each generation.

This made the accumulation of knowledge extremely difficult. It required heroic effort to prevent the volume of recorded knowledge from actually decreasing, since the books might decay faster than they could be reproduced. Various editions of the Bible survived, along with a small number of canonical texts, like from Plato and Aristotle. But an untold amount of wisdom was lost to the ages,5 and there was little incentive to record more of it to the page.

The pursuit of knowledge seemed inherently futile, if not altogether vain. If today we feel a sense of impermanence because things are changing so rapidly, impermanence was a far more literal concern for the generations before us. There was "nothing new under the sun," as the beautiful Bible verses in Ecclesiastes put it--not so much because everything had been discovered but because everything would be forgotten.6

The printing press changed that, and did so permanently and pro

Bibliographic information