(CNN) News events are a constant reminder that the world is full of risk. Jets crash or vanish into the sea. Terrorists perpetrate nearly unthinkable acts of violence on civilians. Nature sends us tornadoes, floods, sharks, snakes, bacteria, viruses and countless other deadly threats. Certain foods lead to increased risk of diabetes and heart disease. Others are contaminated, recalled from shelves.
Most of us are aware of these and other risks in our lives. Yet we are terrible at estimating how likely we are to be affected by them.
In a common example, many people get nervous flying in commercial jets, but when was the last time you buckled your seat belt in the back seat of a taxi? The latter, of course, is much riskier than flying, but we generally don't perceive it that way.
We fear a lot of things in nature -- those sharks, snakes and viruses -- but find it hard to adjust to a healthier diet despite the fact that heart disease has been the leading killer of Americans for decades and is largely preventable.
Psychologists have studied why we are so bad at assessing risk. For one thing, our emotions often play a role in our perception of risk, outweighing logic. Nuclear energy researchers have argued this for years, suggesting that the public perception of radiation risk far outweighs any real-world danger. In his book "Before It's Too Late: A Scientist's Case for Nuclear Energy," physicist Bernard Cohen states, "The public's understanding of radiation dangers has virtually lost all contact with the actual dangers as understood by scientists."
Also, sensational events often make us disproportionately afraid, driven seemingly in part by the fact that they are a rare occurrence. Our brains interpret "unusual" as being more "threatening."
For instance, psychologist Paul Slovic suggests that when people understand a system well, such as our highway or train system, and an unusual accident occurs, we know that this is a rare tragedy. But when we don't understand a system as well -- nuclear energy, for instance -- we may view an accident as a sign of more tragic things to come. A small accident "may have immense social consequences if it is perceived as a harbinger of further and possibly catastrophic mishaps," Slovic writes.
So the more mundane a risk is, the more likely we are to overlook its danger. For instance, one of the leading causes of death in the United States is a combination of influenza and pneumonia. Yet despite the efforts of public health officials encouraging us to get flu shots, fewer than half of US adults get one each year.
This means we tend to be afraid of immediate, unusual, high-impact risks such as terrorism and ignore long-term risks, such as our own health or climate change.
News media, with their emphasis on uncommon events (aka "news"), contribute to public misperceptions about risk as well. During the 2014 Ebola outbreak in West Africa, about 20% of Americans were concerned they would contract the disease, and as many as 43% of Americans were concerned that a loved one might get it despite only a few isolated cases occurring in the United States. Similar fear levels were reported for the biggest health news story of 2009: H1N1/swine flu virus. And it's difficult to attribute those concerns to factors beyond media coverage.
Our inability to accurately assess risk would be problematic enough if it only caused people to make bad personal decisions, such as not wearing seat belts or skipping a flu shot. But it is also problematic on a larger scale, because misdirected fears have real-world policy implications.
For instance, we face an ongoing debate about the number of Syrian refugees the United States should allow into the country for resettlement. A common argument against limiting that number, or even reducing it to zero, is the notion that terrorists may sneak in among the refugees. The probability of that happening is close to zero, according to the Cato Institute, yet roughly half of Americans oppose Syrian refugee resettlement, with many specifically citing the fear of terrorists sneaking in.
There are constructive ways to consider risk. Businesses and humanitarian responders have a framework for looking objectively at threats. In essence, they consider both the likelihood of something terrible happening and the consequences of that event. By considering both the probability of, say, a flood occurring and how severe it might be, businesses and emergency response organizations prioritize where to spend their resources to limit the impacts of an emergency.
It might be useful for us to use a similar approach for public policy efforts. For instance, a terrorist attack is extremely unlikely to affect our day-to-day lives and has taken relatively few American lives in the past decade. That said, if a terrorist attack did occur, the physical, psychological and financial effects would be devastating. So we may agree to continue our significant public investment in fighting terrorism.
This line of thinking might also encourage us to invest more into funding research into heart disease. It's both common and fatal, killing an average of 610,000 people annually in the United States.
In raw numbers, the US spent more than $16 billion in 2013 on counterterrorism efforts, according to the Pew Research Center. In that same year, we spent $1.3 billion on heart disease research, according to the National Institutes of Health.
Considering both the likelihood and consequence of a tragedy might also help us make better choices in how we invest money to begin with. For instance, about 2011, Bill Gates began strongly supporting investments in nuclear energy, and he has since launched a nuclear energy company, TerraPower. Part of his justification for supporting nuclear energy is that, in big-picture terms, nuclear power-related accidents, while extreme, kill fewer people per kilowatt hour generated than do fossil fuels via air pollution.
It is important for individuals, with the help of the media, to accurately report data and provide head-on comparisons of risks we face in the world and their context. We need a more accurate perception of which risks are valid and which are overblown. And together with the researchers, we will all better understand public threats, which will lead to better policy toward reducing them.