Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Information Crisis: How a Better Understanding of Science Can Help Us Face the Greatest Problems of Our Time
Information Crisis: How a Better Understanding of Science Can Help Us Face the Greatest Problems of Our Time
Information Crisis: How a Better Understanding of Science Can Help Us Face the Greatest Problems of Our Time
Ebook784 pages9 hours

Information Crisis: How a Better Understanding of Science Can Help Us Face the Greatest Problems of Our Time

Rating: 0 out of 5 stars

()

Read preview

About this ebook

"This is a book every citizen should read." -SY MONTGOMERY, New York Times bestselling author of The Soul of an Octopus and editor of The Best American Science and Nature Writing 2019


The rapid and ubiquitous spread of information has f

LanguageEnglish
PublisherHill Press
Release dateApr 16, 2024
ISBN9781735111353
Information Crisis: How a Better Understanding of Science Can Help Us Face the Greatest Problems of Our Time
Author

Julia Soplop

Julia Soplop is a science writer and the author of Equus Rising: How the Horse Shaped U.S. History, winner of an Independent Book Publisher (IPPY) Award and a Feathered Quill Book Award. Her work has appeared in numerous publications, including National Geographic, Summit Daily News, and Skiing. She also develops thought leadership for organizations that address issues of scientific or social concern. She holds a bachelor's from Duke University and a master's from the medical journalism program at UNC-Chapel Hill. She lives with her family outside of Chapel Hill, NC.

Related to Information Crisis

Related ebooks

Science & Mathematics For You

View More

Related articles

Reviews for Information Crisis

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Information Crisis - Julia Soplop

    Introduction

    Science Literacy Slices Through the Noise

    S

    aturated, boundless, glowing orange. My sister had texted me a photo one morning, but I couldn’t decipher the subject. Based on the foreground, I could tell she’d snapped the image from her home in the Berkeley Hills, but her usual view of the San Francisco Bay had evaporated—though not behind a typical layer of marine fog. It seemed her vantage point was from inside an illuminated pumpkin. Was this an odd sunset from the previous night?

    No. As it turned out, the western U.S. was ablaze during its worst fire season on record. Fire maps of the area were illegible—just giant blobs of forest fires, all smeared together, everywhere. Smoke from the fires had shot 55,000 feet into the atmosphere and converted into frozen clouds of ash that blotted out the sun and transformed the region into a heavy sea of burnt sienna.¹

    My sister had taken the photo moments before she had sent it, around nine o’clock in the morning on September 9, 2020, the day it seemed as if the sun never rose in the Bay Area. While the previous days had registered once-unusual temperatures in the region of more than 100 degrees Fahrenheit, this nuclear winter, as people had begun to call it, had precipitated a 40-degree drop in temperature along with the tangerine sky. The effect felt nothing short of apocalyptic.

    Throughout 2020, California experienced five of the 10 largest wildfires in its recorded history.² The 9,600 wildfires that lashed the state that year burned almost 4.2 million acres—a state record—destroying nearly 10,500 structures and killing more than 30 people.

    While climate science doesn’t usually allow us to assign causation of a specific event to climate change, such as an individual wildfire or hurricane, it does show us that a warming climate influences these natural occurrences in ways that make them more likely to explode into events of monstrous consequence.³

    The last several years have been a climate-fueled nightmare for many residents of the increasingly arid West and the water-logged South, as well as for populations around the globe. In North Carolina, my family now spends the fall with our eyes to the tropics, trying to anticipate whether the depressions forming off Africa’s coast will spin up into massive hurricanes and barrel our way, forcing us to decide: hunker down or flee?

    Meanwhile, our western family and friends are making their own daunting calculations. They’re tracking air quality, donning masks (even pre-pandemic), parking their cars strategically for the best chance to outrun an approaching fire. Under local instruction, they’re developing and memorizing lists of what they’ll carry with them depending on how much warning they receive to evacuate before a wildfire is expected to reach their homes: two hours (medications, passports, computers, a few changes of clothes, some food, a few family heirlooms), one hour (less stuff), ten minutes (LEAVE EVERYTHING AND RUN TO THE CAR).

    They’re also getting sick from wildfire smoke, which has been linked to exacerbating several underlying health conditions and contributing to new ones.⁴ They’re losing their home insurance time and again due to increased fire risk in the region. And if they can find new insurance plans, they’re paying astronomical premiums just to underinsure their homes. Meanwhile, their utilities are struggling to provide consistent power when once-rare, but now fairly common, extreme heat waves arrive.

    Others live under the simultaneous threats of fire and flood. When I traveled to Montana in June 2022 for a book reading, I tacked on a few days to visit my aunt, who lives just outside Yellowstone National Park. As we hiked through mountain terrain that had been scorched by fire in recent years, my aunt expressed hope that an abundance of late-season snow might lessen the summer’s heightened wildfire risk due to long-term drought. Another day, we drove through a winding river canyon into Yellowstone, where we experienced a blockbuster animal-sighting adventure: wolves, moose, black bears, a grizzly, a coyote mama and her pups, bison, bighorn sheep, marmots, and a badger. On the way home, we stopped in the small community of Gardiner, just outside the park’s northern gate, to buy gifts for my kids.

    One week later, a natural disaster decimated this entire area—not a wildfire, but a flood. Several inches of rain fell within a couple days and melted nearly six inches of that valuable snow in the process, resulting in severe flash flooding that submerged the region.⁵ (The prior year, a scientific assessment had predicted that this exact scenario, prolonged drought punctuated by flash flooding, would become the norm in this region, due to anthropogenic climate change.)⁶

    The first images of the widespread damage came from a helicopter gliding up the canyon in Yellowstone that we had driven through just days earlier. The camera zoomed in on a large swath of road the river had violently ripped away—the very spot we had slowed the car and craned our necks to look for bighorn sheep on the rocks above. Just outside the park, roads wiped out in all directions, tourism-reliant Gardiner had been cut off from the rest of the world.

    Scientific consensus—the agreement of scientific evidence—has existed for more than a quarter of a century that climate change is happening and that the consequences will be dire for just about every life form if left unchecked.⁷ The writing has been on the wall for considerably longer, though. Scientific evidence suggesting that pumping carbon dioxide into the atmosphere would cause the climate to warm has been accumulating for nearly 170 years, since the advent of industrialization itself.⁸

    We have long understood the work required to prevent these doom-and-gloom scenarios, which means that by neglecting to act based on our ever-growing scientific knowledge, we have, collectively, chosen this destructive path.

    Why? If the science has been clear for so long on both the risks of climate change and what it would take to mitigate them, why have we failed to dig ourselves out of this crisis of our own creation?

    For the same reason the U.S. fumbled our COVID-19 pandemic response, despite the remarkably rapid development of safe and effective vaccines and treatments: Our complicated relationship with information—and, in particular, scientific information—has prevented us from digesting and adequately confronting many of the greatest problems of our time.

    In numerous ways, the pandemic unfolded like an accelerated version of the climate crisis, boiling to the surface the elements that obstruct us from taking action to safeguard our health and well-being.⁹ As with climate change, the evidence that could have guided our actions to prevent untold morbidity and mortality from COVID-19 wasn’t enough to save us all from ourselves.

    Instead, a constellation of factors that influence how we interact with information converged to create confusion and hamper an effective response:

    Rampant distortion of information by industries, interest groups, and individuals for power and profit

    Dependence on social media and other unreliable information sources

    The post-truth era—the idea that a set truth doesn't exist, so competing views more or less always have merit

    Lack of trust in government, science, media, and expertise of any sort

    Extreme political polarization that makes identity exploitation easy

    Unclear communication from many scientists and doctors

    The inability of many media outlets to generate strong health and science reporting

    Some media outlets’ demonstrated history of devaluing science and spreading scientific misinformation, particularly on issues involving aversion to regulation

    The built-in processing shortcuts our brains use to simplify information

    Our fractured attention, which can prevent us from overriding these shortcuts to think more critically

    Widespread science illiteracy, resulting in a lack of understanding of the value of science and a vulnerability to pseudoscientific claims

    Widespread media illiteracy, hindering the public from seeking or finding credible sources of scientific information.

    Of course, structural issues, such as the absence of a unified health system, also prevent us from responding sufficiently to some of the biggest challenges we face, but the factors above can help us understand why we routinely choose to neglect the structures that could so obviously help us apply science to society’s greatest benefit.

    While some elements influencing how we uptake information are age-old, others have more recently emerged, creating an increasingly complicated landscape we must learn to better navigate.

    For instance, it should be news to no one that the rise of social media has resulted in the spread and reception of misinformation at a scale that would have been unimaginable even a decade or two ago.

    (Misinformation is a broad term describing false or misleading information. A person may spread misinformation without understanding that it’s inaccurate. Disinformation, however, refers to false information that is intentionally created and spread to mislead. Since it can be difficult to determine the origin of and intention behind false information, I will typically refer to it as misinformation. If a source is known, I may refer to that false information as disinformation.)

    A study published in the journal Science investigated the circulation of true and false news stories on Twitter from 2006 to 2017 and found, Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information.¹⁰ In  the social media realm, it’s harder than ever to find the signal amongst the noise.

    Some people are even creating more noise as their business. In mid-2016, BuzzFeed editor Craig Silverman noticed a strange new phenomenon online: More than 140 bogus news sites had popped up in Macedonia, propagating misinformation about the U.S. presidential election.¹¹ It turned out most of the creators of these sites were teenagers who didn’t care about the American election or Donald Trump at all. They were just out for a buck and had realized the posts that went viral on Facebook and drove massive volumes of traffic to their sites were those that targeted Trump supporters by fabricating stories or spreading misleading information. Silverman called this misinformation fake news, coining the original meaning of the term.

    As sociologist Zeynep Tufekci wrote, "the new, algorithmic gatekeepers aren’t merely (as they like to believe) neutral conduits for both truth and falsehood. They make their money by keeping people on their sites and apps; that aligns their incentives closely with those who stoke outrage, spread misinformation, and appeal to people’s existing biases and preferences. Old gatekeepers failed in many ways, and no doubt that failure helped fuel mistrust and doubt; but the new gatekeepers succeed by fueling mistrust and doubt, as long as the clicks keep coming."¹²

    Trump later co-opted the term fake news but, in doing so, added an additional meaning. During his presidency, he relentlessly referred to the press as the enemy of the people and declared any type of news that was unfavorable to him to be fake news, regardless of whether it was true.¹³ His administration further attacked the idea of objective truth by introducing the term alternative facts to describe false information they had issued that was easily contradicted with evidence.

    Soon fake news comprised three aspects: the actual crisis of false information circulating on social media; shoddy reporting; and accurate news that conveys facts you personally find unflattering or don’t wish to be true. It would be easy to blame Trump for catapulting us into what many refer to as the post-truth era, where the idea of a single, agreed-upon truth is questioned, and emotions and beliefs supersede facts when determining which truth to endorse. But this notion has a decades-long history in the U.S.

    The concept that truth is subjective has roots on the opposite side of the spectrum in the liberal academic philosophy of postmodernism, which became popular in the 1970s and 1980s. The word postmodern dates back to Jean-François Lyotard’s 1979 book, The Postmodern Condition: A Report on Knowledge, which sought to diagnose this transition of American thought away from objective truth toward subjectivity, rather than advocate for the phenomenon.¹⁴

    Whether postmodernism itself was a diagnosis of a growing American phenomenon, the origin of it, or both is less relevant than this fact: The myth that truth is what we want it to be was festering beneath the surface for decades and finally erupted into public consciousness during Trump’s presidency. Now we’re experiencing the bitter consequences of a post-truth society armed with incredibly powerful tools for disseminating misinformation.

    A variety of evidence shows that the ceaseless flood of information from every direction and in every format is chipping away at our ability to focus our attention, which can prevent us from moving past surface-level judgments to thinking deeply about the issues that matter most.¹⁵ In fact, an entire economy has sprung up with the distinct goal of nabbing our attention for power and financial gain by preying specifically on our diminishing capacity to focus. Those who seek to manipulate us have recognized that they can easily do so by concocting simplified narratives, knowing that few people will commit the brain power necessary to identify these claims as vastly reductive and misleading.

    Another feature of the societal landscape that makes us vulnerable to the classic strategies used to distort information is the growing devaluation of expertise. The dismissal of expertise is hardly new; experts have been distraught over this insult for a long while. In 1980, scientist Isaac Asimov wrote, There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and culture life.¹⁶

    The denial of expertise has increased and reached a fever pitch in recent years, though. Many observers have noted that the tenor and quality of public dialogue has degraded, despite the wealth of information now available to us on just about any subject. In his 2017 book, The Death of Expertise, Tom Nichols noted, "The death of expertise…is a different problem than the historical fact of low levels of information among laypeople. The issue is not indifference to established knowledge; it’s the emergence of a positive hostility to such knowledge. This is new in American culture, and it represents the aggressive replacement of expert views or established knowledge with the insistence that every opinion on every matter is as good as every other. This is a remarkable change in our public discourse."¹⁷

    Research has demonstrated that the worldview of rejecting expertise, also known as populism, is correlated with conspiratorial thinking.¹⁸ This attitude has consequences. For instance, evidence has shown that populism is associated with conspiratorial beliefs about COVID-19, and that those who held such beliefs were more likely than those who did not to disregard the guidance of public health officials during the pandemic.

    After years of reading about the devaluation of expertise without grasping why someone would want to ignore the professional opinions of people who spend their careers working to understand specific subjects, I came across an essay by novelist Richard Russo. In it, he explored his struggle to come to terms with the fact that several of his beloved family members had made poor, and even deadly, medical decisions by ignoring the advice of their doctors and public health officials in favor of pseudoscience:

    …the irony is that many people who behave foolishly consider themselves to be in the know, to be in possession of inside knowledge; access to it, for them, is a point of pride. The lesson that life seemed determined to teach my father on a daily basis was that he didn’t know anyone worth knowing, that he had no strings to pull. Because he had only a high-school education and worked with his hands, America seemed determined to make him understand just how unimportant he was in the larger scheme of things. So the possibility that in this particular instance he actually did know somebody worth knowing had to be very rewarding. And to his credit, he didn’t want to hoard his good fortune.¹⁹

    The dismissal of expertise finally made a semblance of sense to me as a by-product of American society, of our eternally inequitable systems.

    However, there is a fundamental flaw with the dismissal of expertise and the misconception that the truth is subjective, when it comes to science anyway, because there is an objective truth to how the physical universe operates. And it takes extensive training and experience to learn how to study the physical world in a way that can move us closer to understanding that reality. While some aspects of life are subjective, how the natural world functions is not. It works the way it works. Science is not truth, and the experts who perform it don’t serve up certainty. But history has shown us that science gives us a better shot at gleaning the realities of the natural world than any other human system.

    Today, growing distrust in the very existence of truth and in the value of expertise places us in a particularly precarious position of vulnerability to the tried-and-true strategies of players who distort information.

    Many players simultaneously reinforce and take advantage of, or at least benefit from, this susceptibility. Industries, interest groups, and individuals relentlessly distort scientific information, sometimes due to unawareness and sometimes due to deceitfulness. These players and their motivations have fallen into the same three broad categories for centuries: those who stand to gain power and/or profit if evidence supports their cause; those who stand to lose power and/or profit if evidence does not support their cause; and those who stand to benefit from stoking and disseminating scientific controversies, whether real or fabricated.

    They are widespread and often found in the following groups, frequently in collaboration: industry; political parties and associated think tanks; religious communities; sloppy and rogue scientists; predatory journals; popular media; and social media and its users.

    Of course, everyone has something to gain or lose, but that doesn’t mean they’re all willing to manipulate scientific information to get what they want. Cynicism against any person or group who could possibly benefit from spreading misinformation won’t get us anywhere. Healthy skepticism can.

    And yet, our troubled relationship with information often precludes us from being able to distinguish players who twist science for their own benefit from those who apply and communicate it accurately and honestly.

    We use three general methods for gathering knowledge about the physical world, referred to as ways of knowing: authority, logic, and science.²⁰ When we examine authority and logic, especially in the context of our brains’ myriad processing shortcuts and limitations, we can see how these methods often fail to generate reliable information about the natural world. We can also see how science takes a step beyond our innate methods of knowledge acquisition.

    Much of the information we gather about the material world comes from sources of authority—that is, people and institutions in positions of power, such as political, community, and religious leaders; teachers; experts (whether credible or self-proclaimed); and family members. Ancient and sacred texts can also serve as authorities, as can perceived revelations from God or gods.

    No one has the time or ability to learn all we’d like to know about how the world functions, so we place our trust in authority figures to understand how some things work and to relay relevant information to us.

    Sometimes obeying and leaning on the knowledge of authority figures is essential. If my kindergartener didn’t obey my authority when I told her to stop doing something that could cause her injury, even if I’m sometimes wrong about the real risk of harm, she would often find herself in perilous situations because of her lack of life experience.

    But gaining knowledge from authority figures has serious limitations, too. Authority and credibility are not necessarily synonymous, and authority figures can be subject to as many biases and logical fallacies as the rest of us.

    In the presence of an authority figure, though, authority bias often sets in, which is our tendency to be more influenced by and assign unwarranted weight or accuracy to their opinion.²¹ Being powerful or charismatic or ancient doesn’t cause an authority to understand how the physical world works. But in the face of power and enchantment, Because I said so, become magic words, when they should actually serve as a red flag.

    …as humans, we seek certainty. Demagogues, dictators, cults, and even some religions offer it—a false certainty—that many find irresistible, wrote neuroscientist Daniel J. Levetin.²² Less nefarious sources of authority, such as those in our daily lives to whom we look for explanation, often provide us a false sense of certainty, too.

    It’s comforting to hear someone speak with certainty, especially when the subject is unfamiliar to us. But accepting information based solely on authority leaves us without a solution when authorities contradict each other. If we accept information on authority alone, we have no way to evaluate which source—if any—is correct, leaving us perhaps to accept the explanation of the one who shouts, Because I said so! the loudest.

    Most of us probably take some type of information on authority every day. It’s a natural and necessary part of how we try to decipher the world, but it can hamper our ability to make accurate sense of it.

    Employing logic, or reasoning based on strict rules, is another way we gather knowledge about the material world. For instance, deductive reasoning allows us to draw a specific conclusion based on a general premise. All dogs are mammals. That animal is a dog. Therefore, that animal is a mammal.

    Logic is powerful. A conclusion is true when the initial premise is correct, and solid logic allows us to define rules and identify correlation. However, logic is limited. It can’t determine causation—a relationship in which one event, variable, or state brings about a change in another—and it falls apart if the premise is incorrect.

    We can also gain knowledge through critical thinking, which is often considered a type of informal logic. Experts disagree over the exact definition of critical thinking, but it entails skillfully gathering, synthesizing, and analyzing relevant information to guide beliefs and behavior.²³ Thinking critically requires a constant, active commitment to this particular way of making judgments about the world around us.

    One of the main challenges of critical thinking, however, is that whether we’ve been trained to think critically or not, we often fail miserably to do so, or at least to do so consistently. Social psychologist Jonathon Haidt described one reason for this failure in his book, The Righteous Mind: Intuitions come first, strategic reasoning second.²⁴ In other words, instead of trying to gather and analyze information in an unbiased way to guide our conclusions, we often do the opposite. Our gut tells us we feel a certain way about something, then we work to drum up evidence to rationalize our feelings. This process is known as motivated reasoning. Cherry-picking evidence to try to prove our point of view instead of using evidence and reason to determine our point of view is antithetical to critical thinking. Rationalizing how we already perceive the world to work obstructs our ability to determine how it actually does.

    Our brains’ information processing shortcuts and logical fallacies also contribute to our incompetence at consistently thinking critically, or doing so at all. Even when we go through the correct motions of collecting evidence and trying to analyze it, we often stumble in our interpretation, because our biases can skew the inputs we seek to guide our decision-making.

    Science takes our ability to explore the material world beyond the false certainty of Because I said so, and the limits of our biases. It does involve logic and critical thinking, but one characteristic that sets science apart as a way of knowing is that it allows us to test our ideas about the natural world against the natural world itself. In science, wrote physical anthropologist Eugenie C. Scott, nature serves as the final arbiter.²⁵

    Scientist E. O. Wilson explained, Natural selection built the brain to survive in the world and only incidentally to understand it at a depth greater than is necessary to survive. The proper task of science is to diagnose and correct the misalignment.²⁶

    Of course science is far from perfect and has many limitations of its own.

    Even scientific truth is merely an approximation, Carl Sagan observed.²⁷

    Science literacy can cut a powerful path through the maze of information in which we live today.

    Science is neither magical nor miraculous; it is the continuous, ever-evolving product of centuries of curiosity, creativity, and critical thinking, fused with arduous, meticulous, and often-tedious work. It is also the most powerful tool humans have developed to understand and influence the physical world.

    Throughout most of human history, from hunter-gatherer societies until relatively recent times, studies estimate that fewer than three-quarters of infants survived to their first birthdays and only around half of children survived to age 15—survival rates equivalent to, or even lower than, those of the young of several modern non-human primates, including orangutans and bonobos.²⁸ Infectious diseases caused the majority of these childhood deaths.

    It was only after medicine began to shift from a pseudoscience to a scientifically based field in the mid-1800s in Europe that infant and child mortality rates began to plunge in countries with access to this new form of medicine.²⁹ The drop was primarily due to the development and widespread distribution of vaccines, as well as other public health measures, such as basic sanitation and, eventually, antibiotics, which together greatly reduced deaths from infectious diseases. A century later, more countries around the world began to see declines in infant and childhood mortality rates as they finally gained broader access to these innovations, particularly vaccines.³⁰ Today, about 97 percent of infants globally survive to their first birthdays and about 95 percent of children survive to age 15.³¹

    Science and scientifically derived technology have transformed our world in ways that can feel incomprehensible to us as 21st-century readers. Scientific innovation hasn’t only ensured the survival of nearly all of our infants and extended our average lifespans but also improved our quality of life as a species in innumerable ways. Thanks to science, we can easily treat many acute ailments that, not long ago, caused a lifetime of suffering. We can spread out and live just about anywhere on the globe thanks to the innovation of climate control. We can limit our family size through contraception. We can selectively breed foods to grow where they are needed most to reduce famine and store them for long periods of time thanks to refrigeration. We can explore the world using numerous types of transportation. We can launch satellites into space, allowing us to communicate with people across the planet to share ideas and warn each other of danger. We can inform policy with real-world data to improve social issues, such as childhood poverty. We can quantify and compute. We can analyze and predict. We can influence many aspects of the world for our benefit—and to our detriment when we don’t regulate it appropriately. Think climate change and nuclear weapons. We also have the capacity to fix many of the problems of our own making, if we choose to.

    The U.S. is the global leader in science and technology research.³² Yet for the majority of Americans, scientific understanding remains stubbornly elusive. If we received much scientific education in school at all, it often revolved around what I call the scientific-method-as-a-recipe curriculum: memorizing the scientific method and lists of facts, replicating a few classic lab experiments, and calculating some equations. None of these activities teaches us much about the character or value of science; the varying quality of evidence the process of science can produce; or the state of the systems that allow us to apply science to better society (or prevent us from doing so).

    A poll by the National Science Foundation (NSF) asked American adults whether they understood the term scientific study.³³ Only 3 in 10 respondents said yes. By this measure, most Americans are scientifically illiterate.

    But what does it really mean to be scientifically literate anyway? This is my definition: understanding the overarching concept of science as a system comprising numerous features that, together, help us to uncover how the world works. Making sense of science also means understanding the value of applying it effectively to tackle not only problems of the hard sciences, but of the social sciences, as well—gun violence, the opioid crisis, educational achievement gaps, health care access, maternal mortality, systemic racism.

    Even those who pursue advanced degrees in the sciences are often trained how to perform science in their specific fields without being taught how to conceptualize and describe it as a broader system.³⁴ For instance, a scientist might have a doctorate in immunology and decades of experience developing vaccines, but not be able to articulate to a relative around the dinner table why we should put any stake in science when sometimes it turns out to be wrong.

    The generalist study of science has sort of faded from view as we’ve taken this hyper-specialized look at the world, where we’re drilling down largely at the molecular level of things, Keene Haywood, a senior lecturer at the University of Miami’s Rosenstiel School of Marine, Atmospheric, and Earth Sciences, told me.³⁵ Our ability to investigate the natural world in a more detailed way than ever before has led to astonishing progress in many fields. But you lose sight sometimes of how this tiny thing fits into the bigger picture of what’s going on, Haywood said. This loss of the broader picture has important societal implications.

    Revisiting our own grasp of science, and perhaps even reimagining our relationship with it, requires considering what science really bubbles down to and why we need it in the first place. Part I of this book describes modern science and its limitations, along with its social value. This section also examines how humans developed the characteristics that constitute modern science to help correct our own deficiencies in perception, allowing us to make systematic, though not linear, progress in understanding how the world operates.

    Part II investigates what science is not and offers several case studies demonstrating how industries, interest groups, and individuals use common, and therefore easily recognizable, tactics to manipulate scientific information for power and profit.

    Part III analyzes the COVID-19 pandemic as a powerful illustration of how the information crisis in which we exist hinders our ability to address our most pressing problems. It also digs into the pandemic’s revelations about the nature of scientific innovation itself: how a paradigm shift in our understanding of the transmission of respiratory viruses occurred; how one crucial factor—money—gives science a shot at producing what we want it to; and how attention on long COVID has the potential to shift the ways we consider and manage other similar mysterious chronic conditions.

    Part IV offers hopeful, real-world opportunities to further explore, engage in, and advocate for science.

    Since the popular media often serve as intermediaries between the scientific community and the public, media literacy is a necessary component of science literacy. The Appendix offers a practical guide for becoming a more discerning and less vulnerable consumer, and/or producer, of health and science news. Like the Appendix, the sidebars scattered throughout the book contain essential information.

    Transparency is key to building a sturdy foundation of trust, a central theme of this book, so I’ll offer a few housekeeping notes. I’m a science and medical writer, not a scientist or physician. This is a book about science, not a book of science. I don’t design or implement scientific studies myself; I base my analyses of the issues addressed here on a combination of scientific evidence, the professional opinions of credible scientific experts within the specific subject areas in question, and my own academic and professional experience observing and interpreting these subjects.

    I’ve sourced the heck out of this manuscript, even at times when facts may be considered common knowledge. My intent is to give the information needed for readers to further investigate a subject of interest, fact-check my claims, or evaluate whether the quality of evidence I cite is strong enough to substantiate my analyses. (We may not always agree.) Sometimes I cite primary sources of information, such as scientific journals. I recognize that many people don’t have subscription access to read the full articles from these journals, but an abstract can still provide useful information. Other times, I intentionally cite secondary sources that I think did a fine job interpreting those technical primary sources for a general audience, and therefore may be more useful to readers. Sometimes I cite secondary sources to show how they failed at this task.

    When I quote sources I’ve interviewed, I’m citing their direct quotations. However, in the few anecdotes depicting my own brushes with science, I’ve constructed quotations from my own memory, and they are not exact.

    One premise of this book is that there are no silver-bullet solutions to complex problems. This book isn’t a silver bullet; it’s a tool to help readers slice through our information crisis to locate, absorb, and accept the evidence needed to make informed decisions about the things that matter most.

    Regardless of our educational and professional backgrounds, re-examining our basic understanding of what science really is and isn’t at its core, its value to us personally and to society, and the enduring strategies different players use to distort science for power and profit can push us to become more astute information consumers, as well as more effective communicators.

    While individual understanding in isolation won’t result in the systemic changes needed to address urgent problems such as mitigating climate change and quashing pandemics, collective action can—and it stems from a rising tide of informed individuals. Building a stronger foundation of science literacy can empower us to make more accurate sense of the world, and to interact with the world in ways that can improve our lives both personally and collectively.

    Part I

    Science:

    What, Why, and How

    1

    Modern Science:

    Whittling Away Our Uncertainty

    T

    he 3-year-old sat in a child-sized chair at a child-sized table, furiously coloring a piece of paper with a crayon. A wall of standing adults surrounded her, watching. Waiting.

    Is her lip swelling? one asked. I think her lip is swelling.

    No, I don’t think so, another said. Usually once it starts, it spreads pretty fast.

    The child squirmed. Trying to ignore her anxious audience, she doubled down on her artwork. A blister began to form on her finger and thumb where the crayon was rubbing her skin.

    I sat next to her in another child-sized chair, trying to tamp down the rising nausea, trying to stop shaking. My husband, standing with the other adults, put his hand on my shoulder.

    This 3-year-old was our daughter, Cricket. She was also the research subject.

    I steadied my voice for her sake. Are you doing okay? I asked.

    Cricket fidgeted, leaned toward me, and whispered, I have to go potty.

    That’s normal, said Edwin Kim, the allergist-immunologist overseeing Cricket’s care. Kim was also one of the co-investigators of this study, which was taking place at the University of North Carolina at Chapel Hill and four other academic medical centers around the country. When their stomachs start to cramp, it can make them feel like they need to use the bathroom.

    The research team conferred. It wasn’t great timing; we were at a point in the process when things could slide rapidly downhill. But they decided she could go to the restroom if the medical team accompanied us down the hall. I would go in with Cricket. They would wait just outside the door. We were all tense, plastering on fake smiles to try to keep Cricket calm as we walked the short distance together.

    Keep the door unlocked, one of them called, as the two of us entered the single restroom. We’ll be right here if anything happens.

    They called though the door every minute or so to make sure we were fine. Cricket was holding steady.

    I won’t forget opening the bathroom door to find them huddled just outside, still awash in false smiles, one holding a little cup of antihistamine measured out with the correct dosage for Cricket, the other clenching an epinephrine injector. (In my memory, it was raised and ready to stab Cricket’s leg if things went south, but this image might be a figment of my imagination.)

    Hold it together, I told myself.

    We walked back to the room with the crayons. A few minutes later, things did go south.

    We are fortunate that Cricket is healthy in almost every way. She has passed each physical exam with flying colors. She is on a normal developmental trajectory. She is strong and athletic. But she does have an allergy to peanuts that could end her life in minutes by anaphylaxis.³⁶

    We were here because we could not accept that possibility, and, since there was a research team just down the road who had pioneered the field of oral immunotherapy for children with peanut allergies, we may not have to. Oral immunotherapy isn’t a cure for a food allergy; it has the potential to offer a layer of protection against a fatal reaction by desensitizing the immune system to an allergen. This process involves ingesting trace amounts of an allergen, then increasing that amount slowly over time until the body can encounter a decent amount of it without mounting a severe reaction, or any reaction at all. (Don’t ever try desensitization without careful supervision of a medical provider experienced in administering this intervention.)

    Cricket appeared eligible on paper for the IMPACT study, a three-year, double-blind, randomized, placebo-controlled clinical trial for peanut oral immunotherapy.³⁷ She was within the study’s designated age range at entry of 1 to 4 years old. Her initial blood and skin prick tests indicated that her body likely mounted an allergic response to peanuts.

    Then there was the matter of the final eligibility test, which required her to demonstrate a clinical, or real-life, allergic reaction to fewer than the equivalent of about one and a half to two peanuts here in the clinic. Eliciting this physical reaction was necessary, because it’s impossible to determine the efficacy of the treatment if the patient doesn’t have an allergy in the first place. Blood and skin prick tests don’t always provide an accurate diagnosis, plus a small percentage of kids outgrow their peanut allergies.

    For this last step, Cricket would be given a series of tiny amounts of peanut flour at 15-30-minute increments until she reacted. The team would track how much accumulated peanut flour it took to induce a reaction, as well as the symptoms the reaction triggered.

    It didn’t take much—just a few sprinkles. Soon after we returned from the restroom, Cricket’s lips and face began to swell. Her eyes grew red. She became congested and agitated. She gripped her belly and complained that it hurt. Sure enough, she was experiencing an allergic reaction, and there were enough body systems involved to assume she could be heading toward anaphylaxis. This is what we had been watching for. We had the information we needed to confirm a clinical reaction to peanuts.

    Before we had begun the process, the nurse had encouraged me to be the one to administer the epinephrine if it was needed. Thankfully, I’d never had to do it before. Many parents don’t give their kids an epinephrine injection during a dangerous reaction, because they aren’t confident enough to do it. This was a teachable moment. I had told the nurse I would do it if it became necessary.

    Now it was time. The type of epinephrine injector we were using, an AUVI-Q, was newer than the type I carried around whenever we left the house. It was the size of a deck of cards, and once you pulled off the cap, it spoke the step-by-step instructions out loud. As Cricket’s reaction escalated, the nurse took my hand and guided me through positioning Cricket on my lap, wrapping one of my legs over her legs to brace her tiny body, firmly pressing the injector into her thigh and holding it for 10 seconds as it pumped epinephrine into the muscle. (It created a kickback I hadn’t expected!)

    I realized afterward that my own body had produced so much adrenaline, the natural form of epinephrine, and my ears were rushing so loudly, that I hadn’t even heard the AUVI-Q speaking the instructions.

    After the injection and a dose of oral antihistamines, Cricket turned white and began to shake. Her facial swelling dissipated almost immediately. The researchers sweetly tucked the two of us into a hospital bed in a private room in the clinic and told us to rest. I gripped Cricket tightly, and we fell asleep together as the adrenaline and epinephrine dispersed and her antihistamines kicked in, leaving us both utterly exhausted. The team would monitor her for several hours to make sure her immune response to the allergen didn’t rebound.

    After we’d slept a short while, Cricket began to cough. This would be a precursor to our adventures down the road. In the context of an allergic reaction, coughing is not good. It can indicate a closing airway. The team rushed in, examined her, and determined, thankfully, that the cause of her cough seemed to be post-nasal drip triggered by the reaction, rather than a closing airway. This new symptom reset the clock, though, and Cricket would need four more hours of monitoring to ensure she was past any danger. It was late in the afternoon by this point, but our kind nurse stayed that evening to keep us under her watchful eye and out of the emergency department.

    Cricket had just won a coveted spot in the IMPACT trial. The next day, she would start the years-long process that we hoped, if she was assigned to the intervention group, could lead to protection against a serious or fatal allergic reaction, and to an improved quality of life by lessening the detective work and anxiety that accompanies every meal ingested every day. (You can read more about this experience, as well as the quality of evidence that clinical trials can produce, in Chapter 4.)

    This was science at work, whittling away our uncertainty of how the world functions. It was simultaneously grueling, frightening, and fascinating. It also conferred some risk but offered the potential for great reward: to grow our ability to guide the world—including our own bodies—toward individual and collective benefit.

    Characteristics of Modern Science

    Science literacy requires developing a broad understanding of science as a system. But how can we describe science in general terms when it takes countless forms and extends across a wide variety of disciplines, including medicine, physical sciences, social sciences, and technology? By examining how it derives its power of discovery from a fundamental bundle of characteristics, all working in concert.³⁸ These features include the following:

    Science is a systematic way of developing knowledge about the physical world, which can help us to predict and influence aspects of nature. It’s a tool that helps us to correct for our biases and the limitations of our human reason and senses, edging us closer to understanding how our world functions, beyond how we individually perceive or interpret it. Science is also the body of knowledge gained by this process. It doesn’t advance in a regular, dependable, linear way. Eventually, though, it pushes us toward deeper understanding.

    Science is more than the scientific method. The recipe of the scientific method—define a question, generate a hypothesis, gather data, analyze data, form conclusions—is the way scientists report studies in journals, but science doesn’t always happen so cleanly or directly. There are often many failures, do-overs, and pivots within the process before a study occurs that produces significant, publishable results. Sometimes, a subject dead ends with nothing to show for it. Other times, discoveries happen by accident rather than through purposeful trials. Either way, science encompasses much more than a simple to-do list.

    Science involves testing ideas about nature against nature itself. Through a process called reductionism, science breaks multifaceted problems into small, testable questions. Once we answer smaller questions, we can assemble that evidence to begin to answer larger questions. As biologist E. O. Wilson explained, science is the search strategy employed to find points of entry into otherwise impenetrable complex systems. Complexity is what interests scientists in the end, not simplicity. Reductionism is the way to understand it.³⁹ Over time, information derived from testing small questions against nature builds into a network of facts, hypotheses, laws, and theories. (The following section, Words Matter, defines these important terms.)

    Science is limited to exploring the physical world; it cannot examine the supernatural. Science helps us to explore how nature works. It can’t tell us why we’re here, what our purpose is, whether God exists, or how God does or doesn’t operate in our world. To test an idea about the physical world, we need to try to control all the variables except the one we’re testing. But we have no way to control or test God’s existence or actions using science. When cultures attributed physical mechanisms they didn’t understand to God, they didn’t manage to advance their understanding of nature. Removing supernatural explanations from the workings of the natural world allowed us to progress in this understanding. Whether God exists doesn’t impact the reality that searching for physical, testable answers is the only method that has ever moved us forward scientifically. Unlike in religion, there is no belief required in science; scientific progress relies on the generation and analysis of evidence.

    Science doesn’t produce certainty—and that’s a good thing. This concept is perhaps the most challenging aspect of science to grasp and communicate, yet it’s one of the most important: Uncertainty is not a weakness of science but rather one of its greatest strengths. Why? Because it means science is an ongoing process that continuously self-corrects in the long run. Ed Yong, who won a Pulitzer Prize for his COVID-19 pandemic coverage for The Atlantic, described science as …a slow, erratic stumble toward ever less uncertainty.⁴⁰

    Aside from the exception of some mathematical and logic proofs, certainty about the workings of nature is unattainable to us mere humans, even with the aid of science. Whether the scientific community considers a matter of science settled—no longer in question for the time being—is whether the relevant body of evidence demonstrates a high or near-certain probability of accuracy, typically with multiple lines of converging evidence. Once a matter is considered settled science, we tend to talk about it as a certainty, but when we do this we’re using certainty as a lay term, not a scientific one. In the backs of our minds, we have to acknowledge a miniscule bit of theoretical uncertainty or we would cease to be practicing science. This opening of theoretical uncertainty—even if negligible given the strength of evidence—is what allows scientists to correct their conclusions over time if new, compelling evidence emerges that makes us reevaluate findings in a different light.

    The good news is that science has shown us that the unachievable bar of certainty isn’t required to make meaningful improvements in our lives. Science doesn’t prove or disprove ideas about nature; it yields a body of evidence

    Enjoying the preview?
    Page 1 of 1