Popularized during the 2016 election cycle by then-candidate Donald Trump, the term “fake news” has since evolved to encompass a broad scope of patently false reporting that gains traction in popular culture. However, despite increased public awareness of the misinformation circulating in both liberal and conservative circles, fake news persists as a leading concern for politicians and journalists alike. A growing body of research suggests that susceptibility to fake news goes beyond changing social norms or the sudden pervasive influence of digital media. According to a number of recent studies, the likelihood to dismiss evidence that belies a longstanding belief has cognitive, psychological and biological roots.
A new study from Ghent University in Belgium demonstrated the powerful effects of fake news. Participants in the study were split into two groups and given an identical biography about a fictitious nurse named Nathalie. The experimental group also received an extra paragraph explaining that Nathalie had been arrested for stealing drugs from the hospital at which she worked; in the control group, the extra paragraph was left out. After having read the entire biopic, the experimental group was told that the material regarding Nathalie’s arrest was in fact false — she had not been arrested for stealing drugs. But later on, when asked a series of questions about Nathalie, the experimental group was significantly more likely to rate her unfavorably. These results indicate that the mere exposure to fake news, stripped of any personal or partisan context, still has the potential to color a person’s viewpoint even after it has been proven incorrect.
A Scientific American article published earlier this week attributes the experiment’s conclusions to differences in memory function. “In other words, some people are less able to discard (or ‘inhibit’) information from their working memory that is no longer relevant to the task at hand—or, as in the case of Nathalie, information that has been discredited.” The capacity to selectively accept or reject such information, the report claims, is likely due to varying levels of education and the development of critical thinking skills, which allow a person to better analyze potentially prejudiced sources, outcomes and opinions. With this type of understanding, they will be more likely to reflect on and revise their own assumptions based on new knowledge.
However, even when individuals can successfully overcome their subconscious biases, the task becomes considerably harder in group settings. Shankar Vedantam, a social scientist at NPR, says, “Very simply, being around other people seems to increase our propensity to believe in fake news.” New information, both truthful and not, is more easily reinforced when others can “confirm” its veracity, even if they have little authority on the subject.
Editor in chief of the Spartan Shield magazine, Emma Horsfield, has experienced firsthand the dangers of prejudiced sources. Maintaining a high standard of journalistic integrity for the stories circulating the halls of Pleasant Valley High School is the obligation of the student reporters, according to Horsfield. “We have an extensive editing process that each member of our staff follows,” she explained. “Each article goes through editing multiple times to be sure it is unbiased.”
The ramifications of fake news are far-reaching and may not be entirely solved by a Facebook algorithm. Instead, teaching people to be more aware of the partiality appearing in their feeds could have a lasting impact on this issue.