“Doing your own research is a good way to end up being wrong,” insisted Washington Post analyst Philip Bump last week. Such a counterintuitive, even comical, claim cannot survive the close interrogation it richly deserves. The only reason for Bump and the Post to publish such a claim is that its implied application, outsourcing your own research to the mainstream media, enhances their own status.
The basis for Bump’s article was, at least at first glance, not quite as absurd as the conclusion he drew. He leaned heavily on a paper published December 20 in Nature titled, “Online searches to evaluate misinformation can increase its perceived veracity.” The paper itself is significantly more informative and edifying than Bump’s armchair regurgitation.
The researchers ran five different studies asking respondents to evaluate news articles at different time intervals (either breaking or months afterward). They selected the most popular stories from a variety of news websites, which they classified as liberal, conservative, and neutral, and also as “low quality” or “mainstream.” In some cases, they prompted participants to search online to help them evaluate the articles as “true” or “false/misleading.” They hired “six professional fact-checkers from leading national media organizations” to evaluate the same articles. “Across five studies, we found that the act of SOTEN [“searching online to evaluate news”] can increase belief in highly popular misinformation by measurable amounts,” they concluded.
The researchers suggested that “data voids” were the most likely explanation for their findings. In other words, “when individuals search online about misinformation, they are more likely to be exposed to lower-quality information than when individuals search about true news.” This theory blames the adverse finding on an unfortunate structural imbalance or erroneous search engine algorithms, instead of the stupidity of average Americans, the solution suggested by Bump’s headline.
Nevertheless, this explanation did not ring true. From my own experience “searching online to evaluate news,” it is often relatively simple to sort fact from fiction. Usually, patently false claims are only repeated by news sites of extremely low quality, and there are often sources suggesting them to be false within the first 10-20 results. It’s slightly more complicated when the false claims are spinning real details, but it’s also possible to correct for this by recognizing a publication’s slant, reading from sources on both sides of an issue, or skimming articles for the un-spin-able facts (such as direct quotes or statistics).
In defiance of Bump’s dictum, I decided to dig deeper and do my own research into the paper’s data. Fortunately, the authors had also made available 105 pages of supplementary information, including how they classified sources, which articles they selected, and how those articles were evaluated by their professional fact-checkers. (Publishing such information shows the researchers were trying to be intellectually honest with their audience.) Their data had several noteworthy issues.
First, there were some glaring issues with their classification of news sources. In dividing them into “mainstream” and “low-quality” categories, the researchers did not consider the actual quality of a site’s reporting, but only whether it was in “the top 100 news sites by US consumption” from 2016-2019. This inaccurate system classified The Huffington Post as mainstream and The Daily Wire as low-quality. To classify the partisanship of a website, the researchers “used scores of media partisanship from a previous study, which assigns ideological estimates to websites on the basis of the URL-sharing behavior of social media users.” Based on this proxy, they classified CNBC as a conservative-leaning news website, along with other centrist or split platforms such as The Wall Street Journal and Real Clear Politics.
The first indication that their classification system might have been inaccurate should have been the lopsided distribution; they sourced from 61 “low-quality” conservative sources, ranging from The Daily Wire to essentially blogs with no name recognition, 58 “low-quality” sources of unclear ideological leaning, and only six “low-quality” liberal sources. Meanwhile, they acknowledged there were only 10 conservative sources in their top 100 list, including the aforementioned questionable cases.
One additional laugh (literally) was the inclusion of The Babylon Bee on their list of low-quality conservative news sources. This is funny because The Babylon Bee publishes the most trustworthy fake news on the internet.
A second problem with their data was a leftward bias evident in their professional fact-checkers, despite the previously mentioned problems in source classification. If the researchers’ quality classifications were accurate, one would expect the professional fact-checkers to give similar scores to the mainstream sources and low-quality sources from the Left and Right — which is not what happened.
The fact-checkers rated 51 articles from each category across the five studies. From the mainstream liberal articles, they rated 50 as “true” (98%) and one as “could not determine.” From the mainstream conservative articles, they rated 47 as “true” (92%) and three as “false/misleading” (6%), while on one there was no most common answer. From the low-quality liberal articles, they rated 29 as “true” (57%), 10 as “false/misleading” (20%), and one as “could not determine,” while on 11 there was no most common answer. From the low-quality conservative articles, they rated 17 as “true” (33%), 30 as false/misleading (59%), and one as “could not determine,” while on three there was no most common answer. From the low-quality articles with unclear political tilt, they rated 22 as “true” (43%), 24 as “false/misleading” (47%), and one as “could not determine,” while on four there was no most common answer.
Either conservative outlets were significantly more likely to publish “false/misleading” articles, or the fact-checkers were simply biased against the positions they took. Anecdotally, there have been many well-publicized instances where fact-checkers dinged a substantively true report as “misleading” because it aided some conservative narrative, or because the fact-checker disliked the opinion presented. Therefore, the second option seems more likely.
A third problem with their data was the surprising degree of disagreement among the fact-checkers themselves. The researchers classified each article as “true” or “false/misleading” based on the most common answer of the four to six fact-checkers who reviewed each article. That fact implies that the fact-checkers did not always agree whether an article should be rated as “true” or “false/misleading.” Indeed, as outlined above, on 19 articles (7% of the whole), there was no most-common answer. Even more shocking, the fact-checkers unanimously agreed on less than half (44.6%) of the articles they rated. The researchers acknowledged that the study participants were less likely to rate a false story as true when more fact-checkers agreed on the verdict.
In summary, this study appeared to find that, after doing their own online research, people were more likely to believe that false stories, as evaluated by fact-checkers, were true, whether the stories came from high- or low-quality conservative or liberal media. But their data contained numerous inaccuracies, including erroneous media classifications, a left-wing fact-checker bias, and even disagreement about whether an article was true or not. The researchers’ conclusions cannot be more reliable than their data.
These data discrepancies underscore another issue with claims of rampant misinformation; the media in general, and fact-checkers in particular, seem to be confused about what is fact and what is opinion. A team of professional fact-checkers aren’t going to disagree over whether two plus two equals four, or whether a certain person earned a degree at a certain university, or whether the Consumer Price Index for December 2023 rose 3.4%.
The types of “facts” that ignite disagreement among fact-checkers concern whether the way we teach math is rooted in white supremacy, whether that person is qualified for the office he holds or aspires to, or whether the economy has performed well under President Biden. Such questions are not facts at all, but rather premises, evaluations, judgments, and arguments. They should properly be categorized as opinion, and no professional fact-checker should rate a piece as “false” or even “misleading” because he disagrees with how a writer analyzes his facts. That isn’t fact-checking (a narrow discipline of parsing true and false claims) but a faulty appeal to authority, which cloaks a partisan argument in a facade of impartial objectivity.
Bump in his piece acknowledged that people have different opinions, and he attempted to distinguish these from “claims” — a term that muddies the water since it can refer to either facts or opinions. “There’s probably another factor at play, one not measured in the research: people who believe false claims often do so because those claims comport with their broader ideology or philosophy,” he said. “There is both a supply and a demand for nonsense or appealingly framed errors. Americans who have little trust in the system can easily find something to reinforce their skepticism. They often do.”
Bump claimed that some people embrace false claims to fit with their preconceived opinions (what we might call a “worldview”). He complained about the threat posed by QAnon and other conspiracy theories (which, for him, can only be found on the Right). It’s only a small logical leap from his conclusions to the totalitarian implication that opinions which lead people to embrace false claims are wrong and should be discarded, belittled, reeducated, and suppressed.
Bump even criticized the House Oversight Committee’s investigation into the Biden family’s corruption. This insinuates that members of Congress shouldn’t do their own research either. They should leave it to the media professionals. Of course, the House investigation has exposed Biden family corruption that the media couldn’t be bothered to investigate.
In Federalist No. 10, James Madison addressed the same basic difficulty as Bump — different opinions driving people into political conflict — but came to a much different conclusion. “As long as the reason of man continues fallible, and he is at liberty to exercise it, different opinions will be formed,” he said. “As long as the connection subsists between his reason and his self-love, his opinions and his passions will have a reciprocal influence on each other; and the former will be objects to which the latter will attach themselves. The latent causes of faction are thus sown in the nature of man.”
Instead of positing Bump’s notion of a mutual reinforcement between wrong opinions and false claims, Madison more accurately described a mutual reinforcement of potentially flawed opinions and self-serving passions. Also in contrast to Bump, Madison rejected the censorship route as a “remedy … worse than the disease.” Instead of centralizing power and universalizing one opinion (one faction), Madison instead advocated for a multiplicity of opinions and factions, so that they would temper one another and mitigate the harmful effects. Madison would critique America’s current political climate not for being too partisan but for being too uniform.
Ironically, Bump has his own record of embracing false claims or wrong opinions because they fit into his preexisting ideological framework. He jumped off the deep end into the Russia collusion conspiracy. He ridiculed the New York Post’s reporting on Hunter Biden’s laptop — which the Department of Justice officially confirmed on Wednesday. He embraced COVID lockdowns long past their expiration date, arguing in April 2020, “There is a balance to be found between the negative effects of job losses and isolation and the rate of deaths of COVID-19. It’s just that these discussions are best undertaken by people such as Fauci during private meetings at the CDC and not between vitriolic television hosts and people with doctorates in unrelated fields.”
On the topic of doing your own research, Bump overemphasized the conclusion from a single study at the expense of the larger picture.
Here’s the big picture: when someone is deciding whether or not to do their own research on the topic, they are, of necessity, uninformed on that topic, and they know themselves to be so. They may have heard a fact, claim, or argument in passing, but they are unsure whether to believe it or not. Their three options are: to remain in ignorance, to let others do the research for them, or to do their own research.
Bump’s own track record illustrates that professional researchers (like journalists) are not necessarily more likely to arrive at the truth than the average person. They aren’t even terribly likely to agree with each other about what the truth is, according to the data from the study Bump cited. In fact, letting the professionals do your research for you may be less reliable than doing your own, if they hold a hostile worldview or have a perverse incentive to mislead their readers. Chances are, that’s exactly what the people advising you against doing your own research are trying to do.
Can someone doing their own research arrive at the wrong conclusion? Sure. But that result is also possible with the alternatives. Sometimes, the reason why an idea sounds counterintuitive is because it’s mistaken.
With that said, there are some ways to make it less likely you’ll get taken in by fake news as you do your own research online. Here are a few tips I use in my own research:
1) Avoid questionable sources, or at least verify their information elsewhere. Have you ever heard of the website before, and has it been reliable in the past? Does the webpage attempt to manipulate your emotions through too much capitalization, lurid use of exclamation points, or overly flashy graphics? Would other people you know take the source seriously if you told them about it?
2) Cross-check information across sites. Are multiple sites reporting the same information? Do sources on the Left and Right both substantially agree?
3) Check for sources. Where did the article get its information? Does it provide a name, organization, or link for where the information came from? This is important for two reasons. First, it allows other reporters, and even people doing their own research, to follow the trail of information to verify that it is actually true. Second, if the information isn’t true (for instance, that a congressman gave this quote, and an organization’s report included that statistic), the alleged source has an opportunity to correct the record.
4) Look for facts, not narratives. Facts are the meat; narratives are the sauce. When you’re doing your own research, don’t be satisfied with a meal consisting of all sauce and no meat. That’s letting others do the research for you. The most important part of any news report is the stats, actions, and quotes reported. Without them, there’s no news. If you forget an article’s narrative but remember the facts, you’ve still read it successfully.
5) Ask critical questions. Does someone have an ulterior motive in publishing this information or creating this narrative? Could the information here be contradicted by anyone (if not, it’s probably not important)? What would change if this information were true or not true?
6) Keep an open mind about breaking news. Breaking news stories are harder to get right. Over time, more information comes to light, contradictions are resolved, and confusion is clarified. It’s fine to believe a story right away, but keep in mind that what we know could change in the next hour, day, or week.
7) Go to God’s Word to find out what is really true. The news can distract, disorient, and depress us. An un-Christian culture can marshal a seemingly impressive host of facts in support of an anti-Christian agenda. But God’s Word and what it says about the world is always truer than what the media writes. Sometimes, God’s Word helps us to interpret the facts according to a God-oriented moral framework. Sometimes it warns us not to believe reporting that can’t be true. So, when you’re doing your own research, be sure to spend time researching God’s Word, too.
Joshua Arnold is a senior writer at The Washington Stand.