“An insufficient taste for evidence regularly brings out the worst in us.”
There’s a disagreement within the profession right now about how much cooperation to offer the new Trump administration. It’s a delicate matter to not take a blatantly partisan stance while still advocating for libraries and asserting that we have a President-elect whose positions run contrary to those values. Political ideologies aside, it seems pragmatic, as we all work for a brighter future, to both hope for the best and prepare for the worst.
A lot of discussion is also continuing about yellow journalism and fake news. Whether or not misinformation played a role in the election outcomes, as a librarian, particularly one who got in early on the Wikipedia bandwagon and other democratized information formats, I’m troubled about the impact of maliciously-spread falsehoods and where we seem to be headed when it comes to seeking and respecting the objective truth.
Filter bubbles have taken some flack too. Prejudiced and wishful thinking resulting in a warped view of reality, or even a selection and confirmation bias, is of course contrary to our actual understanding of the world, much less scientific advancement and the accumulation of knowledge. But filter bubbles not only can be good, to some extent, they must exist. When I’m driving a car, I need to be primarily focused, even if it’s merely with continuous partial attention, on the task at hand, and not engrossed in studying the architecture and landscape around me. We must tune out extraneous data that would cause a mental overload.
I certainly can’t follow, let alone verify, every news source in existence. I look at Google News, as one example, which has personalized the stories it shows me based on my past reading behavior, search history, and demographic profile. I have placed my trust in Google’s filters because they save me time. And that’s exactly, by the way, why they exist and make Google a success. I can type out a restaurant name in the search box, without specifying Milwaukee, and be confident that results from nearby are going to show up ahead of links to businesses in Florida. This support of my search behavior, which is either efficient or lazy, depending on how you look at it, goes far beyond the depreciation of Boolean operators.
By showing us links that we’re more likely to click on, Google is doing its best at delivering relevant hits. But what happens when their links go to pages which are not only subjective but untrue? Does Google have a responsibility to vet its listings, and how would that even be possible to do for billions of pages without reverting to a pre-web publication model? They do ban advertisements for some products, and are taking steps, along with Facebook, to weed out “illegal, misleading or deceptive” content.
No word yet if those exclusions will apply to Donald Trump’s lies. It shouldn’t be a question of journalistic integrity to call attention to the fact that, as with other politicians, the next president has been caught fibbing rather frequently. I suppose “stating falsehoods” might be a better description, since it’s altogether possible, and equally as alarming, that Mr. Trump’s own filter bubble permits him to genuinely think that, for instance, he won the popular vote, despite the lack of any supporting evidence. Regardless of inner beliefs or intent, the resulting spread of mistruths is the same.
As for an information provider’s standards and obligations, the analogous fine line in librarianship is the classic distinction between censorship and selection. Given our finite collection development budgets, we have to choose what books to buy, based mainly on our constituents’ reading preferences, but perhaps by evaluating the reliability of the information those materials contain as well. Yet when we offer unfettered Internet access in the library, not to mention the fact that most people nowadays get their information outside of the library, the ability for us to exercise that prior restraint is removed.
Fabricated news, although nothing terribly new, is an unfortunate consequence of how the web allows everyone to contribute. It breaks the traditional model of only letting peer-reviewed sources be publicized, but I sure don’t want us to go back to being content gatekeepers. I’m fine with the cat being out of the bag, although some may not see it that way. The allure of having librarians in a more powerful position of controlling information — similar to the appeal of esoteric search interfaces that perhaps gave us a greater sense of worth because they aren’t end user friendly — may have concerned librarians yearning for more restrictions on what is allowed to be circulated.
Not only is it virtually impossible to suppress the spread of information, many conspiracy theories thrive by fashioning themselves as “what they don’t want you to see.” A line in A Clash of Kings puts it another way: “When you tear out a man’s tongue, you are not proving him a liar, you’re only telling the world that you fear what he might say.” So teaching everyone how to constantly and critically evaluate an array of publications, which are always going to be out there, is the only recourse we have left. To be clear, giving equal time to the Flat Earth Society and other irrational positions is indeed a waste, but educating people about why and how we know the planets are round is a far better approach than censoring or even ignoring such misinformation.
Moreover, deciding what to trust should never be a binary way of thinking. I remember this in early information literacy frameworks, where Wikipedia was treated as the forbidden fruit while scholarly sources were presented as something to be viewed as gospel. The problem with this approach isn’t just that Wikipedia often has some “good enough” information, but that fraudulent studies have been published by credentialed experts in reputable journals, to say nothing of the James Freys and Jayson Blairs of conventional media.
The myth of absolute certainty regarding empirical facts can be dangerous. One of the more cogent early criticisms of Wikipedia was that it was never guaranteed to be authoritative. As stated by Jerry Holkins, “the collaborative nature of the apparatus means that the right data tends to emerge, ultimately, even if there is turmoil temporarily as dichotomous viewpoints violently intersect. To which I reply: that does not inspire confidence. In fact, it makes the whole effort even more ridiculous. What you’ve proposed is a kind of quantum encyclopedia, where genuine data both exists and doesn’t exist depending on the precise moment I rely upon your discordant fucking mob for my information.”
Nonetheless, the same constraint applies to any observable truth. Based on a verificationist philosophy, what makes a fact scientific, as opposed to, say, an ethical position or teleological claim, is that it can be proven wrong. Every source of information that we rely upon, beyond our own physical senses as a basis for confirmation, may be deceptive and cause us to adopt unjustified or untrue beliefs.
So what are we to do in our efforts to get past all of the bunk out there? One thing, first of all, is to jettison an unscientific way of looking at things. If someone’s answer to the question, “Under what conceivable circumstances of evidence presented to you would you change your belief [that my candidate is a reptile, etc.]?” is “Absolutely nothing,” then they do not have a scientific position, for which engaging in a debate over the facts is pointless. As the saying goes, it’s hard to win an argument with a smart person, but nearly impossible to win an argument with someone who’s ignorant, especially if they wish to remain so. You also cannot wake up a person who’s pretending to be asleep.
Second, a healthy dose of skepticism also doesn’t hurt. As stated by Peter Conn, “Skeptical and unfettered inquiry is the hallmark of American teaching and research.” Extraordinary claims require extraordinary proof. When something sounds too good to be true, it’s important not to become gullible about it. Having ideas is okay, but it clouds your judgment when the ideas have you. This is something to keep in mind when listening to agreeable sources or dismissing unfavorable reports. As another president once said, “Sometimes it’s hard to tell it if you listen to the filter.”
A third and surefire way to uncover and eliminate your own preconceptions is to essentially, in the case of political disputes, pretend the circumstances were reversed:
- If Donald Trump had lost, would he and his adherents and apologists be complaining that a recount was a waste of time?
- Or, do you have a greater propensity to believe that voting results were hacked just because your candidate didn’t happen to win?
- Would an Obama Hotel near the White House have been called a conflict of interest?
- Would Secretary Condoleezza Rice’s undisclosed use of an e-mail server kept in her home’s basement (which didn’t happen) have raised eyebrows?
Hypotheses are strengthened by making gutsier predictions beforehand, social justice is more fairly achieved when principles are designed under a veil of ignorance, and impartiality is the best vantage point in sieving through what’s real and what’s propaganda.