We believe pretty much anything we’re told, which was great in a time when we were only ever told things by our close family and friends (who probably saw those things firsthand) but not so great now that we’re constantly being told things by strangers. Lots of people, including me, like to think of themselves as not so credible; but there’s so many things I’ve just accepted because my friends told me they were true, and so many things I’ve relayed from some random on the Internet–usually with enough epistemic honesty at least to mention my source and say “take this with a grain of salt”–without knowing whether they were backed up by evidence.
I’m thinking about this for two reasons. One is the election we’ve all already thought too much about–but at the same time can’t safely stop thinking about, like, the second I wanted to stop thinking about it Trump named Steve Bannon chief strategist. I was used to thinking of the Breitbart people as ideological enemies of about the same calibre as 4channers, so this is almost more surreal than the presidency. Facebook fake news and blatant untruths in the far-right media were the subject of a lot of journalism leading up to the election. Leftists like to think we read the true media, but how much can we really pat ourselves on the back? We can think: I hear and believe things that are also heard and believed by NPR. But what are my reasons for believing NPR is a good source, and are those reasons any different or any more objectively rational than the reasons of somebody who trusted a Facebook news item about how Hillary went in as a commando to kill American personnel in Benghazi with nothing but a combat knife and a can of hairspray?
The other reason is that I’ve been reading about Yoruba epistemology in Knowledge, Belief and Witchcraft by Hallen and Sodipo. It so happens this is actually my first introduction to formal epistemological philosophy outside a semantics classroom, so I’m sure to some extent the stuff I’m astonished by is old hat–the Quine chapter completely blew my mind right out of the gate and there’s another blog post to be written about that. The writers complicate the traditional translations of mo (translated as “knowing”) and gbagbo (translated as “believing”). Mo ideally is seeing with one’s eyes. Gbagbo they translate as agreeing to accept what you have heard. And of course there are many other languages where the speaker marks propositions to tell the hearer how they came by the information–hearsay, seeing with your own eyes, inferring from context, heard it on Facebook.
Once in an anthropology intro class my TA tried to explain the point of view of some Indigenous opposition to the Land Bridge Theory: “There’s different ways of knowing than just the scientific one.” The snide white jokester she was talking to was like “You mean the truth?” I believe in the scientific project and in objective knowledge but, I mean, come on. That guy doesn’t mo that people moved from Siberia to America tens of thousands of years ago, he just gbagbos it. (From what I’ve read, I gbagbo it too, but what the hell, I wasn’t there.) Just accepting what you were taught by one person as opposed to what you would have been taught by a different person if you had been born in 1200 in a Kwakwaka’wakw settlement doesn’t necessarily make your thing true and their thing fake. Believing this seems to be the root of a lot of unnecessary smugness–yeah, yeah, including my liberal smugness. Either way you’re just accepting what you hear unless you’re willing to start digging much deeper, and digging deep on everything you hear isn’t really feasible. (Though a Year of Living Biblically – style epistemology project where I fact-check everything I hear for a year sounds surprisingly interesting.)
At the same time, it’s hard to dislodge your ego and take material seriously that contradicts your ideology. You have to kind of virtualize your belief set and run their OS alongside yours so you can safely delete it when you want to go back to normal. And I don’t think I’d want it any other way–I don’t want to internalize material I believe to be false–but are my reasons for not wanting that even valid? How would I know if my belief was false and the OS I installed just to try out a few programs was true?
I feel glumly Sapir-Whorf about the idea that maybe languages with evidential markers encourage their speakers to avoid categorizing material we hear in the same bin as material we experienced firsthand. That’s definitely a topic for further research, though, not a real theory. Don’t quote me. Definitely don’t think of that as something you know.