Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?” (T. S. Eliot, The Rock, 1934)
The question “What is truth?” has vexed theologians and philosophers for many centuries. According to the Biblical story, even Jesus did not have a spoken answer to the question when asked by Pontius Pilate at his trial. Scientifically, it is clear that human brains “short-circuit” at times when trying to sort truth from fiction, and the title of this post is one of these “logical fallacies” that plague us all from time to time:
“This is disturbing; therefore, it is not true.”
This fallacy is very easy to perceive in other people, but it is much harder to see in ourselves, which is why a lot of smart people get entrapped by it. Neurologist and science podcaster Steven Novella calls this the argument from personal incredulity. In other words, “I cannot explain or understand this, therefore it cannot be true.”
Of the two dozen logical fallacies documented in Novella’s excellent list with explanations, this is the one where I often find myself stepping back these days and asking myself, “Is this me? Am I crazy or am I one of the few sane ones around this place?” Every world and national event portrayed on television now seems to have two versions of “What is truth?” Do I have the “true truth” or is my mind playing tricks on me?
I’m going to delve into two tricky topics where, in my opinion “truth matters,” and yet social division remains strong. The first is the consensus scientific timeline of life on Earth, i.e., evolution and its implications. The second is the Trumpian social media assault on truth itself.
Education is a mindset; degrees are just credentials
Repeat after me: Correlation is NOT causation. But we need to start here with the demonstrable correlation between higher education and the rejection of literal interpretations of scriptural accounts of Earth’s creation from different religious traditions. The Judeo-Christian tradition from Genesis, for instance, really consists of just a few verses from the late Bronze Age that are more poetry than they are testable hypothesis, which needless to say, was “not a thing” 3000 years ago.
The hostility between the fundamentalist religions of the world and science education itself has been longstanding,  although up at least through the time of Isaac Newton (1642–1727), the “great thinkers” were often simultaneously scientists, mathematicians, philosophers and theologians. I wrote in more depth on this topic earlier this year. And yet, the battle over science education continues in the present. Just last week the Ohio state House passed legislation that attempts to protect students who give religious answers to science class questions from being marked wrong. 
I see this battle as more mindset than simplistic cause-effect. Last year I wrote a series of posts in which I argue that, ever since public libraries became available, college-level education has always been free. Educational institutions market and sell credentials, which have some correlation to price paid, education gained and social status, but a good reading list from your local library, cheap used textbooks from Amazon, and low-cost online e-books can make you as well-educated as most four-year college graduates. And so, education is mostly mindset, especially the desire to learn more broadly the history of human attempts to make sense of the world in which we live.
I assert that the majority of “people in the pew” do not realize that most biologists, and even theologians, at, say, the top Baptist universities work under the assumption that Darwinian evolution is “truth.” The biologist knows the beautiful “natural math” of DNA and its history. The theologian knows the very messy process through which Biblical writings reached their present form, even if both of them must publicly “nuance” their explanations to keep their positions.
I once encountered a physics professor from a Pentecostal college who stated his position this way: “God could have created life on Earth anyway He wanted, and He chose random mutation with natural selection” (i.e., Darwinian evolution without using the “fightin’ words”).
I often point out that, for the first time in history, humankind has at its fingertips a device that can instantaneously find and display the tested, consensus foundations of human knowledge on just about any subject, and yet we more often use that device to take pictures of our lunch.
Education matters (or, from the point of view of a religious fundamentalist, screws you up). Let me suggest to my more literal-minded religious friends that God (no matter how you define the word) created both time (and lots of it) and the human brains that reason out logical, testable explanations of how the universe appears to work, otherwise known as science. Don’t bet against it. In the spirit of this blog, the consensus four-billion-year scientific timeline for Darwinian evolution on Earth is “probably true within a very, very high degree of statistical confidence.” Much of modern biology and medical practice (the kind that actually works) falls out from that explanation and testable verification.
Newton’s “truth” of calculus and planetary motion was not the end-all explanation of gravity, but it was close enough to get us to Einstein’s more accurate “next step along the way.” And yet, Newton’s basic formulas are still all the “truth” most of us need to figure out how far we can throw that ball. Genesis was an important “step along the way” from the Bronze Age, as was Darwin’s 19th-century explanation, even though he did not know the mechanism of DNA. The Universe still continues to “reveal itself.”
The firehose of information
On the political and social side of information, let me suggest that the “firehose data effect,” both a benefit and a curse of the internet, complicates this search for “the truth.” Mostly unseen to us, someone is “curating” the news and other information we get online, from print, or from television, from a flood of good and bad data sources. In return, we have become the product in the classic marketing sense, sold with great efficiency to advertisers who want our eyes and ears.
As a result, we more often let our news sources “choose us” instead of the other way around. Most “newsreader” software is tracking you for advertisers and feeding you the stories that tend to reinforce your own biases. I have long used customized newsreader software running on my own server that processes a lot of online stories efficiently while minimizing tracking, but I have to do much of my own daily collation and culling from over two dozen science, political, and local online news sources. I also find that I have to intentionally pull from sources I normally would not want to hear from in order to see what other people on the internet are seeing. These have been conveniently filtered from most of our internet browsing.
The effect of this collation on most people is the creation of a “truth bubble” that is hard to break. Fox News watchers likely will not hear most stories documenting in detail President Trump’s frequent “untruths,” and MSNBC watchers won’t fully grasp that their “obvious” political corruption grievances will roll right past millions of voting Americans without registering.
Organizations that attempt to “rate the truth” online, such as Pulitzer Prize-winning PolitiFact (of “Pants on Fire!” fame) or urban-legend-busting Snopes.com, become among the first to themselves be accused of political bias by those who disagree with their findings. Both are for me, however, important parts of “truth-checking” when I see something outrageous cross my news feed. I have also written in the past on why the many alleged conspiracies large and small need to be taken, not only with a big “grain of salt,” but with some understanding of the math involved.
Learning is anticipating the future
In his seminal 1998 book Emergence: From Chaos To Order, John Holland defines basic biological learning itself as our brain “anticipating the future” with increasing probability.  Antonio Damasio has commented that, even earlier in biological history,“the gut was the first brain.” Since our earliest days as single-cell creatures (see above), the “learners” simply had a greater probability of passing on their genes. In that sense, evolution is really just a probability function of tiny procreation advantages along the way multiplied over many millions of generations. 
On the good side, the fruits of basic knowledge of this world around us has brought billions of people out of the worst levels of poverty and disease. It is winning the battle over the worst abuses of sexism and racism in some parts of the world, and yet still struggling in others. The best education is an inclusive club, not an exclusive one. But for good or for bad, the human species has also “changed the curve” a bit in recent generations as its most educationally and economically successful members have learned how to increase their material wealth and comfort by not procreating, which perhaps does not bode well for future generations.
And on that disquieting and admittedly elitist note, I need to get back to parsing through the firehose of my news feed…
- I recommend here Karen Armstrong’s great 2001 book, The Battle for God: Fundamentalism in Judaism, Christianity and Islam. Armstrong demonstrates that the more literal-minded versions of these three religions have more in common than you might think, and their hostility to science is much more recent than you might think.
- The actual language of the bill appears to be inherently unreconcilable: “Assignment grades and scores shall be calculated using ordinary academic standards of substance and relevance, including any legitimate pedagogical concerns, and shall not penalize or reward a student based on the religious content of a student’s work.”
- Holland, John H. Emergence: from Chaos to Order. Addison-Wesley, 1998, p. 53.
- My classic simplification of the math of evolution is that “if your parents did not survive long enough to procreate, then you probability will not either.”