“I leave a lot out when I tell the truth. The same when I write a story.”
Is the truth still the truth once it has been edited? Every day we are presented with an abundance of information. We read it, we observe it, we watch it on TV. This is where we receive our cues on how to function effectively in the world. And we expect these cues to be, for the most part, true.
Often times they are. We don’t turn left on red. We add the right amount of flour. We leave our message after the tone. This system works. Why on earth would we question it? But then there are the things that run beneath the surface, the things that can’t be reinforced by our own observations and behaviors.
These are the cases when we rely on outside information the most, whether it be from the media, academia, the government, or the general powers that be. We expect these sources to know what they’re talking about, and even further, to be looking out for the public’s best interest.
Of course, information can be deceiving. We interpret it based on our personal biases and assumptions about the world. And we remember it in the same way. In a recent New York Times article, Sam Wang and Sandra Aamodt explain this phenomenon:
“The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it.
This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.”
So essentially, what we remember can be considered memories of memories of memories, repeating as conductive of the number of times recalled, and in turn reducing our ability to examine them critically. Self-deception is something that we all experience, often without realizing. Is a fish story still a lie if we don’t recognize our distortion of the truth?
This extends to the information we receive from other people. Of the numerous complaints about media bias, governmental deception, and corporate control, whether framed as conservative, liberal, or otherwise, the primary concern is that the public is not receiving the truth. And this is huge.
But then again, what is “truth?” Is “the” truth different from “a” truth? Is truth the same as reality, or are these two completely separate concepts? And in any case, is truth with a capital “T” even a tangible, plausible construct that we can observe and understand? More on this.
When we receive information from outside sources, we receive it in a very fragmented way. First of all, we need to trust that our source got the story straight. Then, we need to trust that they are remembering it in its entirety. This is something that is extremely difficult, if not impossible. Because really, these sources are thinking in terms of their own biases, sorting details based on what they find important.
Of the number of stories that could potentially be reported by the press, it has been said that 75% are never printed by the average local media, with a much higher percentage for national media.[i] So what’s left out? Why is it left out? And is it important in the broader scope of being well-informed?
Then there is documentary film. Critics of the genre often argue that ethical “rules” for filmmaking require “objectivity,” or the presentation of “both” sides of an argument in an unbiased way (this is why Michael Moore is often discredited as a documentary filmmaker.)
First off, what is meant by “both” sides? Where did these sides come from? For any given issue there are likely to be multiple perspectives. The belief that “black and white” analyses are accurate and thorough is extremely misleading. I can’t think of anything that doesn’t have some grey area, and generally “black” and “white” are two perspectives that lie on either side of a spectrum.
Secondly, is it possible to be unbiased? Documentary films present information as perceived by a specific ideology. Whether depicting historical events or attempting to persuade, this information is presented in a way that is relevant to the ideology of the filmmaker. Certain details are inevitably going to be left out. And even inadvertently, certain details are going to be overlooked entirely. Why did the filmmaker select the sources they did? Are these sources credible? Why is the footage edited in the way that it is? And of the hours of footage that were likely recorded, why was this hour or two chosen?
The more these questions are considered, the more a tangible argument can be identified from the source. And often, this argument is suggesting that this is the truth.
So, of what is the truth? Is it objective fact? Or can there be subjective truths that exist as a microcosm of the greater whole (in other words, this is a truth that is relevant to this particular perspective?) If this is the case, then each merit granted to any given argument qualify as a truth about the issue being addressed, combined with the other truths that dissenting opinions hold. This can get confusing and nonsensical very quickly.
Are truth and reality one and the same? Or can reality be socially constructed, serving as an assumption of what the truth is? What each of us sees as real is naturally going to contain some degree of fallacy. And if a society operates under these fallacies, then they inevitably become real to the people within that society. They serve as the cues by which people make their decisions.
Take Soviet Russia for example (or any society operating under heavy propaganda for that matter.) History itself was falsified to benefit an ideological agenda, but to the people studying it, this was history, and it became real to them.
I know I shouldn’t trust what I see. And I know to an exponentially greater degree I shouldn’t trust what I hear. Because I personally am going to filter that information in a way that makes sense to my perspective, and I am going to unconsciously remember that information in a generalized way as a result. Then, if I share that information with someone else, who knows what will be missing.
Ultimately, it’s like a game of telephone. While both deliberate and unconscious omissions still leave part of the original whole, is this still credible? And is it truthful?
[i] Pratkanis and Aronson in Age of Propaganda: The Everyday Use and Abuse of Persuasion—note that this news includes anything from spelling bee results to international armed conflicts.