Society / Justice and Law

Commentary: Is Truth Dead or Are We Inadvertently Killing It?

Written by Mark Markuly, PhD

April 21, 2017

School of Theology and Ministry

Column by Mark Markuly, PhD, dean of the School of Theology and Ministry

In April, TIME magazine issued a simple, controversial magazine cover with a screaming red headline against a black backfield.  For the second time in the magazine’s history, it asked a question on its cover with no image: “Is Truth Dead?”  TIME had an identical cover back in April of 1966, but with a different question: “Is God Dead?”  Both of these articles received a lot of attention after getting published, because it seems the questions touched on a cultural issue of the time that begged for attention.

Unfortunately, the quality and depth of the two articles are very different, and this alone says something about who we are as a national community.  The story from the 60s explored a contemporary ferment in theological thinking about a shifting global understanding of the concept of God.  This perceived shift in thinking about a Supreme Being resulted over the next 51 years in a dramatic alteration in Americans’ religious beliefs and practices.  TIME had the editorial insight to catch a cultural change as it entered the early stages of metamorphosis, an evolutionary transformation that happens very slowly when it comes to religious ideas.  Reporters and writers for the article consulted with a veritable “who’s who” of religious thinkers across the Jewish and Christian traditions.

In 1966, the TIME article opened a window of understanding into an important cultural transition about God and religion that most people had not noticed.  Indeed, as the article quoted, an incredible 97% of Americans believed in God in the mid-1960s.  Denominational allegiance had risen in 1964 by 2% (even though the American population had only increased 1.5%), and 44% of those claiming religious affiliations attended Church every week.  Once more, experiments in liturgy, church structure and ministry in Protestant traditions, and the reforming spirit of Pope John XXIII and the Second Vatican Council in Roman Catholicism, suggested Christianity had entered a period of massive and hopeful revitalization.  So, to the casual viewer, the nation’s dominant religion had no real concerns in 1966, and the concept of God seemed pretty secure as well.  The signs of the times certainly did not warrant wheeling God into an Intensive Care Unit.

The TIME article took readers into the heart of a complicated theological and philosophical debate among a growing number of Jewish and Christian thinkers.  The story looked at a canary in the coal mine: the idea of God—inherited from the Middle Ages—had become dreadfully ill by the second half of the  twentieth century.  The concept struggled for breath and seemed to have slipped into a death rattle.  Some of the chief causes for this dying Western idea of the Divine were woven through the article: the processes of secularization and urbanization; the control of nearly a half of the world’s population by the atheistic philosophies of totalitarian governments;  the availability of meaning-making options other than religion and traditional God-talk and God-thinking; and the impact of scientific discoveries that explained many of the mysteries once ascribed to the direct action of an invisible Supreme Being.  Collectively, the article maintained, these forces had started eroding the conceptual edifice of the concept of “God” that had been slowly built up over thousands of years.

The 1966 TIME article notified American culture of a possible soon-to-arrive dramatic change in religious belief and practice.   Throughout the 1970s-1980s, religious affiliation and attendance started dipping, with institutional scandals chasing others out of the pews in the 1990s.  By the end of the first decade of the new millennium, social scientists could track significant generational changes in religious belief, with two generations expressing indifference about religion in principle, and many believers, in the habit of thinking about the nature or purpose of a God, shifted along the changing contours discussed in the 1966 lead story.

The 2017 article on truth, on the other hand, is mostly a lamentation on the truth-distorting practices of Donald Trump, and the impact that the Internet, particularly social media, is having on the ability of women and men to recognize—or perhaps even care about—the differences between fact and fiction.  Too bad TIME magazine didn’t recognize the bigger story buried in its headline.  The relevance and identifying markers of our historic understanding of truth have been blurring over the past five decades, such that finding a common definition of the term has become problematic. 

Placing these two “red letter” articles next to one another reveals two other important causal forces that are part of our period’s challenges with truth.  First, in the half-century since the 1960s, American journalism has diminished in its ability to call out and interpret the complicated issues that are driving change in society.  Second, journalists themselves, and the entertainers who parody them, have played a key role in making truth more opaque by engaging in practices that blur the boundaries between politics, news reporting, and entertainment.

In 2017 TIME Magazine did not provide the same level of analysis on the issue of truth as it did with the God issue in 1966. If they had, the story would have said a lot less about Donald J. Trump and more about the ancient foundations of Western thinking about truth.  The authors could have begun with the great mythical stories from which the word “truth” is derived—Aletheia in Greek and Veritas in Roman mythology. They could have continued with people who sought to identify the features of truth, and find ways to build the capacities for finding and appropriating those insights, such as those found in the works of Aristotle and Thomas Aquinas.  For a more contemporary perspective, the story could have looked closely at the reasoning process that supports hunting for truth, and offered debates on whether this is primarily a matter of the head or the heart.  The reporters could have looked at the two great defenders of each side of the argument, Immanuel Kant for the former, and David Hume for the latter, and followed both lines of argument from eighteenth century philosophy to contemporary psychology and moral theology where the debate continues at a fevered pitch.  The article could have included thoughts on truth from religious leaders through the second half of the twentieth century, perhaps most notably popes John Paul II and Benedict XVI, both of whom bemoaned the West’s abandonment of the concept of “absolute truth” because of its eventual cultural and societal costs.  Many theologians, who attempt to defend or deconstruct this notion of “absolute truth”—in a time in which all seems increasingly relative to one’s perspective—would have gladly offered their insights to anyone asking questions.  Such a story might have noted that the truth is elusive.  Sometimes it draws an adoring crowd, and sometimes it gets you crucified.  Truth penetrates to the heart of things, offering salve and strength to the human soul; it can also disrupt and fragment.  The truth will set you free, but only if you are willing and able to receive it.

It is sad that we are losing our ability to even talk about truth, perhaps the most important thing in our world. It is especially critical in a democracy.  John Locke, one of the intellectual architects of American democracy once commented on how difficult truth is to find: “It is hard to know what other way men (sic) can come to truth, to lay hold of it, if they do not dig and search for it as for gold and hid treasure.”

From a cultural perspective, one could assume we are not into digging so much anymore.  But, there are some who are too traditional to give up on truth, and some of those in the journalism profession have created International Fact Checking Day.  Yes, we have such a day.  It is April 2.  In a promotion of the day, Angie Drobnic Holan, Editor at Politifact, told the new show Reliable Sources: 

I think we need to bring more attention to facts, reason, evidence, logic.  There is so much … misinformation out there right now (that) people need to think about what sources … they trust, how do they know if something is true or not? 

Drobnic Holan believes we have a desperate need for this day because inaccuracies, mischaracterizations of facts, and outright lies have never been more omnipresent. But finding sources we trust is becoming more and more difficult.  Even the comedian Bill Maher, who plays with facts for laughs, laments the loss of the “good ole days” when people got their news from credible journalism sources.  “Now they get it from chain emails, and chat rooms, and Facebook posts written by lunatics and sadists … The guy with the sandwich board used to be laughed at, now he’s linked to!”

To make it all more complicated, driven by the formats of cable and satellite radio news shows, over the past nearly forty years the journalism craft has eroded its own legitimacy for delivering the truth by becoming a leisure entertainment activity, rather than a profession dedicated to informing and improving citizenship.  In 1976, on the backend of the reporting on Vietnam and Watergate, about 72% of the American population trusted the information provided by journalists.  By 2016, trust in the media had dropped to 32%.  There are many reasons for this decline.  But, certainly one of the reasons this trust in truthfulness has slipped is that journalists, particularly electronic members of the media, have allowed themselves to become part of fictional portrayals in television and film. 

The process began as early as 1974, when the famed pioneer television journalist, Walter Cronkite, who told the nation about the assassination of President John F. Kennedy and the first humans to walk on the moon, made a guest appearance as himself in one episode of the fictional television journalism sitcom, “The Mary Tyler Moore Show.” In 1997, a half a dozen CNN reporters, including senior anchor Bernard Shaw, played themselves in the popular film, “Contact,” astrophysicist Carl Sagan’s story of humanity’s first contact with an alien culture.  CNN regretted the ethical lapse in seeking publicity in a fictional story, and expected to ban future media personality appearances. But, the moment of critical self-reflection did not last.  In 2010, no less serious of a journalist as CNN International Correspondent Christiane Amanpour played herself in “Iron Man 2.”  The practice of nationally recognized journalists lending their “brand” to the believability of a fictitious story is now commonplace.  The political show, “House of Cards,” has become a moonlighting profession for working journalists.  The show has drawn in CNN’s Ashleigh Banfield, John King, Soledad O’Brien, NBC’s Kelly O’Donnell and MSNBC’s Rachel Maddow, even Sean Hannity of FOX News.  In the latest DC Comics film, Batman v. Superman: Dawn of Justice, we have another parade—Nancy Grace, Charlie Rose, Dana Bash, and blogger Andrew Sullivan.  Anderson Cooper plays himself giving a “live” report from his CNN desk on the fight between Batman and Superman and the appearance of the evil Doomsday.

The truthfulness barrier runs in the other direction, as well.  In the May edition of The Atlantic, Caitlin Flanagan has a biting analysis of late-night comedy and “fake news” shows like Trevor Noah’s Daily Show, Samantha Bee’s, Full Frontal, and John Oliver’s, Last Week Tonight.  Flanagan is worried about how the blending of fact and comedy has turned increasingly mean.

Trump and Bee (for instance) are on different sides politically, but culturally they are drinking from the same cup, one filled with the poisonous nectar of reality TV and its baseless values, which have now moved to the very center of our national discourse.

This is spilling over into other forms of entertainment, too.  A New York Times reporter recently gathered the writers of some of America’s most popular political television satires: “Scandal,” “House of Cards,” “Madam Secretary,” and “Veep,” and the television writers bemoaned that writing fiction is becoming too complicated because fictional story lines are actually less imaginative and outrageous than real life.  These television shows often need to discard or entirely re-work scripts because the idea for a segment appears in the news of real life before the absurdist fictional story can air.  It seems life’s imitation of fiction is creating complications for fiction’s imitation of life.  Lord Byron is right, “life is stranger than fiction.”

How did we drift so far from a comfortable and shared sense of truth?  It is the side effect of the factors noted above and one other.  For much of human history, we have only listened to one voice of “truth” – the voice of the people in power.  Over the past 50 years, many cultures have become aware that there are multiple perspectives and voices, and our world is caught in the great task of deconstructing that once simple, singular perspective, and replacing it with something much more complicated.  Since it came out near Easter, an improved TIME article could have ended with the conversation of Jesus of Nazareth with Pontius Pilate before the Roman leader sent the Galilean preacher to the cross.  “I have come to testify to the truth,” Jesus told him.  “What is truth?” Pilate asked.  It seems many of us need to ask this question again.