1

A Brief Background of Argument

Prologue: A living room, somewhere in mid-Missouri . . .

“Oh no; I don’t want to argue about it.”

“Why not?”

“What? Why would I? Do you want to argue about it?”

“Sure. You state your position, I’ll consider it. I’ll reply, and you think about my reply.”

“What?”

“Then you reply to my reply, and we’ll keep going until we get it worked out.”

“You mean you want to talk about it?”

“Yes; arguing is talking about it. Don’t you want to talk about it?”

“Yeah, that sounds fine, but I didn’t want to get into any drama with you shouting, and me shouting back, and you mentioning my mother, and it’s all very unpleasant.”

“Okay, let’s not do that, but that’s not really arguing in the way I mean it. Arguing is necessary for progress. We articulate why we think what we think, revising our views as necessary, until we’ve reached some resolution.”

“Fine, then, let’s argue about it.”

—–

While this conversational scenario might not seem very likely, what would our culture look like if it were? What if, instead of avoiding argument because it seems to provoke conflict, we approached disagreements as chances to explain ourselves and understand others more completely? The word “argument” has taken on all sorts of undesirable connotations in popular culture in the last thirty or forty years, but why? How did we get here, and how do we get out?

Is getting out even necessary?

In short, yes; yes, it is necessary. We, the faculty of MACC, want you [1], the students of MACC, to argue; we need you to argue. Our futures – mine, yours, everyone’s – depends on it, frankly. Of course, as we will discuss in the following handbook, we are talking about “argument” in a very particular and limited context. If you want to learn how to out-scream your opponent, this course is not designed for that sort of thing.

Some of the lessons we will be covering here are very old. In fact, many of them are more-or-less unchanged since the Greeks and Romans. This antiquity is important because it means these ideas have withstood the test of time; they have been tried, found useful, and kept for the next generation. By learning to argue, you are stepping into a tradition centuries old. Our word “argue” comes from the Latin words arguere, which meant “to make known” and from argumentum, with a meaning including “discussion.” So to argue is to discuss and make known. Those are important goals.

A Brief History of the Nature of “Authority” and Argument[2]

Here we are in the early 21st century, a time when many students have an idea about argument that might conjure up bloviating talking heads on news talk shows or people throwing chairs and shouting personal invective at each other. It is also possible students imagine people who simply refuse to engage, claiming perhaps a personal affront and simply withdrawing. This image is unfortunate. We need to get beyond that unpleasantness, but to do so productively, we should spend a moment trying to figure out how we got here.

Let us agree that there was never really a “Golden Age” of argument during which everyone was nice to each other and always kept things professional, dispassionate, and productive. This idea that things (whatever “things” the speaker is evoking) used to be “better” is a common fallacy[3], and we should guard against comparing the present to a probably fictional rose-tinted past. We need to discuss and make clear.

Having said that, though, there are some significant ways in which argument was different “once upon a time.” If we go back to those Classical Greeks mentioned above, we discover that there was a dispute between a group called the Sophists and the group that eventually cohered around the thinking of Socrates[4].  The “Sophists” were so-called because of their concern with the knowledge of the argument process; since sophia is the Greek word for “knowledge,” that is the name that has stuck. To the Sophists, the only important thing was using the tactics of argument (the knowledge of the skills, etc.) to achieve their goals, whatever those were. What mattered most was that the arguer “won” the argument regardless of what position that meant taking. If you paid a good Sophist (say, the confusingly-named Isocrates) to argue that your company should be allowed to dump toxic sludge into a stream that runs through a nature preserve for adorable bunnies and past a playground for orphans, then Isocrates would be happy to make that argument for you with all the skill he could muster, bunnies and orphans notwithstanding.

Socrates and his adherents, though, believed in what we might call “The Truth,” with a capital “T.” For Socrates, as for Plato, and Aristotle, etc., there existed the Right Answer to the toxic sludge question, and our job as arguers was to find it. It was not a matter of winning and losing but of upholding the Truth. We could, through a series of interrogations, discover the Right Answer, and then everybody should follow it.

The Sophists suggested that “The Truth” was simply whatever the arguer wanted it to be and was often suspiciously aligned with the weak against the strong. Why, for instance, is it wrong to simply knock somebody down and take his lunch money? The Sophists would argue that it might be wrong sometimes, but it might also be the right thing to do at other times, depending on the situation or contingency. The Socratics said that some things – like taking advantage of another’s weakness for our own ends – were always inherently corrupt and Wrong.

Perhaps because the Socratics had the better publicity department in the form of Plato’s dialogues, their view of Right and Wrong eventually became the dominant thinking of the Western tradition. You are probably aware of this because knocking someone down and taking his lunch money just seems wrong (it does, doesn’t it?), just as other things, like causing harm to children, or athletes punching women in elevators, or terrorist attacks, or dumping toxic sludge, also are easy to label as Wrong.

The Sophists did not lose out entirely, of course; if you or a loved one has ever needed a defense attorney, it is possible you are acquainted with a legal system that requires a zealous defense on behalf of the client, notions of “right” or “wrong” placed aside for the moment. For those who have ever wondered, “How could Attorney X defend this obviously guilty and reprehensible violent criminal?,” the answer is the Sophistic tradition lives on in an adversarial legal structure, thus forcing the State to prove its argument, rather than merely relying on public distaste or emotion.

But back to the Socratics. Many philosophers and theorists use the term “Totalizing Narrative” to describe the kind of thinking that teaches Right and Wrong are absolutes that can be discovered or revealed. A totalizing narrative, as the name suggests, is a story that explains everything. Most religions, for example, are totalizing narratives – and clearly the Hebrew tradition of a law-based religion is older than Socrates, but the Hebraic tradition did not merge with the Classical tradition until after Socrates; that synthesis happened about six hundred years after, approximately. Once Christianity became the dominant mode of thought in the West, the answer to every question, it was taught, could be found in its precepts. That did not keep Christians from arguing with each other — often quite bitterly, in fact — but what they were arguing about was who had the better grip on The Truth; they never really doubted that The Truth existed and could be known. Those who knew The Truth, then, were the “authorities,” with all the implications of control or authorship that word suggests. The “author,” after all, is the one who gets to tell the story. Some people rose to a position of authority by a lifetime of brilliant argument, like St. Augustine (354-430); others did so due to lives of extraordinary piety (St. Anthony, 251-356), which relied more on what we will later call the emotional appeal.

Even as orthodoxy evolved, and heterodoxies (or, as the possessors of The Truth called them, “heresies”) came and went, the essential idea of the totalizing narrative remained. Even the advent of the Reformation in the sixteenth century, which proved to split the church permanently, never questioned The Truth; the reformers’ great revolt was questioning who had it. Luther or Knox or Zwingli also embraced the totalizing narrative, but their versions were different from what had come before, just as they were increasingly different from each other. Having so many editions of The Truth proved to be psychically stressful and led to great conflicts, like the Inquisition and the Thirty Years’ War.

These horrors were almost inevitable when totalizing narratives were contradictory. If Person A is in possession of The Truth, but Person B believes something different just as absolutely, then one of them must be Wrong. Such error may even border on Evil, and it should be stamped out. Even small differences, like how many fingers one used to make the sign of the cross or in what language a service was delivered, could become the basis for persecution, arrest, or even death. Authorities felt their Truth was under question, and that could not be tolerated.

Even as the religious wars died down, and the Enlightenment came and went, the notion of the totalizing narrative would cling to its dominance. During the 19th century, new theories to explain everything apart from religion arose, but Darwin’s “descent with modification,” Marx’s historical materialism (“the history of all hitherto existing societies . . .”), and Freud’s psychoanalysis still claimed to be The Truth.

Frankly, it would take World War II (1939-1945) to fully undermine the grip of the totalizing narrative. In many respects, that war was also a clash of competing versions of The Truth, like democracy and fascism, for example. When the war was over and the appalling death tolls were calculated, however, Western thought recoiled from the cost of The Truth. By the 1960s an intellectual development called postmodernism had made a significant change: no longer was the question of The Truth, but of “the truths.”

Have you ever said or heard someone say, “Well, that may be true for you, but it’s not true for me”? If so, you have encountered postmodernism (or Sophistry, really, but they had been the bad guys for so long, most pretended this change was something new). The idea that something could be true over here but not true over there would not have made sense to a good totalizer. The Truth is the Truth, right? It is transcendent and stands outside of human influence and must be the same for everyone. But modern thinking embraces the idea of “living your truth” as something almost entirely dependent on you.

This is good, yes? No more oppression based on your beliefs (assuming they do not impinge on other members of society); no more Inquisitions; no more conflicts based on impossible-to-verify revelations from a mountain-top. No more “authorities” telling us what to do or what to believe about science or which movies are any good –

Wait, what? Ah, here we get to the difficulties for you, the modern arguer-to-be. It is almost certainly a good thing that we do not arrest people for being free thinkers anymore[5]. A “live and let live” attitude has no doubt kept the peace and allowed people to live alongside each other. But taken to its extreme, the “truths” may undermine any commonly shared assumptions in ways that make productive argument difficult because argument requires, to some degree, an acceptance of levels of authority. The writer Isaac Asimov recognized this problem when he wrote “There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge’.” But some people do, in fact, know more about a given field than others, just as some modes of living are better than others.

It would be possible to argue about what makes a good father. Suppose that Father A works at a modest job for a modest income and spends lots of time in the evenings and on weekends with his kids; they do not have very nice things, and maybe do not go to the dentist as often as they should, but he is around to play with and read to his kids a lot. Father B works 60 hours a week and hardly ever sees his kids, but they have wonderfully straight and white teeth, and their college funds are paid up; the coaches and tutors make sure they can hit a curveball and play piano. Which one is the better father? We can argue about that, under the assumption that “better father” has an answer.

But surely if Father C puts out his cigarettes on his kids’ forearms, he is a bad father, right? A position that says there is no Truth, but truths, is going to have a harder time explaining why, exactly. “Well, you might think that makes him a bad father, but I don’t have to think so” is a difficult position to counter because there is little common ground. “You do your thing and I’ll do mine” is a good formula in many ways, but it works better at the interpersonal level than if we are trying to establish rules for social interaction. It is not the kind of approach that could be applied to behaviors like child abuse — one hopes.

Yet, we live in a time when many people are suspicious of claims to authority. Matters of settled science, like vaccines and climate change, are routinely disparaged, for example. But if those things sound too serious, we can see the same phenomenon at work in something more mundane: the film review.

—–

Websites that review movies and have comment sections, like Rotten Tomatoes, routinely feature posters’ dismissals of what the professional film critics have to say. The assumption of many of the commenters seems to be that having seen a lot of movies (or even one, perhaps) makes them as authoritative as a film critic like Roger Ebert (who died in 2013, but his website continues to post reviews of current films). For example, the commenter “Yih Dzelonh,” writes, “I consider myself to be a very adept movie critic – like Ebert. I have rated hundreds, if not thousands of movies in my life . . . my expert impression . . . contradicts . . . other critics [. . . but] I feel that my rating is much more accurate.”[6] Notice how this poster claims that rating movies makes one an expert on movies – something a person could do without even seeing a film, theoretically.

Roger Ebert, though, spent his life studying film: he watched countless movies, yes, but he also taught university seminars; he interviewed producers, directors, actors, cinematographers, set designers, and on and on. He traveled the world visiting film sets, exploring international film styles; he could explain the significance of camera angles or what it meant when a character moved from left to right across a landscape; he could explain what “diegetic” music[7] was; he could even explain what a key grip or gaffer or best boy actually does. In short, Roger Ebert knew more about movies than 99% of people. That authority would normally (did normally?) mean that when he said, “Film X is a good movie you should go see,” it meant something different – that is, more authoritative – than if your cousin said the same thing (assuming your cousin is not also a film critic, of course). Believing that simply seeing a lot of movies makes one a film critic is akin to believing that driving a lot makes one a mechanic, having teeth makes one a dentist, or going to school makes one an expert on teaching.

This willingness to forego traditional routes of authority is a complication for us because argument requires using authority. To be sure, there is a lot of bogus authority posing as the real thing and learning to tell the good stuff from the ersatz will be one of the chief lessons of the course. It does mean that you must be willing to embrace the idea that some people do know more than other people and that some modes of living are better than others.

And if they are not, then what is the point of ever trying to improve things? To make things “better” – however you define that term – is one of the major goals of argument. If everybody were to simply shrug and say, “That’s the way it is, and I’m not going to try to change it,” then how would we ever progress? Where would we be if Washington and Adams and Jefferson had thought that? Or Lincoln or Garrison or Stowe? Or the Suffragettes? Or Martin Luther King, Jr.? If national history sounds too grand, remember that somebody had to convince the boss that a “smart phone” was a good idea or that sending time-limited pictures and videos with captions would be a marvelous investment of time and energy and people would really love it.

To recap, we once had a system in which the existence of The Truth was taken for granted, though it was often disputed and easily abused. Those disputes eventually lead us to where we are today — to a system with many truths. We still need authority, however, so in modern argument we will employ a kind of hybrid invoking truth-seeking and persuasion: we may not know what the absolute truth is, but we can make claims based on what we do know now, and in this halting fashion, stopping often to check our heading, we will move forward.

 


  1. Please note: Throughout the text, second-person address, “you,” is used. In standard academic writing “you” is not appropriate. One of the primary reasons to avoid it is that directly addressing the readers often creates unsuitable associations or inclusions. “You” is used here, however, precisely because if “you” are reading this handbook, then we really do mean “you,” the student/arguer. Second-person is therefore deployed strategically and purposely and with an awareness of its rhetorical limitations.
  2. Which manages to be both too long and wildly incomplete.
  3. We will cover and explain what a “fallacy” is in section VII.
  4. This is a ludicrously over-simplified explanation, and you should take a Philosophy class to get the full version, complexities and all.
  5. We hope that does not seem like a controversial statement.
  6. Comment on the Rogerebert.com review of Frantic.
  7. If you are curious, this is the music playing in a scene that the viewers and the characters hear, as opposed to more traditional “soundtrack” cues.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

MACC Composition II Copyright © 2022 by Dustin C. Pascoe is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book