Sunday, May 1, 2011

This All Has Been Related Before

Sometimes hospitals are like airplanes for me, at least in regard to reading.

Because my mom was usually sleeping when I sat with her in her hospital room, I caught up on some reading of academic journals. Although it's possible that someone else has read "Decorous Spectacle: Mirrors, Manners, and Ars Dictaminis in Late Medieval Civic Engagement," "Rogerian Principles and the Writing Classroom: A History of Intention and (Mis)Interpretation," "'Breaking the Age of Flower Vases': Lu Yin's Feminist Rhetoric," and "Acts of Institution: Embodying Feminist Rhetorical Methodologies in Space and Time" within the walls of Allen Hospital in Waterloo, Iowa, it ain't probable.  So while she snoozed, I ventured into catching up on past issues of Rhetoric Review along with reading some magazines.

One article I found both interesting and aggravating is "We Can't Handle the Truth: The Science of Why People Don't Believe Science" by Chris Mooney in the latest issue of Mother Jones, a decidedly liberal public affairs magazine.

If you got a chance, give it a read. But my reader response brain kept thinking about how much of the ideas and evidence presented in the article was related millennia ago by the ancient Greek and Roman rhetoricians, namely Isocrates, Aristotle, and Cicero, along with the modern Rhetorical Dude, Kenneth Burke. 

Here are some quotation nuggets from the article for enticement:
  • "We're not driven only by emotions, of course--we also reason, deliberate. But reasoning comes later, works slower--and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about." 
  • "In other words, when we think we're reasoning, we may instead be rationalizing."
  • "In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views--and thus the relative risks inherent in each scenario."
  • "And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts--they may hold their wrong views more tenaciously than ever."
  • "Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn't trigger a defensive, emotional reaction."
When I read the last statement in the article, which is near the end, my internal response was, "Well, no shit, Sherlock." But emotional and defensive reactions will happen. Some "contexts" are going to create them no matter how hard you try.  

But the reason I related the classical rhetoricians above is that their treatises lay out similar injunctions and ideas about how effectively working with pathos--appeals to emotions--is crucial to persuading an audience to your cause, your ideas, and your evidence.

What the old Greek and Roman guys also point out in their tomes is that pathos just isn't about emotions. On a more complex and realistic level, pathos represents an audience's values, assumptions, and beliefs. And as Aristotle relates, the consummate rhetorician--or "persuader" in Dubyian terms--creates trust and belief in what's he or she is saying. From the Greek, pistis can be translated as trust, belief, or reliability. And a persuader must create pistis to be successful.

But back to pathos relating to beliefs and assumptions. As the Mother Jones article intimates and as we have seen via examples in politics and elsewhere, you can give folks the exact same evidence, facts, studies, and data, but they'll come to very different conclusions as to what should or should not be done based on their core beliefs, assumptions, and values that pertain to how governments should work, what constitutes "life," how men should act, what "feminism" means, et al.

For me, the reality that some people--whether they are right-wingers, Marxists, pro-life Democrats, Birthers, etc.-- cannot and will not be persuaded by strong evidence calls up the concept of Burke's terministic screens and how people have interpreted what he has to say about them. 

I've always thought of Burke's terministic screens as a set of beliefs, values, and assumptions about the world--mediated by language--that act as almost a protective field around one's mind that lets in ideas and evidence that the "symbol-using animal" (Burke's definition of humans) will let persuade him or her. The "bad" ideas and evidence, well, they just bounce off our screens because we don't like what they're selling. We can't rationalize the way we want to. 

However, I don't think the points brought up in the article or the rhetoricians' ideas about pathos and terministic screens mean that we can't persuade people. 

We can. 

We can do so if we use language and actions that cohere with and connect to shared beliefs and assumptions about whatever we're talking about. Or, to put things more succinctly, good arguments begin in agreement. 

The Rhetorical Dude abides. He wants identification to precede persuasion. 

4 comments:

fern said...

I was complaining about the tendency to reason backwards from one's gut reaction just the other day. (Apparently it's too late to copyright this under my name.) Our increased ability to target specific demographics and the fragmentation of communications aimed at them (tweets and headlines at the extreme end) seems to have a variety of interesting effects on "public" discourse. The down side of this is easy enough to see. But the up side might be that when people do get to talk things over with people with whom they share values, they are perhaps more likely to evolve in their thinking since their defenses are not fully activated. And participation in overlapping discourse communities may set up some cognitive dissonance that they work through. But, finally, if people don't see or feel that there is any reason they should second guess their gut reactions (which are as often based on fears and insecurities as on values or beliefs) than information and alternative versions of the "truth" will not find a buyer. I guess that's why I find that "what are you afraid of?" is sometimes a good place to start a search for shared values.

Quintilian B. Nasty said...

Or maybe our emotional reactions represent at times a type of logic that we don't recognize? I can have a reaction of displeasure or happiness, but I "reason backwards" to understand why I felt the way I did/do.

What I find troubling is the lack of open-mindedness to test out ideas, etc. Getting back to a "beginner's mind" as Buddhist philosophy can be applied to varied contexts. I know I'm guilty of falling into mindful ruts, so the article got me thinking about that too.

Dr. K said...

Sure. Reasoning backwards to figure out why you feel how you feel might end up revealing a subterranean logic and core principles, or it might turn out to be error analysis. Either way, respectfully exploring your feelings is more productive than building walls around them and installing trip wires along the perimeter.

(The example behind my comment was censorship, since that comes up in my youth literature classes. I ask students to ask themselves what terrible thing they think will happen if children see an accurate picture of a human body.)

Quintilian B. Nasty said...

I see what you mean about how these ideas connect to censorship.