Friday, September 21, 2012

Defending what you don't really believe?

Here's a fascinating study. By changing what was in front of the subjects on the questionnaire they were filling out, the subjects were tricked into believing that they believed the opposite of what they had just affirmed. What is fascinating is that a slight majority not only would read out loud and affirm that opposite (say that something is permissible, which they first said was not), but would go on to defend that in argument.

I wonder what they were asserting when they seemed to affirm the opposite to their initial claim. It's tempting to say that they were simply misspeaking and hence we should not attribute to them the assertion of something that they didn't believe. But then they defended what they literally said, which suggests that this is what they were asserting.

I guess I am inclined to think they weren't asserting contrary to their beliefs, but they were arguing contrary to their beliefs. Maybe this is another way of seeing that arguments come apart from why one believes what one does.

4 comments:

appearedtoblogly said...

Reminds me of Moore's Paradox and some of the literature thereabout.

thirdmillennialtemplar said...

I don't find that very surprising actually. It seems to me that people often come to believe what they actually tell people they do believe. People entertain inconsistent beliefs until it becomes clear to them that there is an inconsistency, but that usually comes about after one has stated their beliefs (even if only in talking to themselves). Stating what you believe is partly a way to think one's way through to one's position.

I take it that this is one of the reasons that people with alzheimer's disease will voice contrary opinions on the same subjects in different circumstances. More than that, it helps explain even inconsistencies in the works of people like Richard Dawkins or Peter Atkins (Atkins is especially bad in this respect). They judge their views according to whether it seems to jive with the narrative they have accepted, but the problem is that there is some variety of views which are compatible with strong atheism, and thus one can find each of these thinkers move from one position to a contrary position in different contexts without any recognition on their part that they are being inconsistent. Christian thinkers who aren't particularly practised at thinking through their theology systematically also fumble in this respect often enough.

Does not surprise me at all.

Heath White said...

It does not surprise me a great deal either, though it is pretty remarkable. It is a well-known piece of psychology that if you can get someone to affirm something once, they will become much more committed to defending the view. The usual theory is that once you are publicly committed, you have a reputation to defend, and being right is less important than being confident. The researchers have just produced the same psychological reaction by getting people to think that they have affirmed a statement and thus think that they are publicly committed.

I think it is wrong to think of the initial survey responses as being the expression of some “real beliefs” which sit dormant in a belief box until called upon. Rather, the person has some general moral views, or appreciates the force of various considerations, and those sit dormant in a belief box. Then particular stimuli induce them to produce answers to particular questions, which perhaps involves some chain of reasoning. Other stimuli will trigger other lines of reasoning which can often be manipulated to conflict with the first answer. Socrates and other good teachers are quite talented at producing such contradictions in most people. Philosophers are highly attuned to this sort of thing and are less likely to get snowed, I would predict. Being able to produce perfectly consistent answers under any variety of stimuli is a high value in our profession but not elsewhere.


Dagmara Lizlovs said...

"Defending what you don't really believe?" Politicians do it all the time.