[p2p-research] new article on p2p
Michel Bauwens
michelsub2004 at gmail.com
Wed Aug 11 17:52:16 CEST 2010
Dear friends,
our good friend Nikos is requesting comments on his draft before publishing
it on our blog and wiki,
it's long but as usual worth reading and commenting on,
Nikos and I have also a doc version you can request privately, since our
list doesn't take attachments
Michel
On Wed, Aug 11, 2010 at 9:31 PM, Nikos Salingaros <salingar at gmail.com>wrote:
> Dear Michel,
>
> I have been working on the attached article to send you for the p2p
> foundation when it is ready. I originally asked Stefano to join me as
> co-author but he is still in Russia (with the fires) and people in
> Norway now want to translate it and publish it over there. So could
> you please distribute it for peer review to our friends? Then I can
> prepare a final version and send it to Norway and give you the english
> version.
>
> Maybe Stefano and I can prepare a follow-up article later.
>
> Best wishes,
> Nikos
>
*Cognitive dissonance, misinformation, and paradigm shifts.*
**preliminary version**
Nikos A. Salingaros.
*University of Texas at San Antonio.*
*Abstract*: The reason why intelligent persons acquire demonstrably false
beliefs and stubbornly persist in holding them is traced to defensive
reactions that prevent cognitive dissonance. Methods of handling
contradictory information within urgent settings, while obviously
appropriate at the evolutionary level of early humans, wreak havoc with our
present-day rationality. People apparently have inherited both a mechanism
for conforming to group beliefs, and a stock of tools to fight against any
idea that conflicts with already held beliefs. Rational arguments have no
affect whatsoever. Studies in political science and psychology provide the
basis for understanding this fundamental yet alarmingly neglected mechanism
for preserving misinformation. Understanding irrational groupthink and the
resistance to social change as being based upon social learning and
evolutionary adaptation can hopefully save us from the usual frustrations
and failures of trying to educate society. A peer-to-peer educational
framework is proposed as the only means to circumvent this built-in
resistance to new ideas coming from outside conformity.
*Introduction.*
Is today’s consumerist society headed for collapse because of its
exponentially growing, hence unsustainable needs? Living through the
numerous arguments that urge a change from the catastrophic global waste of
natural resources, energy and agricultural regions, losing the diversity of
the biosphere, etc., it is frustrating to find that inertia overrides logic
and reason (Wilson, 2006). One could apply the same explanation to the
continued universal embrace of inhuman architectural and urban typologies,
exemplified in the promotion of a group of famous architects who build more
or less the same non-adaptive buildings. Whereas rampant global consumerism
is based upon misinformation, less widely known is that the way we design
and build our cities violates the principles of Biophilia (Kellert et. al.,
2008) and human-scale urbanism (Duany et. al., 2000). Society continues to
promote irrational ideologies and appears resistant to any rational
arguments. People hold irrational beliefs in the face of logical evidence to
the contrary.
I will summarize some results from political science and psychology that
suggest groupthink and resistance to rationality may be part of an
evolutionary adaptation. And yet, what made for an advantage in a tribal
early human society is most probably leading us towards extinction. To help
those of us who wish to implement changes for the better in society, I will
review mechanisms whereby people get induced into groupthink, often adopting
irrational and false ideas. Then, I will categorize the techniques human
beings normally use to fight against education: the tools otherwise
intelligent people employ to avoid revising their demonstrably false
beliefs. Anyone who wishes to affect social change must overcome these
obstacles; hence it is far better to be aware of them than to ignore them.
Cognitive dissonance occurs when a person is faced with two contradictory
and incompatible thoughts (Tavris & Aronson, 2009). This state generates
tension and anxiety, and can lead to paralysis in action because the
decision mechanism cannot resolve the conflict and decide upon any proper
course to take. Clearly, this is a very dangerous state to be in, and human
beings must be able to avoid getting locked in a state of indecision
(analogous to a computer program freezing up). Situations where this
conflict can arise are usually social ones, when others hold an opinion
contrary to one’s own. If one has to decide alone, there is usually much
less conflict among irreconcilable ideas. Apparently, nature has already
resolved this problem by predisposing us to accept a decision conforming to
what the majority of a group believes. This mechanism comes from social
evolution and is the basis for groupthink, but we see how it solves a
sticking point for action. Recent work on this topic posits a
biological/social co-evolution (Richerson & Boyd, 2005), so that we are in
fact hard-wired with this mechanism favoring conformity to the group.
A body of work shows how the unavoidable tendency to conform easily
overrides both rational behavior and moral inhibitions. Incontrovertible
evidence demonstrates again and again how normal persons ignore their own
sensory apparatus to trust a false piece of information only because it is
the accepted group opinion. In laboratory experiments, people were led to
abandon their own direct perception of phenomena and to instead adopt a
(deliberately false) groupthink opinion. In related experiments discussed
later, normal students were turned into sadistic prison guards (the Stanford
Prison experiment). I will conclude this paper with Thomas Kuhn’s review of
paradigm shifts in science, which can now be interpreted as just another
sudden change of group opinion. Thus, in science, where one expects rational
and intelligent behavior, the acceptance of new theories is as problematic
and is delayed by group conformity just as the same phenomenon occurs in
ordinary society.
Implementing change through education is unlikely to occur within a majority
system, because the social setting guarantees conformity of thought. A
peer-to-peer framework, on the other hand, enables individuals to discover
and then dissimulate innovative ideas outside the system of authority. Using
Information and Communications Technologies, today it is possible and
feasible to construct a society of peers (the peer-to-peer educational
model) that is by its very nature free from the pressures of groupthink.
Then, in evolutionary terms, a paradigm shift may be expected to occur when
the peer-to-peer society reaches a size comparable to the existing system of
authority. Trying to educate individuals who have already accepted
misinformation as truth is probably a fruitless task, thus efforts should be
directed towards enlightenment that occurs as a result of involving a
critical number of people.
In conclusion, these results indicate the validity of the old saying “might
makes right”, with the added surprise that human beings are hard-wired to
conform to group opinion, even if that opinion is based upon misinformation.
Despite the seemingly fatal blow to our dreams for a human society based
upon rationality, hope arises from a distinct means of organization, which
is represented by the peer-to-peer society. By building up a sufficiently
broad informational and social peer-to-peer network, we can bypass our own
evolutionary predisposition to conform to majority opinion, because in a
looser network there is simply less pressure to conform. In addition,
peer-to-peer networks have arisen from the efforts of individuals who have
questioned centralized authority, hence this framework privileges rational
individual thought.
*Stage one: conforming to a group belief. *
Solomon Asch (Asch, 2003; 2004) showed in a classic series of experiments
that a person is ready to mistrust his/her own perceptual apparatus and to
instead adopt a false belief just out of peer pressure. In one experiment,
subjects were consistently misled by biased group opinion and reported the
wrong relative length of a line. People thus accept the majority opinion
regardless. Conformity to group belief is therefore stronger than one’s own
sensory apparatus. Sure, in these experiments, the “group” was selected and
instructed to deliberately mislead the subject, but the result is all the
more frightening, since the relative length of the lines the subjects were
asked to measure was obvious to anyone. Stanley Milgram performed distinct
experiments that confirmed the conformity effect (Milgram, 1961), and a more
advanced setting later confirmed and extended these original results (Berns
et. al., 2005).
To give an example of conforming behavior, though not belief, let me remind
the reader of how susceptible everyone is to canned laughter in television
shows. This particular trigger of conformity to group action is universally
despised, yet omnipresent. The reason is that psychological studies show the
effect to be contagious (Cialdini, 1993). Viewers simply cannot resist being
influenced, even though canned laughter is both obvious and silly. In terms
of audience manipulation, it is found that the worse the joke, the more
canned laughter helps to make it work. Television executives therefore
habitually override a director’s and actors’ requests not to include canned
laughter in their theatrical masterpiece.
Boyd and Richerson (Richerson & Boyd, 2005) argue that forms of groupthink
and ideological conformity were very useful in forming early human
societies. Conformist transmission in social learning is strongly favored in
natural selection. Even if an actual belief is wrong, it matters more for
the long-term survival of the group if it holds itself together during the
time required to reach a collective decision, thus conformity is one very
powerful factor in survival. By contrast, internal dissention within a group
over conflicting ideas weakens the group’s solidarity and purpose. An
indecisive situation proves disadvantageous to the group in making
short-range decisions, such as in emergencies and fighting competing groups.
We don’t have to look very far for historical examples where internal
bickering when the enemy was at the city gates allowed an invasion that
could have been prevented had the population shown more solidarity.
Conformity to a common ideology is certainly good for fighting off an
invading army or another tribe competing for the same territory and
resources. Even in early civilization with a small population, however,
adoption of false beliefs may lead to extinction, and we see this in the
archaeological record. Societies drove themselves to collapse by holding
onto false beliefs about the natural environment upon which they depended,
deforesting and denuding agricultural land, losing their productive soil to
erosion, polluting their only water sources, etc. (Diamond, 2005). These
drawbacks arise on an entirely longer time scale than the immediate one of
quick decision-making. Our knowledge of whether there were any dissenting
voices proposing more rational solutions and practices in a society headed
for extinction is non-existent. All we know is that those societies drove
themselves to collapse by following groupthink and conforming to destructive
beliefs right to the very end (Diamond, 2005).
Just to emphasize how perceived authority and the urge to conform validates
misinformation, there are numerous studies where pretend pollsters got
perfectly straight answers to fictitious questions. Students happily
answered questions about nonexistent places; nonexistent legislation;
nonexistent political figures; and even gave directions to fictitious
locations (Prasad et. al., 2009). The responders mistook the act of asking
questions by some presumed authority (the pretend pollsters showed all the
signals of legitimacy) as proof that all these things existed. Responders
went further to invent fantastical explanations so as to avoid looking
ignorant and thus external to the group by not sharing its common knowledge.
In another study, adults were asked to remember details about a medical skin
test performed at school (Mazzoni & Memon, 2003). Even though there was
never such a test, the subjects invented a detailed, convincing recollection
of the “event”.
*Setting the stage for atrocities. *
The experiments of Stanley Milgram following World War II tried to discover
a psychological basis for the atrocities committed during the war.
Researchers put ordinary, intelligent people in compromising situations to
see if they would do terrible things when ordered to do so. The results are
frightening: yes, perfectly normal people can be turned into monsters, and
it is not very difficult. All you require is a pretend power system that
grants authority, and the subject will follow orders to perform terrible
tasks. In Milgram’s experiments, individuals were ordered to deliver lethal
electric shocks to subjects, and they obeyed (Milgram, 2004). Those
administering the shocks did not know that the current was turned off, and
that the subject was an actor screaming from the supposed shock. Actually,
the consequences were even more frightening than seem on the surface. The
subjects knew these were laboratory experiments carried out in a university
setting, and yet they followed orders against fundamental human morality. In
real-life situations, the power system giving orders often has the right of
life-and-death over the subject, which makes any objection to following
orders even less likely.
The work of Milgram was extended by Philip Zimbardo, who designed and ran
the infamous “Stanford Prison Experiment” (Zimbardo, 2007). In a similarly
unexpected scenario, ordinary students turned into sadistic prison guards
when given the appropriate rubric of power and conformity. Things got so out
of hand that the experiment had to be stopped only after a few days. Because
of his experience with this phenomenon, Zimbardo was asked to testify in the
Abu Ghraib prison scandal investigation (the terrible events perpetrated by
US service men and women in a prison in Iraq in 2004). As would be expected
from the mechanism of conforming, the individuals involved in those
sometimes-sadistic power games turned out to be no different than other,
psychologically normal soldiers.
The mechanism of conformity drives human beings to accept misinformation,
irrational beliefs, and the same mechanism makes a normal human being do
terrible things to other human beings because of peer pressure or direct
orders from some presumed authority. In all of these related but distinct
acts, our hoped-for internal checks seem to dissolve. People do not reflect
before adopting a group belief; they do not weigh the evidence on whether
the logic behind this belief is sound or not, they just accept it like they
accept advertising. When authority or society asks them to perform an
unspeakable act, their innate morality, which is their conscience grown from
lessons of ethics and compassion, simply vanishes.
*Truth is what the majority group decides it to be. *
George Orwell’s original conclusion is that society promotes groupthink,
which forces people to accept misinformation as truth (Orwell, 1949).
Validation of an idea or set of ideas is based on whether they are accepted
by the majority: a deceptive parallel to democratic governance. Orwell
described a totalitarian regime, but we now know that this conformity
mechanism applies just as well to a democratic society. Once misinformation
has been accepted by the majority, then it is almost impossible to correct
it in the public consciousness. Yet this syndrome is not supposed to occur
in a democracy.
The advertising industry flourishes in democracies, where its primary goal
is to misinform people so that they will consume a product. Granted that
some of the massively advertized products do what they claim to do, but a
large number of them are either of marginal value or are actually
pathogenic. In any case, the global consumerist system promotes many
products of doubtful worth that replace more sustainable local products, and
this unfair competition (because the global companies are linked into an
economic power structure) kills regional industries. Specifically, the two
linked examples of interest in this paper, unsustainable consumerism and its
global non-adaptive architecture, are promoted using all the power of the
global media. These two cases represent crucial industries for global
industrialization (though they are certainly not the only examples).
Two related but distinct mechanisms influence people to accept
misinformation as truth: first, the passive conformity of beliefs to match
the majority opinion; and second, deliberate falsehood promoted by a system
of authority in order to further its ends. The second mechanism is active
and is implemented by the system, whereas the first mechanism is passive and
occurs because of human nature. Many individuals have tried to promote
irrational beliefs so as to win control of a group of people. We have
examples in cults, dangerous sects, extremist political movements, etc.
However innocuous the advertising industry may appear to be compared to a
totalitarian regime coming to power, the same techniques of persuasion are
used to sell fatburgers and soft drinks laden with synthetic fructose.
Here, I am not going to investigate the propaganda and conditioning
apparatus used by systems of authority to spread misinformation, which
requires a separate and independent study. The purpose of this paper is
strictly to bring to light the first, natural mechanism of conformity
precisely because it is not amenable to human control (although it provides
the basis for the second mechanism of deliberate misinformation).
*Lockout and the wall of mistrust. *
It is very easy to prejudice a person’s opinion about a subject or event by
saying something positive or negative before the person comes into contact
with the event. This effect is well known to political lobbyists, who will
rush to be the first to talk to an incoming politician. Whoever has the
first word can implant either positive or negative thoughts in the
politician’s mind, and those subconscious thoughts will influence decisions
during the rest of that person’s career. Some authors refer to its negative
application as “lock-out” (McFadyen, 2000). This technique is used in
character assassination. Say something nasty about person A to person B
before A and B meet, and person B will be forever aligned negatively against
A, who is the target of this “lock-out”. The same technique works to
discredit an idea by making a derogatory comment before the idea comes up
for evaluation.
Lockout works to insulate a group’s beliefs from outside influence. A
society is defined by a set of mutually shared beliefs, whether those are
factually correct or not. At the same time, this commonality defines a group
in terms of its particular beliefs, and we can estimate different groups’
social proximity depending upon how close one set of beliefs comes to
another. Hence the seeming paradox of groups of people living in close
physical proximity to each other, but very distant in terms of belief
overlap. Genocides occur in which people who have lived together for
generations, but who belong to socially distant sets of beliefs, turn upon
and kill each other. Or the converse phenomenon is seen, where spatially
separated societies are culturally close because of the nearness of their
beliefs.
Every social group maintains cohesion through its beliefs, and therefore
wishes to protect existing beliefs from external influence. A wall of
mistrust is set up towards any competing beliefs found on the outside. In
cults, one of the primary messages to followers is to not believe anyone
outside the cult, which is but an extreme version of the natural
exclusiveness of any socially cohesive group. Only information that tends to
confirm the group’s beliefs is allowed to penetrate. The mechanism of
selectively justifying a previously held opinion in the face of data that
could disprove it is known in the psychology literature as “confirmation
bias” (Nickerson, 1998), and is explained as an innate mechanism for
avoiding cognitive dissonance.
Curiously, the more outrageous the belief, the more effectively it holds a
society together. This result is extracted from anecdotal evidence in the
study of religions, dangerous cults, and terrorist organizations. Believing
in something that is obviously good, and also promising not to do something
that is obviously bad, does not require the reinforcing rubric of the social
group, since it could easily be an individual decision. But if conforming to
the group requires an unusual, or unusually difficult action, then the act
of acceptance brands the initiate as something special and makes the group
reality even more relevant. This point has been confirmed by experiment
(Gerard & Mathewson, 1966). We can refer to initiation rites common to all
societies, more marked and arduous in some than in others. It is no surprise
that the more difficult the task of initiation, the more permanent is the
attachment to the group.
In terms of conforming by accepting misinformation, the more outrageous the
misinformation, the more tightly the individual holding it needs to cling to
the group. The reason is that those beliefs are “true” only within the
social context of the group but dangerously false outside, which makes life
outside the group problematical for persons holding such beliefs. The
implications turn out to be as pessimistic as they are unexpected: those
individuals holding the most erroneous beliefs are the most difficult to
approach, and certainly the most difficult to educate (Nyhan & Reifler,
2010).
Let me conclude by returning to the global consumerist society’s dependence
upon irrationality. A turn towards a more sustainable future needs to apply
technology on the small, local scale, and become very suspicious of large
energy-consuming projects favored by global industry, international funding
agencies, and governments alike. The built environment needs to be built to
facilitate emotional nourishment and human socialization and to reject
inhuman industrial typologies. The obstacle that reform faces is represented
by a refusal to accept the falsity of many of the global consumerist
society’s basic assumptions. People simply refuse to see the truth. The ways
in which they accomplish this denial are outlined next.
*Stage two: six tactics in a strategy for resisting the truth. *
I am interested in developing a strategy for educating people who are stuck
with irrational beliefs. This has been a problem since the beginning of
recorded history. I wish to focus on the tools that people utilize to block
input that could change their ideas. Either programming or de-programming,
therefore, will have to somehow overcome these blocking techniques.
Cognitive dissonance arises when external information contradicts an already
held belief. The way we normally deal with this is NOT to rationally compare
the two competing theses so as to resolve the conflict based upon reason and
available evidence. Rather, we react in the same way we react to a physical
threat. We will fight against information that threatens our beliefs,
inventing any means of defense possible. This strategy has nothing to do
with rationality or truth. We normally accept information only if it
reinforces beliefs already held, and we reject information that conflicts
with something we already believe (Nickerson, 1998).
A recent paper on the sociology of political beliefs (Prasad et. al., 2009)
lists techniques that people use to prevent cognitive dissonance. Another
list is given by Cameron and Zuwerink-Jacks (Zuwerink-Jacks & Cameron,
2003). I have combined the lists and changed some of their labels to be
clearer for a general audience (the original labels are given in brackets).
I will pay special attention to defensive techniques developed in the animal
kingdom that, in addition to lending colorful labels, reveal the biological
analogy of these tactics. I also wish to suggest that the methods of
blocking rational arguments, although they require human reasoning applied
towards an illogical end, basically work on a pre-human level. Only the last
one is uniquely human as far as I can tell. I will make my own conjectural
hypothesis linking these six tactics with the evolution of neural networks
in the brain.
1. TUNING OUT (Selective exposure) — the “Ostrich” technique.
2. SOURCE DEROGATION — the “Rhinoceros” technique.
3. DISPLACEMENT (Disputing rationality) — the “Eel” technique.
4. IRRATIONAL COUNTERARGUING (Counterarguing) — the “Squid” technique.
5. SELECTIVE SUPPORT (Attitude bolstering) — the “Lizard” technique.
6. INFERRED JUSTIFICATION — the “self-justifying prosecutor” technique.
For readers who don’t know the animal behaviors I allude to, let me
summarize them here. A popular myth is that the Ostrich reacts to threats by
digging a hole and hiding its head in the sand (in fact, the Ostrich lies
down to look like a lump). When annoyed or threatened in any way, the
Rhinoceros just puts its head down and charges the source of annoyance. The
skin of an Eel is covered by slimy mucus so that when someone tries to catch
one, it slips out of grasp. The Squid frustrates its predators by releasing
a cloud of ink in the water, making it impossible to see anything and
facilitating its escape. The Lizard drops its tail to divert attention
elsewhere while it escapes.
I will now discuss in turn the techniques for rejecting a rational result
that contradicts misinformation already held by a person.
*Tuning Out* occurs when you (the questioner) are talking to a person (the
subject) and present evidence that his/her beliefs about a topic are wrong.
Cognitive dissonance creates a high state of stress, which is unpleasant, so
the subject responds by blocking what is being said. A common physiological
response is to just tune out the message and stare back with a blank look,
i.e. no response at all. There is consequently no engagement with the
questioner.
*Source Derogation* means attacking the questioner while ignoring the
question. This action could range from politely disputing the interlocutor’s
credentials and expertise, to implying a corrupt or dishonest motive (a
deliberate ploy), to outright insults and violence. The questioner could be
accused of being brainwashed, even though the subject is more likely to be
so in this instance. Any pretext that can justify a personal attack on the
questioner is useful. An imagined social difference between the questioner
and subject can be brought up in a classic prejudicial attack: for example,
the questioner is accused of being fascist, totalitarian, communist,
anarchist, etc.
*Displacement* is a response that engages at a minimal level, but the
response is founded upon irrationality. The person holding the false belief
answers that the issue does not depend upon facts, but is instead purely a
matter of opinion. Supposedly, any rational discussion is extraneous to the
topic; hence logical argument would be futile. There is therefore engagement
but no analysis. The existing false belief is maintained intact and free of
any threat from revision because it has been displaced into the realm of
opinion (at least as far as the subject is concerned).
*Irrational Counterarguing* involves offering evidence that presumably
refutes what the questioner is claiming. The problem here is that the
subject is arguing for an irrational belief against the questioner’s
competing rational belief. If the basis for arguing is logic and
rationality, then the issue could be settled very quickly, but that is never
the case. The strategy’s goal is to retain the false belief, not to allow it
to be questioned. In protecting the irrational belief, the subject who holds
such a belief is forced to introduce irrational or irrelevant arguments. It
is impossible to produce a coherent argument to defend an irrational held
belief.
*Selective Support* is a method of ignoring the evidence presented against a
false belief, and instead bringing in other peripheral pieces of information
that might seem to support the false belief. Here the subject tries to build
up a tangential logical edifice for supporting his/her false belief by going
around the main logical objections to the belief itself in a diversionary
tactic. There is no direct engagement on the fundamental issue, only clever
side-stepping.
*Inferred Justification* involves believing misinformation because it is
accepted by authority and/or by the majority in the group. No rational
reason is needed for this initial acceptance of misinformation, just
groupthink. What happens next is crucial, however: the subject’s brain
evolves circuits to create a seemingly rational explanation after the fact.
Once that stage has been accomplished, then to the subject holding the false
belief, any thought on the topic appears natural and obviously true. The
questioner is answered by saying that the consequences of this false belief
(which may be substantial and even catastrophic) justify the belief itself.
Of course this thought process follows a perverted inverse logic. The
subject assembles a fictitious backwards chain of reasoning to justify the
final piece of misinformation.
There are several points worth discussing. I draw attention to the physical,
visceral, and emotional feeling of knowing something to be true (Lakoff &
Johnson, 1999). The neural path that stores a packet of knowledge in the
brain (even if that knowledge is false) becomes part of someone’s physical
being, and is henceforth associated with a precise emotion. If the encoding
of a piece of misinformation is registered as “true”, then any successive
reference to that misinformation evokes the “true” physical and emotional
response. The subject is therefore hardly able to go against his/her bodily
signals confirming that something is intuitively true, even in the face of
rational evidence to the contrary. Reaction to having a basic belief
questioned by another person is irrational, since it is based upon an
emotional state generated by cognitive dissonance.
Another point has to do with the evolution of both complex neural circuits
and software. When a piece of software, or a neural net evolves to “learn”
something that is set as the goal of the exercise (to solve a particular
task in the case of software code), the system goes through an evolutionary
process involving many steps. Each step in the evolution of a neural circuit
or genetic program generates many alternative choices by some random
algorithm, and a selection process chooses the result that comes closer to
satisfying the desired conditions. The end result is a circuit or program
that does what it is supposed to do. Here the crucial feature is that
evolved circuits and programs are very difficult if not impossible to
understand, since they were not built rationally step-by-step (Hillis,
1998).
Moving to the neurological mechanism for the above tactics, I conjecture
that the same evolved procedure applies to the brain circuits that “grow” a
spurious explanation for a particular piece of misinformation. Neither the
subject nor anyone else can explain the physical neural circuit in any
logical sense, because it was never grown logically. It evolved a
posteriori, and no one can guess what associations in the subject’s brain
were used to anchor it to the permanent memory. Nevertheless, it FEELS true
and it is definitely associated with the visceral emotion of something that
is indeed true. Misinformation stored in this manner becomes embedded in a
pre-human consciousness: it becomes intuitive, one’s “gut feeling”,
something that cannot possibly be argued with logically or rationally. And
here we have the great obstacle to learning once false beliefs have become
embedded.
The label I have proposed for *Inferred Justification* is drawn from the
criminal justice system. Researchers documented the refusal of judges,
prosecuting attorneys, detectives, and police officers to admit to error
after a conviction was later reversed through DNA evidence (Tavris &
Aronson, 2009). It is very common for the system to simply dismiss the DNA
testing and to reinterpret the old evidence so as to justify the original
verdict, getting very angry with others in the same system who are
re-opening cases already closed. The need for self-justification leads
prosecutors to use the inverted logic by which if a person actually went to
jail, or was executed, then this in itself is sufficient to justify the
process that led to that person’s conviction.
*Some examples from the author’s experience. *
The strategy for maintaining misinformation explains the bizarre reactions I
have come across in presenting my work on architecture and urbanism. In
developing a theoretical basis for designing buildings and cities, I have
had to fight against a profession that has no rigorous logical or rational
basis, a curious anomaly indeed (Salingaros, 2005; 2006; 2008). When I
present my results, they inevitably contradict accepted twentieth-century
typologies and models of what “good” architecture has come to mean. In fact,
what is “good” is defined strictly by what is currently fashionable and is
supported by a large group of architects, architecture critics,
architectural magazines, architecture prize boards, etc. Arguing against
this establishment is arguing against a group that has been formed by
conforming to accepted images and a group belief system.
I have experienced and thus recognize all of the six tactics listed
previously. When talking to students, I often mention that some
architectural or urban typology is dysfunctional, and that a particular
famous architect who applies it has made a serious mistake. A frequent
response is *Tuning Out*. The student who has been socialized into accepting
everything that famous architects do simply does not know how to respond to
my criticism, hence tunes out. I notice a characteristic frightened look in
the students’ eyes when this happens: they are frightened because they are
at a loss of what to do in that situation. I have observed this often, and
every time it is disconcerting. These students were never prepared for the
possibility that something they were taught as absolute truth may in fact be
wrong.
When a similar thing occurs with a more senior architect or faculty member,
however, the typical reaction is *Displacement*, *Selective Support*,
or *Irrational
Counterarguing*. I should mention that, frequently, practicing architects
and architectural academics might become belligerent and hostile, applying
an extreme case of *Source Derogation*. This violent response could be
explained in terms of their emotional unease due to cognitive dissonance.
Their habitual position of authority in their closed society is a result of
everyone conforming, and their whole value system is threatened when someone
questions it. I have also experienced *Tuning Out* with senior architects,
and it has taken the form of a belligerent cutting off the dialogue.
In a large number of conversations and encounters with architects with whom
I have debated the scientific basis of architecture, nothing was ever
clarified because my interlocutors presented irrelevant material to counter
the results (*Selective Support).* A recurring strategy is to deny the very
existence of a theoretical basis for architectural and urban design (and
thus all the published literature on the topic). This strategy ignores the
scientific and experimental basis for architecture, and claims that this
discovered body of knowledge is personal preference and not rational at all
(*Displacement*). Those who actually try to question the scientific results
that establish architectural design inevitably turn to politics and argue
around the topic altogether (*Selective Support*). Mixed in with all of this
is the tendency to present irrational counter-arguments in such debates (*
Irrational* *Counterarguing*). As a result, architects talk by going around
in circles, and no understanding ever comes forward. But the desired result
of protecting their false beliefs is achieved: the strategy defends accepted
dogma from any possible threat of revision.
The most disturbing reaction to questioning the paradigm of global
consumerism tied to inhuman architecture and city form approaches religious
conviction. Surrounded by the products of this industrial paradigm long
promoted as necessary for economic progress, common people assume that there
must be inevitable and logical reasons why these unsustainable practices and
inhuman built forms are all around us, but cannot articulate them* *(*Inferred
Justification*). Surely a superior intelligence has decided that consumerism
and alien looking buildings are part of the natural evolution of humankind.
It is INCONCEIVABLE that all of this could be based on misinformation and
misunderstanding, let alone something as shallow as a fashion. Architects
are forced to justify their own profession, but are no better at explaining
these contradictions.
Two interesting examples of a failure to abandon misinformation come to
mind, and they concern two recent books. In the first case, the authors
passionately argue for a peer-to-peer informational system, but introduce
images of inhuman architecture that conform to the worst that the global
consumerist system is promoting. In the second case, the author develops
logical arguments promoting an adaptive architecture on the human scale, but
then illustrates the book with images of inhuman buildings that are again
images promoted by the global architectural/economic consumerist system. I
can only interpret these examples of incongruity in terms of the tremendous
difficulty of breaking away from conditioned images that are supported by
misinformation. Those images are so deeply ingrained in one’s subconscious
that they undermined the efforts of these well-meaning authors.
*Stage three: recover tradition in a peer-to-peer educational framework. *
Faced with the pessimistic outlook for educating those who have accepted
misinformation as truth, we will need to develop a strategy for affecting
social change. What I have discussed above leads me to believe that factual
knowledge is more likely to be discovered and held by individuals who are
not central to the system. Otherwise, both the official education system
that practices “lock-out” and the subsequent pressure to conform to majority
opinion will prevent any revision of presently held misinformation.
Individuals who have already realized to some extent that beliefs promoted
by the majority system are false are ready to learn the truth, and are thus
open to rational conversation. Being relatively isolated (or better
insulated), they are less exposed to the pressures to conform, but also
probably have difficulty in accessing factual information because the system
has no interest in maintaining access to sources that contradict its own
beliefs. Information and Communications Technologies help in enabling the
transmission of essential information that may be marginal to the
information of interest to the system. In this way, anyone who has access to
the Internet has the possibility of unlearning misinformation. Note that I
am not proposing a substitution of one set of dogmas with another: what is
happening is a level comparison between misinformation and facts based upon
rationality, contingent upon the responsibility of the individual to decide
the truth for himself/herself.
As long as we can facilitate the self-education of a group of separated
individuals outside the system we will have saved information that would
otherwise be lost. So far we have defined an intellectual escape from the
regime, and a process by which one can become educated about critical
issues. Yet that does not affect the system itself in any significant way.
The next step is to link those in possession of verifiable information
together into a network. That objective is now feasible through the
internet. Linking up informed individuals creates a peer-to-peer society,
and only if this society ever grows to compete in numbers with the
established system, then we do have a chance for social and cultural reform.
A peer-to-peer society is by its very definition free from the danger of
becoming just another regime, in which the majority would define dogma and
everyone would be prone to accepting misinformation. The self-correcting
architecture of the peer-to-peer society provides safeguards against this
eventuality (Bauwens, 2005), although that will not prevent wily individuals
or groups from propagating self-serving misinformation in trying to create a
power base out of a peer-to-peer society.
Another significant component in the proposed peer-to-peer solution is to
draw upon inherited knowledge from pre-industrial times. Many of the
problems we have with today’s global consumerist society are a product of
mindless industrialization on a massive scale. The forces that have evolved
the unsustainable global consumer frenzy were present in a significantly
lower level in pre-industrial societies. Those societies took better care of
the Earth and its resources (although we have countless cases of societal
collapse through the foolish extinction of resources). Today we can draw
upon traditional smaller-scale living solutions, enhanced by the smart
application of the latest technology. In the West, there was a resurgence of
interest and turn towards smaller-scale solutions in the 1960s-1970s, but
the movement was marginal and it was pushed aside by the global consumerist
system. Nevertheless, all the information from traditional societies, plus
the results compiled by the “back-to-earth” movement, forms an invaluable
knowledge base for a sustainable future for humankind.
In fact, the peer-to-peer society creates an alternative that needs to work
with the existing system and parallel to it, while protecting itself from
the dominant system’s biases. Simple, practical knowledge is an antidote to
the extremely expensive high-tech extravaganzas that represent the latest
offerings of the global consumerist system. There are many people now
connected in peer-to-peer groups, discovering, re-discovering, and applying
knowledge that is unpopular in the society that promotes global consumerism.
This marginalized information about self-sufficiency, low-tech
sustainability, biophilic design, etc. is currently out of fashion and
therefore must be documented, assimilated, and applied largely outside the
dominant system.
*The helpful nature of mass movements. *
The only way to counter a society that has accepted misinformation is to
create an alternative mass movement that maintains and promotes a set of
correct principles. It is difficult for a scientist to admit this solution,
because it says nothing about the truth of the ideas themselves: the
mechanism of correction is one of belief replacement that is made more
attractive through the force of numbers. Starting a mass movement also has
its own set of dangers, which I would like to discuss briefly. History is
full of examples of “bad” mass movements (dangerous cults, Geometrical
Fundamentalism, Nazism, etc.) and “good” mass movements (early Christianity,
the Civil Rights Movement, universal voting rights, etc.), therefore there
is no intrinsic check of the positive qualities of any mass movement.
Every mass movement proceeds because it promises hope for a new or improved
way of life (Hoffer, 1951). Criticizing the mistakes or erroneous beliefs of
the majority group never works in practice, because “confirmation bias”
blocks such strategies from having any effect. Reasonable and rational
explanations of the desirability of the intended changes don’t work either.
History teaches us that the appeal must above all be an emotional one. Here
the peer-to-peer educational network already has several advantages, while
at the same time being immune to the most dangerous of potential
disadvantages. The immediate appeal of a peer-to-peer network is one of
collaborating towards a better, more shared future: it is not the appeal of
power that draws converts to a dangerous cult or extremist political
movement.
A mass movement necessarily grows outside an existing system of power, and
does so only because it doesn’t threaten the existing power structure
initially. Let me contrast the ways the two systems operate. Members of the
majority power structure maintain their allegiance to their system through
vertical rewards: they support and are in turn rewarded by the system.
Members of the emerging mass movement, by contrast, are tied to each other
by a horizontal system of connections. Furthermore, their reward is an
emotional one, not a material one: for example, members of a peer-to-peer
educational network contribute their services for free. Reward comes in the
shared system of values, which, in the case of a peer-to-peer network, is
based upon verifiable truths. We are unlikely to wind up with a shared
system of misinformation, because that situation cannot survive the
open-source critical scrutiny of a peer-to-peer system.
The peer-to-peer educational network is a multiply-connected horizontal
system, constantly checked by its members, growing and adding new ideas that
are then checked, etc. I believe that there exist sufficient systemic checks
to prevent it from transforming into a vertical system of power like the
ones we find today based upon misinformation. Another factor is that
peer-to-peer systems are made possible via electronic media in an age when
social interaction through traditional, physical means is waning. This
brings us around full circle: the inhuman architecture and urbanism promoted
by the system in power for a century (Geometrical Fundamentalism) has
intentionally cut daily connections among people, and thus made sure that no
mass movement that could threaten its hegemony would ever arise. It is only
now, almost a century after industrial modernist architecture was introduced
in the 1920s, that the internet finally circumvents the inhuman environments
created by those typologies.
*Kuhn and his paradigm shifts. *
When Thomas Kuhn introduced his famous “paradigm shifts” (Kuhn, 1970), he
described a discontinuous process whereby a scientific theory is suddenly
accepted by the majority of researchers, after a long period in which it is
neglected despite its correct basis of evidence. Kuhn theorized that it is
necessary to build up some sort of “momentum” before one theory can replace
another, even if the newer theory has a perfectly rational scientific base
and inevitably explains observed phenomena better than the theory it will
eventually replace. This is not the way science is supposed to work,
however. Ideally, a better explanation supported by scientific data ought to
easily displace an older and cruder theoretical formulation of the same
observed phenomena. But it doesn’t happen that way.
All too often in the history of science, a much superior explanation is
resisted by the contemporary scientific community and is marginalized and
forgotten, to be re-discovered and appreciated only much later. This
phenomenon sounds very much like the behavior of non-scientists who switch
from one belief to another under the mechanism of groupthink. In this latter
case, there is frequently no basis for rationality: a segment of the
population may switch political alliances, or popular beliefs, or some key
aspect of cultural behavior. Fashions take over the minds of a nation, run
their course, and then give way to yet another fashion. The point is that
during the period when one fashion holds reign, it is nearly impossible to
convince its followers to switch to something else, and rational arguments
have no effect. When change eventually comes, it is sudden.
Kuhn was talking about scientists, who naturally represent one of the most
intelligent and rational segments of any population. Yet scientists
apparently act in an irrational manner when it comes to accepting beliefs
about their own discipline, which itself is supposed to explain natural
phenomena rationally. Science after all has an experimental basis:
researchers measure phenomena in the laboratory, which is not just
philosophical speculation. Nevertheless, if scientists are not immune to
irrationality, how then are we to expect non-scientists to be influenced by
rational arguments? Kuhn introduced a term that has been talked about
steadily for several decades, but unfortunately he did not indicate how the
paradigm shift occurs, and, more importantly, how it could be speeded up. I
believe that the pieces to answer this question lie in the topics covered in
this paper.
*Conclusion*.
This paper reviewed the near impossibility of correcting already held false
beliefs. Having laid out the problems we face in trying to educate people
who are happily following groupthink, it is time to develop strategies for
accomplishing this task. In my experience, individuals holding a worldview
founded upon misinformation occasionally come to an enlightening
breakthrough all by themselves, and they then turn to the available sources
of true information to enrich their knowledge base. I freely admit that I
have had little direct success in converting someone who has been following
groupthink. This pessimistic assessment is borne out by professional
psychologists who deprogram members of dangerous cults, where unfortunately
a very small percentage of former followers are ever successful in resuming
normal life.
The solution proposed here is to build up an alternative mass movement. This
strategy combines several features that offer an optimistic assessment for
educating people. First, developing a peer-to-peer educational network
concentrates and saves from oblivion knowledge that is essential for a
healthy society, but which is either neglected or suppressed by the global
consumerist system. Second, a peer-to-peer network works horizontally, thus
encouraging previously isolated individuals to link into a more supportive
greater whole. Third, the informational nature of the peer-to-peer society
can respond very quickly to the appropriate technological and scientific
developments that can make an enormous difference to the quality of life of
a large number of people. The global consumerist system is unlikely to
develop such innovations; or it will most likely attempt to control them for
commercial exploitation.
*References*.
Solomon E. Asch (2003) “Effects of Peer Pressure upon the Modification and
Distortion of Judgments”, Chapter 17 of: Lyman Porter, Harold Angle & Robert
Allen, Editors, *Organizational Influence Process*, 2nd. Edition, M. E.
Sharpe, Armonk, New York, pages 295-303.
Solomon E. Asch (2004) “Opinions and Social Pressure”, Chapter 3 of: Elliot
Aronson, Editor, *Readings About the Social Animal*, 9th Edition, Worth
Publishers, New York, pages 17-26.
Michel Bauwens (2005) “P2P and Human Evolution: Peer to peer as the premise
of a new mode of civilization”, <
http://www.networkcultures.org/weblog/archives/P2P_essay.pdf>.
Gregory S. Berns, Jonathan Chappelow, Caroline Zink, Giuseppe Pagnoni, Megan
Martin-Skurski & Jim Richards (2005) “Neurobiological Correlates of Social
Conformity and Independence During Mental Rotation”, *Biological Psychiatry*,
Volume 58, pages 245–253.
Robert B. Cialdini (1993) *Influence: Science and Practice*, 3rd. Edition,
Harper Collins, New York.
Jared Diamond (2005) *Collapse: How Societies Choose to Fail or Succeed*,
Penguin, New York.
Andrés Duany, Elizabeth Plater-Zyberk & Jeff Speck (2001) *Suburban Nation:
The Rise of Sprawl and the Decline of the American Dream*, North Point
Press, New York.
Harold Gerard & Glover Mathewson (1966) “The Effects of Severity of
Initiation on Liking for a Group”, *Journal of Experimental Social
Psychology*, Volume 2, pages 278-287.
W. Daniel Hillis (1998) *The Pattern on the Stone*, Basic Books, New York.
Eric Hoffer (1951) *The True Believer*, Perennial Classics, New York.
Stephen R. Kellert, Judith Heerwagen & Martin Mador, Editors (2008) *Biophilic
Design: The Theory, Science and Practice of Bringing Buildings to Life*,
John Wiley, New York.
Thomas Kuhn (1970) *The Structure of Scientific Revolutions*, 2nd. Edition,
University of Chicago Press, Chicago.
George Lakoff & Mark Johnson (1999) *Philosophy in the Flesh: The Embodied
Mind and Its Challenge to Western Thought*, Basic Books, New York.
Giuliana Mazzoni & Amina Memon (2003) “Imagination Can Create False
Autobiographical Memories”, *Psychological Science*, Volume 14, pages
186-188.
Ian McFadyen (2000) *Mind Wars*, Allen & Unwin, St. Leonards, NSW,
Australia.
Stanley Milgram (1961) “Nationality and Conformity”, *Scientific American*,
Volume 201, pages 45-51.
Stanley Milgram (2004) “Behavioral Study of Obedience”, Chapter 4 of: Elliot
Aronson, Editor, *Readings About the Social Animal*, 9th Edition, Worth
Publishers, New York, pages 27-40.
Raymond S. Nickerson (1998) “Confirmation Bias: A Ubiquitous Phenomenon in
Many Guises”, *Review of General Psychology*, Volume 2, No. 2, pages
175-220.
Brendan Nyhan & Jason Reifler (2010) “When Corrections Fail: The Persistence
of Political Misperceptions”, *Political Behavior*, in press.
George Orwell (1949) *Nineteen Eighty-Four*, Penguin Books, London:
reprinted 2003.
Monica Prasad, Andrew J. Perrin, Kieran Bezila, Steve G. Hoffman, Kate
Kindleberger, Kim Manturuk & Ashleigh Smith Powers (2009) “There must be a
reason: Osama, Saddam, and Inferred Justification”, *Sociological Inquiry*,
Volume 79, No. 2, pages 142-162.
Peter J. Richerson & Robert Boyd (2005) *Not by Genes Alone: How Culture
Transformed Human Evolution*, University of Chicago Press, Chicago.
Nikos A. Salingaros (2005) *Principles of Urban Structure*, Techne Press,
Amsterdam, Holland.
Nikos A. Salingaros (2006) *A Theory of Architecture*, Umbau-Verlag,
Solingen, Germany.
Nikos A. Salingaros (2008) *Anti-Architecture and Deconstruction*, 3rd Ed.,
Umbau-Verlag, Solingen, Germany.
Carol Tavris & Elliot Aronson (2009) *Mistakes Were Made*, Harcourt,
Orlando, Florida, 2007.
Edward O. Wilson (2006) *The Creation: An Appeal to Save Life on Earth*, W.
W. Norton, New York.
Philip Zimbardo (2007) *The Lucifer Effect*, Random House, New York.
Julia Zuwerink-Jacks & Kimberly Cameron (2003) “Strategies for Resisting
Persuasion”, *Basic and Applied Social Psychology*, Volume 25, Issue 2,
pages 145–161.
--
P2P Foundation: http://p2pfoundation.net - http://blog.p2pfoundation.net
Connect: http://p2pfoundation.ning.com; Discuss:
http://listcultures.org/mailman/listinfo/p2presearch_listcultures.org
Updates: http://del.icio.us/mbauwens; http://friendfeed.com/mbauwens;
http://twitter.com/mbauwens; http://www.facebook.com/mbauwens
Think tank: http://www.asianforesightinstitute.org/index.php/eng/The-AFI
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listcultures.org/pipermail/p2presearch_listcultures.org/attachments/20100811/c4994323/attachment-0001.html>
More information about the p2presearch
mailing list