Barriers to Accurate Information Gathering in Organizations
[Including Governments]
Martin G Evans
Professor Emeritus of Organizational Behaviour
Rotman School of Management, University of Toronto
M. G. Evans
48 Griswold Street
Cambridge
MA 02138
“I expect to get valid information ... I can’t
make good decisions unless I get valid information”
George W. Bush, April 13th 2004.
George Bush’s cry is echoed by every organizational manager in the world. Every
manager would like to be sure that the information they received was both timely and accurate.
Every good manager knows that it is his and her responsibility to make sure that the information
received is timely and accurate. Every good manager knows about the three major barriers to the
realization of their heart’s desire: self-serving subordinates, group think, and uncertainty
absorption. Good managers are proactive to ensure that these barriers are overcome.
Self serving subordinates.
In organizations, information is power. By holding back information, an individual makes
others dependent on him in their decision making processes. In other situations, where there is a
major difference in the technical competence between a manager and a subordinate (so the boss
is unaware that the wool is being pulled over his eyes, the subordinate can manipulate the flow of
information so as to make sure the boss makes decisions that are consistent with the
subordinate’s best interests even though they may be sub-optimal for the organization. The
classic description of this process was provided by Andrew Pettigrew many years ago in his
graphic description of the power of an information gatekeeper in affecting the outcome of a
strategic organizational decision (to make a major computer purchase). By carefully controlling
the flow of information between a set of suppliers and the decision making body (the Company's
Board of Directors), the gatekeeper ensured that 'his' supplier received the contract for the
purchase. The gatekeeper used the following tactics:
a) Providing fulsome and timely replies to communications from the supplier that he
favored, but only reluctant and tardy replies to communications from 'other'
suppliers.
b) Refusing to visit or be "wined and dined" by the other suppliers.
c) Refusing to let his subordinate managers (each of whom favored a different supplier
based upon their departmental interests; this was after all a multi-party political
game) have direct access to the deliberations of the decision making body. The
only exception to this was on an occasion when he knew that the most influential
board member would be absent.
d) Providing a biassed balance of positive and negative information about 'his' supplier
and the other suppliers to the decision making body. He did this by diagnosing
the 'assessed stature' (how good it looked in the eyes of the Board) of his
department with the decision making body; he then passed information that he
wanted to be believed when his stature was high (positive information about 'his'
supplier, negative information about other suppliers), and he passed information
that he wanted to be ignored or discounted (negative information about 'his'
supplier, positive information about the others) when his department's stature was
low. Note that over the whole decision period he provided balanced positive and
negative information about each supplier. The imbalance lay in his strategic
shifting of the balance of positive and negative information depending on how his
bosses perceived him.
How can managers prevent being manipulated in this way? The first shield against this
kind of manipulation is by being as technically well informed as the subordinate. This enables the
manager to undertake his own evaluation of the adequacy of the information. However, this
shield is rarely available in this world where problems are multifaceted and it is impossible for
the manager to be an expert in each facet. The second, more proactive, technique is for the boss
to reach down the hierarchy to gain information directly from the subordinate’s own subordinates
(and the suppliers) rather than having it filtered through the bottleneck of the subordinate’s
filtering mechanisms.
Further distortion by self-aggrandizing subordinates is the suppression of bad news or the
inflation of good news. This tendency is increased when several subordinates are in competition
for the support of the boss and when, as in this case, the boss has the power to decide the fate of
those subordinates (in terms of budgetary outcomes). This is what appears to have happened in
the famous August 6 2001 PDB document. The seventy full field investigations claimed to be
being undertaken by the FBI seem to have been rather less full and involved passive monitoring
of suspects’ financial affairs rather than their quotidian activities (like learning to fly). Again, a
vigilant manager would look beyond the label “full field” and ask what exactly was being done,
to whom, and why, and where. Tough questioning might have revealed how sham these “full
field” investigations were with a consequent increase in surveillance of the suspects. Other things
that can be done to minimize this tendency to inflate good news and suppress bad news is
through the development of trust between the parties – the boss and each of the subordinates with
each other.
Groupthink
The Bay of Pigs, the decision to bomb North Vietnam, and the Iranian adventure provide
us with ample illustrations of Groupthink. This magnificently Orwellian term was coined by
psychologist Irving Janis to refer to a group's inability to tolerate dissent, to a situation in which
getting agreement around a solution becomes more important than developing the best solution.
President George W. Bush has yet to learn that tough action must be taken in order to
prevent the domination of the forces of Groupthink. What does Groupthink look like? How
would we recognize it in our own decision-making groups? Janis and earlier Norman Maier have
identified a number of symptoms that we should be on guard against.
It is often difficult, because of the Groupthink phenomenon, for us to recognize when
these symptoms occur in our own groups -- after all we know better, we would never make the
same mistakes. That illustrates the first and greatest problem: the group believes that it is right;
that it is right both factually and morally, and consequently the scenario of the operation {Bay of
Pigs, Iran) will unfold as planned. This sense of invulnerability and moral correctness leads to
four things that affect the group as it engages in the process of making a decision. First there
exists a sense of unanimity within the group; everyone agrees on a plan. Indications of this are a
failure to generate more than one or two alternative courses of action, and an individual's
unwillingness to express his or her reservations -- individuals are their own self-censors. They
suppress and keep to themselves their doubts and reservations about the plan. Often, many
people share these reservations and the group lacks a true unanimity. In addition, members of the
group put subtle pressure on those whose questions slip through the guard of their own selfcensoring.
As Janis observes they do this by limiting the bounds of criticism to details of the plan
rather than to its underlying assumptions and by isolating the dissenter into a slightly ridiculous
role as in Lyndon Johnson's term for Bill Meyers: "Let's hear what Mr-Stop-the-Bombing has to
say". Finally and most injuriously the group develops the MindGuards. These are the people who
filter the information coming to the group. They make sure that outside information is suppressed
or reinterpreted if it fails to support the cherished assumptions of the group. As a result of this
process, the group makes its decision only upon information that is supportive of that decision.
This builds up a self-fulfilling cycle of correctness. The illusion of rightness and unanimity is
preserved; no disruptive questioning or information is admitted by the group.
These processes are pervasive in decision-making groups. One of the few ways of
reducing their severity is to institutionalize dissent. Janis and Maier prescribe several
mechanisms for doing this. They represent a re-establishment of the traditional American system
of checks and balances in the political process.
A group should routinely make decisions in a two stage process. First, critical evaluation
should be suspended and a wide variety of alternative courses of action generated; the final list
should consist of the off-beat as well as the obvious. This can be done by having several
subgroups or individuals brainstorm lists of alternatives which are then brought back to the
policy making group. In addition, group members should be encouraged to discuss the issues
with their fellow workers and subordinates so as to bring a wider range of alternatives to the
group's attention.
Second, each alternative should be criticized in terms of its strengths and weaknesses,
with equal time given to both aspects. Work should be done on integrating several flawed
solutions into a better one. This can be done by the leader of the group formally assigning the
responsibilities of criticizing the alternatives to each group member. This alone is not enough;
the leader must accept with good grace and an open mind the criticisms made about his ideas and
proposals. Without this, all criticism will degenerate to a few pro forma comments. It is also
helpful to augment the group, from time to time, with outside members who can bring a wider
range of criticisms to bear upon the alternatives under consideration. These criticisms are apt to
be less inhibited that those of the group members as they will have not built up any ownership of
the alternatives under discussion.
Third, when the group is close to reaching a decision about the best alternative, the leader
should appoint a Devil's Advocate for the two or three options that are under serious
consideration. These individuals are to challenge the assumptions and expectations of the
proponents of each alternative.
Finally, as a group approaches a final decision, it should be augmented by outsiders who
have not been involved in the decision so far. These people will be able to bring to bear a wider
variety of perspectives and their comments and criticisms will be less inhibited than those of
group members as they will not have built up any ownership of the proposed solution.
At the last step -- once a decision has been taken -- the group should meet one more time
to review doubts and challenge the correctness of the decision. We have all experienced what the
French call "the bottom of the stairs" feeling when one remembers that critical comment that
might have changed the course of a discussion. This “last chance” meeting gives us the
opportunity to make that comment. It is like those Viking chiefs who used to make their
decisions twice over: once when drunk and once when sober. If the two agreed they would carry
out the decision; if they differed they would think again.
These practices will not completely eliminate errors of judgement, nor will they guarantee
success as even the best laid plans may go awry. These procedures will minimize the chance of
failure due to overlooking possible alternatives or failing to consider the potential side-effects of
the outcome selected.
Of course, the successful implementation of each of these safeguards depends critically
upon the role of the group's leader. The leader must show that he/she values dissent and should
provide a strong role model of the acceptance of critical comment. This means that the group
leader should keep an open mind, be open to criticism, and not commit her or himself to a course
of action until all the alternatives have been thoroughly explored. Can George Bush adhere to this
self denying ordinance?
Uncertainty Absorption
The third prevalent source of information distortion in both interpersonal and
organizational communication is uncertainty absorption. This occurs where raw data are
summarized, aggregated, and edited prior to being transmitted onward. Two processes seem to
be involved: the selection of data from the plethora of incoming stimuli; and the packaging of
this data for transmission. In both processes, the monitor/communicator's frame of reference or
cognitive map is crucial in determining what information will be noticed and what information
will be transmitted. In either process, information can either be suppressed, attenuated or
enhanced.
It is clear that organizational position (functional specialty, staff/line, hierarchical level)
affects the frame of reference that individuals use to scan their environments and incoming
communication messages. These differences are found both in terms of the aspects of the
environment attended to and in terms of the complexity of the individual's cognitive map. For
example, higher level managers in both staff and line functions had higher cognitive complexity
than those managers at lower levels; however, high level staff managers were less complex (more
single tracked) than their line counterparts.
The problem is that information congruent with a particular frame of reference is more
easily noted or transmitted. When people with different frames of reference communicate, the
receiver will distort the incoming information to fit her/his frame of reference. In organizations,
different departments have different "funds of knowledge" and different "frames of reference."
For example, R&D people are concerned with technical and, to a lesser extent, business issues;
Marketing people were more balanced having almost equal concern with business issues,
customer needs, selling and technical issues; However, managers in Manufacturing were mainly
concerned with production issues and, to a lesser extent, technical. These different foci make it
difficult to share insights. Deborah Dougherty found that successful product development only
occurred when firms broke out of these habituated ways of thinking. Successful innovation
resulted in someone, somehow ensuring that multiple frames of reference were considered. Only
when issues and solutions were considered in these varieties of ways was success ensured.
Similarly, messages are transmitted up organizational hierarchies and each level has its
own concerns: folk at the bottom are concerned with operational nitty-gritty issues; those in the
middle worry about administrative issues. Thus information selected at the bottom for its
operational relevance may be useless from a strategic perspective – especially after the middle
managers have put their “administrative” spin on it. Furthermore, information that is relevant
from a strategic perspective may never be noticed because it is irrelevant for operational
purposes. The enhancing of information for strategic purposes is demonstrated by the use made
by Britain’s Joint Intelligence Committee in the run up to the Iraq war. In that case, an
unsupported piece of information – that Hussein could deploy WMD at 45 minutes notice -- was
bolstered to become a major plank in the British Government’s case for going to war. Chapter 6
of the Hutton report describes the process nicely (URL:
http://news.bbc.co.uk/1/shared/spl/hi/uk/03/hutton_inquiry/hutton_report/html/chapter06.stm#a34).
Uncertainty absorption is probably the most difficult for the organization, especially a
large differentiated organization, to solve. It involves people in different divisions and at
different levels taking the time and energy to understand the mindsets of people in different
positions in the organization. Presidents need to talk to operatives, Marketers to Researchers,
and, in government, everyone needs to understand the political implications of what they do.
For each of these barrier to effective communication, Presidents and top managers need
to be proactive. They need to actively explore frames of reference, they need to challenge the
assumptions underlying the options that are proposed to them, and they need to get opinions from
all parties affected by the decisions they make. Can George Bush change his incurious ways?
No comments:
Post a Comment