Buyer Registration   |   Buyer Login   |   Forgot Password   |  

6 ways to make dumb groups smarter

6 ways to make dumb groups smarter

Since the beginning of human history, people have made decisions in groups. As the saying goes, two heads are better than one. If so, then three heads should be better than two, and four better still – hence the supposed wisdom of crowds.

The key is information aggregation: Different people take note of different “parts,” and if those parts are properly aggregated, they will lead the group to know more (and better) than any individual.

Unfortunately, groups all too often fail to live up to this potential. Companies bet on products that are doomed to fail and miss out on spectacular opportunities. In governments, policy judgments misfire, hurting thousands of people in the process.

WHY DO ERRORS OCCUR?

Groups err for two main reasons. The first involves “informational signals.” Naturally enough, people learn from one another; the problem is that groups often go wrong when some members receive incorrect signals from other members. The second involves “reputational pressures,” which lead people to silence themselves or change their views to avoid some penalty – often, merely the disapproval of others.

As a result of informational signals and reputational pressures, groups run into four problems:

+ Groups do not merely fail to correct the errors of their members; they amplify them.

+ They fall victim to cascade effects, as group members follow the statements of those who spoke first.

+ They become polarized, taking up positions more extreme than those they held before deliberations.

+ They focus on what everybody knows already – and thus don’t take into account critical information that only a few people have.

AMPLIFYING ERRORS

With the psychologists Daniel Kahneman and the late Amos Tversky in the vanguard, behavioral scientists have identified some common mental shortcuts (known as heuristics) and biases that lead individuals astray. The “planning fallacy,” for example, leads us to underestimate how much time projects will take and how much money they’ll cost. “Overconfidence” leads us to believe that our forecasts are more accurate than they are. The “availability heuristic” leads us to seize on whatever springs most readily to mind.

The central question is whether groups can avoid or mitigate these errors. Experimental evidence indicates that they usually do not. The psychologists Roger Buehler, Dale Griffin and Johanna Peetz have found, for example, that the planning fallacy is aggravated in groups. That is, groups are even more optimistic than individuals when estimating the resources necessary to complete a task.

CASCADING TO THE WRONG ANSWER

The human brain may be wired from birth to imitate other people. When it comes to group decisions and information flow, the favored term among social scientists is “cascade” – a small trickle in one direction that soon becomes a flood.

If a product, a business or a cause gets a lot of support early on, it can win over a group even if it would have failed otherwise. Many groups end up thinking that their ultimate convergence on a shared view was inevitable. Beware of that thought. The convergence may well be an artifact of who was the first to speak.

POLARISING GROUPS

Why does group polarisation occur? There are three reasons:

The first involves informational signals. Group members pay attention to the arguments made by other group members. Arguments in any group with an initial predisposition will inevitably be skewed in the direction of that predisposition. As a statistical matter, the arguments favoring the initial position will be more numerous than those pointing in another direction. Individuals will have thought of some but not all the arguments that emerge from group deliberation. Thus deliberation will naturally lead people toward a more extreme point in line with what they initially believed.

The second reason involves reputation. People want to be perceived favorably by other group members. Once people hear what others believe, they will adjust their positions at least slightly in the direction of the dominant position in order to preserve their self-presentation.

The third reason stresses the links among three factors: confidence, extremism and corroboration by others. When people lack confidence, they tend to be moderate. As people gain confidence, they usually become more extreme in their beliefs, because a significant moderating factor – their own uncertainty – has been eliminated. The agreement of others tends to increase confidence and thus extremism.

FOCUSING ON “WHAT EVERYBODY KNOWS”

Suppose a group has a great deal of information – enough to produce the unambiguously right outcome if that information is elicited. Even so, the group will not perform well if its members emphasize broadly shared information while neglecting information that is held by one or a few.

“Hidden profiles” is the technical term for accurate understandings that groups could but do not achieve. Hidden profiles are a product of the “common knowledge effect,” whereby information held by all group members has more influence on group judgments than information held by only a few.

MAKING GROUPS WISER

A central goal in group decision-making should be to ensure that groups aggregate the information their members actually have and don’t let faulty informational signals and reputational pressures get in the way. Here are six ways to achieve that goal:

SILENCE THE LEADER. Leaders often promote self-censorship by expressing their own views early, thus discouraging disagreement. High-status members can do groups a big service by indicating a willingness to hear uniquely held information. They can also refuse to take a firm position at the outset.

“PRIME” CRITICAL THINKING. Social scientists have done a lot of work on the importance of “priming” – that is, triggering some thought or association in such a way as to affect people’s behavior. In experiments on group decision-making, engaging participants in a prior task that involves either “getting along” or “critical thinking” has been shown to have a big impact. When people are given a “getting along” task, they shut up. When given a “critical thinking” task, they are far more likely to disclose what they know.

REWARD GROUP SUCCESS. People often keep silent because they receive only a fraction of the benefits of disclosure. Experiments have shown that incentives can be restructured to reward group success – and hence to encourage the disclosure of information. Cascades are far less likely when each individual knows that he has nothing to gain from a correct individual decision and everything to gain from a correct group decision.

ASSIGN ROLES. Experiments have found that the bias in favor of shared information is reduced when test subjects are openly assigned specific roles. If a group wants to obtain the information that its members hold, they should be told before deliberations begin that each has a different and relevant role.

APPOINT A DEVIL’S ADVOCATE. If hidden profiles and self-silencing are sources of group failure, a tempting approach is to ask some group members to act as devil’s advocates, urging a position that is contrary to the group’s inclination. Those who assume that role can avoid the social pressure that comes from rejecting the group’s dominant position.

ESTABLISH CONTRARIAN TEAMS. Another method is “red teaming.” Red teams come in two basic forms: those that try to defeat the primary team in a simulated mission, and those that construct the strongest possible case against a proposal or a plan.

THE DELPHI METHOD. This approach mixes the virtues of individual decision-making with social learning. Individuals offer first-round estimates (or votes) in complete anonymity. Then a cycle of re-estimations (or repeated voting) occurs, with a requirement that second-round estimates have to fall within the middle quartiles (25%-75%) of the first round. This process is repeated until the participants converge on an estimate.

Group failures often have disastrous consequences. The good news is that decades of empirical work, alongside recent innovations, offer some practical correctives that can make groups a lot wiser.

GROUP POLARISATION IN ACTION – BOULDER VS COLORADO SPRINGS

To examine the phenomenon of group polarization, the two of us (along with the social scientist David Schkade) created an experiment in group deliberation – one that, we believe, accurately reflects much deliberation in the real world.

We recruited citizens from two Colorado cities and assembled them in small groups (usually six people), all from the same city. The groups were asked to deliberate on three of the most contested issues of the time: climate change, affirmative action and same-sex civil unions. The two cities were Boulder, known by its voting patterns to be predominantly liberal, and Colorado Springs, known by its voting patterns to be predominantly conservative. We did a reality check on the participants before the experiment started to ensure that the Boulder residents were in fact left of center and the Colorado Springs residents were right of center.

Group members were asked first to record their views individually and anonymously and then to deliberate together in an effort to reach a group decision. After the deliberations the participants were again asked to record their views individually and anonymously. Here’s what we found:

1. People from Boulder became a lot more liberal, and people from Colorado Springs became a lot more conservative. Not only were the group “verdicts” more extreme than the pre-deliberation averages of group members, but the anonymous views of individual members became more extreme as well.

2. Deliberation decreased the diversity of opinion among group members. Before the groups started to deliberate, many of them showed considerable divergence in individual opinions. Discussion brought liberals in line with one another and conservatives in line with one another. After a brief period of discussion, group members showed a lot less variation in the anonymous expression of their private views.

3. Deliberation sharply increased the disparities between the views of Boulder citizens and Colorado Springs citizens. Before deliberation, many people’s opinions overlapped between the two cities. After deliberation, group dynamics left liberals and conservatives much more sharply divided.

 

*Source: http://www.brw.com.au/p/leadership/ways_to_make_dumb_groups_smarter_bUHh7g9gSqwheLuZCG4uGM

Blog Categories

No sub-categories