I really like the essay. The example of the Swedish navy being absolutely convinced the Soviets were invading them with submarines and divers (which turned out to be minks and seals, animals who are at home in deep water), whereupon they blasted the water with everything they could, is a metaphor for our mesodermal friends perhaps, who blast every pain as if some contrary piece of mesoderm deep inside simply refuses to "heal" or bend or move the right way.
The Swedes remained vigilant, certain the Soviets were lying when they said they did not send submarines into Swedish waters. Finally they unlearned their vigilance after the Soviet regime changed completely, and realized they'd been fighting marine mammals. Hilarious.

Here is more that pertains to us, my bolds:
Surprisingly perhaps, technical experts may be among the most resistant to new ideas and to evidence that contradicts their current beliefs and methods. Their resistance has several bases. Experts must specialize and their specialized niches can become evolutionary dead-ends (Beyer, 1981). Because experts' niches confer high incomes and social statuses, they have much to lose from social and technical changes. Expertise creates perceptual filters that keep experts from noticing social and technical changes (Armstrong, 1985). Even while experts are gaining perception within their domains, they may be overlooking relevant events just outside their domains.
Second, organizations make it more difficult to learn without first unlearning. People in organizations find it hard to ignore their current beliefs and methods because they create explicit justifications for policies and actions. Also, they integrate their beliefs and methods into coherent, rational structures in which elements support each other. These coherent structures have rigidity that arises from their complex interdependence. As a result, people in organizations find it very difficult to deal effectively with information that conflicts with their current beliefs and methods. They do not know how to accommodate dissonant information and they find it difficult to change a few elements of their interdependent beliefs and methods. The Swedish sailors who conducted the searches had been trained to interpret certain sounds as a submarine and rising bubbles as a diver; they had not been prepared for the sounds and bubbles made by animals. A Swedish navy that had just spent three weeks dropping depth charges and antisubmarine grenades in the belief that it had trapped an intruder was not ready for the idea that it had been deceived by playful young seals.
Tushman, Newman, and Romanelli (1986) characterized organizations' development as long periods of convergent, incremental change that are interrupted by brief periods of "frame-breaking change."
Second, organizations make it more difficult to learn without first unlearning. People in organizations find it hard to ignore their current beliefs and methods because they create explicit justifications for policies and actions. Also, they integrate their beliefs and methods into coherent, rational structures in which elements support each other. These coherent structures have rigidity that arises from their complex interdependence. As a result, people in organizations find it very difficult to deal effectively with information that conflicts with their current beliefs and methods. They do not know how to accommodate dissonant information and they find it difficult to change a few elements of their interdependent beliefs and methods. The Swedish sailors who conducted the searches had been trained to interpret certain sounds as a submarine and rising bubbles as a diver; they had not been prepared for the sounds and bubbles made by animals. A Swedish navy that had just spent three weeks dropping depth charges and antisubmarine grenades in the belief that it had trapped an intruder was not ready for the idea that it had been deceived by playful young seals.
Tushman, Newman, and Romanelli (1986) characterized organizations' development as long periods of convergent, incremental change that are interrupted by brief periods of "frame-breaking change."
They said "frame-breaking change occurs in response to or, better yet, in anticipation of major environmental changes." However, even if abrupt changes do sometimes "break" people's old perceptual frameworks, the more common and logical causal sequence seems to be the opposite one. That is, people undertake abrupt changes because they have unlearned their old perceptual frameworks.
Third, unlearning by people in organizations may depend on political changes. Belief structures link with political structures as specific people espouse beliefs and methods and advocate policies (Hedberg, 1981).
Third, unlearning by people in organizations may depend on political changes. Belief structures link with political structures as specific people espouse beliefs and methods and advocate policies (Hedberg, 1981).
Since people resist information that threatens their reputations and careers, it may be necessary to change who is processing information before this information can be processed effectively. Thus, a change in control of the Swedish government may have been essential before the Defense Ministry could concede the possibility of errors in the conduct of antisubmarine hunts. A change in control of the Soviet Union may have been essential before the Swedes could allow the possibility of Russian vulnerability or truthfulness.
Top managers' perceptual errors and self-deceptions are especially potent because senior managers can block actions proposed by their subordinates. Yet, senior managers are also especially prone to perceive events erroneously and to overlook bad news. Although their high statuses often persuade them that they have more expertise than other people, their expertise tends to be out-of-date. They have strong vested interests, and they know they will catch the blame if current policies and actions prove wrong (Starbuck, 1989).
Top managers' perceptual errors and self-deceptions are especially potent because senior managers can block actions proposed by their subordinates. Yet, senior managers are also especially prone to perceive events erroneously and to overlook bad news. Although their high statuses often persuade them that they have more expertise than other people, their expertise tends to be out-of-date. They have strong vested interests, and they know they will catch the blame if current policies and actions prove wrong (Starbuck, 1989).
Leave a comment: