CUNY B r o o k l y n  C o l l e g e            Department of Computer and Information Science
CIS 763X: Software Methodology
with Dr. D. Kopec                           Fall, 2000


Back tondividual Class Presentations

Normal Accidents

by Charles Perrow

Chapter 9:  Living with High-Risk Systems, what is to be done?

Presented by Joan Seabourne


Summary

Chapter 9:        Living with High-Risk Systems, what is to be done?

Charles Perrow, propose partitioning high-risk systems into three categories.  The first, those systems that are hopeless and should be abandoned because the inevitable risks outweigh any reasonable benefits (nuclear weapons and nuclear power). Second, systems that we are either unlikely to be able to do without, but which could be made less risky with considerable effort (some marine transport) or where the expected benefits are so substantial that some risks should be run, but not as many as we are currently running (DNA research and production).  The third and final group includes those systems, which are self-correcting in some degree and could be further improved with modest efforts (chemical plants, airliners and air traffic control, mining, fossil fuel, power plants, highway and automobile safety). Perrow believes his recommendations are consistent with public opinions and public values. However, Perrow states that there could be three substantial objections to his recommendations of abandoning high-risk systems or making drastic and costly modifications to them. The first objection that would judge his recommendations as wrong is if the science of risk assessment as currently practiced is correct. Current risk assessment theory suggest that most nuclear power and weapon have done almost no harm to people, hence low risk, while fossil fuel plants, auto safety, and mining have done a great deal of harm, hence high risk. Perrow on the other hand, believes the opposite, that nuclear power plant are high risk and should be abandoned, where as auto safety, fossil fuel plants, and air traffic control, are low risk which could be improved with modest self-correcting efforts. Perrow further believes that since the science of risk assessment is new, the science deserves scrutiny.  Secondly, Perrow believes his recommendations could be wrong if it can be shown that his recommendations are contrary to public opinions and values or if his opinions are not contrary, then the public opinions and values are ill informed and should be corrected, rather than respected.  Some work in Cognitive psychology have suggested that the public is ill informed, and ill equipped, because of the way the public reasons, to make important decisions about very complex matter. Perrow believes that this work, is in error. The third objection to Perrow’s recommendations, for high-risk systems which he said was to abandon some systems or make drastic and costly modifications to other, would be to run these systems safely, it simply requires authoritarian, rigidly disciplined, error-free organization, such as naval nuclear submarines appear to have. This view states that there is an organizational solution, which the public is not willing to put in place.

Perrow believes that living with risky systems means “ keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment.”  The issue, he believes, is not risk, but power; the power to impose risks on the many for the benefit of the few.”
 
The first objection to Perrow recommendation is weather the current risk assessment theory, which judge nuclear power to be low risk, while judging auto safety and fossil fuel as high risk is valid.  Perrow states that, the appearances of catastrophic man-made processes have caused public concerns, which led to responses from the vendors of these processes, and a number of social scientists. These inquiries have resulted in a new field call risk-benefit analysis, or risk assessment. Perrow believes that risk assessment carries its own risks.  The activity of risk assessment is not new. People with power have always commissioned risk assessments, when making important decisions, to calculate probable benefits and t probable costs. Throughout history rulers and property owners, have sought the advice of court advisors, priests, astrologer, and lawyers.  As, risk increasingly came from technological activities, scientists and engineers replaced the advisors. However, many of our catastrophic systems today are not new, and have already been subjected to simple forms of risk assessment. Mining has been with us for two centuries; the failure of bridges and ships, even railroad and aircrafts disasters are not new, but third and forth-party victims were not present in catastrophic numbers for the older high-risk systems. Vessels could not polluted the shoreline and the World War II bombers could not crash into building holding nuclear weapons, as happened at an unidentified overseas base in 1956.  Chemical plants were not as large, or as close to communities, or processing such explosive and toxic chemicals. It is only recently that almost every densely populated section of the country, risk a nuclear plant accident.. Further more, nuclear power, nuclear weapons, and recombinant DNA, are new systems with the potential for third and fourth-party victims, which are the innocent bystanders and future generations. As the number of new risk grows, so does the number of risk assessors. Their function is not only to advise the rulers about the risks, but should the risk be taken, to reassure the public. With the increase in risks and the public concern, regulatory agencies have appeared in large numbers. So now, the other function of risk assessors is to second-guess these agencies. Risk assessors, usually call for less regulation and are severe in their criticism of regulatory agencies.  The professionals in this field are generally engineers, scientists, and social scientists who are based in universities, research organizations, government regulatory agencies, military establishments, and industry trade groups. Private, profit-making research or consulting groups, such as management consulting firms, run this profitable business for the government and the industry. Trade associations such as the Electric Power research Institute conduct or sponsor risk-assessment studies. For example, General Motors sponsored a conference on risk, and published the proceeding under the title, “How Safe Is Safe Enough.”  The leading experts in the field were there, but their primary concern was not risks of military or industrial activity, but the risks of regulation.   Some of the best scientific and social science minds are at work on the problem of “how safe is safe enough.” It is a narrow field, with mathematical models and the ALARA principles (as low as reasonably achievable) being debated at conferences. In the model, everything has a price, if it does not have a price it cannot be entered into the calculations. For example, a life is worth roughly $300,000, one study concluded, less if you are over sixty.  The assessors feel that a life is a life. Therefore, death by diabetes should have the same impact on the public as death by murder. The assessors are then upset that the public is unaware that more people have died from diabetes than murder. The assessors feel that the public’s perception of risks is subject to biases due largely to the sensationalism of the media. For example, to the experts fifty thousand highway deaths a year are equal to a single catastrophe with fifty thousand casualties, so the experts are upset that the public protests nuclear plants, and estimates highway deaths to only half of what they are. The assessors then says, that these biases may misdirect the actions of public interest groups and government agencies, resulting in less than optimal control of risk. Perrow believes that risks from risky technologies are not borne equally by the different social classes and that risk assessors ignores the social class distribution of risk. For example, cost-benefit analysis relies on current market prices for evaluating cost and benefits therefore, people with low earning power will receive lower prices on their lives. For example, Perrow said, that the current market price for temporary nuclear workers is quite low, given the long recession. Also, the property values near a chemical plant are likely to be low because of odors, fumes, fire and explosion risks.  When an accident takes place the damaged to the environment is calculated in terms of values already depressed because of the accident potential, rather than what the land would be worth if an electronics plant were there, or a nice park.
 Risk assessors feel that the country must push ahead with risky endeavors or other companies or nations will beat the U.S. in the competitive race to the market place. This line of reason notes that our country was founded in risk and grew powerful by taking risks, and the social benefits have been enormous. For example, the Japanese will beat the U.S. in genetic engineering. Perrow asked, why take the risks if the Japanese can do it with more safeguards? Does it matter, all that much, if we buy these benefits from Japanese firms rather than U.S. firms?  Perrow then answer his questions by saying that U.S. firms like oil companies, the pharmaceutical and the chemical companies will lose private profits and dividends to stockholders. Perrow said, these risks are taken to promote the private profit of a few and should not be factored into an economist’s model. In the model, a dollar saved is a dollar saved, no matter who gets the dollar or who lives a riskier life to save it for them. And to those that say this country was built on risk, Perrow said  the risks that made this country great were not industrial risks, such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage.
 
Risk assessment and risk benefit studies do not distinguish between addiction and free choice in activities. For example, high way fatalities, and lung cancer from smoking are treated as voluntary activities, like hang-gliding. But most people who smoke today do so because they were barraged with advertisements that soon addicted them. For example, in World War II every packet of field rations held five cigarettes per meal and the sale of cigarettes to the armed forces were untaxed. Also airliners used to pass out cigarettes to their passengers, presumably to calm the passenger’s nerves after takeoff. Today, young people see a large number of adults who have not been able to break the habit and all are still barraged by smoking advertisements. Smoking is not a decision made by informed consumers but a government-supported program of addiction for private profit. Therefore Perrow feels that, an individual’s addiction to smoking should not be compared to the costs industry must be forced to incur such as making safer Christmas toys.

 The risk-assessing group sometime distinguishes between active and passive risks. Active risks, are those that the individuals performing the activity have some control over.  A person tends to accept risks more easily when he thinks his skills will play a part in avoiding the hazards, for example, driving, skiing and parachuting. These active risks are also considered voluntary risks. However, Perrow considers driving to work an involuntary risk, were the person have some controlled over the activity. Active risks are generally not pursed for someone else’s private profit. Perow also notes, that there are two dangers of active risks. Consumers will not always voluntarily pay for safer products. Second, active risk are attractive, a person like to take some risk, if he feel he has control over the risk. For example, as the risk of skiing declines with better equipment, the ski resort will attract more novice skiers, who feel the risk is reduced to a level they will tolerate. The end result is that the accident level may not change, because novices skier means more accidents, including more accidents for the experience skiers they ran into. Therefore the safety of any active risks, must include the number of participants and the proportion of new, unskilled ones.
 Passive or involuntary risks are those that the individuals performing the activity have no control over. A person tends to fear and reject risks where he is a passive recipient, for example, if an air controller made a mistake, or a dam breaks. These passive risks are generally pursed for someone else’s private profit. Perow notes, that where the passive control is in the hand of organizational leaders, these organizations must be policed. For example, even though airliners are interested in safety, and it is to their economic advantage to have safe travel or usage will decline, the Federal Aviation Administration is required to police them. Perrow said, one cannot count on any market to automatically incur the cost of more safety, therefore, the government must step in and police the market. As a result, the government is involved in more and more area of our life. In some cases Congress recognizes this danger, for example, the NRC (Nuclear regulatory Commission) is trying to control nuclear power.

One unfortunate implication of risk assessment is that the public should be excluded from discussions that affect them. Few assessors say this outright, but most imply it. Most seem to take the middle road, bring the public in but control them. The idea is to bring the public over to the side of the expert through education. By definition, the expert should know more than the public. Some experts believe that even if the public is given the facts, they are deficient in proper reasoning powers.  Cognitive psychologists believe that humans in general do not reason well, even expert can make mistakes in probabilities and interpretation of evidence. The psychologist says, that the public does not reason well, since the public minimize some dangers, maximize others, and do not calculate the odds as a statistician would recommend. For example, why would the public smoke cigarettes while voting against nuclear power?  Perrow believes that rationality in humans is limited or “bounded” as it is called, but it is possible that when confronted with disorderly data and discordant goals, this limitation our its greatest strength.
There are three forms of cognitive thinking or rationality. The first is absolute, which is used mostly by economists and engineers. The second, bounded or limited, which some risk assessors use. The third, is called social and cultural,  which most people live by.  Absolute rationality uses calculation to decide which activity one should choose. Such as nuclear power over coal-fired power plants.  The calculation shows that Nuclear power is close to risk-free, because the probabilities of a meltdown is very small, while coal power kills an estimated 10,000 people per year. The choice is obvious in terms of absolute rationality. So, why is 20 to 40 percent of the public worried about nuclear power. The public may be irrational or hyper critical about nuclear power. If the public is irrational then the social harm is extensive; it includes protest, demonstrations, and a Congress that refuses to follow the expert.  Cognitive psychologist, those who study the process of thinking or cognition, have conducted experiments that showed that people were considerably less than absolute in their rationality. The limit on people ability to consistently make fully rational decisions might be due in part to neurological limitation, to limits on memory and attention, to lack of education, and to lack of training in probabilities and statistics. It also seems to be due to hunches and rough estimates. Cognitive psychologists call these guesses “heuristics”, from the word for discovery. For example, “availability heuristic” suggests that rather than examining all existing cases and then basing judgment on all the experiences, people tend to judge a situation in terms of the most available case, which is the one most easily remembered. If there has recently been an airline crash, people would focus on the crash and ignore the successful flights. Heuristics are useful, timesaving devices even if they get us into trouble. They prevent agonizing over every possible contingency that might occur. Second, they cut down on the “cost of search”, which is the time and effort needed to examine all the possible choices and then rank then. Third, heuristics facilitate social life by giving other a good estimate of what we are likely to do, since we appear to share these heuristics. Heuristics appear to work because our world is loosely coupled, and that allows for approximations rather than complete accuracy.

The third view of rationality, social and cultural rationality or social rationality for short, recognizes the cognitive limits on rational choices, but holds that such limits are less consequential in accounting for poor choices. In fact, it is beneficial. There are at least two reasons why a person might be thankful for his limited cognitive abilities.  They are social bonding and diversities.  People vary with respect to different thinking abilities for different tasks. For example, one person may be good at math while the other may be good with models or three-dimensional space. One’s individual limitation brings about social bonding because the individuals will need each other whenever the specific tasks appear. This is a strong basis for social life.
The second reason a person might be thankful for his limited cognitive ability is diversity. Diversity in that each individual sees the same problem from a different prospective. For example, the individual who is good with number will look for a mathematical solution, where another person will look at potential consequences, not observed ones. If the individuals then share the solution, with each other the group benefits from the different skills.

This view of social rationality is supported by a public opinion poll taken by the Decision Research and members of a Clark University group. The poll compared experts and the public views on some technologies, such as nuclear power. The researchers wanted to know why the public and the experts differ greatly, for example the public ranked nuclear power as the most risky, where as the experts rank nuclear power as a low risk. The researchers found that the public judged most risks, like nuclear power, on the possibility of a disaster, not the historical data, while the experts judged most risks on historical data. That would explain why the public would worry more about nuclear power instead of car safety regardless of the number of people killed on the highways. When the experts and the public were asked to judge the risks based on the “dread” (which is a lack of control over an activity and fatal consequences, if there were a mishap of some sort or high catastrophic potential) both the experts and the public choose nuclear power as the most risky.

The third solution to Perrow's recommendations (of abandoning some high-risk systems and making drastic and costly modifications to other), required that high-risk systems be run by rigid and error-free organizations, like a naval base system. Perrow objection to using rigid organization is that current high-risk systems do not have organizational problems. For example, in general, well run plants have system accidents and both the airways and aircraft are not plagued with organizational problem. However the marine industry was analyzed as an “error-inducing system”. It was suggested that the authoritarian dicision-making system aboard the ship, should be reorganize and the operators should be given more control because faulty information and pressures for production, can lead to problems. The organizations at risk are the complexly interactive, tightly coupled one in cell 2 of the interactive/coupling chart (fig. 9.1). Perrow states that complex but loosely coupled system (cell 4, such as universities) are best decentralized.  Linear and tightly coupled systems (cell 1, such as pharmaceutical plants) are best centralized. Linear and loosely coupled systems (cell 3, most manufacturing) can be either. But complex and tightly coupled systems (cell 2, including nuclear power) can be neither, the requirements for handling failures in these systems are contradictory. Systems with interactive complexity (cells 2 and 4) will produce unexpected interactions among multiple failures, but they need not damage the subsystem or the system as a whole. Accidents will be avoided if the system is also loosely coupled, (cell 4, universities) because loose coupling give time, resources, and alternative paths to cope with the disturbance and limit the problem impact. But in order to make use of these advantages of loose coupling, those at the point of disturbance must be free to interpret the situation and take corrective action. Since the disturbances are generally likely to be experienced first by the operators, which include first line supervisors and other on-duty personnel such as technicians and maintenance, this mean the system should be decentralized. These personnel have two tasks, analyzing the situation and acting so as to prevent the propagation of errors. Unexpected and incomprehensive interactions will not allow immediate analysis of the cause of the accident, but given the slack in loosely coupled systems, this is not a problem. Even though the system’s state is unexpected and mysterious the operators can ask, what might happen next and take action to prevent interaction with other subsystems. Organizational theorists generally find that complex systems are best decentralized and tight coupling systems are best centralized.

Perrow said, “ that catastrophes send us warning signals, we have misread these signals too often, reinterpreting them to fit our preconceptions. Better training alone will not solve the problem, or more gadgets, or promises that it won’t happen again. Worse yet, we may accept the preconception that military superiority and private profits are worth the risks.
 
 
 
 
 
 


Comments and suggestions e-mail to Sergey D.