For IEEE Members

Ieee spectrum, follow ieee spectrum, support ieee spectrum, enjoy more free content and benefits by creating an account, saving articles to read later requires an ieee spectrum account, the institute content is only available for members, downloading full pdf issues is exclusive for ieee members, downloading this e-book is exclusive for ieee members, access to spectrum 's digital edition is exclusive for ieee members, following topics is a feature exclusive for ieee members, adding your response to an article requires an ieee spectrum account, create an account to access more content and features on ieee spectrum , including the ability to save articles to read later, download spectrum collections, and participate in conversations with readers and editors. for more exclusive content and features, consider joining ieee ., join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of spectrum’s articles, archives, pdf downloads, and other benefits. learn more →, join the world’s largest professional organization devoted to engineering and applied sciences and get access to this e-book plus all of ieee spectrum’s articles, archives, pdf downloads, and other benefits. learn more →, access thousands of articles — completely free, create an account and get exclusive content and features: save articles, download collections, and talk to tech insiders — all free for full access and benefits, join ieee as a paying member., the challenger disaster: a case of subjective engineering, from the archives: nasa’s resistance to probabilistic risk analysis contributed to the challenger disaster.

Illustration: Barry Ross

Editor’s Note: Today is the 30 th  anniversary of the loss of the space shuttle Challenger, which was destroyed 73 seconds in its flight, killing all onboard. To mark the anniversary, IEEE Spectrum is republishing this seminal article which first appeared in June 1989 as part of a special report on risk. The article has been widely cited in both histories of the space program and in analyses of engineering risk management.

“Statistics don’t count for anything,” declared Will Willoughby, the National Aeronautics and Space Administration’s former head of reliability and safety during the Apollo moon landing program. “They have no place in engineering anywhere.” Now director of reliability management and quality assurance for the U.S. Navy, Washington, D.C., he still holds that risk is minimized not by statistical test programs, but by “attention taken in design, where it belongs.” His design-­oriented view prevailed in NASA in the 1970s, when the space shuttle was designed and built by many of the engineers who had worked on the Apollo program.

“The real value of probabilistic risk analysis is in understanding the system and its vulnerabilities,” said Benjamin Buchbinder, manager of NASA’s two-year-old risk management program. He maintains that probabilistic risk analysis can go beyond design-oriented qualitative techniques in looking at the interactions of subsystems, ascertaining the effects of human activity and environmental conditions, and detecting common-cause failures.

NASA started experimenting with this program in response to the Jan. 28, 1986, Challenger accident that killed seven astronauts . The program’s goals are to establish a policy on risk management and to conduct risk assessments independent of normal engineering analyses. But success is slow because of past official policy that favored “engineering judgment” over “probability numbers,” resulting in NASA’s failure to collect the type of statistical test and flight data useful for quantitative risk assessment.

This Catch 22–the agency lacks appropriate statistical data because it did not believe in the technique requiring the data, so it did not gather the relevant data–is one example of how an organization’s underlying culture and explicit policy can affect the overall reliability of the projects it undertakes.

External forces such as politics further shape an organization’s response. Whereas the Apollo program was widely supported by the President and the U.S. Congress and had all the money it needed , the shuttle program was strongly criticized and underbudgeted from the beginning. Political pressures, coupled with the lack of hard numerical data, led to differences of more than three orders of magnitude in the few quantitative estimates of a shuttle launch failure that NASA was required by law to conduct.

Some observers still worry that, despite NASA’s late adoption of quantitative risk assessment, its internal culture and its fear of political opposition may be pushing it to repeat dangerous errors of the shuttle program in the new space station program.

Basic Facts

System: National Space Transportation System (NSTS)—the space shuttle

Risk assessments conducted during design and operation: preliminary hazards analysis; failure modes and effects analysis with critical items list; various safety assessments, all qualitative at the system level, but with quantitative analyses conducted for specific subsystems.

Worst failure: In the January 1986 Challenger accident, primary and secondary O-rings in the field joint of the right solid-fuel rocket booster were burnt through by hot gases.

Consequences: loss of $3 billion vehicle and crew.

Predictability: long history of erosion in O-rings, not envisaged in the original design.

Causes: inadequate original design (booster joint rotated farther open than intended); faulty judgment (managers decided to launch despite record low temperatures and ice on launch pad); possible unanticipated external events (severe wind shear may have been a contributing factor).

Lessons learned: in design, to use probabilistic risk assessment more in evaluating and assigning priorities to risks; in operation, to establish certain launch commit criteria that cannot be waived by anyone.

Other outcomes: redesign of booster joint and other shuttle subsystems that also had a high level of risk or unanticipated failures; reassessment of critical items.

NASA’s preference for a design approach to reliability to the exclusion of quantitative risk analysis was strengthened by a negative early brush with the field. According to Haggai Cohen, who during the Apollo days was NASA’s deputy chief engineer, NASA contracted with General Electric Co. in Daytona Beach, Fla., to do a “full numerical PRA [probabilistic risk assessment]” to assess the likelihood of success in landing a man on the moon and returning him safely to earth. The GE study indicated the chance of success was “less than 5 percent.” When the NASA Administrator was presented with the results, he felt that if made public, “the numbers could do irreparable harm, and he disbanded the effort,” Cohen said. “We studiously stayed away from [numerical risk assessment] as a result.”

“That’s when we threw all that garbage out and got down to work,” Willoughby agreed. The study’s proponents, he said, contended “ ‘you build up confidence by statistical test programs. ’ We said, ‘No, go fly a kite, we’ll build up confidence by design.’ Testing gives you only a snapshot under particular conditions. Reality may not give you the same set of circumstances, and you can be lulled into a false sense of security or insecurity.”

As a result, NASA adopted qualitative failure modes and effects analysis (FMEA) as its principal means of identifying design features whose worst-case failure could lead to a catastrophe. The worst cases were ranked as Criticality 1 if they threatened the life of the crew members or the existence of the vehicle; Criticality 2 if they threatened the mission; and Criticality 3 for anything less. An R designated a redundant system [see “How NASA determined shuttle risk,”]. Quantitative techniques were limited to calculating the probability of the occurrence of an individual failure mode “if we had to present a rationale on how to live with a single failure point,” Cohen explained.

The politics of risk

By the late 1960s and early 1970s the space shuttle was being portrayed as a reusable airliner capable of carrying 15-ton payloads into orbit and 5-ton payloads back to earth. Shuttle astronauts would wear shirtsleeves during takeoff and landing instead of the bulky spacesuits of the Gemini and Apollo days. And eventually the shuttle would carry just plain folks: non-astronaut scientists, politicians, schoolteachers, and journalists.

NASA documents show that the airline vision also applied to risk. For example, in the 1969 NASA Space Shuttle Task Group Report , the authors wrote: “It is desirable that the vehicle configuration provide for crew/passenger safety in a manner and to the degree as provided in present day commercial jet aircraft.”

Statistically an airliner is the least risky form of transportation, which implies high reliability. And in the early 1970s, when President Richard M. Nixon, Congress, and the Office of Management and Budget (OMB) were all skeptical of the shuttle, proving high reliability was crucial to the program’s continued funding.

OMB even directed NASA to hire an outside contractor to do an economic analysis of how the shuttle compared with other launch systems for cost-effectiveness, observed John M. Logsdon, director of the graduate program in science, technology, and public policy at George Washington University in Washington, D.C. “No previous space programme had been subject to independent professional economic evaluation,” Logsdon wrote in the journal Space Policy in May 1986. “It forced NASA into a belief that it had to propose a Shuttle that could launch all foreseeable payloads ... [and] would be less expensive than alternative launch systems” and that, indeed, would supplant all expendable rockets. It also was politically necessary to show that the shuttle would be cheap and routine, rather than large and risky, with respect to both technology and cost, Logsdon pointed out.

Amid such political unpopularity, which threatened the program’s very existence, “some NASA people began to confuse desire with reality,” said Adelbert Tischler, retired NASA director of launch vehicles and propulsion. “One result was to assess risk in terms of what was thought acceptable without regard for verifying the assessment.” He added: “Note that under such circumstances real risk management is shut out.”

‘Disregarding data’

By the early 1980s many figures were being quoted for the overall risk to the shuttle, with estimates of a catastrophic failure ranging from less than 1 chance in 100 to 1 chance in 100 000. “The higher figures [1 in 100] come from working engineers, and the very low figures [1 in 100 000] from management,” wrote physicist Richard P. Feynman in his appendix “Personal Observations on Reliability of Shuttle” to the 1986 Report of the Presidential Commission on the Space Shuttle Challenger Accident .

The probabilities originated in a series of quantitative risk assessments NASA was required to conduct by the Interagency Nuclear Safety Review Panel (INSRP), in anticipation of the launch of the Galileo spacecraft on its voyage to Jupiter, originally scheduled for the early 1980s. Galileo was powered by a plutonium-­fueled radioisotope thermoelectric generator, and Presidential Directive/NSC-25 ruled that either the U.S. President or the director of the office of science and technology policy must examine the safety of any launch of nuclear material before approving it. The INSRP (which consisted of representatives of NASA as the launching agency, the Department of Energy, which manages nuclear devices, and the Department of Defense, whose Air Force manages range safety at launch) was charged with ascertaining the quantitative risks of a catastrophic launch dispersing the radioactive poison into the atmosphere. There were a number of studies because the upper stage for boosting Galileo into interplanetary space was reconfigured several times.

The first study was conducted by the J. H. Wiggins Co. of Redondo Beach, Calif., and published in three volumes between 1979 and 1982. It put the overall risk of losing a shuttle with its spacecraft payload during launch at between 1 chance in 1000 and 1 in 10, 000. The greatest risk was posed by the solid-fuel rocket boosters (SRBs). The Wiggins author noted that the history of other solid-fuel rockets showed them as undergoing catastrophic launches somewhere between 1 time in 59 and 1 time in 34, but that the study’s contract overseers, the Space Shuttle Range Safety Ad Hoc Committee, made an “engineering judgment” and “decided that a reduction in the failure probability estimate was warranted for the Space Shuttle SRBs” because “the historical data includes motors developed 10 to 20 years ago.” The Ad Hoc Committee therefore “decided to assume a failure probability of 1 x 10 -3 for each SRB. “ In addition, the Wiggins author pointed out, “it was decided by the Ad-Hoc Committee that a second probability should be considered… which is one order of magnitude less” or 1 in 10, 000, “justified due to unique improvements made in the design and manufacturing process used for these motors to achieve man rating.”

In 1983 a second study was conducted by Teledyne Energy Systems Inc., Timonium, Md., for the Air Force Weapons Laboratory at Kirtland Air Force Base, N.M. It described the Wiggins analysis as consisting of “an interesting presentation of launch data from several Navy, Air Force, and NASA missile programs and the disregarding of that data and arbitrary assignment of risk levels apparently per sponsor direction” with “no quantita­tive justification at all.” After reanalyzing the data, the Teledyne authors concluded that the boosters ’ track record “suggest[s] a failure rate of around one-in-a-hundred.”

When risk analysis isn’t

NASA conducted its own internal safety analysis for Galileo, which was published in 1985 by the Johnson Space Center. The Johnson authors went through failure mode worksheets assigning probability levels. A fracture in the solid-rocket motor case or case joints —similar to the accident that destroyed Challenger —was assigned a probability level of 2; which a separate table defined as corresponding to a chance of 1 in 100, 000 and described as “remote,” or “so unlikely, it can be assumed that this hazard will not be experienced.”

The Johnson authors’ value of 1 in 100 000 implied, as Feynman spelled out, that “one could put a Shuttle up each day for 300 years expecting to lose only one.” Yet even after the Challenger accident, NASA’s chief engineer Milton Silveira, in a hearing on the Galileo thermonuclear generator held March 4, 1986, before the U.S. House of Representatives Committee on Science and Technology, said: “We think that using a number like 10 to the minus 3, as suggested, is probably a little pessimistic.” In his view, the actual risk “would be 10 to the minus 5, and that is our design objective.” When asked how the number was deduced, Silveira replied, “We came to those probabilities based on engineering judgment in review of the design rather than taking a statistical data base, because we didn’t feel we had that.”

After the Challenger accident, the 1986 presidential commission learned the O-rings in the field joints of the shuttle’s solid-­fuel rocket boosters had a history of damage correlated with low air temperature at launch. So the commission repeatedly asked the witnesses it called to hearings why systematic temperature­-correlation data had been unavailable before launch.

NASA’s “management methodology” for collection of data and determination of risk was laid out in NASA’s 1985 safety analysis for Galileo. The Johnson space center authors explained: “Early in the program it was decided not to use reliability (or probability) numbers in the design of the Shuttle” because the magnitude of testing required to statistically verify the numerical predictions “is not considered practical.” Furthermore, they noted, “experience has shown that with the safety, reliability, and quality assurance requirements imposed on manned space­flight contractors, standard failure rate data are pessimistic.”

“In lieu of using probability numbers, the NSTS [National Space Transportation System] relies on engineering judgment using rigid and well-documented design, configuration, safety, reliability, and quality assurance controls,” the Johnson authors continued. This outlook determined the data NASA managers required engineers to collect. For example, no “lapsed-time indicators” were kept on shuttle components, subsystems, and systems, although “a fairly accurate estimate of time and/or cycles could be derived,” the Johnson authors added.

One reason was economic. According to George Rodney, NASA’s associate administrator of safety, reliability, maintain­ability and quality assurance, it is not hard to get time and cycle data, “but it’s expensive and a big bookkeeping problem.”

Another reason was NASA’s “normal program development: you don’t continue to take data; you certify the components and get on with it,” said Rodney’s deputy, James Ehl. “People think that since we’ve flown 28 times, then we have 28 times as much data, but we don ’t. We have maybe three or four tests from the first development flights.”

In addition, Rodney noted, “For everyone in NASA that’s a big PRA [probabilistic risk assessment] seller, I can find you 10 that are equally convinced that PRA is oversold…  [They] are so dubious of its importance that they won ’t convince themselves that the end product is worthwhile.”

Risk and the organizational culture

One reason NASA has so strongly resisted probabilistic risk analysis may be the fact that “PRA runs against all traditions of engineering, where you handle reliability by safety factors,” said Elisabeth Paté-Cornell, associate professor in the department of industrial engineering and engineering management at Stanford University in California, who is now studying organizational factors and risk assessment in NASA. In addition, with NASA’s strong pride in design, PRA may be “perceived as an insult to their capabilities, that the system they ’ve designed is not 100 percent perfect and absolutely safe,” she added. Thus, the character of an organization influences the reliability and failure of the systems it builds because its structure, policy, and culture determine the priorities, incentives, and communication paths for the engineers and managers doing the work, she said.

“Part of the problem is getting the engineers to understand that they are using subjective methods for determining risk, because they don’t like to admit that,” said Ray A. Williamson, senior associate at the U.S. Congress Office of Technology Assessment in Washington, D.C. “Yet they talk in terms of sounding objective and fool themselves into thinking they are being objective.”

“It’s not that simple,” Buchbinder said. “A probabilistic way of thinking is not something that most people are attuned to. We don’t know what will happen precisely each time. We can only say what is likely to happen a certain percentage of the time.” Unless engineers and managers become familiar with probability theory, they don ’t know what to make of “large uncertainties that represent the state of current knowledge,” he said. “And that is no comfort to the poor decision-maker who wants a simple answer to the question, ‘Is this system safe enough? ’”

As an example of how the “mindset” in the agency is now changing in favor of “a willingness to explore other things,” Buchbinder cited the new risk management program, the workshops it has been holding to train engineers and others in quantitative risk assessment techniques, and a new management instruction policy that requires NASA to “provide disciplined and documented management of risks throughout program life cycles.”

Hidden risks to the space station

NASA is now at work on its big project for the 1990s: a space station, projected to cost $30 billion and to be assembled in orbit , 220 nautical miles above the earth, from modules carried aloft in some two dozen shuttle launches. A National Research Council committee evaluated the space station program and concluded in a study in September 1987: “If the probability of damaging an Orbiter beyond repair on any single Shuttle flight is 1 percent— the demonstrated rate is now one loss in 25 launches, or 4 percent —the probability of losing an Orbiter before [the space station's first phase] is complete is about 60 percent.”

The probability is within the right order of magnitude, to judge by the latest INSRP-mandated study completed in December for Buchbinder’s group in NASA by Planning Research Corp., McLean, Va. The study, which reevaluates the risk of the long-delayed launch of the Galileo spacecraft on its voyage to Jupiter, now scheduled for later this year, estimates the chance of losing a shuttle from launch through payload deployment at 1 in 78, or between 1 and 2 percent, with an uncertainty of a factor of 2.

Those figures frighten some observers because of the dire con­sequences of losing part of the space station. “The space station has no redundancy — no backup parts,” said Jerry Grey, director of science and technology policy for the American Institute of Aeronautics and Astronautics in Washington, D.C.

The worst case would be loss of the shuttle carrying the logistics module, which is needed for reboost, Grey pointed out. The space station’s orbit will subject it to atmospheric drag such that, if not periodically boosted higher, it will drift downward and with in eight months plunge back to earth and be destroyed, as was the Skylab space station in July 1979. “If you lost the shuttle with the logistics module, you don ’t have a spare, and you can ’t build one in eight months,” Grey said, “so you may lose not only that one payload, but also whatever was put up there earlier.”

Why are there no backup parts? “Politically the space station is under fire [from the U.S. Congress] all the time because NASA hasn’t done an adequate job of justifying it,” said Grey. “NASA is apprehensive that Congress might cancel the entire program”— and so is trying to trim costs as much as possible.

Grey estimated that spares of the crucial modules might add another 10 percent to the space station’s cost. “But NASA is not willing to go to bat for that extra because they ’re unwilling to take the political risk,” he said— a replay, he fears, of NASA’s response to the political negativism over the shuttle in the 1970s.

The NRC space station committee warned: “It is dangerous and misleading to assume there will be no losses and thus fail to plan for such events.”

“Let’s face it, space is a risky business,” commented former Apollo safety officer Cohen. “I always considered every launch a barely controlled explosion.”

“The real problem is: whatever the numbers are, acceptance of that risk and planning for it is what needs to be done,” Grey said. He fears that “NASA doesn’t do that yet.”

In addition to the sources named in the text, the authors would like to acknowledge the information and insights afforded by the following: E. William Colglazier (director of the Energy, Environment and Resources Center at the University of Tennessee, Knoxville) and Robert K. Weatherwax (president of Sierra Energy & Risk Assessment Inc., Roseville, Calif.), the two authors of the 1983 Teledyne/Air Force Weapons Laboratory study; Larry Crawford, director of reliability and trends analysis at NASA head­quarters in Washington, D.C.; Joseph R. Fragola, vice president, Science Applications International Corp., New York City; Byron Peter Leonard, president, L Systems Inc., El Segundo, Calif.; George E. Mueller, former NASA associate administrator for manned spaceflight; and Marcia Smith, specialist in aerospace policy, Congressional Research Service, Washington, D.C.

This article first appeared in print in June 1989 as part of the special report “Managing Risk In Large Complex Systems” under the title “The space shuttle: a case of subjective engineering.” 

How NASA Determined Shuttle Risk

At the start of the space shuttle’s design, the National Aeronautics and Space Administration defined risk as “the chance (qualitative) of loss of personnel capability, toss of system, or damage to or loss of equipment or property.” NASA accordingly relied on several techniques for determining reliability and potential design problems, concluded the U.S. National Research Council’s Committee on Shuttle Criticality Review and Hazard Analysis Audit in its January 1988 report Post-Challenger Evaluation of Space Shuttle Risk Assessment and Management . But, the report noted, the analyses did “not address the relative probabilities of a particular hazardous condition arising from failure modes, human errors, or external situations,” so did not measure risk.

A failure modes and effects analysis (FMEA) was the heart of NASA’s effort to ensure reliability, the NRC report noted. An FMEA, carried out by the contractor building each shuttle element or subsystem, was performed on all flight hardware and on ground support equipment that interfaced with flight hard ware. Its chief purpose was to i dentify hardware critical to the performance and safety of the mission.

Items that did not meet certain design, reliability and safety requirements specified by NASA’s top management and whose failure could threaten the toss of crew, vehicle, or mis­sion, made up a critical i tems list (CIL).

Although the FMEA/CIL was first viewed as a design tool, NASA now uses it during operations and management as well, to analyze problems, assess whether corrective actions are effective, identify where and when inspection an d maintenance are needed, and reveal trends in failures.

Second, NASA conducted hazards analyses, performed jointly by shuttle engineers and by NASA’s safety and operations organizations. They made use of the FMEA/ CIL, various design reviews, safety analyses, and other studies. They considered not only the failure modes identified In the FMEA, but also other threats p osed by the mission activities, crew­machine interfaces, and the environment. After hazards and their causes were identified, NASA engineers and managers had to make one of three decisions: to eliminate the cause of each hazard, to control the cause if it could not be eliminated, or to accept the hazards that could not be controlled.

NASA also conducted an element i nterface functional analysis (EIFA) to look at the shuttle more nearly as a com plete system. Both the FMEA and the hazards analyses concentrated only on i ndividual elements of the shuttle: the space shuttle’s main engines i n the orbiter, the rest of the orbiter, the external tank, and the solid fuel rocket boosters. The EIFA assessed hazards at the mating of the elements.

Also to examine the shuttle as a system, NASA conducted a one-time critical functions assessment in 1978, which searched for multiple and cascading failures. The information from all these studies fed one way into an overall mission safety assessment.

The NRC committee had several criticisms. In practice, the FMEA was the sole basis for some engineering change decisions and all engineering waivers and rationales for re taining certain high-risk design features. However, the NRC report noted, hazard analyses for some important, high-risk subsystems “were not updated for years at a time even though design changes had occurred or dangerous failures were experienced.” On one procedural flow chart, the report noted, “the ‘Hazard Analysis As Required ’ is a dead-end box with inputs but no output with respect to waiver approval decisions.”

The NAC committee concluded that “the isolation of the hazard analysis within NASA’s risk assessment and management process to date can be seen as reflecting the past weakness of the entire safety organization” —T.E.B. and K.E.

Robert Kahn: The Great Interconnector

Video friday: spacehopper, led touchscreen is also a pv charger.

Home

You are here

Engineering ethics case study: the challenger disaster.

Course Introduction:

This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given. The disaster itself is chronicled through NASA photographs. Next the decision-making process—especially the discussions occurring during the teleconference held on the evening before the launch—is described. Direct quotations from engineers interviewed after the disaster are used to illustrate the ambiguities of the data and the pressures that the decision-makers faced in the months and hours preceding the launch. The course culminates in an extended treatment of six ethical issues raised by Challenger.

Learning Objectives:

This course teaches the following specific knowledge and skills: • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice • Shuttle hardware involved in the disaster • Decisions made in the period preceding the launch • Ethical issue: NASA giving first priority to public safety over other concerns • Ethical issue: the contractor giving first priority to public safety over other concerns • Ethical issue: whistle blowing • Ethical issue: informed consent • Ethical issue: ownership of company records • Ethical issue: how the public perceives that an engineering decision involves an ethical violation

This course includes a true-false/multiple-choice quiz at the end, which is designed to highlight the general concepts of the course material.

This course is intended for all types of engineers.

The course materials are based on the pdf file, “Engineering Ethics Case Study: The Challenger Disaster.”

Course Material

[PDF, 1.37 MB]

Available Courses

  • On SALE Specials
  • Packaged Specials
  • All Courses
  • Advanced Florida Building Code
  • Architectural Engineering
  • Geotechnical
  • Green Building & Sustainability
  • Laws and Rules, Ethics
  • Mechanical, HVAC
  • Plumbing & Fire Protection
  • Video Courses

Unregistered User

  • State Approvals
  • State Requirements
  • Join Our Mailing Lists
  • Shopping cart

User Login - Sign Up

  • Create new account
  • Request new password

Click seal to view our Continuing Education Certifications

Continuing Education Certifications

Email:

  • PDH Online Courses
  • PDH Discount Packages
  • Ethics, Laws and Rules
  • PDH Live Webinars
  • PDH Video Presentations
  • PDH Interactive Courses
  • OH Timed & Monitored
  • Free 1 PDH Course
  • CED Newsletter Subscription
  • Corporate Enrollment Programs
  • PE Referral
  • How It Works
  • User Registration
  • State Requirements
  • 100% Satisfaction Guarantee
  • Corporate Profile
  • Approved Sponsor
  • Accepted Courses
  • Course Provider Biographies
  • Course Provider Registration
  • Course Provider Qualifications
  • How to Provide a Course
  • Course Provider Earnings
  • Course Provider Search

Online Courses

  • Biomass Fuels
  • Hydro Power
  • Renewable Sources
  • Solar Energy
  • Wind Energy
  • ADA Requirements
  • Assessment and Planning
  • Building Design Fundamentals
  • Facility Design
  • Business Planning
  • Conflict Resolution
  • Finance and Economics
  • Management and Leadership
  • Marketing and Communication
  • Performance and Productivity
  • Public Influence
  • Self Development
  • Chemical Fundamentals
  • Cooling Water Treatment
  • Disinfection Systems
  • Industrial Water Treatment
  • Wastewater Collection
  • Wastewater Treatment
  • Water Desalination
  • Water Treatment
  • Area Drainage
  • Erosion Control
  • Pavement Design
  • Site Development
  • Stormwater Management
  • Water Supply
  • AC/DC Currents
  • Circuit Measurements
  • Communication Systems
  • Control Systems
  • Electrical Distribution
  • Electrical Equipment
  • Electrical Fundamentals
  • Electrical Substations
  • Equipment Testing
  • Motors and Generators
  • Protection Systems
  • Voltage Regulation
  • Contract Administration
  • Engineering MegaProjects
  • Engineering Patents
  • Maintenance Management
  • Plant Management
  • Project Management
  • Reliability Management
  • Value Engineering
  • Air Pollution
  • Building Contaminants
  • Environmental Impacts
  • Hazardous Waste
  • Indoor Environmental Quality
  • Site Remediation/Protection
  • Solid Waste
  • Engineering Catastrophes
  • Ethics: General
  • Ethics: State Specific
  • Deep Foundations
  • Retaining Structures
  • Rock Mechanics
  • Shallow Foundations
  • Slope Stability
  • Soil Mechanics
  • Special Conditions
  • HVAC Applications
  • HVAC Distribution
  • HVAC for Facilities
  • HVAC Fundamentals
  • Systems and Equipment
  • Ventilation
  • Industrial Systems
  • Nuclear Systems
  • Power Plants
  • Steam Systems
  • Cathodic Protection
  • Coating and Lubrication
  • Material Applications
  • Material Properties
  • Thermal Insulation
  • Compressed Air Systems
  • Fire Protection
  • Gears and Bearings
  • Mechanical Fundamentals
  • Piping Systems
  • Plumbing Systems
  • Pumping Systems
  • Thermodynamics
  • Vibration Control
  • Mineral Resources
  • Oil and Gas
  • Underground Storage Tanks
  • CBR Protection
  • Chemical Hazards
  • Electrical Hazards
  • Excavation Safety
  • Fall Protection
  • Safety Fundamentals
  • Bridge Design
  • Bridge Inspections
  • Bridge Maintenance
  • Flood Construction
  • High-Wind Construction
  • Seismic Construction
  • Structural Analysis
  • Energy Efficient Homes
  • Energy Management
  • Green Roofs and Exteriors
  • High Performance Buildings
  • HVAC System Improvements
  • Lighting Assessments
  • MEP Energy Savings
  • Lane Use Management
  • Roadway Design
  • Roundabout Design
  • Traffic Control
  • Traffic Signals
  • Transportation Planning
  • Transportation Safety

Discount Packages

State Specific

  • District of Columbia
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • South Carolina
  • South Dakota
  • West Virginia

General Packages

  • Alt./Ren. Energy
  • Building Design
  • Chemical Engineering
  • Civil Engineering
  • Construction Safety
  • Electrical Engineering
  • Environmental Engineering
  • Geotechnical Engineering
  • HVAC Engineering
  • Hydrology/Hydraulics
  • Materials Engineering
  • Mechanical Engineering
  • Petroleum Engineering
  • Stormwater Engineering
  • Structural Engineering
  • Transportation Engineering
  • Wastewater Engineering

Live Webinars

  • All Webinars
  • Engineering Management
  • Ethics and Catastrophies
  • Material Engineering

Video Presentations

  • Business Skills
  • Industrial Engineering
  • Sustainable Design

Interactive Courses

  • Safety Engineering

OH Timed & Monitored

Courses by provider, phone: 1-877-322-5800, e-mail: [email protected], engineering ethics case study: the challenger disaster.

Engineering Ethics Case Study: The Challenger Disaster

This online engieering PDH course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are presented. The disaster itself is chronicled through NASA photographs. Next the decision-making process, especially the discussions occurring during the teleconference held on the evening before the launch, is described. Direct quotations from engineers interviewed after the disaster are used to illustrate the ambiguities of the data and the pressures that the decision-makers faced in the months and hours preceding the launch. The course culminates in an extended treatment of six ethical issues raised by Challenger.

This 3 PDH online  course is intended for all engineers who are interested in gaining a better understanding about the ethical issues that lead to the Challenger disaster and how they could have been avoided.

This PE continuing education course is intended to provide you with the following specific knowledge and skills:

  • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice
  • Shuttle hardware involved in the disaster
  • Decisions made in the period preceding the launch
  • Ethical issue: NASA giving first priority to public safety over other concerns
  • Ethical issue: the contractor giving first priority to public safety over other concerns
  • Ethical issue: whistle blowing
  • Ethical issue: informed consent
  • Ethical issue: ownership of company records
  • Ethical issue: how the public perceives that an engineering decision involves an ethical violation

In this professional engineering CEU course, you need to review the course document titled, "Engineering Ethics Case Study: The Challenger Disaster".

Upon successful completion of the quiz, print your Certificate of Completion instantly. (Note: if you are paying by check or money order, you will be able to print it after we receive your payment.) For your convenience, we will also email it to you. Please note that you can log in to your account at any time to access and print your Certificate of Completion.

  • Privacy Policy
  • Terms of Use

Print this page

Course Outline

This three-hour online course begins by presenting the minimum technical details needed to understand the physical cause of the Shuttle failure.  The failure itself is illustrated through NASA photographs.  Next the decision-making process—especially the discussions occurring during the teleconference held on the evening before the launch—is described.  Direct quotations from engineers interviewed after the disaster are used to illustrate the ambiguities of the data and the pressures that the decision-makers faced in the period preceding the launch.  The course ends by presenting ethical issues raised by Challenger. 

This course includes a multiple-choice quiz at the end, which is designed to enhance the understanding of the course materials.

Learning Objective

This course teaches the following specific knowledge and skills:

  • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice;
  • Shuttle hardware involved in the disaster;
  • Decisions made in the period preceding the launch;
  • Ethical issue: NASA giving first priority to public safety over other concerns;
  • Ethical issue: the contractor giving first priority to public safety over other concerns;
  • Ethical issue: whistle blowing;
  • Ethical issue: informed consent;
  • Ethical issue: ownership of company records; and
  • Ethical issue: how the public perceives that an engineering decision involves an ethical violation.

Intended Audience

This course is intended for all types of engineers.

Benefit to Attendees

An attendee of this course will be familiar with various types of ethical issues that can arise in a large-scale, high-profile project, and with the difficulty of distinguishing unethical behavior from technical misjudgment.  

Course Introduction

On January 28, 1986, the NASA Space Shuttle Challenger burst into a ball of flame 73 seconds after take-off, leading to the death of all seven people on board.  Some months later, a commission appointed by the President to investigate the disaster declared that the cause was the failure of a seal in one of the solid rocket boosters.  The observations and accusations of the Commission and other critics at the time taken together became the standard interpretation of the cause of the Challenger disaster.  This interpretation routinely appears in popular articles and books about engineering, management, and ethical issues when Challenger is cited as a case study.  But the interpretation ignores much of the history of how NASA and the contractor’s engineers had actually recognized and dealt with the seal problems in advance of the disaster. 

Course Content

The course materials are based on the article, “The Challenger Disaster and Engineering Ethics” by Mark P. Rossow:

Engineering Ethics Case Study: The Challenger

Please click on the above underlined hypertext to view, download or print the document for your study. Because of the large file size, we recommend that you first save the file to your computer by right clicking the mouse and choosing "Save Target As ...", and then open the file in Adobe Acrobat Reader. If you still experience any difficulty in downloading or opening this file, you may need to close some applications or reboot your computer to free up some memory.

Course Summary

When the details of what Challenger engineers actually thought and did are closely examined, the conventional explanation of the disaster is not supported.  An alternative explanation is that Space shuttle technology was new, complex, and hazardous.  Decision-makers thought that they understood the risks and that the risks were acceptable.  The decision-makers were mistaken.  But they were not villainous.

Related Links

For additional technical information related to this subject, please visit the following websites or web pages:

Once you finish studying the above course content , you need to take a quiz to obtain the PDH credits .

Advertisement

Advertisement

The Boeing 737 MAX: Lessons for Engineering Ethics

  • Original Research/Scholarship
  • Published: 10 July 2020
  • Volume 26 , pages 2957–2974, ( 2020 )

Cite this article

  • Joseph Herkert 1 ,
  • Jason Borenstein 2 &
  • Keith Miller 3  

105k Accesses

58 Citations

108 Altmetric

11 Mentions

Explore all metrics

The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and subsequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on Boeing’s practices and culture. Explanations for the crashes include: design flaws within the MAX’s new flight control software system designed to prevent stalls; internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s lack of transparency about the new software; and the lack of adequate monitoring of Boeing by the FAA, especially during the certification of the MAX and following the first crash. While these and other factors have been the subject of numerous government reports and investigative journalism articles, little to date has been written on the ethical significance of the accidents, in particular the ethical responsibilities of the engineers at Boeing and the FAA involved in designing and certifying the MAX. Lessons learned from this case include the need to strengthen the voice of engineers within large organizations. There is also the need for greater involvement of professional engineering societies in ethics-related activities and for broader focus on moral courage in engineering ethics education.

Similar content being viewed by others

the challenger case study engineering ethics

Is AI recruiting (un)ethical? A human rights perspective on the use of AI for hiring

Anna Lena Hunkenschroer & Alexander Kriebitz

the challenger case study engineering ethics

Interventions to improve team effectiveness within health care: a systematic review of the past decade

Martina Buljac-Samardzic, Kirti D. Doekhie & Jeroen D. H. van Wijngaarden

the challenger case study engineering ethics

Cultivating the Ethical Repertoires of Behavior Analysts: Prevention of Common Violations

Lisa N. Britton, Amy A. Crye & Linda K. Haymes

Avoid common mistakes on your manuscript.

Introduction

In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes after takeoff; these two accidents claimed nearly 350 lives. After the second incident, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated version of the 737 workhorse that first began flying in the 1960s. The crashes were precipitated by a failure of an Angle of Attack (AOA) sensor and the subsequent activation of new flight control software, the Maneuvering Characteristics Augmentation System (MCAS). The MCAS software was intended to compensate for changes in the size and placement of the engines on the MAX as compared to prior versions of the 737. The existence of the software, designed to prevent a stall due to the reconfiguration of the engines, was not disclosed to pilots until after the first crash. Even after that tragic incident, pilots were not required to undergo simulation training on the 737 MAX.

In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA). While the case is ongoing and at this writing, the 737 MAX has yet to be recertified for flight, our analysis is based on numerous government reports and detailed news accounts currently available. We conclude with a discussion of specific lessons for engineers and engineering educators regarding engineering ethics.

Overview of 737 MAX History and Crashes

In December 2010, Boeing’s primary competitor Airbus announced the A320neo family of jetliners, an update of their successful A320 narrow-body aircraft. The A320neo featured larger, more fuel-efficient engines. Boeing had been planning to introduce a totally new aircraft to replace its successful, but dated, 737 line of jets; yet to remain competitive with Airbus, Boeing instead announced in August 2011 the 737 MAX family, an update of the 737NG with similar engine upgrades to the A320neo and other improvements (Gelles et al. 2019 ). The 737 MAX, which entered service in May 2017, became Boeing’s fastest-selling airliner of all time with 5000 orders from over 100 airlines worldwide (Boeing n.d. a) (See Fig.  1 for timeline of 737 MAX key events).

figure 1

737 MAX timeline showing key events from 2010 to 2019

The 737 MAX had been in operation for over a year when on October 29, 2018, Lion Air flight JT610 crashed into the Java Sea 13 minutes after takeoff from Jakarta, Indonesia; all 189 passengers and crew on board died. Monitoring from the flight data recorder recovered from the wreckage indicated that MCAS, the software specifically designed for the MAX, forced the nose of the aircraft down 26 times in 10 minutes (Gates 2018 ). In October 2019, the Final Report of Indonesia’s Lion Air Accident Investigation was issued. The Report placed some of the blame on the pilots and maintenance crews but concluded that Boeing and the FAA were primarily responsible for the crash (Republic of Indonesia 2019 ).

MCAS was not identified in the original documentation/training for 737 MAX pilots (Glanz et al. 2019 ). But after the Lion Air crash, Boeing ( 2018 ) issued a Flight Crew Operations Manual Bulletin on November 6, 2018 containing procedures for responding to flight control problems due to possible erroneous AOA inputs. The next day the FAA ( 2018a ) issued an Emergency Airworthiness Directive on the same subject; however, the FAA did not ground the 737 MAX at that time. According to published reports, these notices were the first time that airline pilots learned of the existence of MCAS (e.g., Bushey 2019 ).

On March 20, 2019, about four months after the Lion Air crash, Ethiopian Airlines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Democratic Republic of Ethiopia 2019 ), issued in April 2019, indicated that the pilots followed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted after the Lion Air crash but could not control the plane (Ahmed et al. 2019 ). This was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020 ) issued in March 2020 that exonerated the pilots and airline, and placed blame for the accident on design flaws in the MAX (Marks and Dahir 2020 ). Following the second crash, the 737 MAX was grounded worldwide with the U.S., through the FAA, being the last country to act on March 13, 2019 (Kaplan et al. 2019 ).

Design Choices that Led to the Crashes

As noted above, with its belief that it must keep up with its main competitor, Airbus, Boeing elected to modify the latest generation of the 737 family, the 737NG, rather than design an entirely new aircraft. Yet this raised a significant engineering challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those employed on the A320neo, on the existing 737 airframe posed a serious design problem, because the 737 family was built closer to the ground than the Airbus A320. In order to provide appropriate ground clearance, the larger engines had to be mounted higher and farther forward on the wings than previous models of the 737 (see Fig.  2 ). This significantly changed the aerodynamics of the aircraft and created the possibility of a nose-up stall under certain flight conditions (Travis 2019 ; Glanz et al. 2019 ).

figure 2

(Image source: https://www.norebbo.com )

Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines mounted higher and more forward on the wing.

Boeing’s attempt to solve this problem involved incorporating MCAS as a software fix for the potential stall condition. The 737 was designed with two AOA sensors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would only use input from one of the plane’s two AOA sensors. If the single AOA sensor was triggered, MCAS would detect a dangerous nose-up condition and send a signal to the horizontal stabilizer located in the tail. Movement of the stabilizer would then force the plane’s tail up and the nose down (Travis 2019 ). In both the Lion Air and Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS (Gates 2018 ; Ahmed et al. 2019 ). Since the two crashes, Boeing has made adjustments to the MCAS, including that the system will rely on input from the two AOA sensors instead of just one. But still more problems with MCAS have been uncovered. For example, an indicator light that would alert pilots if the jet’s two AOA sensors disagreed, thought by Boeing to be standard on all MAX aircraft, would only operate as part of an optional equipment package that neither airline involved in the crashes purchased (Gelles and Kitroeff 2019a ).

Similar to its responses to previous accidents, Boeing has been reluctant to admit to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019 ). In the 737 MAX case, the company pointed to the pilots’ alleged inability to control the planes under stall conditions (Economy 2019 ). Following the Ethiopian Airlines crash, Boeing acknowledged for the first time that MCAS played a primary role in the crashes, while continuing to highlight that other factors, such as pilot error, were also involved (Hall and Goelz 2019 ). For example, on April 29, 2019, more than a month after the second crash, then Boeing CEO Dennis Muilenburg defended MCAS by stating:

We've confirmed that [the MCAS system] was designed per our standards, certified per our standards, and we're confident in that process. So, it operated according to those design and certification standards. So, we haven't seen a technical slip or gap in terms of the fundamental design and certification of the approach. (Economy 2019 )

The view that MCAS was not primarily at fault was supported within an article written by noted journalist and pilot William Langewiesche ( 2019 ). While not denying Boeing made serious mistakes, he placed ultimate blame on the use of inexperienced pilots by the two airlines involved in the crashes. Langewiesche suggested that the accidents resulted from the cost-cutting practices of the airlines and the lax regulatory environments in which they operated. He argued that more experienced pilots, despite their lack of information on MCAS, should have been able to take corrective action to control the planes using customary stall prevention procedures. Langewiesche ( 2019 ) concludes in his article that:

What we had in the two downed airplanes was a textbook failure of airmanship. In broad daylight, these pilots couldn’t decipher a variant of a simple runaway trim, and they ended up flying too fast at low altitude, neglecting to throttle back and leading their passengers over an aerodynamic edge into oblivion. They were the deciding factor here — not the MCAS, not the Max.

Others have taken a more critical view of MCAS, Boeing, and the FAA. These critics prominently include Captain Chesley “Sully” Sullenberger, who famously crash-landed an A320 in the Hudson River after bird strikes had knocked out both of the plane’s engines. Sullenberger responded directly to Langewiesche in a letter to the Editor:

… Langewiesche draws the conclusion that the pilots are primarily to blame for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this age-old aviation canard, Langewiesche minimizes the fatal design flaws and certification failures that precipitated those tragedies, and still pose a threat to the flying public. I have long stated, as he does note, that pilots must be capable of absolute mastery of the aircraft and the situation at all times, a concept pilots call airmanship. Inadequate pilot training and insufficient pilot experience are problems worldwide, but they do not excuse the fatally flawed design of the Maneuvering Characteristics Augmentation System (MCAS) that was a death trap.... (Sullenberger 2019 )

Noting that he is one of the few pilots to have encountered both accident sequences in a 737 MAX simulator, Sullenberger continued:

These emergencies did not present as a classic runaway stabilizer problem, but initially as ambiguous unreliable airspeed and altitude situations, masking MCAS. The MCAS design should never have been approved, not by Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullenberger 2019 )

In June 2019, Sullenberger noted in Congressional Testimony that “These crashes are demonstrable evidence that our current system of aircraft design and certification has failed us. These accidents should never have happened” (Benning and DiFurio 2019 ).

Others have agreed with Sullenberger’s assessment. Software developer and pilot Gregory Travis ( 2019 ) argues that Boeing’s design for the 737 MAX violated industry norms and that the company unwisely used software to compensate for inadequacies in the hardware design. Travis also contends that the existence of MCAS was not disclosed to pilots in order to preserve the fiction that the 737 MAX was just an update of earlier 737 models, which served as a way to circumvent the more stringent FAA certification requirements for a new airplane. Reports from government agencies seem to support this assessment, emphasizing the chaotic cockpit conditions created by MCAS and poor certification practices. The U.S. National Transportation Safety Board (NTSB) ( 2019 ) Safety Recommendations to the FAA in September 2019 indicated that Boeing underestimated the effect MCAS malfunction would have on the cockpit environment (Kitroeff 2019 , a , b ). The FAA Joint Authorities Technical Review ( 2019 ), which included international participation, issued its Final Report in October 2019. The Report faulted Boeing and FAA in MCAS certification (Koenig 2019 ).

Despite Boeing’s attempts to downplay the role of MCAS, it began to work on a fix for the system shortly after the Lion Air crash (Gates 2019 ). MCAS operation will now be based on inputs from both AOA sensors, instead of just one sensor, with a cockpit indicator light when the sensors disagree. In addition, MCAS will only be activated once for an AOA warning rather than multiple times. What follows is that the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s power will be limited in terms of how much it can move the stabilizer and manual override by the pilot will always be possible (Bellamy 2019 ; Boeing n.d. b; Gates 2019 ). For over a year after the Lion Air crash, Boeing held that pilot simulator training would not be required for the redesigned MCAS system. In January 2020, Boeing relented and recommended that pilot simulator training be required when the 737 MAX returns to service (Pasztor et al. 2020 ).

Boeing and the FAA

There is mounting evidence that Boeing, and the FAA as well, had warnings about the inadequacy of MCAS’s design, and about the lack of communication to pilots about its existence and functioning. In 2015, for example, an unnamed Boeing engineer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019 ). In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running rampant” (Gelles and Kitroeff 2019c ). Forkner subsequently came under federal investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and Schmidt 2020 ).

In December 2018, following the Lion Air Crash, the FAA ( 2018b ) conducted a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur in the expected fleet life of 45 years if the flight control issues were not addressed; this Risk Assessment was not publicly disclosed until Congressional hearings a year later in December 2019 (Arnold 2019 ). After the two crashes, a senior Boeing engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about management squelching of a system that might have uncovered errors in the AOA sensors. Ewbank has since publicly stated that “I was willing to stand up for safety and quality… Boeing management was more concerned with cost and schedule than safety or quality” (Kitroeff et al. 2019b ).

One factor in Boeing’s apparent reluctance to heed such warnings may be attributed to the seeming transformation of the company’s engineering and safety culture over time to a finance orientation beginning with Boeing’s merger with McDonnell–Douglas in 1997 (Tkacik 2019 ; Useem 2019 ). Critical changes after the merger included replacing many in Boeing’s top management, historically engineers, with business executives from McDonnell–Douglas and moving the corporate headquarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019 ). According to Tkacik ( 2019 ), the new management even went so far as “maligning and marginalizing engineers as a class”.

Financial drivers thus began to place an inordinate amount of strain on Boeing employees, including engineers. During the development of the 737 MAX, significant production pressure to keep pace with the Airbus 320neo was ever-present. For example, Boeing management allegedly rejected any design changes that would prolong certification or require additional pilot training for the MAX (Gelles et al. 2019 ). As Adam Dickson, a former Boeing engineer, explained in a television documentary (BBC Panorama 2019 ): “There was a lot of interest and pressure on the certification and analysis engineers in particular, to look at any changes to the Max as minor changes”.

Production pressures were exacerbated by the “cozy relationship” between Boeing and the FAA (Kitroeff et al. 2019a ; see also Gelles and Kaplan 2019 ; Hall and Goelz 2019 ). Beginning in 2005, the FAA increased its reliance on manufacturers to certify their own planes. Self-certification became standard practice throughout the U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff et al. 2019a ).

The serious drawbacks to self-certification became acutely apparent in this case. Of particular concern, the safety analysis for MCAS delegated to Boeing by the FAA was flawed in at least three respects: (1) the analysis underestimated the power of MCAS to move the plane’s horizontal tail and thus how difficult it would be for pilots to maintain control of the aircraft; (2) it did not account for the system deploying multiple times; and (3) it underestimated the risk level if MCAS failed, thus permitting a design feature—the single AOA sensor input to MCAS—that did not have built-in redundancy (Gates 2019 ). Related to these concerns, the ability of MCAS to move the horizontal tail was increased without properly updating the safety analysis or notifying the FAA about the change (Gates 2019 ). In addition, the FAA did not require pilot training for MCAS or simulator training for the 737 MAX (Gelles and Kaplan 2019 ). Since the MAX grounding, the FAA has been become more independent during its assessments and certifications—for example, they will not use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs 2019 ).

The role of the FAA has also been subject to political scrutiny. The report of a study of the FAA certification process commissioned by Secretary of Transportation Elaine Chao (DOT 2020 ), released January 16, 2020, concluded that the FAA certification process was “appropriate and effective,” and that certification of the MAX as a new airplane would not have made a difference in the plane’s safety. At the same time, the report recommended a number of measures to strengthen the process and augment FAA’s staff (Pasztor and Cameron 2020 ). In contrast, a report of preliminary investigative findings by the Democratic staff of the House Committee on Transportation and Infrastructure (House TI 2020 ), issued in March 2020, characterized FAA’s certification of the MAX as “grossly insufficient” and criticized Boeing’s design flaws and lack of transparency with the FAA, airlines, and pilots (Duncan and Laris 2020 ).

Boeing has incurred significant economic losses from the crashes and subsequent grounding of the MAX. In December 2019, Boeing CEO Dennis Muilenburg was fired and the corporation announced that 737 MAX production would be suspended in January 2020 (Rich 2019 ) (see Fig.  1 ). Boeing is facing numerous lawsuits and possible criminal investigations. Boeing estimates that its economic losses for the 737 MAX will exceed $18 billion (Gelles 2020 ). In addition to the need to fix MCAS, other issues have arisen in recertification of the aircraft, including wiring for controls of the tail stabilizer, possible weaknesses in the engine rotors, and vulnerabilities in lightning protection for the engines (Kitroeff and Gelles 2020 ). The FAA had planned to flight test the 737 MAX early in 2020, and it was supposed to return to service in summer 2020 (Gelles and Kitroeff 2020 ). Given the global impact of the COVID-19 pandemic and other factors, it is difficult to predict when MAX flights might resume. In addition, uncertainty of passenger demand has resulted in some airlines delaying or cancelling orders for the MAX (Bogaisky 2020 ). Even after obtaining flight approval, public resistance to flying in the 737 MAX will probably be considerable (Gelles 2019 ).

Lessons for Engineering Ethics

The 737 MAX case is still unfolding and will continue to do so for some time. Yet important lessons can already be learned (or relearned) from the case. Some of those lessons are straightforward, and others are more subtle. A key and clear lesson is that engineers may need reminders about prioritizing the public good, and more specifically, the public’s safety. A more subtle lesson pertains to the ways in which the problem of many hands may or may not apply here. Other lessons involve the need for corporations, engineering societies, and engineering educators to rise to the challenge of nurturing and supporting ethical behavior on the part of engineers, especially in light of the difficulties revealed in this case.

All contemporary codes of ethics promulgated by major engineering societies state that an engineer’s paramount responsibility is to protect the “safety, health, and welfare” of the public. The American Institute of Aeronautics and Astronautics Code of Ethics indicates that engineers must “[H]old paramount the safety, health, and welfare of the public in the performance of their duties” (AIAA 2013 ). The Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging its members: “…to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment” (IEEE 2017 ). The IEEE Computer Society (CS) cooperated with the Association for Computing Machinery (ACM) in developing a Software Engineering Code of Ethics ( 1997 ) which holds that software engineers shall: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment….” According to Gotterbarn and Miller ( 2009 ), the latter code is a useful guide when examining cases involving software design and underscores the fact that during design, as in all engineering practice, the well-being of the public should be the overriding concern. While engineering codes of ethics are plentiful in number, they differ in their source of moral authority (i.e., organizational codes vs. professional codes), are often unenforceable through the law, and formally apply to different groups of engineers (e.g., based on discipline or organizational membership). However, the codes are generally recognized as a statement of the values inherent to engineering and its ethical commitments (Davis 2015 ).

An engineer’s ethical responsibility does not preclude consideration of factors such as cost and schedule (Pinkus et al. 1997 ). Engineers always have to grapple with constraints, including time and resource limitations. The engineers working at Boeing did have legitimate concerns about their company losing contracts to its competitor Airbus. But being an engineer means that public safety and welfare must be the highest priority (Davis 1991 ). The aforementioned software and other design errors in the development of the 737 MAX, which resulted in hundreds of deaths, would thus seem to be clear violations of engineering codes of ethics. In addition to pointing to engineering codes, Peterson ( 2019 ) argues that Boeing engineers and managers violated widely accepted ethical norms such as informed consent and the precautionary principle.

From an engineering perspective, the central ethical issue in the MAX case arguably circulates around the decision to use software (i.e., MCAS) to “mask” a questionable hardware design—the repositioning of the engines that disrupted the aerodynamics of the airframe (Travis 2019 ). As Johnston and Harris ( 2019 ) argue: “To meet the design goals and avoid an expensive hardware change, Boeing created the MCAS as a software Band-Aid.” Though a reliance on software fixes often happens in this manner, it places a high burden of safety on such fixes that they may not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy machine. In the Therac-25 case, hardware safety interlocks employed in earlier models of the machine were replaced by software safety controls. In addition, information about how the software might malfunction was lacking from the user manual for the Therac machine. Thus, when certain types of errors appeared on its interface, the machine’s operators did not know how to respond. Software flaws, among other factors, contributed to six patients being given massive radiation overdoses, resulting in deaths and serious injuries (Leveson and Turner 1993 ). A more recent case involves problems with the embedded software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended acceleration deaths, with expert witnesses citing bugs in the software and throttle fail safe defects” (Cummings and Britton 2020 ).

Boeing’s use of MCAS to mask the significant change in hardware configuration of the MAX was compounded by not providing redundancy for components prone to failure (i.e., the AOA sensors) (Campbell 2019 ), and by failing to notify pilots about the new software. In such cases, it is especially crucial that pilots receive clear documentation and relevant training so that they know how to manage the hand-off with an automated system properly (Johnston and Harris 2019 ). Part of the necessity for such training is related to trust calibration (Borenstein et al. 2020 ; Borenstein et al. 2018 ), a factor that has contributed to previous airplane accidents (e.g., Carr 2014 ). For example, if pilots do not place enough trust in an automated system, they may add risk by intervening in system operation. Conversely, if pilots trust an automated system too much, they may lack sufficient time to act once they identify a problem. This is further complicated in the MAX case because pilots were not fully aware, if at all, of MCAS’s existence and how the system functioned.

In addition to engineering decision-making that failed to prioritize public safety, questionable management decisions were also made at both Boeing and the FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs that the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation by Boeing was ill-advised; that lesson appears to have been learned at the expense of hundreds of lives (Duncan and Aratani 2019 ).

The Problem of Many Hands Revisited

Actions, or inaction, by large, complex organizations, in this case corporate and government entities, suggest that the “problem of many hands” may be relevant to the 737 MAX case. At a high level of abstraction, the problem of many hands involves the idea that accountability is difficult to assign in the face of collective action, especially in a computerized society (Thompson 1980 ; Nissenbaum 1994 ). According to Nissenbaum ( 1996 , 29), “Where a mishap is the work of ‘many hands,’ it may not be obvious who is to blame because frequently its most salient and immediate causal antecedents do not converge with its locus of decision-making. The conditions for blame, therefore, are not satisfied in a way normally satisfied when a single individual is held blameworthy for a harm”.

However, there is an alternative understanding of the problem of many hands. In this version of the problem, the lack of accountability is not merely because multiple people and multiple decisions figure into a final outcome. Instead, in order to “qualify” as the problem of many hands, the component decisions should be benign, or at least far less harmful, if examined in isolation; only when the individual decisions are collectively combined do we see the most harmful result. In this understanding, the individual decision-makers should not have the same moral culpability as they would if they made all the decisions by themselves (Noorman 2020 ).

Both of these understandings of the problem of many hands could shed light on the 737 MAX case. Yet we focus on the first version of the problem. We admit the possibility that some of the isolated decisions about the 737 MAX may have been made in part because of ignorance of a broader picture. While we do not stake a claim on whether this is what actually happened in the MAX case, we acknowledge that it may be true in some circumstances. However, we think the more important point is that some of the 737 MAX decisions were so clearly misguided that a competent engineer should have seen the implications, even if the engineer was not aware of all of the broader context. The problem then is to identify responsibility for the questionable decisions in a way that discourages bad judgments in the future, a task made more challenging by the complexities of the decision-making. Legal proceedings about this case are likely to explore those complexities in detail and are outside the scope of this article. But such complexities must be examined carefully so as not to act as an insulator to accountability.

When many individuals are involved in the design of a computing device, for example, and a serious failure occurs, each person might try to absolve themselves of responsibility by indicating that “too many people” and “too many decisions” were involved for any individual person to know that the problem was going to happen. This is a common, and often dubious, excuse in the attempt to abdicate responsibility for a harm. While it can have different levels of magnitude and severity, the problem of many hands often arises in large scale ethical failures in engineering such as in the Deepwater Horizon oil spill (Thompson 2014 ).

Possible examples in the 737 MAX case of the difficulty of assigning moral responsibility due to the problem of many hands include:

The decision to reposition the engines;

The decision to mask the jet’s subsequent dynamic instability with MCAS;

The decision to rely on only one AOA sensor in designing MCAS; and

The decision to not inform nor properly train pilots about the MCAS system.

While overall responsibility for each of these decisions may be difficult to allocate precisely, at least points 1–3 above arguably reflect fundamental errors in engineering judgement (Travis 2019 ). Boeing engineers and FAA engineers either participated in or were aware of these decisions (Kitroeff and Gelles 2019 ) and may have had opportunities to reconsider or redirect such decisions. As Davis has noted ( 2012 ), responsible engineering professionals make it their business to address problems even when they did not cause the problem, or, we would argue, solely cause it. As noted earlier, reports indicate that at least one Boeing engineer expressed reservations about the design of MCAS (Bellamy 2019 ). Since the two crashes, one Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al. 2019b ) and several current and former Boeing engineers and other employees have gone public with various concerns about the 737 MAX (Pasztor 2019 ). And yet, as is often the case, the flawed design went forward with tragic results.

Enabling Ethical Engineers

The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994 ), Space Shuttle Challenger (Werhane 1991 ), and GM ignition switch (Jennings and Trautman 2016 ). In the Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well before the car was released to the public and signed off on the design even though crash tests showed the tank was vulnerable to rupture during low-speed rear-end collisions (Baura 2006 ). In the case of the GM ignition switch, engineers knew for at least four years about the faulty design, a flaw that resulted in at least a dozen fatal accidents (Stephan 2016 ). In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially catastrophic flaws in the shuttle’s solid rocket boosters a full six months before the accident. He, along with other engineers, unsuccessfully argued on the eve of launch for a delay due to the effect that freezing temperatures could have on the boosters’ O-ring seals. Boisjoly was also one of a handful of engineers to describe these warnings to the Presidential commission investigating the accident (Boisjoly et al. 1989 ).

Returning to the 737 MAX case, could Ewbank or others with concerns about the safety of the airplane have done more than filing ethics complaints or offering public testimony only after the Lion Air and Ethiopian Airlines crashes? One might argue that requiring professional registration by all engineers in the U.S. would result in more ethical conduct (for example, by giving state licensing boards greater oversight authority). Yet the well-entrenched “industry exemption” from registration for most engineers working in large corporations has undermined such calls (Kline 2001 ).

It could empower engineers with safety concerns if Boeing and other corporations would strengthen internal ethics processes, including sincere and meaningful responsiveness to anonymous complaint channels. Schwartz ( 2013 ) outlines three core components of an ethical corporate culture, including strong core ethical values, a formal ethics program (including an ethics hotline), and capable ethical leadership. Schwartz points to Siemens’ creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time (Schnebel and Bienert 2004 ) and has taken efforts in the past to evaluate its effectiveness (Boeing 2003 ). Yet it is clear that more robust measures are needed in response to ethics concerns and complaints. Since the MAX crashes, Boeing’s Board has implemented a number of changes including establishing a corporate safety group and revising internal reporting procedures so that lead engineers primarily report to the chief engineer rather than business managers (Gelles and Kitroeff 2019b , Boeing n.d. c). Whether these measures will be enough to restore Boeing’s former engineering-centered focus remains to be seen.

Professional engineering societies could play a stronger role in communicating and enforcing codes of ethics, in supporting ethical behavior of engineers, and by providing more educational opportunities for learning about ethics and about the ethical responsibilities of engineers. Some societies, including ACM and IEEE, have become increasingly engaged in ethics-related activities. Initially ethics engagement by the societies consisted primarily of a focus on macroethical issues such as sustainable development (Herkert 2004 ). Recently, however, the societies have also turned to a greater focus on microethical issues (the behavior of individuals). The 2017 revision to the IEEE Code of Ethics, for example, highlights the importance of “ethical design” (Adamson and Herkert 2020 ). This parallels IEEE activities in the area of design of autonomous and intelligent systems (e.g., IEEE 2018 ). A promising outcome of this emphasis is a move toward implementing “ethical design” frameworks (Peters et al. 2020 ).

In terms of engineering education, educators need to place a greater emphasis on fostering moral courage, that is the courage to act on one’s moral convictions including adherence to codes of ethics. This is of particular significance in large organizations such as Boeing and the FAA where the agency of engineers may be limited by factors such as organizational culture (Watts and Buckley 2017 ). In a study of twenty-six ethics interventions in engineering programs, Hess and Fore ( 2018 ) found that only twenty-seven percent had a learning goal of development of “ethical courage, confidence or commitment”. This goal could be operationalized in a number of ways, for example through a focus on virtue ethics (Harris 2008 ) or professional identity (Hashemian and Loui 2010 ). This need should not only be addressed within the engineering curriculum but during lifelong learning initiatives and other professional development opportunities as well (Miller 2019 ).

The circumstances surrounding the 737 MAX airplane could certainly serve as an informative case study for ethics or technical courses. The case can shed light on important lessons for engineers including the complex interactions, and sometimes tensions, between engineering and managerial considerations. The case also tangibly displays that what seems to be relatively small-scale, and likely well-intended, decisions by individual engineers can combine collectively to result in large-scale tragedy. No individual person wanted to do harm, but it happened nonetheless. Thus, the case can serve a reminder to current and future generations of engineers that public safety must be the first and foremost priority. A particularly useful pedagogical method for considering this case is to assign students to the roles of engineers, managers, and regulators, as well as the flying public, airline personnel, and representatives of engineering societies (Herkert 1997 ). In addition to illuminating the perspectives and responsibilities of each stakeholder group, role-playing can also shed light on the “macroethical” issues raised by the case (Martin et al. 2019 ) such as airline safety standards and the proper role for engineers and engineering societies in the regulation of the industry.

Conclusions and Recommendations

The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight. Using almost any standard ethical analysis or framework, Boeing’s actions regarding the safety of the 737 MAX, particularly decisions regarding MCAS, fall short.

Boeing failed in its obligations to protect the public. At a minimum, the company had an obligation to inform airlines and pilots of significant design changes, especially the role of MCAS in compensating for repositioning of engines in the MAX from prior versions of the 737. Clearly, it was a “significant” change because it had a direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA interaction underscores the fact that conflicts of interest are a serious concern in regulatory actions within the airline industry.

Internal and external organizational factors may have interfered with Boeing and FAA engineers’ fulfillment of their professional ethical responsibilities; this is an all too common problem that merits serious attention from industry leaders, regulators, professional societies, and educators. The lessons to be learned in this case are not new. After large scale tragedies involving engineering decision-making, calls for change often emerge. But such lessons apparently must be retaught and relearned by each generation of engineers.

ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice, https://ethics.acm.org/code-of-ethics/software-engineering-code/ .

Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields . New York: Springer ( in press ).

Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/world/asia/ethiopia-crash-boeing.html .

AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics .

Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morning News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-boeings-737-max-wont-be-approved-in-2019/

Baura, G. (2006). Engineering ethics: an industrial perspective . Amsterdam: Elsevier.

Google Scholar  

BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29, https://www.bbc.com/news/business-49142761 .

Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Aviation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-updates-congressional-hearings/ .

Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve 'Crisis of Trust' over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisis-of-trust-over-boeing-737-max/ .

Birsch, D., & Fielder, J. (Eds.). (1994). The ford pinto case: A study in applied ethics, business, and technology . New York: The State University of New York Press.

Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethics-Program .

Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-Trim-Due-to-AOA.pdf .

Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/ .

Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/ .

Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updates/resources/ .

Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliveries. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronavirus-737-max/#1b9eb8955b5a .

Boisjoly, R. P., Curtis, E. F., & Mellican, E. (1989). Roger Boisjoly and the challenger disaster: The ethical dimensions. J Bus Ethics, 8 (4), 217–230.

Article   Google Scholar  

Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons: A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and Society , 1 (2), 83–88.

Borenstein, J., Wagner, A. R., & Howard, A. (2018). Overtrust of pediatric health-care robots: A preliminary survey of parent perspectives. IEEE Robot Autom Mag, 25 (1), 46–54.

Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25, https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince .

Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May 2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error-mcas-faa .

Carr, N. (2014). The glass cage: Automation and us . Norton.

Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present, and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.

Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of a profession. Philos Publ Affairs, 20 (2), 150–167.

Davis, M. (2012). “Ain’t no one here but us social forces”: Constructing the professional responsibility of engineers. Sci Eng Ethics, 18 (1), 13–34.

Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering identities, epistemologies and values (pp. 65–79). Springer, New York.

Department of Transportation (DOT). (2020). Official report of the special committee to review the Federal Aviation Administration’s Aircraft Certification Process, January 16. https://www.transportation.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf .

Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety review. The Washington Post, November 27, https://www.washingtonpost.com/transportation/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/ .

Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of concealment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washingtonpost.com/local/trafficandcommuting/house-report-on-737-max-crashes-faults-boeings-culture-of-concealment-and-labels-faa-grossly-insufficient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa866bb81_story.html .

Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not 'Completely' Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boeing-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-procedures.html .

Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company airplanes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf .

Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.documentcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1 .

Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings, and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_Submittal_to_FAA_Oct_2019.pdf .

Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report. Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminary-Report-B737-800MAX-ET-AVJ.pdf .

Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.pdf .

Gates, D. (2018). Pilots struggled against Boeing's 737 MAX control system on doomed Lion Air flight. The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/ .

Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boeing-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-the-lion-air-crash/ .

Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times, December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boeing-737-max-survey.html .

Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, January 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html .

Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boeing-elaine-chao.html .

Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t. New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-warning-light.html .

Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/business/boeing-safety-737-max.html .

Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016. The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-flight-simulator-text-message.html .

Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-again.html .

Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the 737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boeing-737-max-crash.html .

Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in October, Questions About the Plane Arose. The New York Times, February 3, https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html .

Gotterbarn, D., & Miller, K. W. (2009). The public is the priority: Making decisions using the software engineering code of ethics. Computer, 42 (6), 66–73.

Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times, July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html .

Harris, C. E. (2008). The good engineer: Giving virtue its due in engineering ethics. Science and Engineering Ethics, 14 (2), 153–164.

Hashemian, G., & Loui, M. C. (2010). Can instruction in engineering ethics change students’ feelings about professional responsibility? Science and Engineering Ethics, 16 (1), 201–215.

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3 (4), 447–462.

Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging technologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Academies Press, New York.

Hess, J. L., & Fore, G. (2018). A systematic literature review of US engineering ethics interventions. Science and Engineering Ethics, 24 (2), 551–583.

House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Aircraft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Preliminary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Preliminary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202020.pdf .

IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html .

IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf .

Jennings, M., & Trautman, L. J. (2016). Ethical culture and legal liability: The GM switch crisis and lessons in governance. Boston University Journal of Science and Technology Law, 22 , 187.

Johnston, P., & Harris, R. (2019). The Boeing 737 MAX Saga: Lessons for software organizations. Software Quality Professional, 21 (3), 4–12.

Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boeing-with-plan-to-individually-inspect-max-jets.html .

Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytimes.com/2019/03/13/business/canada-737-max.html .

Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.html .

Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on 737 MAX. The New York Times, November 7 (updated December 11), https://www.nytimes.com/2019/11/07/business/boeing-737-max-faa.html .

Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s 737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boeing-737-max.html .

Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boeing-737-max-investigation.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boeing-737-max-faa.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says. The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-crashes.html .

Kline, R. R. (2001). Using history and sociology to teach engineering ethics. IEEE Technology and Society Magazine, 20 (4), 13–20.

Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://apnews.com/470abf326cdb4229bdc18c8ad8caa78a .

Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, September 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html .

Leveson, N. G., & Turner, C. S. (1993). An investigation of the Therac-25 accidents. Computer, 26 (7), 18–41.

Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html .

Martin, D. A., Conlon, E., & Bowe, B. (2019). The role of role-play in student awareness of the social dimension of the engineering profession. European Journal of Engineering Education, 44 (6), 882–905.

Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engineering-Business Nexus (pp. 81–101). Springer, Cham.

National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19, https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf .

Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM , January, https://dl.acm.org/doi/10.1145/175222.175228 .

Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2 (1), 25–42.

Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclopedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-responsibility .

Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerge-in-boeing-737-max-safety-inquiries-11556418721 .

Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737 MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-safety-approval-process-at-faa-11579188086 .

Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance. The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-simulator-training-11578423221 .

Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—two frameworks for ethical design practice. IEEE Transactions on Technology and Society, 1 (1), 34–47.

Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/ .

Pinkus, R. L., Pinkus, R. L. B., Shuman, L. J., Hummon, N. P., & Wolfe, H. (1997). Engineering ethics: Balancing cost, schedule, and risk-lessons learned from the space shuttle . Cambridge: Cambridge University Press.

Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520Final%2520Report.pdf .

Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won't be over. Investor's Business Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisis-not-over/ .

Schnebel, E., & Bienert, M. A. (2004). Implementing ethics in business organizations. Journal of Business Ethics, 53 (1–2), 203–211.

Schwartz, M. S. (2013). Developing and sustaining an ethical corporate culture: The core elements. Business Horizons, 56 (1), 39–50.

Stephan, K. (2016). GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas]. IEEE Technology and Society Magazine, 35 (2), 34–35.

Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysullenberger.com/my-letter-to-the-editor-of-new-york-times-magazine/ .

Thompson, D. F. (1980). Moral responsibility of public officials: The problem of many hands. American Political Science Review, 74 (4), 905–916.

Thompson, D. F. (2014). Responsibility for failures of government: The problem of many hands. The American Review of Public Administration, 44 (3), 259–273.

Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster. The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-investigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution .

Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum , April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer .

Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/ .

Watts, L. L., & Buckley, M. R. (2017). A dual-processing model of moral whistleblowing in organizations. Journal of Business Ethics, 146 (3), 669–683.

Werhane, P. H. (1991). Engineers and management: The challenge of the Challenger incident. Journal of Business Ethics, 10 (8), 605–616.

Download references

Acknowledgement

The authors would like to thank the anonymous reviewers for their helpful comments.

Author information

Authors and affiliations.

North Carolina State University, Raleigh, NC, USA

Joseph Herkert

Georgia Institute of Technology, Atlanta, GA, USA

Jason Borenstein

University of Missouri – St. Louis, St. Louis, MO, USA

Keith Miller

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Joseph Herkert .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Herkert, J., Borenstein, J. & Miller, K. The Boeing 737 MAX: Lessons for Engineering Ethics. Sci Eng Ethics 26 , 2957–2974 (2020). https://doi.org/10.1007/s11948-020-00252-y

Download citation

Received : 26 March 2020

Accepted : 25 June 2020

Published : 10 July 2020

Issue Date : December 2020

DOI : https://doi.org/10.1007/s11948-020-00252-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Engineering ethics
  • Airline safety
  • Engineering design
  • Corporate culture
  • Software engineering
  • Find a journal
  • Publish with us
  • Track your research

Engineering Ethics Case Study: The Challenger Disaster

This course provides instruction in engineering ethics through a case study of the space shuttle challenger disaster..

Course image

About this course

This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given. The disaster itself is chronicled through NASA photographs. Next the decision-making process—especially the discussions occurring during the teleconference held on the evening before the launch—is described. Direct quotations from engineers interviewed after the disaster are used to illustrate the ambiguities of the data and the pressures that the decision-makers faced in the months and hours preceding the launch. The course culminates in an extended treatment of six ethical issues raised by Challenger. Topics: This course teaches the following specific knowledge and skills: Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice Shuttle hardware involved in the disaster Decisions made in the period preceding the launch Ethical issue: NASA giving first priority to public safety over other concerns Ethical issue: the contractor gives first priority to public safety over other concerns Ethical issue: whistleblowing Ethical issue: informed consent Ethical issue: ownership of company records Ethical issue: how the public perceives that an engineering decision involves an ethical violation Publication Source: Mark Rossow, PhD, PE (retired)

Meets the standards of:

  • APEGA ( Association of Professional Engineers and Geoscientists of Alberta )
  • EGBC ( Association of Professional Engineers and Geoscientists of British Columbia )
  • APEGNB ( Association of Professional Engineers and Geoscientists of New Brunswick )
  • APEGS ( Association of Professional Engineers and Geoscientists of Saskatchewan )
  • APEGM ( Association of Professional Engineers and Geoscientists of the Province of Manitoba )
  • APENS ( Association of Professional Engineers Nova Scotia )
  • EPEI ( Engineers PEI )
  • NAPEG ( Northwest Territories and Nunavut Association of Professional Engineers and Geoscientists )
  • PEGNL ( Professional Engineers and Geoscientists of Newfoundland and Labrador )
  • PEO ( Professional Engineers Ontario )

Want to save up to 50% off?

Create your own CPD Courses for Engineers Custom Package and pick any courses

Training 2 or more people?

Get your team access to 824 top CPD Engineers courses anytime, anywhere.

Footer logo

Engineering Ethics Case Study: The Challenger Disaster

Topic outline.

the challenger case study engineering ethics

Credits : 4 PDH

Pdh course description:.

  • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice
  • Shuttle hardware involved in the disaster
  • Decisions made in the period preceding the launch
  • Ethical issue: NASA giving first priority to public safety over other concerns
  • Ethical issue: the contractor giving first priority to public safety over other concerns
  • Ethical issue: whistle blowing
  • Ethical issue: informed consent
  • Ethical issue: ownership of company records
  • Ethical issue: how the public perceives that an engineering decision involves an ethical violation

To take this course:

Email:

  • CPD Online Courses
  • CPD Discount Packages
  • CPD Engineering Ethics
  • CPD Live Webinars
  • CPD Video Presentations
  • CPD Interactive Courses
  • Free 1 CPD Course
  • CED Newsletter Subscription
  • Corporate Enrollment Programs
  • P.Eng. Referral
  • How It Works
  • User Registration
  • Provincial Board Requirements
  • 100% Satisfaction Guarantee
  • Corporate Profile
  • Approved Sponsor
  • Accepted Courses
  • Course Provider Biographies
  • Course Provider Registration
  • Course Provider Qualifications
  • How to Provide a Course
  • Course Provider Earnings
  • Course Provider Search

Online Courses

  • Biomass Fuels
  • Renewable Sources
  • Solar Energy
  • Wind Energy
  • Building Design Fundamentals
  • Facility Design
  • Business Planning
  • Conflict Resolution
  • Finance and Economics
  • Management and Leadership
  • Marketing and Communication
  • Performance and Productivity
  • Public Influence
  • Self Development
  • Chemical Fundamentals
  • Cooling Water Treatment
  • Disinfection Systems
  • Industrial Water Treatment
  • Wastewater Treatment
  • Water Treatment
  • Erosion Control
  • Pavement Design
  • Site Development
  • Stormwater Management
  • Water Supply
  • AC/DC Currents
  • Circuit Measurements
  • Communication Systems
  • Control Systems
  • Electrical Distribution
  • Electrical Equipment
  • Electrical Fundamentals
  • Electrical Substations
  • Equipment Testing
  • Motors and Generators
  • Protection Systems
  • Voltage Regulation
  • Engineering Catastrophes
  • General Ethics
  • Contract Administration
  • Engineering MegaProjects
  • Maintenance Management
  • Plant Management
  • Project Management
  • Reliability Management
  • Value Engineering
  • Air Pollution
  • Building Contaminants
  • Environmental Impacts
  • Indoor Environmental Quality
  • Site Remediation/Protection
  • Solid Waste
  • Retaining Structures
  • Slope Stability
  • Soil Mechanics
  • HVAC Applications
  • HVAC Distribution
  • HVAC for Facilities
  • HVAC Fundamentals
  • Systems and Equipment
  • Ventilation
  • Industrial Systems
  • Nuclear Systems
  • Power Plants
  • Steam Systems
  • Cathodic Protection
  • Coating and Lubrication
  • Material Applications
  • Material Properties
  • Thermal Insulation
  • Compressed Air Systems
  • Fire Protection
  • Gears and Bearings
  • Mechanical Fundamentals
  • Piping Systems
  • Plumbing Systems
  • Pumping Systems
  • Thermodynamics
  • Oil and Gas
  • CBR Protection
  • Chemical Hazards
  • Electrical Hazards
  • Safety Fundamentals
  • Bridge Design
  • Bridge Inspections
  • Flood Construction
  • High-Wind Construction
  • Seismic Construction
  • Wood Applications
  • Energy Efficient Homes
  • Energy Management
  • Green Roofs and Exteriors
  • High Performance Buildings
  • HVAC System Improvements
  • Lighting Assessments
  • MEP Energy Savings
  • Lane Use Management
  • Roadway Design
  • Roundabout Design
  • Traffic Control
  • Traffic Signals
  • Transportation Planning
  • Transportation Safety

Discount Packages

  • British Columbia
  • New Brunswick
  • Newfoundland and Labrador
  • Northwest Territories and Nunavut
  • Nova Scotia
  • Prince Edward Island
  • Saskatchewan

Engineering Ethics

Live webinars.

  • All Webinars
  • Alt./Ren. Energy
  • Building Design
  • Business Skills
  • Chemical Engineering
  • Civil Engineering
  • Electrical Engineering
  • Engineering Management
  • Environmental Engineering
  • Ethics and Catastrophies
  • Material Engineering
  • Mechanical Engineering
  • Transportation Engineering

Video Presentations

  • Industrial Engineering
  • Sustainable Design

Interactive Courses

  • Geotechnical Engineering
  • Safety Engineering
  • Structural Engineering

Courses by Provider

Phone: 1-877-322-5800, e-mail: [email protected], engineering ethics case study: the challenger disaster.

Engineering Ethics Case Study: The Challenger Disaster

This online engieering PDH course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are presented. The disaster itself is chronicled through NASA photographs. Next the decision-making process, especially the discussions occurring during the teleconference held on the evening before the launch, is described. Direct quotations from engineers interviewed after the disaster are used to illustrate the ambiguities of the data and the pressures that the decision-makers faced in the months and hours preceding the launch. The course culminates in an extended treatment of six ethical issues raised by Challenger.

This 3 PDH online  course is intended for all engineers who are interested in gaining a better understanding about the ethical issues that lead to the Challenger disaster and how they could have been avoided.

This P.Eng. continuing education course is intended to provide you with the following specific knowledge and skills:

  • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice
  • Shuttle hardware involved in the disaster
  • Decisions made in the period preceding the launch
  • Ethical issue: NASA giving first priority to public safety over other concerns
  • Ethical issue: the contractor giving first priority to public safety over other concerns
  • Ethical issue: whistle blowing
  • Ethical issue: informed consent
  • Ethical issue: ownership of company records
  • Ethical issue: how the public perceives that an engineering decision involves an ethical violation

In this professional engineering CEU course, you need to review the course document titled, "Engineering Ethics Case Study: The Challenger Disaster".

Upon successful completion of the quiz, print your Certificate of Completion instantly. (Note: if you are paying by check or money order, you will be able to print it after we receive your payment.) For your convenience, we will also email it to you. Please note that you can log in to your account at any time to access and print your Certificate of Completion.

  • Privacy Policy
  • Terms of Use

A person with short dark hair, glasses, and a nose ring looks at a futuristic screen with writing and charts.

Are tomorrow’s engineers ready to face AI’s ethical challenges?

the challenger case study engineering ethics

Doctoral Candidate in Movement Science, University of Michigan

the challenger case study engineering ethics

Associate Professor of Sociology, University of Michigan

Disclosure statement

Elana Goldenkoff receives funding from National Science Foundation and Schmidt Futures.

Erin A. Cech receives funding from the National Science Foundation.

University of Michigan provides funding as a founding partner of The Conversation US.

View all partners

A chatbot turns hostile . A test version of a Roomba vacuum collects images of users in private situations. A Black woman is falsely identified as a suspect on the basis of facial recognition software, which tends to be less accurate at identifying women and people of color .

These incidents are not just glitches, but examples of more fundamental problems. As artificial intelligence and machine learning tools become more integrated into daily life, ethical considerations are growing, from privacy issues and race and gender biases in coding to the spread of misinformation .

The general public depends on software engineers and computer scientists to ensure these technologies are created in a safe and ethical manner. As a sociologist and doctoral candidate interested in science, technology, engineering and math education, we are currently researching how engineers in many different fields learn and understand their responsibilities to the public.

Yet our recent research , as well as that of other scholars , points to a troubling reality: The next generation of engineers often seem unprepared to grapple with the social implications of their work. What’s more, some appear apathetic about the moral dilemmas their careers may bring – just as advances in AI intensify such dilemmas.

Aware, but unprepared

As part of our ongoing research , we interviewed more than 60 electrical engineering and computer science masters students at a top engineering program in the United States. We asked students about their experiences with ethical challenges in engineering, their knowledge of ethical dilemmas in the field and how they would respond to scenarios in the future.

First, the good news: Most students recognized potential dangers of AI and expressed concern about personal privacy and the potential to cause harm – like how race and gender biases can be written into algorithms, intentionally or unintentionally.

One student, for example, expressed dismay at the environmental impact of AI, saying AI companies are using “more and more greenhouse power, [for] minimal benefits.” Others discussed concerns about where and how AIs are being applied, including for military technology and to generate falsified information and images.

When asked, however, “Do you feel equipped to respond in concerning or unethical situations?” students often said no.

“Flat out no. … It is kind of scary,” one student replied. “Do YOU know who I’m supposed to go to?”

Another was troubled by the lack of training: “I [would be] dealing with that with no experience. … Who knows how I’ll react.”

Two young women, one Black and one Asian, sit at a table together as they work on two laptops.

Other researchers have similarly found that many engineering students do not feel satisfied with the ethics training they do receive. Common training usually emphasizes professional codes of conduct, rather than the complex socio-technical factors underlying ethical decision-making. Research suggests that even when presented with particular scenarios or case studies, engineering students often struggle to recognize ethical dilemmas .

‘A box to check off’

Accredited engineering programs are required to “include topics related to professional and ethical responsibilities” in some capacity.

Yet ethics training is rarely emphasized in the formal curricula. A study assessing undergraduate STEM curricula in the U.S. found that coverage of ethical issues varied greatly in terms of content, amount and how seriously it is presented . Additionally, an analysis of academic literature about engineering education found that ethics is often considered nonessential training.

Many engineering faculty express dissatisfaction with students’ understanding, but report feeling pressure from engineering colleagues and students themselves to prioritize technical skills in their limited class time.

Researchers in one 2018 study interviewed over 50 engineering faculty and documented hesitancy – and sometimes even outright resistance – toward incorporating public welfare issues into their engineering classes. More than a quarter of professors they interviewed saw ethics and societal impacts as outside “real” engineering work .

About a third of students we interviewed in our ongoing research project share this seeming apathy toward ethics training, referring to ethics classes as “just a box to check off.”

“If I’m paying money to attend ethics class as an engineer, I’m going to be furious,” one said.

These attitudes sometimes extend to how students view engineers’ role in society. One interviewee in our current study, for example, said that an engineer’s “responsibility is just to create that thing, design that thing and … tell people how to use it. [Misusage] issues are not their concern.”

One of us, Erin Cech, followed a cohort of 326 engineering students from four U.S. colleges. This research, published in 2014, suggested that engineers actually became less concerned over the course of their degree about their ethical responsibilities and understanding the public consequences of technology. Following them after they left college, we found that their concerns regarding ethics did not rebound once these new graduates entered the workforce.

Joining the work world

When engineers do receive ethics training as part of their degree, it seems to work .

Along with engineering professor Cynthia Finelli , we conducted a survey of over 500 employed engineers . Engineers who received formal ethics and public welfare training in school are more likely to understand their responsibility to the public in their professional roles, and recognize the need for collective problem solving. Compared to engineers who did not receive training, they were 30% more likely to have noticed an ethical issue in their workplace and 52% more likely to have taken action.

An Asian man wearing glasses stares seriously into space, standing against a holographic background in shades of pink and blue.

Over a quarter of these practicing engineers reported encountering a concerning ethical situation at work. Yet approximately one-third said they have never received training in public welfare – not during their education, and not during their career.

This gap in ethics education raises serious questions about how well-prepared the next generation of engineers will be to navigate the complex ethical landscape of their field, especially when it comes to AI .

To be sure, the burden of watching out for public welfare is not shouldered by engineers, designers and programmers alone. Companies and legislators share the responsibility.

But the people who are designing, testing and fine-tuning this technology are the public’s first line of defense. We believe educational programs owe it to them – and the rest of us – to take this training seriously.

  • Artificial intelligence (AI)
  • Engineering
  • Computer science
  • Ethical questions
  • Technology ethics
  • Engineering students
  • Societal impacts
  • Artificial Intelligence ethics

the challenger case study engineering ethics

Senior Lecturer - Earth System Science

the challenger case study engineering ethics

Operations Coordinator

the challenger case study engineering ethics

Sydney Horizon Educators (Identified)

the challenger case study engineering ethics

Deputy Social Media Producer

the challenger case study engineering ethics

Associate Professor, Occupational Therapy

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Engineering Ethics Case Study: The Challenger Disaster Course No: LE3-001 Credit: 3 PDH

Profile image of paul swaminathan

Related Papers

tee shi feng

the challenger case study engineering ethics

Junichi Murata

One of the most important tasks of engineering ethics is to give engineers the tools required to act ethically to prevent possible disastrous accidents which could result from engineers' decisions and actions. The space shuttle Challenger disaster is referred to as a typical case in almost every textbook. This case is seen as one from which engineers can learn important lessons, as it shows impressively how engineers should act as professionals, to prevent accidents. The Columbia disaster came seventeen years later in 2003. According to the report of the Columbia accident investigation board, the main cause of the accident was not individual actions which violated certain safety rules but rather was to be found in the history and culture of NASA. A culture is seen as one which desensitized managers and engineers to potential hazards as they dealt with problems of uncertainty. This view of the disaster is based on Dian Vaughan's analysis of the Challenger disaster, where in...

12th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference and 14th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference

Taiki Matsumura

MidHath Nigar

Since long time ago, the human being has been curious to understand what happens to his surroundings and to be able to use this knowledge for manipulating the environment for his benefit. The spark of doubt, inherent to the nature of man, is so strong that he not only would settle for understanding his immediate environment. There is always the enthusiasm of looking up to the sky to discover the mysteries that keeps the universe, what happens beyond what our eyes can see if we can reach for the stars. Moved by the exponential growth of technology and science, in the 20th-century human being undertook the so-called “Space Race.” It would take to another level the imagination and knowledge of engineers and scientists around the world to achieve one of the most bold and fearless objectives that humanity has proposed to itself: lead men into space. The winner of such race would proclaim to be the National Aeronautics and Space Administration (NASA) when in 1968 managed to take the first man on the Moon in the Apollo 11 mission. Space Shuttle Challenger leaps from the launch pad. Photo Credit: NASA [1]. It was expected that as a result of the event space agencies would not stop. Instead, it would be an experience full of motivation with views towards what could come if they continued to work correctly. In this context, NASA announced an ambitious project called “Space Shuttle” in 1976. This project presented the idea of a reusable manned spacecraft, able to make several return trips into space. The dream was becoming a reality, conquer of space was happening. The first trips of the Space Shuttle, although in the midst of uncertainties and details to improve, were promising, and with such enthusiasm, NASA dared to send continuous missions within relatively short periods of time. However, a tragedy occurred in the tenth trip of the project, which would be an event that marked and changed the history of space exploration forever. The catastrophe took place on January 28th, 1986, when the spacecraft of the “Shuttle Challenger” mission was destroyed only 73 seconds after launching in front of the eyes of the entire organization and a huge section of the American population. The Shuttle Challenger mission, which numbering was STS-51-L [2], had as objectives to take to orbit the second Tracking and Data Relay Satellite for American communication services. In addition to the placing in orbit of the SPARTAN-Halley, which was an astronomical platform that would carry out observations of the Comet Halley, which at that time was close to the Earth. The accident claimed the lives of the seven members of the crew, including a teacher of basic education who designated for teaching children about space when returning from the mission [3]. The impact that it caused on the population and the enthusiasm of the scientific community was so significant that many media named it as “the largest accident on the conquer of space” [4]. The crew of the Space Shuttle Challenger. Photo Credit: History.com In this essay, we will make a thorough study of the technical and administrative factors that contributed to the failure of the Shuttle Challenger project. Starting from paying attention from the planning stage, the implementation of the project, and even the consequences and further investigations, to be able to identify the lessons to be learned in both areas. It is expected that by completing this task, we can notice the critical factors that we need to pay particular attention to approaching ourselves as students that can develop projects successfully.

Ramon Llull Journal of Applied Ethics

Robert E . Allinson

For the purpose of this analysis, risk assessment becomes the primary term and risk management the secondary term. The concept of risk management as a primary term is based upon a false ontology. Risk management implies that risk is already there, not created by the decision, but lies already inherent in the situation that the decision sets into motion. The risk that already exists in the objective situation simply needs to be “managed”. By considering risk assessment as the primary term, the ethics of responsibility for risking the lives of others, the envi- ronment and future generations in the first place comes into the forefront. The issue of risk heeding is especially important as it highlights the need to pay attention to warnings of danger and to take action to redress problems before disasters occur. In this paper, the decision making that led to the choice of technology utilized and the implementation of such technology in the case of the space shuttle Challenger disaster will be used as a model to illustrate the need to take ethical factors into account when making decisions regarding the safety of technological systems and the heeding of danger warnings. While twenty-five years separates the decision to launch the Challenger and the Fukushima Daiichi nuclear plant dis- aster, the lessons of the Challenger disaster are still to be learned.

Eric Aluoch

abdelrahman abdelraouf

7th IET International Conference on System Safety, incorporating the Cyber Security Conference 2012

Sanjeev Appicharla

This paper presents the results of the desk-stop study to model and analyse the Space Shuttle Challenger Accident using Management Oversight and Risk Tree as a part of the SIRI Methodology. The study uses the NASA Summary Report of the Presidential Commission Investigations on the Space Shuttle Challenger Accident as an input document [8]. The aim of the case study is to learn all causal factors in producing the accident. It is assumed that popular explanations of the accident suffer from errors either blaming the launch decision or the behaviour of managers during the pre-launch decision making activity. Utility of the case study is in learning all the causal factors of the given effect, the Space Challenger Accident, which occurred on 28th January 1986 using the doctrine of causation (cause-effect reasoning) as the guiding principle [8],[12],[14],[17]. Note: The paper title contains a spelling mistake.

Musavvir Mahmud

Engineering is the science and technology to meet up the demands of society. But resources engineers have are limited. Also the knowledge of science and technologies are not perfect. So catastrophic incidents take place sometimes. This report discusses about the disaster of Columbia Shuttle, Tay Bridge and Japan airlines flight 123. All three incidents are discussed with a brief background followed by a brief description of the disaster. All the possible reasons behind the disasters were described. At the end the aftermath of the incidents were discussed

RELATED PAPERS

Scientia Agropecuaria

Bius Boletim Informativo Unimotrisaude Em Sociogerontologia

John Lennon Moura Lima

Soldagem & Inspeção

Thiago Aires Mendes

Journal of Bone and Mineral Metabolism

Raimund Kinne

Jukka Turunen

Investigacion e Innovación en Ingenierias

Edwin Duque

Redox Biology

Rokibul Islam

Sérgio Infante

Journal of the European Optical Society: Rapid Publications

Alessandra Rocco

Odontología Sanmarquina

Adrian Baños Medina

Chone Shmeruk ז״ל

MACARENA VARGAS VEAS

Electronic Notes in Theoretical Computer Science

Jesus Medina

IEEE Military Communications Conference, 2003. MILCOM 2003.

Wensheng Zhang

Amir Gandomkar

ZAMM - Journal of Applied Mathematics and Mechanics / Zeitschrift für Angewandte Mathematik und Mechanik

Edwin Kreuzer

DOAJ (DOAJ: Directory of Open Access Journals)

Herton Corseuil

2022 7th International Conference on Data Science and Machine Learning Applications (CDMA)

Salah El Falou

Richard Urbanek

Selected Regular Lectures from the 12th International Congress on Mathematical Education

Chris Sangwin

The Historical Review/La Revue Historique

Georgia Foukaneli

Jurnal Pengabdian Masyarakat Charitas

Ronald Sukwadi

Ana Maria Mendez Puga

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

OEC logo

Site Search

  • How to Search
  • Advisory Group
  • Editorial Board
  • OEC Fellows
  • History and Funding
  • Using OEC Materials
  • Collections
  • Research Ethics Resources
  • Ethics Projects
  • Communities of Practice
  • Get Involved
  • Submit Content
  • Open Access Membership
  • Become a Partner

The Explosion of the Challenger

Part Seven of Seven Discussions in the Concerning the Challenger Disaster.

I wrote the following entry in my notebook after returning to my office. "I sincerely hope that this launch does not result in a catastrophe. I personally do not agree with some of the statements made in Joe Kilminster's written summary stating that SRM-25 is okay to fly."

As it turned out, I didn't agree with any of his statements after I had a chance to review a copy of the chart. A review of the chart will produce the following conclusions from anyone having normal powers of reason. The chart lists nine separate statements, seven of which are actually reasons against launch, while one is actually a neutral statement of engineering fact. The remaining statement concerning a factor of safety of three on seal erosion is not even applicable to the discussion which had ensued for over an hour. Therefore, Morton Thiokol senior management reversed a sound technical decision without any re-evaluation of the data they had promised when they requested the caucus.

The next morning I paused outside Arnie Thompson's office and told him and the manager of applied mechanics, who was my boss, that I hoped the launch was safe, but I also hoped that when we inspected the booster joints we would find all the seals burned almost through the joint, then maybe we could get someone with authority to stand up and stop the flights until we fixed the joints.

It was approximately five minutes prior to the launch as I was walking past the room used to view launches when Bob Ebeling stepped out to encourage me to enter and watch the launch. At first I refused, but he finally persuaded me to watch the launch. The room was filled, so I seated myself on the floor closest to the screen and leaned against Bob's legs as he was seated in a chair. The boosters ignited, and as the vehicle cleared the tower Bob whispered to me that we had just dodged a bullet. At approximately T+60 seconds Bob told me that he had just completed a prayer of thanks to the Lord for a successful launch. Just 13 seconds later we both saw the horror of destruction as the vehicle exploded. We all sat in stunned silence for a short time, then I got up and left the room and went directly to my office, where I remained the rest of the day. Two of my seal task-team colleages inquired at my office to see if I was okay, but I was unable to speak to them and hold back my emotions so I just nodded yes to them and they left after a short silent stay.

Roger Boisjoly made a number of choices in the months leading up to the Challenger accident. He consistently took an ethical course of action, often risking his job. Nevertheless, he was unable to avert the January 28 launch. In 1988 Roger Boisjoly was given the American Association for the Advancement of Science Award for Scientific Freedom and Responsibility for his extensive and well conceived efforts to avert the shuttle disaster.

Back to the main index for the Challenger Disaster

Related Resources

Submit Content to the OEC   Donate

NSF logo

This material is based upon work supported by the National Science Foundation under Award No. 2055332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

IMAGES

  1. PPT

    the challenger case study engineering ethics

  2. PPT

    the challenger case study engineering ethics

  3. PPT

    the challenger case study engineering ethics

  4. The Space Shuttle Challenger Engineering Ethics

    the challenger case study engineering ethics

  5. Challenger Space Shuttle Disaster

    the challenger case study engineering ethics

  6. Engineering Ethics Case Study: The Challenger Disaster

    the challenger case study engineering ethics

VIDEO

  1. Case Study Engineering society (DCC50232)

  2. VIDEOO CASE STUDY ENGINEERING IN SOCIETY ALIF (F1047) DAN AFHAM(F1028)

  3. Engineering Ethics Project II 2

  4. AUEB-Stats Shorties

  5. Case Study: Engineering Consultation Services

  6. Challenger Study Point: Class 12 Hindi Objective Questions

COMMENTS

  1. PDF Engineering Ethics Case Study: The Challenger Disaster

    The purpose of case studies in general is to provide us with the context—the technical details—of an engineering decision in which an ethical principle may have been violated. Case Study of Challenger Disaster On January 28, 1986, the NASA space Shuttle Challenger was destroyed in a disastrous fire 73

  2. The Space Shuttle Challenger Disaster

    Student Handout - Synopsis. On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into flight. The failure of the solid rocket booster O-rings to seat properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank.

  3. The Challenger Disaster: A Case of Subjective Engineering

    28 Jan 2016. 14 min read. Illustration: Barry Ross. Illustration: Barry Ross. Editor's Note: Today is the 30th anniversary of the loss of the space shuttle Challenger, which was destroyed 73 ...

  4. E

    What led to the failure of the Space Shuttle Challenger? Gain a comprehensive understanding of the relationship between engineering ethics and technical knowledge through the lens of the case study, the Challenger disaster, in this PDH-accredited course exploring what caused the physical failure of the Challenger.

  5. PDF The Space Shuttle Challenger: A Case Study in Engineering Ethics

    Jan. 24, 1985 - O-Ring blow-by discovered after flight 51-C. Aug. 19, 1985 - NASA management briefed on o-ring issues. By the end of 1985, there have been 24 successful Shuttle flights. Jan. 27, 1986 - Telecon for Challenger flight go/no-go. Jan. 28, 1986 - Challenger disaster kills 7 astronauts.

  6. Engineering Ethics Case Study: The Challenger Disaster

    Engineering Ethics Case Study: The Challenger Disaster is a continuing education course for professional engineers. Course Introduction: This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given.

  7. Engineering Ethics Case Study: The Challenger Disaster

    This online engieering PDH course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are presented. The disaster itself is chronicled through NASA photographs.

  8. Engineering Ethics Case Study: The Challenger

    The observations and accusations of the Commission and other critics at the time taken together became the standard interpretation of the cause of the Challenger disaster. This interpretation routinely appears in popular articles and books about engineering, management, and ethical issues when Challenger is cited as a case study.

  9. PDF Space Shuttle Case Studies: Challenger and Columbia

    The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the infamous teleconference that took place the night before the launch in which some engineers tried to postpone the launch. However, the space shuttle program ...

  10. Roger Boisjoly

    The Challenger Disaster. January 28, 1986. Two video clips of the Challenger Explosion from CNN: "Reagan honors shuttle crew (1986)" and "NASA remembers Challenger".. In January of 1987, nearly a full year after the Challenger exploded, Roger Boisjoly spoke at MIT about his attempts to avert the disaster during the year preceding the Challenger launch. . According to the Report of the ...

  11. Challenger Space Shuttle Disaster Bibliography

    Engineering Ethics, 3(2), 171-212. doi: 10.1007/s11948-997-0008-4. Using the Challenger disaster as an example, the author looks at some ways to ... Challenger case as a study in flawed decision making. Pinch, Trevor J . and Harry Collins. 2008. The Naked Launch: Assigning

  12. PDF Engineering Ethics Case Study: The Challenger Disaster

    Thousands of television and radio interviews and discussions took place. Thousands of engineering, business, and philosophy students studied the Challenger disaster in ethics courses. All this attention was the result of a series of engineering decisions that led to the death of seven people.

  13. PDF ENGINEERING ETHICS

    ENGINEERING ETHICS The Space Shuttle Challenger Disaster Department of Philosophy and Department of Mechanical Engineering Texas A&M University NSF Grant Number DIR-9012252 Instructor's Guide Introduction To The Case On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger,

  14. The Boeing 737 MAX: Lessons for Engineering Ethics

    The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994), Space Shuttle Challenger (Werhane 1991), and GM ignition switch ... In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially ...

  15. Engineering Ethics Case Study: The Challenger Disaster

    This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given. The disaster itself is chronicled through NASA photographs. Next the decision-making process—especially the discussions ...

  16. Course: Engineering Ethics Case Study: The Challenger Disaster

    This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given. The disaster itself is chronicled through NASA photographs. Next the decision-making process—especially the discussions ...

  17. Engineering Ethics Case Study: The Challenger Disaster

    This online engieering PDH course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are presented. The disaster itself is chronicled through NASA photographs.

  18. Space Shuttle Challenger Disaster: Ethics Case Study No. 1

    Allan J. McDonald, former director of the Space Shuttle Solid Rocket Motor Project for Morton Thiokol, discusses the events surrounding the destruction of th...

  19. PDF Engineering Ethics Case Study The Challenger Disaster

    issues raised by Challenger. Purpose of Case Studies Principles of engineering ethics are easy to formulate but sometimes hard to apply. Suppose, for example, that an engineering team has made design choice X, rather than Y, and X leads to a bad consequence—someone was injured. To determine if the engineers acted ethically,

  20. Challenger Space Shuttle Disaster Bibliography

    This is an extremely well-written, detailed account of this well-known engineering case study. Engineering Ethics. Bell, T. and K. Esch. 1989. The space shuttle: A case of subjective engineering. ... The author argues that in the Challenger case organizational structure, corporate culture, engineering and managerial habits, and role ...

  21. Are tomorrow's engineers ready to face AI's ethical challenges?

    When engineers do receive ethics training as part of their degree, it seems to work. Along with engineering professor Cynthia Finelli, we conducted a survey of over 500 employed engineers ...

  22. Engineering Ethics Case Study The Challenger Disaster R1

    Engineering Ethics Case Study: The Challenger Disaster Mark P. Rossow, P., Ph. Preface. On January 28, 1986, the Space Shuttle Challenger was destroyed in a disastrous fire shortly after liftoff. All passengers aboard the vehicle were killed. A presidential commission was formed to investigate the cause of the accident and found that the O-ring ...

  23. Engineering Ethics Case Study: The Challenger Disaster Course No: LE3

    The course culminates in an extended treatment of six ethical issues raised by Challenger. Purpose of Case Studies Principles of engineering ethics are easy to formulate but sometimes hard to apply. Suppose, for example, that an engineering team has made design choice X, rather than Y, and X leads to a bad consequence—someone was injured.

  24. The Explosion of the Challenger

    Roger Boisjoly made a number of choices in the months leading up to the Challenger accident. He consistently took an ethical course of action, often risking his job. Nevertheless, he was unable to avert the January 28 launch. In 1988 Roger Boisjoly was given the American Association for the Advancement of Science Award for Scientific Freedom ...