Monday, December 30, 2019
THIS IS NOT A DRILL Dont blame the button presser, blame the button designer
THIS IS NOT A DRILL Dont blame the button presser, blame the button konzepterTHIS IS NOT A DRILL Dont blame the button presser, blame the button designerOn Saturday morning, people living in Hawaii faced a false-but-terrifying nuclear threat after an employee at the Hawaii Emergency Management Agency erroneously sent a public emergency alert that read BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL to peoples smartphone devices, causing widespread panic and fear for the 38 minutes it took for Hawaii authorities to correct the error.How could such a massive error have happened? According to the Washington Post, the unidentified employee saw two options in a drop-down menu on a computer program - Test missile alert and Missile alert - and incorrectly picked the latter option of a real-life threat, which triggered alerts to the publics smartphone devices and television screens.Design failure, not human error, to blameThe employee responsible for the push alert failure has reportedly been reassigned but not fired. Instead of blaming the person who pushed the wrong button, the government blamed the state system that allowed one person to have so much power. Based on the information we have collected so far, it appears that the government of Hawaii did not have reasonable safeguards or process controls in place to prevent the transmission of a false alert, FCC Chairman Ajit Pai said in a statement on Sunday.User interface expert Don Norman said the error is the incompetent design in the alert system that allowed it to occurIn 2003, Norman wrote an oft-cited essay on how system failures within organizations too often get attributed simply to human error, which allows the errors to continue to occur unexamined.If we assume that the people who use technology are stupidthen we will continue to design poorly conceived equipment, procedures, and software, thus leading to mora and more accidents, all of which can be blamed upon the hapless users rather than the root cause - ill-conceived software, ill-conceived procedural requirements, ill-conceived business practices, and ill-conceived design in general, Norman wrote. It is far too easy to blame people when systems fail.Under this logic, yes, humans can cause errors, but more attention and responsibility should be paid to the processes in place that lumineszenzdiode to such an error.When a contractor can disable the presidents accountNormans philosophy can be applied to another recent incident of one workers mistake causing an outsized impact. In November, social media company Twitter blamed a contractor on his last day for disabling President Trumps Twitter account for 11 minutes.In a later interview, the contractorBahtiyar Duysak explained that his action was a mistake because he never thought the account would actually get deactivated, according to TechCrunch.The New York Times reported that Twitter employees had long expressed concerns that high-level acc ounts like those belonging to the U.S. president were too easily accessible for hundreds of the companys workers. By having a procedure that allowed for contractors like Duysak to access them, a system failure was more likely to occur.The nuclear threat triggered by a dropped socket wrenchThis is not the first time a nuclear situation has been caused by a human error facilitated by a flawed system. In 1980, two airman were performing maintenance on a U.S. Air Force ti II ballistic missile in Arkansas when a socket wrench was accidentally dropped into the shaft of the missile silo. The socket hit a fuel tank and caused a missile explosion that lifted a warhead out of the ground and caused two peoples deaths.If the warhead had actually detonated, the maintenance mishap could have made part of the state of Arkansas uninhabitable and countless lives would have been lost. The safeguards for such a fluke event were not in place. As one of the technicians in the missiles control room put i t, an accident like this wasnt on the checklist.If the system worked properly, someone dropping a tool couldnt send a nuclear warhead into a field, Eric Schlosser, who wrote a book on the Air Force incident, said in a PBS documentary.These examples show us that blaming a person is easy when an error causes headlines and panic. Whats more important, designers like Norman argue, is making sure the problem gets fixed so that mistakes and fluke events cannot be repeated by someone else.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.