- Mock Questions for UPSC Mains
- Introduction: Stampede disasters
- Body#1: Causes / triggers for stampedes
- Body#2: Disaster Preparedness: Before Stampede
- Body#3: Disaster Management: After Stampede
- (GSM3) Role of Science Technology in Stampede Prevention
- (GSM3) Role of Media in man-made disasters
- (GSM4) Ethical code for Media during disasters
- (GSM2) Legal Provisions against stampede
- Conclusion: Stampede prevention (भगदड़ की रोकथाम)
Mock Questions for UPSC Mains
Answer following questions in 200 words each, in blank A4 sized papers with 1″ margins on each side:
- (GSM3) Suggest measures for preparedness against human stampedes at public places and public events. (मानव भगदड़)
- (GSM3) Why do stampedes occur? Enumerate the application of science and technology in disaster response, recovery and mitigation of such manmade disasters.
- (GSM3) Discuss the role of media in the prediction, protection and preparedness for manmade disasters. (मानव निर्मित आपदाओं में मीडिया की भूमिका)
- (GSM4) Prescribe a code of ethics for the journalists during disasters and terrorist attacks.
- (Essay) India’s preparedness against man-made disasters. (Answer in 1000-1200 words)
Introduction: Stampede disasters
There are three ways to begin the answer:
- Define: Difficult to define human stampede (मानव भगदड़) in one or two lines. Besides you’ll have to mention those factors / triggers again in the body of the answers- so unnecessarily it’ll lead to repetition of phrases and sentences. Therefore, better to start with “origin”.
- Origin: From religious shrines to railway stations- frequent incidents of human stampedes are an unfortunate reality of Indian life. The latest entry in the list of such tragedies is Elphinstone railway bridge in Mumbai, which claimed the lives of over 20 people in 2017. [Atleast this much current affairs you should follow: >20 had died.]
- Data: In last decade over __ people have lost lives in the stampedes, including the latest tragedy that occurred at Mumbai’s Elphinstone railway bridge. [but you can’t memorize decadal causality numbers for each disaster in the real exam. So “data” is a difficult way to begin this answer.]
- Observe that I’ve given “origin” in verbose and flowery manner. Because, disaster management for human stampede is not a ‘big’ topic unlike earthquake and urban floods where you can fillup many pages. so to reach 200 words marks, you may have to do padding like this.
- Yes, this article has more than sufficient points to even handle an essay or GTO Tasks/GDPI for SSB/CDS interviews, but UPSC could ask any random disaster, so “padding the introduction” is a skill you should also cultivate.
Body#1: Causes / triggers for stampedes
- Human stampedes result from the forces generated by panicked persons pushing each other in a large crowd.
- Since their movement is uncoordinated, they get injured or fall off each other, and become obstacles to the movement of others.
- Fallen person dies by intense compression of lungs and subsequent suffocation. [Medical term is “Traumatic asphyxia” but we are writing answer for GS paper not optional subject paper. so no need to memorize.]
NDMA identified following triggers/ factors leading to stampedes
- Structural: collapse of the temporary structure, steep stairs, narrow exists because of illegal constructions, parking and hawkers etc.
- Fire / electric: usually from the makeshift kitchens in the ‘pandal’, inappropriate use of firecrackers / electrical wiring during the event.
- Human: Underestimating the size of crowd, overselling of the tickets; lack of coordination with authorities, panicking by rumors, rush to get freebie / celebrity autograph etc.
Disaster Preparedness: Before Stampede
Planning & coordination
- Following Factors must be considered while planning for large events:
- 1) Type and duration of event.
- 2) Size of the expected crowd, gender and age profile of attendees.
- 3) Location characteristics.
- Accordingly, the event organizers do planning, rehearsal and safety drills with the help of police, fire, health, forest, revenue, PwD departments.
- For example: in religious events, depending on the estimated crowd size- extend the duration of the ‘melaa’ or ‘darshan’ for more days to avoid a larger gathering.
Volunteers, Paramedics & Security personnel
- Uniform dresses and id cards for volunteers.
- Give them training on crowd management, how to frisk visitors, how to operate metal detectors, how identify trouble makers and suicide bombers.
- Order them to frisk bidi, cigar, matchstick etc. to avoid any fire disaster.
- Paramedics should train the event volunteers in how to give first aid treatment including CPR resuscitation, how to transport the injured persons etc.
- There should be disaster protocol and standard operating procedure for all nearby by hospitals.
CPR full form is cardio-pulmonary resuscitation but since we are not writing answer for medical science optional, you may ignore to remember this full form.
- Central control room equipped with CCTVs to monitor and spot crowd buildup areas.
- Mike & public address system to make important announcements.
- Temporary observation towers may be eracted, if required.
- In UAV / Drones could be used for monitoring the crowd movement & risk assessment- for mega events such 15th August or 26th January parade.
- Don’t rely on mobile phones as large gathering might create network problems. Better to use dedicated walkie-talkie system for communication between personnel.
Prevent Traffic Congestion
- Traffic congestion hampers retaliation, rescue and relief operations during stampede / terror attacks.
- Coordinate with rail and bus operators so they don’t unilaterally arrange vehicles for other routes. Else, many people may come in private vehicles.
- Charge high parking fee to discourage private vehicles.
- Provide shuttle bus service between venue and the nearest railway / bus stop to ensure less rickshaws and taxis crowding the venue.
- Strict enforcement of traffic and parking rules even for VIPs.
Entry Tickets & Display boards
- Provide seat numbers and entry passes. Don’t give general admission- otherwise difficult to control movement of people
- Assign parking and seating arrangements with color coded pathsways.
- Because “mad rush and casual attitude” is the Indian way of life. People don’t read the text instructions like “Exit from Nanaji gate via Deendayal pathway.” They’ll keep asking volunteers where to go?
- but if ticket / signboard /public announcement says “follow green path for entry and red path for exit” then it facilitates easier and effective communication- even for the illiterates.
- Accordingly, design sign-boards / display boards in multiple languages to disseminate information about entry & exit routes, food, water & rest room facilities, helpdesk location for lost & found items and children.
Pathways & Hawkers
- Prohibit hawkers and sellers in corridors and pathways.
- Hawkers must not be allowed to use stoves or sell cigarettes / matchsticks / loud whistles / scary masks / fire crackers and other dangerous items which can be misused by unruly youth to scare people. If it’s a ‘melaa (fair)’ such shops should be at ‘exit’ rather than at entry points.
- If the shrine is atop hill/mountain, there should be separate track for pedestrians and separate track for ponies and horses. (else someone could light a firecracker to startle the horses and cause stampede)
- Adequate facilities for light, ventilation, drinking water and toilets- along the pathways.
Entry-Exit, Barricades & Puctuality
- Doors must not be suddenly opened or suddenly closed.
- Provide multiple exits. Routes of ingress and egress must be separate. Otherwise incoming and outgoing visitors /vehicle will collide during panic / rush.
- Entry and exit points must have strong but non-permanent barricades. (so that during rush, you can remove those barricades.) Such barricades must not have strong metal spikes, else it may cause more injuries during stampede.
- Strict adherence to time and punctuality. Because delay causes anger and restlessness in the crowd. Sudden entry of train / celebrity triggers rush. [therefore multiple entry-exits and non-permanent barriers help.]
Freebies and Autographs
- If there is any ‘free distribution’ of gift, toy, saree, blanket, food, prasad, alms etc. there must be multiple distribution points. Volunteers should ensure that all devotees don’t concentrate at a single distribution point.
- Such free gifts must not be thrown randomly at the crowd like SRK did @Vadodara rail station during promotion of his flop movie ‘Raees’.
- Celebrity should not be allowed to mingle with the crowd because 1) his personal security may get compromised 2) it’ll trigger rush among the people to push each other for his glimpse / autograph. (e.g. again SRK ‘induced’ stampade @Vadodara Rail station during promotion of his flop movie ‘Raees’.)
- Don’t allow more people than the weight bearing capacity of podium / lifts / temporary stairs. Sometimes, large no. of people climb on podium to have group photo with the celebrity or politician- resulting into collapse of the structure. Give strict instruction to the security personnel to forbid such entry.
Electricity and Fire hazards
- Kitchen and cooking facilities should be away from the main event.
- Proper care in handling the LPG stoves, firewood and electric fitting. Only qualified and registered catering personnel should handle these.
- Circuit breakers, fuse boxes, switchboards, fuel tanks must be kept in isolated location with security guard, lest some unruly youth play with it. E.g. shutting down all lights during a night-concert may also trigger panic and stampede.
- Electrical wires should be underground or overhead and always away from the walking paths- to minimize tripping hazards.
- Forbid smoking, because 1) It can cause fire; 2) suffocation and discomfort may prompt others to push each other.
- Ensure fire extinguishers, fire hydrants, sand buckets, water tanks and first aid kits are available throughout the venue.
- Forbid visitors from throwing food waste, lest it attract dogs, monkeys and elephants – their violent behavior can cause crowd panic.
Disaster Management: After Stampede
- Event volunteers and paramedics must commence rapid first aid treatment.
- Distribe patients to area hospitals in a coordinated manner, so that relatives can easily find them.
- A control room and helpdesk to handle all the inquiries.
- Police personnel and relief workers should ensure proper storage and tagging of the mobiles, purses, footwear and other belongings of the victims.
- Psycho-social support and mental health services for the survivors, and the persons who lost their loved ones.
(GSM3) Role of Science Technology in Stampede Prevention
S&T, ICT: Before disaster
- Aadhar linked Barcoded passes should be given, and verified at entry points, to check whether person actually arrived or not. So, in case of stampede or railway accident – insurance claim / relief money can be distributed with less hassles. [Vaishnodevi and Sabarimala shrines already have e-registration]
- In the long run, if aadhar is also linked with crime database, we can prevent entry of under-investigation / accused / convicted persons with a history of inciting violence or playing mischief at public places.
- Such e-registration will also give us an idea of gender, and age composition of the crowd and make advance facilities for wheelchairs and adequate number of restrooms etc. accordingly.
- Special apps could be crated for digital entry pass and GPS instructions for parking and entry / exit points.
- RFID / App based visitor passes- this will help monitoring the crowd buildup with real time geo-location, and take preventive measures accordingly.
- If RFID / App not possible, we can install sensors to monitor heat, smoke and noise levels at various points- that too can help estimating the crowd buildup and take preventive measures.
- CCTV / UAV / Drones / GIS for monitoring of the crowd.
- Above data will also be useful in big-data analytics with crowd simulation at later time. This will help in better management of future events.
- SMS, Social media, prerecorded voice mails to make important announcements (before the event and after the stampede).
S&T, ICT: During / after disaster
- Dedicated website / social media account / toll-free IVRS number to disseminate authentic information and to prevent rumor mongering.
- RFID / Barcodes and photographs of unidentified dead bodies and luggage. All of these should be displayed on a dedicated web portal so, relatives can find them without having to run from pillar to post in every hospital and morgue.
- Logistics: material, medicine, blood donors, inventory management and personnel management- all can be managed efficiently using ERP softwares.
(GSM3) Role of Media in man-made disasters
As such I’ve written this for stampede, but you should be intelligent enough to replicate these points on other events as well e.g. terrorist attack at public place or RIOTING after conviction of a religious guru.
Role Before Stampede:
- Educational: inform juntaa about venue timing, traffic routes. don’t just cover event / celebrity appearance / speeches, journalist should also educate public about “dos and don’ts” to avoid stampedes, terror attacks and pickpocketing. and most importantly, youth must be informed not to take selfies but help victims.
- Critical: evaluate and highlight any gaps in the disaster preparedness before the event. If media had kept highlighting the nuisance of hawkers & congestion at railway bridges (instead of tracking Honeypreet’s whereabouts), then railway police could have been pressurized to take some preventive measures across all the railway stations and bridges, including Elphinstone bridge.
Role during / after stampede
- Investigative: Find the cause of disaster, Report the activities of anti-social elements and politicians who’re obstructing the rescue and relief operation e.g. AAP MLA Alka Lamba’s photo-op on fire brigade vehicle.
- Informative: Help people locate their missing relatives and loved ones. Prevent the spread of misleading rumors and distorted facts.
- (Again) Critical:
- Highlight the needs of survivors and family members to ensure that authorities act with full sincerity and empathy during rehabilitation.
- Organize debates and expert opinions to keep public pressure on authorities- to ensure they to initiate preventive measures against future disasters.
- Mobilize: Funds, relief material, blood donors and volunteers.
(GSM4) Ethical code for Media during disasters
- Investigative journalists should resist temptation for quick-bytes and half-baked truths to deliver ‘breaking news’. They should publish news / put blame on a person or organization only after verifying the facts from authentic sources.
- For example, after Elphinstone stampede, leading news websites showed a clip of a ‘man molesting a dying women in the railway bridge’, however from the different camera angle, it was seen that the person was actually trying to help. Later on, all newspapers had to retract that molestation story and apologize. Nonetheless, such “breaking news” harms India’s image among foreign tourists who may not see the retracted news later on.
- Photojournalists and cameramen should avoid voyeurism, brandishing of the personal grief, or publishing extremely gory and disturbing photographs.
- They should respect the privacy of the women and innocence of the children (unlike our kids reality shows.)
- In some cases, shocking photos / videos need to be published to highlight the wrongdoings and bring people out of their lethargic attitude. A journalist should use his inner conscience for this decision.
- Journalist must weigh the pros and cons of covering a story or acting as a ‘volunteer’. There have been instances of freelance reporters shooting video of the dying / drowning person instead of saving him.
(GSM2) Legal Provisions against stampede
Unlikely question but GSM2 is known for its अतरंगी (crazy) questions.
Disaster management act 2005 mandates that:
- Authorities can restrict human and vehicular traffic in vulnerable area.
- Local authorities must ensure safety-standards in all the construction projects in their area. [After all, stampedes can occur in malls and theatres also, and if they don’t have mechanisms for wide doors, fire exit etc.]
- Public officials / private companies neglecting their duties regarding safety standards= punishable.
- Person making false claims for insurance / relief = punishable.
Police Act 1861 & state specific city police acts:
- Police given powers to regulate public assemblies, and processions. Organizers have to obtain license.
- Police / fire officials can remove obstructing persons / structures during disasters.
- Penalty on organizers and participants, if any bonfires, animal stampede, celebratory firing etc.
State specific Mela Act / Cinema Act
- Organizers have to obtain license from District Magistrate.
- DM given power to specify safety standards- including the entry, exhibition and sale of animals.
Shortcomings in the existing legal structure
- Existing laws provide only penalty for neglect / mischief. but no specific mechanism or comprehensive and uniform guidelines for compensation.
- So, victims have to depend on government’s mercy OR approach civil courts but it’s very time consuming process, as seen in the Uphaar Cinema tragedy case of Delhi.
- There must a law requiring all such events, places and shrines to compulsorily purchase “liability insurance for visitors”- covering the loss of life & belongings, injury & hospitalization. (Similar to Civil Liability for Nuclear Damage Act, 2010.)
- If railway and bus stations are also required to buy such insurance, it’ll reduce their profitability. But they can hike the entry fees. This will automatically reduce the crowd build up. (i.e. only the travelers will come, and 50 relatives will not come to ‘see him off’ to enjoy free wifi.)
Conclusion: Stampede prevention (भगदड़ की रोकथाम)
- Natural and man-made disasters occur through omission and commission of the society and the administration.
- Therefore, any disaster can be prevented or at least its impact can be minimized by the coordinated actions of all the stakeholders (and depending on question, you can write, “including media” or “and with the help of science technology”).
TONY COX, host:
This is TALK OF THE NATION. Im Tony Cox in Washington. Neal Conan is away.
Man-made disasters are a fact of life, from the BP Horizon oil spill to the West Virginia mine explosion earlier this year to the levy failures during Hurricane Katrina five summers ago.
What these and other incidents have in common, experts say, is a dangerous mix of arrogance, bad luck and shortsightedness by the people in charge. Managers often operate on the edge, mistakenly thinking that technology will keep things right, but too often, they're wrong.
Today, we explore the risk-taking culture that can and does lead to disaster. We will also examine what can be done about it. How do you and your colleagues try to keep disaster from happening where you work? Tell us your story. Our number here in Washington is 800-989-8255. The email address, email@example.com. And to join the conversation, just go to our website. Go to npr.org, and click on TALK OF THE NATION.
Joining us now by phone from San Francisco is William Reilly. He is the co-chairman of President Obama's national oil spill commission and co-chairman of the National Commission on Energy Policy. Mr. Reilly, welcome to the show.
Mr. WILLIAM REILLY (Co-chairman, BP Deepwater Horizon Oil Spill and Offshore Drilling Commission): Pleasure to be here, thank you.
COX: Let's begin with this. Typically, commissions are developed after disasters occur, such as the one that you are working on for the BP Horizon disaster, yet we continue to have more disasters happen. How do you make sure, if it's even possible to do that, that the findings will help in preventing future problems?
Mr. REILLY: One very unusual characteristic of this commission is that we were appointed while the disaster was continuing to unfold. This was a slow-motion catastrophe that just continued and continues today in terms of the impacts that are continuing to be noticed down in the Gulf.
We are focused mostly, however, on the future. I mean, we're instructed to find out the cause and the root cause of this calamity and recommend policies to absolutely reduce the chance of it ever happening again and then to recommend the policy for future of offshore oil and gas.
And our emphasis will be, while we'll answer those questions about how it happened and why it happened, very much on the future.
COX: Would you expect the and I know it's early, but would you expect that part of your commission recommendations would be to either lower or in some way change what I will call the culture of risk-taking that so often leads to these kinds of disasters?
Mr. REILLY: The culture both of the regulatory enterprise of MMS that was supposed to have realistic response plans approved and realistic well designs, protective well designs, as well as the behavior of the company, which has a history of having been challenged by process safety, those very much on our agenda, and we're getting into those as we speak.
COX: Do you think that we're too reliant on technology?
Mr. REILLY: I think we probably have a tendency to, and I guess it's a very human one, to assume that if things haven't gone wrong for an extended period of time, they won't go wrong, even though we are constantly intervening to improve and make more sophisticated the interrelationships among technology and human choices and decision.
We noticed that in the financial crisis, where the subprime problem was thought not to be possible. And we've also noticed it here, where the consensus really was that you could not have a spill like this, given the technology of today's oil and gas exploration, which is very, very sophisticated.
The response capability, however, and the containment technology were never developed in tandem with the tremendous advances made in the capacity to go deeper in the ocean. That is one reason this spill went on much longer than it should have, even after it happened.
COX: One final thing I'd like to ask you: Do you think that there is a fear of punishment on the part of those who work at places like BP and elsewhere, where these disasters occur, that prevents them or in some way affects their behavior on the job because they're so concerned that if something goes wrong that - well, we know what will happen if something goes wrong?
Mr. REILLY: I cannot really say that's true based upon what I know now with respect to BP. I can say that the best rig practices, the best oil and gas exploration practices, which I have looked into somewhat, give everybody, the lowest level worker a stop-work capacity. And there's no repercussions if he should exercise it.
That kind of thing, very alert sensitivity to safety, a problem that anybody sees that can cause a difficulty like a rig failure, I think needs to be built into the system. And whether it was with respect to the Macondo well, we will determine.
COX: William Reilly is the co-chairman of President Obama's national oil spill commission investing the BP Horizon disaster and co-chairman of the National Commission on Energy Policy. He was also the administrator of the EPA under President George H.W. Bush. He joined us by phone from San Francisco. Mr. Reilly, thank you very much.
Mr. REILLY: Good to be with you, thank you.
COX: Thank you. Joining us now from member station WUOM in Ann Arbor, Michigan, is James Bagian. He is a former NASA astronaut who has participated in investigations of Challenger and Columbia shuttle disasters. He is also the current director of the Veteran Administration's National Center for Patient Safety. Jim, nice to have you.
Mr. JAMES BAGIAN (Chief Patient Safety Officer, Veterans Health Administration): Same here.
COX: Let's get right into your career as an astronaut. NASA has had its problems with takeoffs and space travel. What procedures are used to prevent disasters in such a high-tech field as that?
Mr. BAGIAN: Well, I think it's not unlike many other industries. They try to initially analyze the things that are to be done and look for those steps or procedures that have potential risk or hazard associated with them, try to assess the risk.
And this is really the probability that a bad thing will happen, and then decide which risks you want to reduce and how to go about doing that as a cost effect to do so, and then go through that whole litany of things that could happen and then try to anticipate and practice for them.
COX: Well, Jim, your history has shown that you have worked in a number of areas - aerospace, you are now working in the health care industry. Life and death is a matter of life, so to speak, in both of those careers. Are you finding that there are more similarities than differences in terms of how the processes are gone through in order to try to prevent disaster?
Mr. BAGIAN: Well, I think my observation, at least, is in the fields that are more heavily engineering, rooted in engineering so, space flight, aviation - there tends to be a more methodical approach to what can go wrong, anticipation of that, and then to try to put robust systems in place to account for that and reduce the probability of a bad event.
In health care over the last 10 years or so, there's starting to be an awakening of how to do the same things, rather than concentrate on asking people to be more careful, try harder, don't make mistakes and be perfect. I think they're starting to understand that you can't just exhort people to be more careful, that you have to set up a whole system of things that works together to ensure that the right thing happens, not just diligence.
COX: We talked you write about, we have been talking about with Mr. Reilly before you, and you have written about the culture of safety and how some approach it and some dont. And you just gave us a brief description of the difference between aerospace, for example, and other industries.
How big a factor, how important a factor, Jim, is this culture that leads to the kinds of disasters that we saw in the Gulf and in other places?
Mr. BAGIAN: Well, I think culture certainly is important. I think as you hear Mr. Reilly speak just a few moments ago, part of it is the ability and the tendency, desire for people who work in a given area to be able to speak up. To stop the line, is sort of what he was just talking about, that people, anybody can say this doesn't look right, and it has a mechanism by which they can make that communication available up the chain of command.
Now, I'd hasten to add that no matter how good you make that, when you look retrospectively, when there's been a bad event, you can always say we could have communicated better, and that's always true.
I think things are relative and you have to see, do people generally speak up when they have a concern that something's wrong, not that they're sure, but they are concerned and then if they turn out to raise that concern, and it turns out nothing was, there wasn't a problem, do they feel ridiculed, embarrassed, humiliated, or do people say boy, I'm glad you brought that up? It wasn't anything this time, but I'd rather be safe than sorry.
And I think that's the thing that you always try to strike that happy balance, and I think you never do. You have to constantly be diligent and vigilant as far as management goes, to invite people to speak and to be able to do so without feeling not just punishment but being embarrassed and humiliated.
COX: Whenever there is a disaster, and there have been too many to enumerate, what tends to happen, and it happened again here with BP, was that management gets demonized because it appears as if they were they fell asleep at the wheel or even worse, were complacent or not caring about the, you know, the potential disastrous effects of their activity.
My question to you is: Do you find in your work that that is a legitimate reason and an explanation for why things go wrong, that management just doesn't give a damn, so to speak?
Mr. BAGIAN: I think that's seldom ever the case. As you can just witness - and, you know, I'm not an expert on what's happening with the Gulf oil spill, but you can be sure management wishes they weren't in the position they are today. It's not good from a business perspective, it doesn't help their profitability, it doesn't help their shareholders, and certainly it doesn't help management themselves. As we've seen, there's been a change in management at BP.
So if they could have the Aladdin's magic lamp and make a wish, I'm sure they would wish it didn't happen at all. So I think that's it's an easy thing for people to say, but just whether it's management at an oil company, whether it's a physician or a management of a hospital, they don't want to be a bad thing to happen.
I mean, that would be very, very rare. And it's easy to draw that conclusion, I think, erroneously so, when you're look at after a bad event occurs, people say, oh, something bad happened, somebody must be to blame. And it's seldom ever that simple.
And in fact, a comment that Mr. Reilly made, where he said we're looking for the root cause, I would hasten to add there isn't a root cause. It's a bad term. There are many causes and contributing factors, and to say that there's just one, I would doubt you could ever show an event that there was just one cause. There might be one principal cause, but there are many that, you know, that contribute to in sum total end up with a bad event. And you have to look at the myriad of things that contribute to a bad event.
COX: We're going to be joined shortly by another person to add another perspective to the discussion. And before we go to break, I want to ask you very quickly about the role that punishment plays in risk assessment and in leading up to these kinds of disasters. Is that a big factor, punishment?
Mr. BAGIAN: Well, I think the fear of punishment is. I think in general, what you're trying to do is prevent a bad event from happening. What has happened can't be undone, but you want to say, how do we get people to feel free to speak up when there's been a problem that they don't think they did anything heinous that they'll be treated fairly?
So you have to make criteria that are clear to people that if they cross this line, they could be punished. And we've done that in health care to make that clear to people, but that way, people can feel safe to speak up about other things.
COX: Speaking up is really a very important point, and we are going to talk about that more with Beverly Sauer, who is a consultant in strategic risk communication.
We're talking about risk management and preventing disasters. What measures does your company have in place to try and ward off accidents? Give us a call, 800-989-8255. I'm Tony Cox. It's TALK OF THE NATION from NPR News.
(Soundbite of music)
COX: This is TALK OF THE NATION. I'm Tony Cox, in Washington.
A major plane crash, a bridge collapse or an oil spill, man-made disasters often have several things in common: arrogance, bad luck and complacency among managers.
Today, we talk about the steps that can be taken to curb the risk-taking environment that leads to these kinds of tragedies. Our guests are James Bagian, a former NASA astronaut who participated in the investigations of Challenger and Columbia shuttle disasters. And in just a few moments, we'll be joined by Beverly Sauer, a consultant in strategic risk communication, who served as a member of West Virginia's Special Commission on the Sago Mine disaster back in 2006.
If you'd like to join the conversation, tell us: what systems does your company have in place to try to prevent disasters in the workplace? Tell us your story. Our number here in Washington, 800-989-8255; the email address, firstname.lastname@example.org. And to join the conversation, just go to the website npr.org, and click on TALK OF THE NATION.
Let's take a phone call first. This is David(ph) from Salt Lake City, Utah. David, welcome to TALK OF THE NATION.
DAVID (Caller): Yeah, hello, thank you for taking my call.
COX: You're welcome.
DAVID: I have I've been in the field of disaster management and relief for 12, 13 years now, and one thing that I would say is that whenever there's a major setback, such as the, you know, what we're seeing in the Gulf, there are very quickly developments - such as, I believe it was after Three Mile Island - there was the creation of what's called the joint information center, which allows the stakeholders involved in the response to streamline their messaging to the public and also coordinate their messages internally.
We've also seen, you know, the adoption of what's called the incident command system, which now goes from the federal level down to your township level, so that responders can coordinate better.
And what I would think is that one of the outcomes we might see from what's happening here, is better coordination with the private-sector stakeholders. Because even 10 years ago - I won't say it was an afterthought, but we didn't see that as much. And now I think that may bring that special coordination to the fore.
And I'll go ahead and take my response off the air.
COX: Thank you very much, David, for that, which is a good question for me to put to our next guest, who is joining us here in Studio 3A, Beverly Saur. Beverly Sauer, as we said, is a consultant and expert in strategic risk communication; served on West Virginia's Special Commission on the Sago Mine disaster back in 2006, also the author of "The Rhetoric of Risk: Technical Documentation in Hazardous Environments." Nice to have you here.
Ms. BEVERLY SAUER (Consultant in Strategic Risk Communication; Author, "The Rhetoric of Risk: Technical Documentation in Hazardous Environments"): Thank you, Tony.
COX: So communication is really big. Jim made reference to it, so did the caller make reference to intercommunication between private and the government sector. How big a factor is, Beverly, communication in terms of preventing these disasters?
Ms. SAUER: Well, Tony, without making communication the sort of answer to every question, which is dangerous in itself, because technology is really important; you have to have communication within an organization so that the people who are on the ground - and that could be inside a coal mine, it could be on the rig - have the ability to communicate what they know to management because they have a very special knowledge of it's like driving a car. You know your car. You know how it performs. And you have to trust those people to make really good decisions.
At the same time, you need a big picture overview, so that people who are focused in the BP disaster, for example, on the slurry and the mud and making changes, are not too focused on what they are doing, that they're communicating with other people who know the effects on the big system.
So it's a communication top-down, communicating that safety is important; and it's communication from the bottom up, what's happening on the ground, and how do we deal with it.
COX: Before we get to our next caller, who is going to be Ken(ph), by the way, so if Ken, if you can hear me, just hold on, I'm coming to you in a second, Jim, I want to follow up on this idea of communicating between levels within a company and whether or not the people at the bottom rung are heard, or cared about to be heard, by the people who are the decision-makers at the top rung.
Mr. BAGIAN: Well, I think generally, they want to hear, but you can look at many studies that are done we've done them, others have done them when you ask people how good is communication up against the hierarchy.
The higher you are in an organization, the managers tend to think communication's great, but when you go to the bowels of the ship, so to speak, they don't feel it's quite so good.
So it's generally the people at the top think everything is great, I wish people would talk to me. But the people at the bottom have disincentives, obstacles, may feel intimidated, you know, a whole bunch of things that can cause that, where they don't communicate. And that's where management, leadership has to constantly offer avenues, techniques to break down those barriers, because you might not hear the things you need from the coal(ph) face.
COX: Let's hold on, Beverly. I'm going to let you respond to that, but I want to get Ken in here, because we want to hear what the listeners have to say, as well. Ken - joining us joining us from Alford, Florida. Is that right, Alford or Alfred?
KEN (Caller): Alford.
COX: Alford. Welcome to TALK OF THE NATION.
KEN: Yes, and thank you. I've been in health care, I'm an RN, and a lot of problems that face nurses every day are medication errors. And they can be catastrophic. The way we deal with them is not to lambaste the person who made the error, but there's a critical point here, and that is that 99.9 percent of the time, the nurse really tried not to make that error. And I think that's one of the things that's missing in corporate America today, is that lack of regard for the humans or the resources that are involved.
COX: Thank you very much, Ken, for that call. Beverly, you wanted to make a comment?
Ms. SAUER: Well, I want to say that a lot of the communication that occurs is simply different, and we have to understand that we're not always dealing with the same thing.
Engineers, who are sometimes at the top levels of management, very frequently talk in very complex ways, and they don't they feel that if they're criticized for their communication, that they're not trusted for their engineering.
So we need to create a culture where engineers are actually trained to communicate to others outside of their areas of expertise. And this was particularly critical in the blowout preventer, because there are very few engineers that I've known I know or I've talked to - who actually know how it works. And then if you look at the New York Times, articles on the financial issues, people will say, well, the blowout preventer didn't prevent the fire, the blowout preventer didn't prevent this - and very few people actually know what it was intended to do.
So you take that to the Minerals Management Service, they assume that if it's a blowout preventer, it prevents blowouts. And there are not enough questions asked and not enough communication about what's really going on in some of these complex technologies.
COX: Which brings us back to the beginning, communication, once again.
Ms. SAUER: Communication.
COX: Scott(ph) is joining us from Eagle River, Arkansas. Scott, welcome or no, Alaska, Eagle River, Alaska. Welcome.
SCOTT (Caller): Hi, my name is Scott, and I've got two incidents to relate where there were problems in one spill and one potential spill, because management didn't want to listen to engineering.
In the mid-'90s, there was a (technical difficulties)...
COX: Scott, unfortunately, you are breaking up. Let me ask you to get to another phone, call us back, and we'll try to get you back on the air. Jim, you were an astronaut. Did you feel that people weren't listening to you? Have you heard something when you were, you know, about to go up into space - or you saw something, or you smelled something - that had not been in the book, so to speak?
Mr. BAGIAN: No, I think generally, you felt that when you fed back concerns, that they were looked at. I won't say everybody always agreed with answers sometimes, but as Beverly said, it's how do you communicate what level of risk you're going to take.
And I think that's important to point out. There's always risk. There's never zero risk. There's nothing that is safe. That is, it's safe as compared to what? And I think sometimes when people are sloppy and talk about these things, that gets lost, because if you don't define what does safe mean, one person has the idea - say with the shuttle, oh, it's like flying a commercial airliner to Disney World. Well, there isn't like that at all. It's far more dangerous than that.
So one person's view of safe is an airliner, and an astronaut's is it's like getting shot out of a cannon. So it's a quite different view of the world.
COX: Let's talk about something else that's related to safety in the workplace, and that's cost. Cost is always an issue, and here is an email we got. As a recent civil engineering graduate, it seems to me that there is an unwillingness to pay for the cost of designing and maintaining the structures that we rely on.
As we begin more complex designs, the cost will increase if they are to be done safety. True or not true, Beverly?
Ms. SAUER: It's absolutely true. And one of the ironies of all of this is that if you look at the Columbia disaster, NASA estimated that it cost about $6.4 billion to do the cleanup and investigation. There were figures that came out this morning that BP is about $6.1 billion. So it seems to be the cost of a disaster.
The cost of good communication and preparation is much smaller than that and needs to be taken up front. And when you look at some of these companies, Upper Big Branch mining disaster, there were over 1,000 violations which were seen to be the cost of doing business in the engineer or in mining engineering.
Any three of those, the first three were combustible gases, ventilation and arcing electrical wires. And I've often said, if you got on an airplane, and you knew that there were arcing electrical wires, combustible gases and ventilation, you wouldn't get on. But you expect miners to go to work in conditions that are disastrous to begin with -and fundamentally disastrous.
COX: Do you find that to be the issue for you in the places where you have studied and worked, Jim, that cost is a major factor with regard to the potential for safety violations, and ultimately, for disaster?
Mr. BAGIAN: I think you'll often hear that said. I'm not sure how true it is. Cost certainly is a factor. It must be considered, because it's not safety at any cost. It's safety at the appropriate cost. And you have to look at the - what is the cost to the organization.
If you spend money to make one thing safer, but then you don't have the ability, say, to buy new diagnostic or therapeutic equipment, there's a real human cost to that, as well. And you have to weigh those and decide which is the best for the population you serve, in general. And I think that's true whether you run an airline, whether you're trying to fly the Space Shuttle or drill a well.
And I think, how do you communicate that? How do you talk about that there's - what risk is acceptable. And you'll see NASA's had that problem, and so have others, that before the disaster occurs, how well do you communicate to your stakeholders what the true risk of a disaster is and how much you're willing to spend to buy that risk down? And I think that often doesn't occur until - in many organizations until after the disaster occurs. And that's far too late.
COX: You know, we've been talking about space travel. We've been talking about BP. We've talked about the mine disasters. But there's another area where cost and safety are, you know, really close to all of us, and that's the food that we eat. Let me ask Phil from Sparta, New Jersey to join us. Phil, welcome. You're on TALK OF THE NATION.
PHIL (Caller): Thank you very much. I appreciate it. Yeah, you know, the food industry has had a program in place for 40 or 50 years. It was developed by Pillsbury for their food for the space flight program. And, basically, what it does is identifies the critical hazards and then identifies control points that control those hazards. So that way you don't waste your resources on the unimportant stuff. You focus on the important stuff.
COX: Does it work?
PHIL: It works, but people are going to say, why is all the - why do we have all these problems in the food industry?
COX: Well, I was going to ask you that. Yes. Well, what's the answer?
PHIL: It doesn't work. Oh, I'm sorry.
COX: I was going to ask you if it's working, then why do we have all of these problems in the food industry, then?
PHIL: In my 35 years experience in the food industry, when it didn't work, it was because the resources were not given upfront to properly develop the program, or the program wasn't followed and the resources weren't given on a long-term basis to continue the program. The program itself is very rigorous. It's human error in choosing not to follow the program.
COX: All right. Phil, thank you very much for that. Once again, Beverly and Jim, back to the idea of the cost. Here's another caller. This is Mark from Grangeville, Idaho. Mark, welcome to TALK OF THE NATION.
MARK (Caller): Hi, yeah. I work in the telecommunications industry. I mean, we have to spend a significant amount of money to be prepared to respond to disasters: major power outages, you know, earthquakes, all kinds of stuff we have to be prepared to respond to. And it comes down to resources. You have to have the money available from the start, from design, from engineering, from construction, and then in maintenance.
I mean - and in my opinion, in most of these cases, it truly comes down to greed. People want to put the money in their pockets instead of investing it on safety and anti-disaster and disaster preparedness. And to say that safety at any cost isn't worth pursuing, I would vehemently disagree. Safety at any cost is what is necessary, and if you can't justify the safety expense, then we don't need to be doing that in the first place.
COX: Thank you very much for the call, Mark.
You're listening to TALK OF THE NATION, from NPR News.
We've got callers lined up. They want to talk. Let's get them in, then I'll come back to both you, Jim and Beverly, for your responses to what we are hearing. Andy from Blountville, Tennessee - welcome to TALK OF THE NATION.
ANDY (Caller): Yes, sir. Thank you. I think the point I'm going to make is probably the one the man from Alaska that you lost was going to say. Management in modern American business does not want bad news. And technical experts who are brave enough to raise alarms are perceived as not team players, and economics trumps everything.
COX: Randy, thank you for that. You know, let's go back to this communication thing, Beverly. Randy was suggesting, as others have, that the folks at the top don't want to hear what the people at the bottom are saying, and certainly if it has a price tag attached to it.
Ms. SAUER: That's very frequently the case - not always the case, but very frequently. And if you look at the BP emails, when - you see a moment when one of the team members actually says: I don't want to be a bad team member, but I want to make a point. So there is this discussion going on.
But one of the things that happens in emails is that they're not terribly well organized. And so we don't know what the final version of the events is. We can't - we see discussions happening, people making decisions on the fly, sometimes. And these aren't saved in ways that people, when there's a disaster like the Columbia disaster, when people are in the air and we need to have information and things are happening very quickly, we have emails spread all over the world and inaccessible, sometimes, on laptops. So...
COX: Jim, so that we don't end this conversation on a sour note - we've been, you know, bitching about the boss for the last 20 minutes or so, and probably rightfully so in some cases. And yet you have been researching this, working in this area, studying in this area. Are there no positive steps we can implement now, something that we can look forward to to try to prevent these kinds of disasters?
Mr. BAGIAN: Well, I think there's always risk, but I think there's a number of things. And one of your callers that talked about the food industry, for instance. I believe he was talking about HACCPs, which was developed for food. It's very much like the failure mode and effect analysis that the engineering world uses, and as we developed in health care. And there are tools that are out there that help people understand how to methodically look for the flaws or vulnerabilities in their system and then prioritize them for action - which one that gives you the best utility that deal with first, et cetera. Those can be used. Many industries use them. Some don't.
I think the issue about communication, how do you communicate, and what is the system for communicating concerns so that they don't get lost in emails that are not well communicated or generally available? How well do you have a separate pair of fresh eyes that is not emotionally attached to what's going on? They can act as an ombudsman, in a way. And there are systems like that that can occur, and many good organizations have just those. Are any of them foolproof? Absolutely not.
And I would add the one comment about, you know, if you can't do it safely, you shouldn't do it. The question is: What is safely? Nothing that we do is without risk. There are airliners that do crash. There are trains that do crash. And if we say there can never, ever be an incident, that means we will do nothing. We're not omniscient.
So I think we have to realize what those are, and it's always the one we seize on - here's a case, as you mentioned, the food problems. What percentage of food that is served or delivered is defective or hurts a consumer in infinitesimally small amount, and you can't lose sight of that. And you have to look at the denominators who say what percent really cause problems to understand that risk.
COX: You know, it's been an interesting conversation, has it not, Beverly?
Ms. SAUER: Yes, it has.
COX: And our time is up, unfortunately. And I - the only question that I can ask you that I can get a really quick answer is: I'm assuming that we must continue to communicate.
Ms. SAUER: We must continue to communicate. And management, there are good stories of managers who make a point that they are the leaders. They're leaders in safety, and they're leaders in a safety culture.
COX: Beverly Sauer is the consultant in strategic risk communication. She joined me here in Studio 3A today. James Bagian is a former NASA astronaut who has helped investigate the Challenger and Columbia shuttle disasters. He joined us from member station WUOM in Ann Arbor, Michigan. Thank you both.
This is TALK OF THE NATION, from NPR News.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.