Rockets, Radios, and Risk: How NASA Manages Uncertainty in Orbit

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Rockets, Radios, and Risk: How NASA Manages Uncertainty in Orbit. The summary for this episode is: <p>Few careers involve managing as much risk as one where you’re responsible for launching humans riding gigantic rockets into outer space. That’s exactly what Barrios Technology Chief Strategy Officer Ginger Kerrick did during her three-decade career working for NASA.</p><p>On this episode of GRC &amp; Me, Ginger joins LogicGate’s Chris Clarke to discuss methods for developing methodical, standardized thought processes for risk decision-making in high-stakes scenarios, how NASA employees are trained to separate logic from emotion, how disasters can inform future mitigation planning, and why the most important part of managing risk is having the right leaders in place.</p>

Chris Clarke: Hi, welcome to GRC& ME, a podcast where we interview governance, risk and compliance thought leaders on hot topics, industry- specific challenges and trends to learn about their methods, solutions, and outlook in the space, and hopefully have a little fun doing it. I'm your host, Chris Clarke. With me today is Ginger Kerrick, the Chief Strategy Officer at Barrios Technology, a woman- owned and operated small business that supports the aerospace community to advance humanity on and off planet. Prior to Barrios, Ginger worked for the National Aeronautics and Space Administration, or NASA as it's sometimes referred to, where she spent over 29 years in roles such as Flight Director and Deputy Director of Exploration, Integration and Science Directorate, and the Johnson Space Center, Assistant Director for Vision and Strategy. Ginger, could you tell us a little bit more about yourself like what's your journey been with NASA?

Ginger Kerrick: Oh, sure. Happy to do that. And thanks for having me on today. I always wanted to work for NASA. When I was five years old I decided I wanted to work for NASA. And instead of my mom thinking I was crazy, she's like, " All right, what's your plan?" And I didn't have one. Luckily, she did, which was study math and science and pay attention to your parents and do well in school and go to college. And so, sure enough, I did all those things and I was able to achieve my dream of working for NASA. I started there in 1991 and retired from there in 2021. So I had an awesome 30- year career and adventures at every turn.

Chris Clarke: I imagine NASA does a lot of things. What were some of the cool aspects of that that you worked on?

Ginger Kerrick: Oh, gosh, I had so many. One of the big ones is I got to live and work in Russia. I never thought I would go there, but I got sent there to support the very first crew that was going to fly onboard the International Space Station. So we've had the space station up there flying since 1998, but I started with NASA shortly before that. So it was really cool to be part of something that you're establishing from the ground up. But living in Russia, I didn't know Russian. I go out there, I know the alphabet and I realized I'm in trouble pretty fast. But I made friends and I learned technical Russian first because I was in classes with the crew, writing procedures, that kind of thing. And then we made friends, so then I started learning conversational Russian at home with those guys. So that was really great. Another exciting adventure was working in Mission Control. I was the first person that NASA ever allowed that was not an astronaut to sit at the console that's called CAPCOM, which is short for capsule communicator. It's the person that talks to the crew. So when they say, " Houston, we have a problem," it's the person that answers and says, " Hey, yo, what's your problem?" But a little bit more professionally than that. So I did that job for four years and it was awesome. And then I got to be a flight director, the person in charge of Mission Control. So those were three of my highlights in my career, but inside of each one of those are different things that I'm happy to have participated in. It was a wonderful career.

Chris Clarke: I'm actually genuinely speechless because that's something that you see in media, but it's incredible to talk to someone who's done that and seen that. And I mean, you mentioned at the beginning how you didn't have a plan for how to get there, but your mom did. And clearly you spent 30 years with NASA. What piece of career advice would you give to someone who's looking for their path?

Ginger Kerrick: Well, I think two things. One, map out a plan for it and be prepared for that plan to divert course from your original path that you laid out for you. But two, don't be afraid to try something new. I've come across a lot of people in my career that, " Oh no, I'm not going to apply for that job because I'm not ready. I don't know everything." Well, guess what? No one does, they just don't say it out loud. And so be bold and think about the last time that you started something new when you didn't know everything and how it turned out okay, and it's going to turn out okay again as long as you put forth the effort and put a plan together.

Chris Clarke: I love that. I appreciate that. I mean, I love trying new things. I think some people worry that the risk is too high when it comes to their careers and their... to really take those chances in a lot of ways.

Ginger Kerrick: Yeah, I think a lot of people shortchange themselves.

Chris Clarke: I do want to come back to this concept of risk in the career, but before we jump in there, I always like to start a little bit with risk management in real life and thinking about, we often talk about in the terms of professional space, but we also make risk decisions every day. One, the example that I would give is I'm a big snacker. My wife and I love snacks and we love sweets. So anytime we go into a store where they have those around, we're like, "Oh, well, why not just get this bag of chips or this thing of candy?" And H- E- B, our Texas grocery store, they are awesome in that they do pick up. Well, one, they're awesome in the things that they have, and that's why we can't go into the store, so they'll do pick up. So one of the ways my wife and I avoid this risk of being snackers is we will only do store pickup from there. We will not go in.

Ginger Kerrick: Oh, well- played.

Chris Clarke: We only order online. And that's how, I would say we avoid the snack risk by only ordering online the things that we know are what we want. And so that's my risk in real life example. I don't know if you have something Ginger that?

Ginger Kerrick: Yeah, we all experience it every day. I mean, I wake up, my alarm goes off and I hit the snooze button and I'm thinking, okay, can I continue to hit this button and still walk the dog, eat breakfast and make it on time for work? That's risk management. Or, oh, definitely at the gym. I go to Orangetheory and they were trying to get me to do some exercise yesterday, and I'm like, " No. You want to know why? Because I have a SLAP tear and I know that if I do cleans or overhead presses, I run the risk of aggravating my SLAP tear. So no, I'm going to do hammer curls. Thank you very much. And I'll be happier. You'll be happier because I'll continue to keep coming here." So we make these basic decisions every day, but people don't realize that they are doing risk management, risk assessment, risk mitigation.

Chris Clarke: I do love that example. And it's interesting because you probably didn't go into Orangetheory thinking, I'm not going to do clean presses today, right?

Ginger Kerrick: Yeah, I'm going to do whatever they throw at me today.

Chris Clarke: Right, exactly. And I think this kind of goes to tie together a little bit with what you talked about as a flight director and in Mission Control of you have to make these risk decisions in real time. So I'd be interested in how do you approach real time risk trade- off decisions, and how do you think about making those decisions when you don't have time to prepare?

Ginger Kerrick: Yeah. So I'll start in Mission Control because that's an easier example for me. So you prepare yourself... You can never prepare yourself for everything you're going to see in Mission Control or in life, but you can prepare yourself for certain things and the thought process that you use to do that will help you deal with events that are unexpected. So at NASA, we knew that we had three primary goals whenever we're sitting there in Mission Control. Protect the crew, protect the vehicle, their spacecraft, and be able to execute the mission. So those are the three things we keep in the back of our head and we judge our risks associated with all of that. Now, in human space flight, the urgency factor is a little different than most office jobs, but not unlike what our military or first responders or healthcare professionals face, because they are faced with these life and death scenarios. And knowing that, it makes it easier for you to make a decision. And so say something will happen, something fails. My first thing is, is the crew safe. Okay, all right. I don't need to take any immediate action to save the crew. Is the vehicle safe? Yes. All right. Are we going to be able to execute today's mission? Yeah., tomorrow's mission? No. Okay. Let's dig into that and figure out what can we do to make sure we can execute tomorrow's mission. But it's a methodical, non- emotionally based approach to assessing risk. And you have to be able to take those natural human emotions that you're feeling when you're faced with a situation like that and set them aside because they're not conducive to the logical thought process that you need to execute to arrive at a solution.

Chris Clarke: Are there, I don't want to say training, or what is the development that you go through to start to learn that methodology and almost to take emotion out of the decision- making process?

Ginger Kerrick: We have great training at work. And when I worked in Mission Control, I worked for an organization called the Flight Operations Directorate, and that had our astronauts, our people that work in Mission Control and the people that train both of those to do those jobs. And when we get new employees in, we actually send them to something called bootcamp. And we take their brains and their concept of the world that they knew and we remold them into almost, I don't want to diminish what our military folks do by saying it's military style, but the approach is very similar in that there's a reprogramming that occurs and now when you look at the world, you will think of the crew, the vehicle, and the mission. And we put them in scenarios where if they do have an emotional response and they can't complete their mission, they'll be physically removed. They know that you aren't allowed to do that. And so we train them, we keep giving them scenarios so that it almost desensitizes them from the emotional aspect because they realize they can't do their job if they're spiking emotionally. So we'll have simulations. We review anything that we think can possibly fail on the vehicle, and we document that, if this fails, we're going to run this procedure. If this fails, we're going to refer to this flight rule that will guide us on what to do next. So you have policies, procedures, and flight rules that govern your behavior and kind of help put a non- emotional framework, a database framework on your response. And that training of the brain helps you when something really bad happens that is not documented, that you've never seen before, and you just remember, okay, that's startling. I don't have time to deal with that right now. I'm going to go through a logic- based approach to figure out what to do next and how to best deal with the risk or the manifestation of a risk.

Chris Clarke: I mean, that's such an interesting... Do you have folks doing those documentation in real time? Are they reviewing that documentation? Is that part of that reprogramming inaudible?

Ginger Kerrick: For real time, so say for example, we're going to go do a spacewalk. The team that's executing that spacewalk will go study the procedures months beforehand, conduct simulations months beforehand, before they go on console for the actual event. I remember going at home reading all my flight rules. Okay, if this happens, I'm going to do this. If this happens, I'm going to do this. And so everybody refreshes it when you go in there and then say the spacewalk goes off without a problem, it's almost like, well, okay, all right. That was good. But I've had space walks where I've lost communication with my crew member. My crew member had a bag float away containing tools that we needed to do the next three series of spacewalk. So how do you go about replanning that? I had another time where their carbon dioxide levels were building up in the suit. Other flight directors have had water built up in the helmet where we almost had a crew member drown. That one we'd never planned for because that never happens. But the way the team dealt with it, it was because they were trained to handle other scenarios, life- threatening scenarios. But the more you practice, the better you get.

Chris Clarke: I mean, that's incredible. We talk about that a lot of your unknown unknowns, the things that you can't plan for. How do you go about the process of trying to identify everything that can go wrong? And do you just accept that there will always be some level of inaudible?

Ginger Kerrick: We do, but based on our history with the space shuttle and the space station, we're pretty good at it. Say you have a given box, a power box, all right, how many levels of redundancy is in this power box? Well, if this thing goes down, it loses power completely, but if this part goes down, it'll just shift over to this other channel and still... So we walk through all of that for each individual box. And then when you're conducting an activity that requires power to go from that box to this box to this box, then we'll say, all right, well, are there common computers powering, that control all these boxes? Yes. Okay, what if that computer goes down and takes out that power? So it's kind of stacked in a way. But we go through all of that in the design phase of the vehicle. So it's not that we wait until the vehicle is designed to go and do this because you're going to make design decisions based on how you think you're going to operate the vehicle to buy down risk. So we do that well, we've done that with a lot of our spacecraft. And then I find that I carry it over into my personal world too, where I want to have a dissimilar redundancy, we call it. I don't want to rely on two boxes of one thing. I want to have one box of this thing and another box that's totally separate design because what if this fails?

Chris Clarke: I like that term, dissimilar redundancy. That feels like a very strong risk management strategy. You had mentioned in a previous conversation around next worst failure.

Ginger Kerrick: Oh, yeah.

Chris Clarke: I mean, what does that concept and how does that play into building dissimilar redundancies? Are those related?

Ginger Kerrick: It is related, yes. So say for example, we're planning a mission, and I say, we always ask ourselves, what if this box fails? So we address what we're going to do. Now if you're in that scenario where that box is down and you're limping along, what's the next worst thing that could happen? Well, then this box could go down because if this box went down because this computer was malfunctioning and that computer takes this thing. Okay, so now I have this box and that box and I can still keep the crew safe, keep the vehicle intact, and execute the mission. Great. So now what's the next worst thing that could happen? And you go down this path until you've reached what you think is the point of absurdity, because what are the odds that five things in a row are going to happen, or four things in a row, and you kind of just get a feel for that. But I think it's a good way of looking at any situation. Say you take on a new job and, well, I don't want to do it because I don't know how. Try it out. What's the worst thing that could happen? Well, I could not be good at it. Okay, and if I'm not good at it, what's the next worst thing? Well, I could get moved to a different job or I could get fired. Oh, okay. You just walk yourself through it. So that's what it's all about.

Chris Clarke: I mean, it feels like such a... I think there's this concept of in continuous improvement of the five why analysis where you just kind ask that same question. And I love this approach of that's the risk management approach. Like, well, what's the next worst thing that can happen? And almost, to your point, it gets you all the way through. And even as you answer each of those, you can almost come up with mitigations around it. Like, well, inaudible example inaudible, well, what if I'm not good at it? Well, I can take training, I can-

Ginger Kerrick: Go find people that are good at it and make friends.

Chris Clarke: Exactly. Yeah. So I love that methodology in a way. So thank you. You talked a lot about these next worst failures, but what are some risks that you think about in Mission Control or your astronauts think about in space that us people on earth would never think about?

Ginger Kerrick: Well, let's see. I'll start with I guess something more fun. Just the astronauts, things that... Here you need to go to the bathroom, you go to the bathroom. And the chances of you missing are pretty slim because there's gravity, you go to the same place every time. If you're up in space and you have to go to the bathroom, it is totally different. So there is risk associated with just using the toilet up there that you could miss fire, and now it is floating away and you got this big turd bomb to pick up, clean up after. So little things like that, going to the bathroom in space is totally different. Or if you're not okay in the morning until you have that cup of coffee, remembering that it takes 20 minutes for the water to heat up in the heater onboard the space station, and so you've got to hit that bad boy the moment you wake up so you're not going to be cranky during the first conversation you have with Mission Control. So there's little day- to- day risks that they think of, but of course, human space flight has risks. They're in a vehicle that's roughly 230 miles above earth and would take some time to get back in an emergency. We worry about space debris. There's all this junk that's out there that if penetrated, the hole could cause us to lose gas in the space station and, quite frankly, kill the crew. Spacewalks, it looks so much fun when the astronauts are in their suits and playing around outside. But that is one of the most dangerous things that we do because that is a mini spacecraft and if anything hits that or if they lose cooling or apparently if the cooling system malfunctions and has water coming into the helmet, we apparently can drown a crew member now that we've seen that. But there's all these things that, because you're doing it in zero gravity and you're doing it in a hostile environment that sees temperature variations between 200 Fahrenheit and - 200 Fahrenheit, it's just so different than going to work every day or going to the bathroom every day.

Chris Clarke: First question is, is turd bomb the official NASA terminology.

Ginger Kerrick: I was like, what word could I use that's a inaudible? But when you think about it, that's what it looks like if you have an accident in space because it just floats and it goes everywhere.

Chris Clarke: I can imagine. Well, I hopefully will only ever imagine that. I guess going to the part about space debris because that feels like something that you could never really truly control?

Ginger Kerrick: No, you can't.

Chris Clarke: Do you map those out where the risk areas are? How do you almost address and mitigate that risk?

Ginger Kerrick: Yeah, there is a special branch of the government that monitors space debris, not NASA, but NASA is one of the customers of that. So they monitor everything that's out there and they know the orbit of our spacecraft. And so if there is something that our spacecraft is going to encounter that this branch of the government is able to track, they will give us an alert and say, " Hey, 72 hours from now, this thing is going to be close to your orbit." And we will analyze and analyze and then determine whether or not... sometimes we need to adjust the orbit of the space station, so we have to reboost so that we can miss it. Other times where the tracking is not as good on those things, we don't get a lot of advanced notice. And so all we do is call up to the crew and say, get in your escape vehicle now and shut the hatch. We will tell you when you can come out. And those are scary days. I've only had that happen twice in my flight director career. I don't know if it's happened since I left. But we do keep track of it, but there's a lot of junk out there and I wish somebody would come up with a technology, a big old Hoover vacuum cleaner or something, they go to suck all the debris out of there, suck all the debris out of the ocean and clean up our environments. But so far that has not been developed.

Chris Clarke: Wow. And is the escape pod, is that more protected? Is that why that it's where, oh, it's just in case kind of thing.

Ginger Kerrick: Yeah. If it is going to hit, say you get a significant event that hits, you may not have time. Or it may hit a module that is on your way to your spacecraft. So it's better to take your chances in the spacecraft itself and just come home if the vehicle gets hit. So it's all about, again, risk mitigation, like what's the size of the particle I'm tracking? Where could it impact? How can I buy down the risk by being directing the crew members to go into their vehicle? And what risk do I accept if that debris hits that vehicle? And when I weigh that against it, yeah, okay, it's better for them to be in the vehicle.

Chris Clarke: Yeah. Wow. I mean, I love that you just walked me through the methodology you talked about earlier.

Ginger Kerrick: Yes. Yeah, I lived it every day.

Chris Clarke: Interestingly, we've kind of talked a lot about, in a way, physical risk. When you talk about protect the crew, protect the craft, those both feel very physical risks. I think a lot of folks in our industry now are focused on cyber and technological risks comparatively. Does NASA put more emphasis on one versus the other? How do you think about almost the dynamic of those two types of risks?

Ginger Kerrick: We never put more emphasis on one or the other, but we did have very different teams protecting us from those risks. So you have the physical risk in space, you have the physical risk at the Space Center. So we have security guards, we have a gated community. You have to be badged to get in. But we also do have a group that looks out for cybersecurity threats. I can't dive too much in detail about that, but bottom line is the Space Station's a target, Mission Control is a target, and there are bad actors out there every day trying to get into the system. So we have certain safeguards in place, both for Mission Control and for the Space Station. And as a flight director with the clearance that we had, we would get alerts, we would get read into certain scenarios that we need to be cognizant of in the event that they are successful in doing something. So we can lead the team through that. But the majority of the people that work in Mission Control and even our astronauts aren't read into a number of those scenarios.

Chris Clarke: That's a-

Ginger Kerrick: Your face.

Chris Clarke: Yeah. Mentally, I almost can't process that. The breadth of that kind of approach to things in a way. Yeah, it's almost-

Ginger Kerrick: You have to these days think about all of that. Again, what's the worst thing that could happen to me if... The way I designed my network for Mission Control, where are my vulnerabilities? How can I shore those up and reduce the risk that I could sustain a cyber attack? If I sustain it, how quickly can I identify it and isolate it? And how quickly can I recover from it and do I have a backup system? There's all these things at play.

Chris Clarke: Just thinking between that, but then also the concept, like the juxtaposition of that approach with also I need to mitigate debris and space. It's such an interesting, almost opposites of approach to that, which is just fascinating. I'd be interested in talking a little bit about organizational culture and leadership at NASA in general. And I mean, you were with NASA for a long time and I'm sure you saw the organization go through a lot of change. I'd be interested in what was that experience like? What advice do you have for people trying to navigate change like that?

Ginger Kerrick: Yeah, no, happy to. So one of my favorite books of change management is Leading Change by Kotter. I have read that after every, what I determined to be a major significant event at NASA that has changed our culture. And it speaks to me in different ways every time that I read it. But one of the big events that I had the misfortune of being a part of was the Columbia accident. And watching what NASA became after that, well, just getting us to survive that. When you have that traumatic event where you lose seven of your family members, basically, because those crew members were our family, and you start to question, what contribution did I have that led to that? Was there something in the design that I should have seen? Was there something in a meeting where I should have spoken up? And watching the workforce go through that grieving process, but also marching forward with a resilience and a determination that I had not seen previously, that we will not let this happen to another one of our family members. And we questioned everything. It was due to foam, we learned, hitting the underside of the shuttle. So the people that are responsible for manufacturing of that foam and the application of that foam started asking themselves, " All right, how can we mitigate this completely? How can we make sure no piece of foam ever falls off? Can we redesign the foam? Can we change how we apply it?" And they went through all that, did some tests. The foam still flies off. Okay, we've done everything that we can do here. We've mitigated it to an acceptable level, not to zero. So now we know foam is always going to fall off. How can we make sure we get eyes on it? Oh, we're going to install a lot more cameras on the launchpad. All right, great. When that thing takes off and clears a launchpad, how are we going to get eyes on it? Oh, we're going to add some cameras to the external tank so we can watch it. Oh, okay, great. When that external tank goes away, what if we miss something? How are we going to tell if there was damage? Well, we'll get some cameras. All right, who's going to have these cameras? What's the fidelity of these cameras? Will there be infrared? So we marched through all these steps to the point where we could never really mitigate the risk to zero, but we did everything we could possibly think of to at least understand what problem we were faced with so we would never come back and just not know. So that was one of the big ones. And then I can also talk about end of shuttle program, that was interesting. And then the start of commercial was also interesting. But I didn't know if you had any questions about the Columbia?

Chris Clarke: I'm speechless once again. I don't know how to articulate how the experience must have been like. To a much lesser extent, organizations going through cultural change, I think sometimes lose momentum. They try to experience this change and you see this initial burst of we are going to do things. How did NASA sustain that mentality?

Ginger Kerrick: It was, granted, easier for us because the price we had to pay was so devastating and you couldn't forget about it. We wouldn't let people forget about it. We had people die. In most companies, cultural change is initiated because we want to pick up a new business line or we failed a product line and we need to reinvigorate. And so making it personal is how you keep it alive. Losing a crew member was obviously easy to keep personal, but when that company had that product line fail, what did it do to the employees? Did people lose jobs? Did people lose bonuses? Did people lose benefits? Keeping that in the forefront because people will forget what initially kicked off this cultural change because they get so involved in the cultural change and then they get busy and they forget it. So with NASA, there was an 18- year cycle between some of our significant events. And we had between Challenger and Columbia, there were 18 years. And when 18 years rolled around again, we thought, oh my gosh, we could be forgetting this generation of people here could be forgetting. And so how do you keep it in the forefront? So businesses just need to figure out how do you make it personal? How do you make them remember what it felt like that caused you to kick off this organizational check for cultural change?

Chris Clarke: Thank you for sharing that. I mean, that makes sense. And in a way, it ties to just a human nature of-

Ginger Kerrick: Yeah. No, you get busy move on to the next thing.

Chris Clarke: But tying to an emotional response always makes things stronger. And I mean you see it through senses of when you smell something from childhood, you associate a stronger feeling in some way. You mentioned end of shuttle and the other changes there. And I think that's interesting in the sense of disruption to an industry. How do you roll with that kind of disruption? How have you seen, I guess, NASA handle that kind of disruption and the cultural change associated with that?

Ginger Kerrick: Yeah, the end of shuttle tied to the start of commercial. So our president had decided we're going to retire the shuttle and we're going to hire one of these commercial companies to build our next spacecraft. So anybody that had ever come to work at NASA who grew up watching the Apollo missions and had this NASA pride because NASA is the only one in human space flight, and I was what I called a shuttle hugger. I'm like, don't take my shuttle away. For the love of God, don't take it away. That was the mentality when they ripped the shuttle from our grasp. And then we're looking around and we say, who are we if we're not the ones flying the shuttle? Oh, okay, we have this Space Station. Oh, but wait, we have no US vehicle to get our crew members in the space. NASA has failed. This is how we were processing it. And why do they get to do it? Give us the money. We'll build something. And we were building something and you cut our funding. So there was this bitterness, we can still do this, we're still relevant. But I think reflecting back on that time, it was the right thing to do. You're trying to stimulate a new economy in lower earth orbit and lunar orbit and help these different companies come up with different ways of doing business. But at the time, the people working in NASA did not see that. You're taking our shuttle, you're taking my job. You're challenging my relevancy. So there was a mourning period that we had to allow people to have. And certain people could not come out of that. They went off and a lot of them quit and went into oil and gas because that was booming at the time. Others, after they finished mourning, thought, okay, I am the smartest person in the room with human space flight. I'm going to go help these guys. I'm going to go help because eventually they're going to fly our crews and I don't want them to feel what I felt at Columbia. So everybody then went in with this noble intent to help the commercial providers, some of which wanted our help and some of which did not. But we did our best. And once they started succeeding, then NASA felt better. And then NASA got some other funding to pursue the design of vehicles beyond low earth orbit, and it gave them another identity. So it was a challenging time, but it was something that absolutely needed to happen and we just needed to have leaders in place to be able to help people navigate the challenges and focus on why this change was occurring.

Chris Clarke: Yeah, I mean, the part you mentioned around focusing on the why is so strong. I think oftentimes when things just happen, there's distrust, but walking people through understanding that why gets to a lot of that core of the decision wasn't made to make a decision. The decision was made for reasons by smart people who you trust and follow. And that's really powerful, I think, in that change cycle for folks. I guess your Barrios now, do you find that the parallels between what you do now and handling change now versus handling change at a government, NASA, are similar? Are they different? What's that?

Ginger Kerrick: The technical challenges and the personnel challenges are similar, leading people and solving technical problems. But I have learned so much two years outside of the federal government about things that we did as a federal agency that we thought were enabling companies to do business with us that are actually roadblocks to it. Things that are challenging for no reason at all. But I never knew because I was never on this side. And so I go back to the folks that I used to work with and say, " Hey, would you like to hear things I wish I would've known by Ginger Kerrick to make myself a better civil servant leader?" " Yes. Yes, I'd love to hear that." Okay. And so I go back with a positive spin on it like how could I have known? How could you have known? But let me help you bridge the gap so that we can find ways to make it easier to do business together. And similarly, I take lessons from the civil servant side of the house and try to help our folks that work at Barrios that have only worked as a contractor. " Hey, well you know what you don't know is we have to consider this when making this decision." " Oh, okay." So hopefully I can bridge the gap between the two worlds and help them work better together in the future.

Chris Clarke: That's interesting. This is one of my favorite risk management questions, but what keeps you up at night around what you do?

Ginger Kerrick: Oh. It's different now that I'm working for Barrios because I am the chief strategy officer for Barrios. So am I making all of the right decisions? Am I providing all the right suggestions to our leadership team to help us, not only stay relevant in the human space flight industry, but to grow as respected leaders? So now that keeps me up at night. When I was at NASA, it was always, did a decision I made at work today, is that going to manifest itself two years later, three years later, and killing the crew or destroying the vehicle or not being able to execute the mission? I always, in every decision that I made at NASA, fast- forwarded. If I agree to this now and I say, it's okay for us not to do testing on this part because there's scheduled pressure, what am I missing in that test that two years from now might manifest itself in an accident? So it's two very different things that used to keep me up at night and that keep me up at night now.

Chris Clarke: I love that you put a time horizon on it of is this decision now, how will it impact two years down the line? I think it's a problem a lot of, I'm just going to call it humanity struggles with like I don't know what decisions I'm making today may have an impact on my life two years from now. But how did you develop that mentality of thinking, was it purely just through experience and seeing how those... How did you start to think in that manner?

Ginger Kerrick: I think I've always thought in that manner because I had the bottom ripped out once when I didn't. When I was 11 years old, I went to my dad's work and he was having a heart attack, and I watched him basically take his last breath. Although they say he died in the ambulance on the way to the hospital. I watched him struggle. They put him in the ambulance and my first thoughts were, " Hey, can I spend the night with dad? Can I eat his jello? Because he's going to be okay." And when he wasn't, I was not prepared for that. And I did not have a good reaction to that. And so I think that's where my next worst failure programming started, before I even got to NASA and had a term for it. It was, all right, this just happened. I need to be prepared if it goes well, if it goes okay, and if it goes poorly. And that mindset kind of carried forward into decision- making. All right, my friends invited me out to this party and I have a test tomorrow. All right, if I go and I have a good time and come home, or if I go and I have a really good time and forget to study or I could stay at home and study. And if I do that and I get an A and then I could get into college. So I've always looked that way because a lot of people are, particularly these days it seems, are in the instant gratification mode and you can do that, but at the expense of things. And I just like to think through them before I make a decision.

Chris Clarke: I am sorry. I appreciate you sharing that story.

Ginger Kerrick: Oh yeah, no, it made me who I am today. It's okay.

Chris Clarke: Yeah. Your point around instant gratification is always, yeah, I mean, it's been a theme I think of this conversation in a lot of ways of what are your massive benefits in taking almost the mental model, just one more extra step forward rather than just being like, okay, when I make this decision, what's the impact? But it's when I make this decision, what's the impact? And then what's the next impact after that in every way. So yeah, that makes a lot of ton of sense. And I'll go back to my snacking example.

Ginger Kerrick: Perfect.

Chris Clarke: I had to make the mental model decision just short- term, did not go to the store. Those were all the broad risk management questions that I had. Was there any other thoughts around risk management strategy or your time that you were hoping to share before we get to Risk or That?

Ginger Kerrick: No, just I talk a lot about risk management, but risk management requires leadership. Have to have the right leaders in place to build a team around them that will execute this risk management model that we've been talking about. And I've had the opportunity to observe quite a number of different leaders in my time and different approaches. I'm actually, oh, I put it here in case. So I just got this book. I'm so excited. So this is Gene Kranz and it's his second book, Tough and Competent. So he was one of the original flight directors. I want to say he was number two or three. I was number 60, 6-0. So he was one of our founding fathers. But he led the team after the Apollo 1 fire and has a lot of experience. So I'm anxious to read his book, but his book is titled Tough and Competent. And I agree that those are characteristics of a good leader. You have to be resilient and be able, like I said, to disassociate emotion from the logical problem that you're trying to solve and lead your team through that in order to manage risk. But I think leaders today also need emotional intelligence, need to understand themselves, how other people see them, understand other people and how to adapt their leadership style to get people that think differently than they do to march down the same path that they want them to go. And then have some humility. None of us know everything, and we're never going to know everything, and we'll never be the smartest person in the room. And finding who is the smartest person in the room for certain subjects and bringing the best out of them so that the team can achieve their mission. People that can be good communicators and can enhance the performance of their team by providing not, I hate the word constructive criticism. It drives me crazy. I call it performance enhancing feedback because who doesn't want to enhance their performance? Who would not like some feedback? But having leaders that will do that for people I think is really, really important.

Chris Clarke: I appreciate you inaudible. I agree with all of that. And I love this term performing enhancing feedback. I think there's this... I mean, I'm a fairly new- ish leader. I think oftentimes something I've struggled with is I know it's my job to make other people better, but sometimes giving them criticism can feel bad, even though it's for their own good and for the good of the company. I mean, it's also something, candidly, as a new parent that I've experienced of, I have to discipline my two- year- old, and that does not feel good, but I know it's for the best for him. So I mean, I love that concept of performance enhancing feedback, but almost through the lens of also being emotionally intelligent about it.

Ginger Kerrick: Yes. Yeah. Because some people that you know really well will be like, " Hey, that's not your A game. The way you said that didn't make any darn sense. I need you to really step it up and do this." " Oh, okay, I got you." And other people, " Okay, that was a really good try. I need you to inaudible. I have some suggestions if you're interested in hearing them that might make it easier next time." " Oh yeah. Yeah. Okay." Then, "Do you remember how that person looked at you and twisted their head when you got to this chart? All right, they weren't following you. So you need to be better about reading visual cues." " Oh, okay." You just walk them through it a different way, but adaptability.

Chris Clarke: Yeah, and I mean what you just said, even the way you position that, it's about how can you make the person receptive to that feedback and it takes that emotional intelligence to get them there. I mean, thank you for sharing that. I am going to start using all this.

Ginger Kerrick: Good. Good. I won't take credit for that terminology. I worked with a guy that provided training to us at NASA named Craig Divisio, and he taught us a class on performance enhancing feedback and that class changed my life. I'm like, wow, I never thought to look at it that way.

Chris Clarke: Yeah. Okay. Well, I'm going to look them up. I want to end a little bit on, we do this thing called risk or that where we talk about some risks that may go either way. So I didn't really find a good way to bring this into the conversation, but do aliens exist?

Ginger Kerrick: Oh gosh.

Chris Clarke: Or I can frame it in a different way of like, which-

Ginger Kerrick: Oh my God. Maybe. Maybe.

Chris Clarke: Then, which do you think is a riskier alien for us to discover? A less evolved bacteria that we don't know how to address or a highly evolved, more intelligent than us species?

Ginger Kerrick: Okay. If they're highly evolved, that means I could probably have a conversation with them. Bacteria, I can't. And I've seen what bacteria can do to this planet left unchecked. So I would take my chances on, with the risk trade, I would say it would be lower risk for the one that's smarter than us because I can at least have a conversation. I've talked to people that are a lot smarter than I am before and gotten what I needed, but I really don't usually win with bacteria because it requires antibiotics and I'm allergic to a lot of the good ones.

Chris Clarke: Huh, okay. That's fascinating. Yeah, I mean, yeah, it probably makes sense inaudible-

Ginger Kerrick: You get to see how my brain works with these questions too.

Chris Clarke: I love that thought process there. Yeah. Okay. I'm going to chew on that one now. Maybe the next one then, because I know you have a background in physics, is which do you think will have a bigger impact on the space community, artificial intelligence or quantum computing?

Ginger Kerrick: Ooh, a greater impact on the space community? I think artificial intelligence, particularly on missions beyond... well, to Mars and beyond our solar system. When you go to Mars, you can't really call Houston anymore when you have a problem. And so having smart systems on board that can tell you what you need to fix or create a solution for you would be good. But it's in its infancy right now. And if you think back to when social media came out, the intent of it was to provide connection and communication, and now it's being used as a tool for divisiveness. I would worry about my crew member on a ship with some AI that could be smart enough to take over. And they have all these movies where it does that. So I don't know. Quantum physics at least, it's based on actual physics, not... So maybe it's a safer bet. But I think if you treated AI the right way and built it the right way, you could get more bang for the buck.

Chris Clarke: You used your next worst failure on it.

Ginger Kerrick: Yes. Yes, I did. I just did that. I'm so sorry. Yeah.

Chris Clarke: No.

Ginger Kerrick: It takes over the spaceship and, yeah.

Chris Clarke: Well, I mean, I could have asked about HAL in 2000.

Ginger Kerrick: Yes. That made me nervous a long time too when I saw that movie.

Chris Clarke: Yeah, that movie gives me the creeps, so I didn't want to talk about it again. So then I guess one more around, when you think about cyber risks, I think one of the things we always... Are those more likely to come from inside your organization or outside? Is it from someone clicking the wrong link or is it from an adverse, like a bad actor in some way?

Ginger Kerrick: Are they more likely to come inside or outside? So let's start with, it depends. So inside your organization, if you don't hire the right people, you have the right screening process to know that you're not hiring somebody with malicious intent or you don't provide the right training, or you don't have the right internal safeguards. So if you hire the right people, provide the right training, have some safeguards, you can buy down on insider risk. Outside your organization however, you have no control about the people that are out there. All you can do is boost your fence line and try to make it difficult to get in. But I think knowing what I know about what is out there, I would be more fearful of external than internal for the right organization with the right culture and the right leadership.

Chris Clarke: That makes sense. If you're building it the right way inside, then you have more control, you have more insight, you can strengthen that away.

Ginger Kerrick: Yeah, there's more mitigations you can put in place than you can to external. And the talent out there external is quite significant.

Chris Clarke: Those are kind of all the risk or that questions. You had mentioned leading change by John Kotter, Tough and Competent by Gene Kranz. Any other books or media that you'd recommend to folks?

Ginger Kerrick: Oh, yeah. What Got You Here Won't Get You There, which is by Goldsmith. I have it back on the shelf over there. And then I'm a huge fan of this lady. I met her and her name is Ann Rhoades, and it's Built on Values. So we did a values- based culture exercise at Barrios to try to really ensure that the values that our leadership team declared are important to us are adequately infused into the culture of the organization. And she describes an approach in this book that we used at Barrios and that I have participated in. I'm also a regent on the Texas Tech University System Board of Regents. And for all five of our universities, we conducted values- based culture exercises according to her program. And I think it yields really great results for companies that are interested in doing that. And then I have a book on Emotional Intelligence 2.0. I didn't like the 1.0, but the 2.0 is pretty good and I forget what the author was. That was a great book and helpful in my leadership journey as well.

Chris Clarke: Real quick on the inroads built on cultures, do you all have your Barrios company cultures listed out? Do you mind sharing what your-

Ginger Kerrick: Yeah, we have our company values that we identify. And so the first one is Barrios family, that we want to make sure that our companies feel valued and supported, both personally and professionally. And then our second value is work- life balance, believe it or not. Because human space flight can be crazy at times, and we're not promising everybody's going to work 40- hour weeks, but we promise you that when we are working on a mission and you have to put in that 60- hour week, that we're going to encourage you to take a couple of days off. Then our third value, it's weird because it has the word value in the value, but it's called extraordinary value in that we want to invest so much in our employees that they're not able to just do their jobs, but to go above and beyond in doing their jobs, anticipating customers needs and being innovative and just being the experts and the go- to people in the field. And then our last one is social responsibility. Wherever Barrios is, and we have a contingent in Huntsville and here in Houston, a small contingent in Florida and Langley and also in Colorado, we want to make the community around us wherever we are better and donate our time and our efforts. So if you think about those values, Barrio's family, work- life balance, social responsibility, and extraordinary value, you'd never guess we were a human space company. And we are a people- focused company and I love the journey that we went on to identify that about ourselves and how we've owned it in everything that we do. And I'm really fortunate. I had 30 years with NASA, and I'm fortunate to have found this particular company because their values really match my own quite well.

Chris Clarke: Yeah, that's amazing. I love those.

Ginger Kerrick: Me too.

Chris Clarke: Well, thank you for sharing. Those are all the questions I had. Any other last thoughts or things you wanted to share with inaudible?

Ginger Kerrick: Nothing. I just want to thank you for the time. I am very passionate about these subjects, risk management, leadership, culture change, if you can't tell. And I like to continue to share what I've learned in the journey of my career with others so that they don't have to learn it for themselves in some cases, like I did. Just hear some relevant experiences that they can apply to their own careers and their own personal lives. So thank you for the opportunity to speak to you today.

Chris Clarke: Well, thank you so much, Ginger. This was just an absolute pleasure to be a part of getting a chance to talk with you more.

Ginger Kerrick: Thanks.

Chris Clarke: That's all we got. Thank you all for listening.


Few careers involve managing as much risk as one where you’re responsible for launching humans riding gigantic rockets into outer space. That’s exactly what Barrios Technology Chief Strategy Officer Ginger Kerrick did during her three-decade career working for NASA.

On this episode of GRC & Me, Ginger joins LogicGate’s Chris Clarke to discuss methods for developing methodical, standardized thought processes for risk decision-making in high-stakes scenarios, how NASA employees are trained to separate logic from emotion, how disasters can inform future mitigation planning, and why the most important part of managing risk is having the right leaders in place.