Human factors from desktop to deep space
Like lots of kids, Kritina Holden dreamed of working at NASA when she grew up. Not necessarily as an astronaut, just at NASA. However, unlike lots of kids, Tina’s dream actually came true.
Kritina Holden, Ph.D., Technical Fellow in Human Factors at Leidos, has spent most of her career at NASA's Johnson Space Center, focusing on how humans interact with technology.
“Unlike a technology focused role, a human factors practitioner's role is to focus on the human user. It's not about what cool technologies can be brought to bear on a problem, but really about fully understanding -- what are the needs of the user in performing their task? And then what cool technologies might meet those needs?”
In this episode, Tina discusses how she got into the field, what being a Technical Fellow in Human Factors actually means, and what human systems integration is. She also shares some examples of how NASA is using human systems integration (HSI) in their program today and into the future as they plan for missions to Mars.
“Focusing on the human means that you involve the user from the beginning. Try to understand their capabilities, the environment they work in, and the culture of the environment. We follow the well-known Human Factors maxim of know thy user - for they are not you.”
To learn more about how HSI applies to office applications as well as space travel, how movies often miss the mark when it comes to realistically usable technologies, and how someone aspiring to follow in Tina’s footsteps can achieve their NASA dream, don’t miss this enlightening episode.
On today’s podcast:
- How human systems integration (HSI) differs from typical engineering approaches
- The human-centered design (HCD) process
- How NASA has used the HCD process
- The benefits of HSI and HCD outside of NASA
Transcript
-
Kritina Holden (00:00): We know they're going to get deconditioned after a long duration in space. There's a potential for depression, loss of alertness, loss of situation awareness, all the things that you need to really focus and do a good job on these spacecraft tasks. So, it'll be a challenge.
Bridget Bell (00:24): Welcome to MindSET, a Leidos podcast. I'm your host, Bridget Bell.
Meghan Good (00:28): And I'm your host, Meghan Good. Join us as we talk with pioneers in science, engineering, and technology, to understand their creative mindset and share their stories of innovation.
Bridget Bell (00:43): Today, we're speaking with Dr. Kritina Holden, human factors technical fellow at Leidos. In our conversation, she shared her story of how she got into human systems integration, from a childhood dream to work with NASA, through her education, and now her position with Leidos and her work with NASA.Meghan Good (01:01): We talked a lot about what is HSI and what does that really look like on an engineering effort? And then, how is that different than other engineering efforts?
Bridget Bell (01:11): Tina has spent most of her career at NASA's Johnson Space Center. And so, she shared with us some examples of how NASA is using HSI on programs today-
Meghan Good (01:21): As well as those programs that they're looking to in the future, the longer duration missions, those to Mars, and what challenges and constraints that they're trying to deal with as they're considering, “what is a human going to do in that environment?”
Bridget Bell (01:36): And then, she wrapped up the conversation by talking about how HSI can be used in a variety of different areas, not just space. And then, gave some really interesting resources for others looking into this topic or interested in a similar career. So, with that, let's get started with our conversation.
Bridget Bell (02:04): Welcome to MindSET. Today, we're speaking with Dr. Kritina Holden, Human Factors technical fellow with Leidos. Welcome, Tina.
Kritina Holden (02:12): Thank you. Thanks so much for having me.
Bridget Bell (02:15): Let's start with your role with Leidos and a little bit about your background.
Kritina Holden (02:18): Okay. Sure. I guess my story starts when I was about eight years old and I looked up at the moon and said, I want to work for NASA. I don't know what I want to do, but I know I want to work for NASA. So, I started college as a computer science major, but I was really always interested in psychology. So, near the end of my undergraduate career, I switched my major and minor because I was really more interested about the human using a computer than programming the computer to do the task, and particularly interested in how humans perceive and process information from the computer and how they interact with it. And, near the end of my stay at the university, a professor suggested, hey, have you heard about this field that combines computer science and psychology? It's called human-computer interaction. Well, that sounded great to me and put me on my path to graduate school.
Kritina Holden (03:13): So, I ended up going to Rice University in Houston. And I got a masters and PhD in engineering psychology. That was the name of the degree, but it was essentially human-computer interaction. And then, conveniently, already being located in Houston, I had an opportunity to go to work at NASA. So, this was my first real job. And I guess my dream has come true because I've spent most of my career at NASA Johnson Space Center.
Kritina Holden (03:39): In the past, I have worked in a web based training company. And I was the usability lead for an international enterprise software company. But I did come back to NASA. And currently I'm a human factors technical fellow with Leidos. I'm involved in space, human factors research. And I'm also a displays and control subject matter expert, working with spacecraft development programs.
Meghan Good (04:02): And wow, Tina. So, within your role and work with NASA, you focus on how humans interact with technology. What does it really mean to focus on the human?
Kritina Holden (04:13): Well, unlike a technology focused role, a human factors practitioner's role is to focus on the human user. So, it's a very different mindset. It's not about what cool technologies can be brought to bear on a problem, but really about fully understanding what are the needs of the user in performing their task, and then what cool technologies might meet those needs.
Kritina Holden (04:37): Sometimes the best answer is not the cool technology at all. It's just a better process or better designed display. Focusing on the human also means that you involve the user from the beginning, try to understand their capabilities, the environment they work in, and the culture of the environment. We follow the well known human factors maxim of “know that user, for they are not you.” This is really important. I don't know how many times I've heard a developer say, “well, this design makes perfect sense to me. I don't know the problem.” But it doesn't ultimately work for the user. People tend to think, well, I'm a human. I know what makes sense to another human. But this is really a bad assumption, as many poor designs have shown. I'm sure everybody's had the experience of working with a badly designed piece of software or a control panel in your car. So, we really don't want to make assumptions for the user. We want to bring users in early to see and interact with conceptual prototypes, involve them in early iterative testing.
Bridget Bell (05:38): So, throughout the MindSET series, we've talked about how we can use technology to enable our customers to focus on their mission. But it seems like your work is taking it even to a more micro scale focused on that individual human. And I know I've heard you say the term human systems integration. So, what does that mean? What is HSI? And how do you look at the human as a system?
Kritina Holden (06:05): So, HSI stands for human systems integration. And it's an interdisciplinary management process, or basically an overall approach that ensures, throughout development, the human operator is considered on the same level as the hardware and software, taking into consideration human capabilities and human limitations.
Kritina Holden (06:27): So, if we use the example of designing a vehicle, there's a team usually dedicated to focus on the power system, let's say. They must determine what inputs or data are needed to come into the power system to perform the functions of the system, how information will be output from the power system, what dependencies and interrelationship there are between the power system and other systems. The same questions can be asked and should be asked about the human. What information does the human need to perform their task? How will they provide information to other systems through inputs to a computer system? How are their actions dependent on other events in the system, for example.
Kritina Holden (07:06): So, just as there might be a subsystem team set up to develop the power system, ideally, there's an HSI team who will focus on the humans' needs, inputs, and outputs. They are specialists who will provide guidance about what should be done by the human versus automation, depending on the strengths of each. So, for example, computers are good at rapid responses, repetitive operations, calculations, remembering things, where humans are better at complex problem solving and adapting to changing conditions. So, if humans are using a system that requires them to remember things from one display to another, or to do mental math, that can result in errors, frustration, and sometimes be a safety risk. So, it's important to think about what functions the computer should do and what functions the human should do. The overall goal is to design the system to fit the user and not make the user adapt to fit the system.
Meghan Good (07:59): So, I guess with that in mind, I'm wondering what's different. So, how does this look different than a typical engineering development effort?
Kritina Holden (08:07): So, we follow a process called the human centered design process. And, in doing HCD, human centered design, we focus on meeting the user's needs, not just cool technology, as I mentioned. We bring that user in early versus having demos and testing after the product's already done. There's a lot of iteration, a lot of prototyping early on. We follow an iterative test, redesign, retest cycle, where we test users in realistic scenarios and we collect performance data. This allows issues to be identified early so that we can make changes when they're cheaper and feasible.
Kritina Holden (08:46): And the truth is, everyone will end up doing usability testing. It's just whether it's in the lab when it's cheap and easy to fix, or whether it's when the product is already out in the field, which means costly redesigns. We also have a lot of special methods for accomplishing human centered design. We have things like task analysis, cognitive walkthroughs, card sorting, reactive prototyping tools, Wizard of Oz techniques, and methods for measuring situation awareness, workload, and usability.
Meghan Good (09:16): Can you tell me more about what are Wizard of Oz techniques?
Kritina Holden (09:19): I knew that would intrigue you?
Meghan Good (09:21): I had to know. Do I click my heels together and I'm not in Kansas anymore?
Kritina Holden (09:26): You could, but actually it's a great technique for testing a prototype that's not fully mature. So, a lot of times during development, you might have a partially functional prototype, but you want to be able to test it with users. And maybe your prototype doesn't give actual responses. Maybe it doesn't actually function. So, Wizard of Oz technique means that you basically do smoke and mirrors. You fake it so that, as far as the user knows, the system is actually providing answers to the inputs they're giving. It's bringing up a display at their request, when actually there's no real background architecture going on. It's a person feeding them the appropriate display or sending the response. So, the user thinks it's real, but it's actually the man behind the curtain.
Bridget Bell (10:17): I love it. It's really bringing the user in early because it's before the system is even designed or ready.
Kritina Holden (10:24): Absolutely.
Bridget Bell (10:25): So, you talked through a lot of the benefits, and one of those being bringing the user in early and identifying those issues early. But what are more of those benefits? And what are some challenges that you've seen?
Kritina Holden (10:39): So, the benefits of this approach ultimately are you get designs that produce higher productivity. You have fewer service calls because users have been brought in early. They already have buy-in. They have had their needs identified, so the product works for them. You've identified user requirements up front, so you don't have the problem with requirements creep, which everybody has probably experienced in a development project. Early testing means that you identify issues early. So, overall, the life cycle cost is lower. And basically good designs reduce training costs, they reduce errors, and they increase safety.
Kritina Holden (11:17): Challenges are probably, at first, getting development teams to be open to new way of doing business. And then, also educating management about how investment in HSI will yield life cycle cost savings.
Meghan Good (11:32): That makes a lot of sense because it is a switch, a different paradigm there. Now, I wonder, what are some examples from NASA where you've used this approach?
Kritina Holden (11:42): So, I've got a couple that I can talk about. The first one is an older example, but it's a great example. It was work that we did for the International Space Station, or ISS. And it has to do with the respiratory support pack cue card.
Kritina Holden (11:58): So, on space station, they have medical supplies that are put into kits. They're like small suitcases. And one of those is called the respiratory support pack. On the lid of the pack, you have a cue card that contains very brief procedures or instructions for how to put the equipment together, how to apply it to the patient or the person in need to stabilize them. So, in this case that I'm talking, if someone was in respiratory distress, you would pull out the respiratory support pack, you would read the cue card, you would follow the instructions, put the equipment together, stabilize the patient.
Kritina Holden (12:38): And so, we had a training group come to us and say, we've been working with this cue card, but the crews, during simulations, have not been able to stabilize a patient within the required amount of time. They're just having a lot of trouble with the cue card. And it's interesting to note that all of the crew members, all the astronauts are not physicians. Many of them are not. They receive some medical training, but not a lot. And it may be that they received the training a long time ago. So they were having a lot of trouble with this particular cue card.
Kritina Holden (13:10): Since our expertise, part of our expertise, is information design, we thought it was an interesting project. We looked at the cue card. And it had a lot of text, a lot of text, very cluttered layout. And, if you think about it, reading is probably the last thing you want to do in an emergency situation. So, we removed all the unnecessary text. We added a schematic of the different parts of the equipment that needed to be assembled. We added color coding. We did about four iterations of testing where we identified problems, we would do a redesign, we would take into consideration recommendations from the users that we ran. And the final test, we found that people were able to complete the procedure in four minutes. So, that's a savings of three minutes. That's a huge deal with something like this. And that hard data led ISS to adopt the new cue card design. So, that was a very exciting project.
Meghan Good (14:08): Wow. That's incredible that you were able to save three minutes in an emergency situation. And what I'm most struck by is that it wasn't necessarily something very technical, but it really changed the outcome because you were considering the people who are using it and the stress of the environment they were in. That's so cool.
Bridget Bell (14:27): Yeah. I agree. Very interesting. So, do you have other examples that you can share?
Kritina Holden (14:32): I have another example that's kind of currently in work. The Orion Spacecraft is the vehicle that's going to take astronauts to the moon and Mars. And they have been in development for a while. And I used to work with that group. And one of the early challenges was figuring out how the cockpit would be designed and how astronauts would interact with it. The space vehicle, being brand new, is mostly a glass cockpit. So, unlike the old shuttle, which had hundreds and hundreds of switches and knobs, this allows crew to do almost all of their commanding with software. So, they needed some sort of cursor control device to interact with that software. At that point in time, touchscreen had been ruled out, so we really needed to think about what kind of device would work.
Kritina Holden (15:25): And the real challenge was that this device has to operate under vibration, acceleration, microgravity, ungloved, gloved, and pressurized gloved. So, that's a tall order, a lot of challenging constraints. So, to take this project on, we involved astronauts early. We did testing of commercially available solutions and decided that we would probably need a custom device. We started by making clay models. We did 3D models with 3D printers looking at different alternatives. We had astronauts put their hands on some of these models to figure out the best ergonomic form. We had astronauts suggest different types of controls. Maybe they had worked with a particular type of button or switch in a jet aircraft that they used to fly. We put all this together in a bunch of prototypes. We did a lot of iterative testing in the lab in a pressurized glove box with gloved subjects and in high fidelity mock ups. So, we have a lot of data on this device. Now, this device is going to be used in high fidelity testing and preparation for flight very soon. So, it's on a good path.
Kritina Holden (16:40): Similarly, the displays for Orion are being developed with the same type of process. We had very early prototypes that we had the astronauts look at and give their thoughts about. They've been heavily involved in iterative testing. And these prototypes have increased in fidelity over time. So again, we have lots of testing and lots of data before the solutions are locked in.
Meghan Good (17:04): So, Tina, I cringe every time I see cybersecurity broached on a TV show or in a movie. And I wonder, for you, do you do the same when it's about a character using a computer? Do you kind of cringe at some of the displays and how people are interacting with it?
Kritina Holden (17:23): I do. Also, just in general about human computer interaction things. Like there's this famous line where we always laugh. I think it was in a Jurassic Park movie. So, in the middle of mayhem, the young girl gets on the computer and says, “It's a Unix system. I know this.” And she just starts typing away like she's got it all figured out. Because, if you know Unix, of course you know how to break into the Jurassic Park system. It's pretty funny.
Meghan Good (17:49): But I wonder with that too, do you feel like a lot of the expectations from your human users are actually driven by some of the Hollywood aspects of what they're seeing? Since displays really are in movies all over the place and they kind of generalize the functionality and make it simple, do you then find that users want that?
Kritina Holden (18:11): That's absolutely true. And it's especially true with spacecraft user interfaces because we see so many super cool science fiction movies. And you see these really advanced ways of interacting with the computer. And then, what you find out is, when you come to work for the space program, that we actually have a big challenge in terms of radiation. And radiation in space is not only very dangerous for the human, but it's really hard on electronics. It burns holes in the chips, and it flips bits, and it causes lots of issues. So, a lot of the very fancy interfaces with nice graphics, and streaming video, and all these things that you might expect, are things that we really can't do with the radiation hardened components that we have to use for space travel.
Kritina Holden (19:05): So, it's a little bit of a let down sometimes when you see what some of the real user interfaces look like, some of the vehicle displays. You're like, wow. Those are like '60s style. They're very simple and they're kind of boring. They're very basic. They're meant to be usable, not meant to be flashy. But we're just not able to put all that fanciness in because we have to use radiation hardened components.
Meghan Good (19:30): And right there is that intersection between the field that you've been discussing with us and engineering. Right? Where there's these constraints, there's limitations, and you're trying to find that balance between what the human needs and what the system actually has to deliver.
Kritina Holden (19:47): Exactly.
Bridget Bell (19:49): And that balance of the environment. Like you're saying, in space, it's a completely different set of challenges as far as the resiliency needed. And so, that's super interesting.
Meghan Good (20:00): And I can't even imagine those future pioneers who are going to be in that team of four that you've discussed on their way to Mars, out of touch, and it's on them in space. That's amazing.
Kritina Holden (20:12): It is. And that's one of the biggest challenges on the research side of the house that I work in. That's one of our biggest challenges is figuring out how do we mitigate the extreme conditions they are going to be under and the things that it's going to do to them? How do we develop countermeasures for the isolation and the lack of communication with Earth? And, at times, they want him to be able to see the Earth. So, they're going to lose this Earth connected feeling. They're going to miss their families. It's just a lot to put on someone. And that ultimately is going to affect their performance, their ability to do their task, and do it with accuracy, and stay focused. So, we're very, very much thinking about, how can we help and support the crew that are going to be in this very challenging situation?
Bridget Bell (21:05): And you said, your degree is, the official title is engineering psychology. And so, that statement kind of wraps it all up of it's not just about the engineering and making sure the systems work and the humans can use the systems, but that psychology behind it, of what the users are going through in their body, in their mind. That isolation, it's a huge problem to solve.
Kritina Holden (21:29): Yeah. Absolutely. We know they're going to get deconditioned after a long duration in space. There's a potential for depression, loss of alertness, loss of situation awareness, all the things that you need to really focus and do a good job on these space craft tasks. So, it'll be a challenge.
Bridget Bell (21:50): So, all of those examples make me think back to the motto you said, the “know thy user for they are not you.” Because everything from as small as a cue card to the cockpit design and displays, you're considering how that unique human is going to be interacting. And so, it also makes me wonder, as we look into the future, how will HSI help NASA meet the challenges of those future longer duration missions?
Kritina Holden (22:21): So, first of all, moreso than other domains, such as aviation, at NASA, when we build a spacecraft, we have very few opportunities to get the design just right. We build relatively few spacecraft as compared to airplanes, for example. So, once that design is locked down and the vehicle is put into operation, things are not very changeable. There will be minimal system upgrades, but in truth, once that spacecraft launches, updates and fixes are going to be a bit difficult. So, astronauts will have to live with the design as is for a potentially long period of time. So again, we need to get it right the first time as much as possible. That's why HSI and a focus on the human is so important.
Kritina Holden (23:04): And this becomes even more important with future long duration missions. Because, as we head to Mars, for example, there will be communication delays and blackouts. Crews will not have the near 24/7 access to mission control. Right now, mission control provides guidance, watches over crew tasks, corrects mistakes, provides reminders, helps with procedures, et cetera. And, on future missions, there are going to be times when four crew members are totally on their own to perform tasks and solve any problems that arise. Even more reason that we need good HSI to ensure things have an excellent design.
Kritina Holden (23:43): We also need this focus on the human to make sure that we provided the information and resources that these crews are going to need to operate autonomously. They're going to need things like intelligent information systems, decision aids, just in time training tools, troubleshooting software, and even simple software development tools. So, we also need HSI professionals to advocate for very simple designs, things that have small numbers of components so there are fewer things to fail. Things that are easy to train, that are easy to use, easy to repair. All of these things are extremely important as we develop vehicles that will go far beyond Earth.
Meghan Good (24:23): It's amazing to think about in those systems, those platforms, or whatever it's called, those vessels, those shuttles are in place for so long. I mean, you really do have to think about the design and how they'll be used in the future. But, as you've been talking about this, I wonder, with human systems integration and with human centered design, as you've been describing it, it seems to have a lot of implications even outside of NASA. How do you see this being applied across industries or other technology areas?
Kritina Holden (24:53): So, HSI absolutely applies across all kinds of development efforts, whether it's hardware, software, procedures, or training, anything really that involves human interaction. So, no matter the domain, we need to make sure solutions are effective, so they meet the needs of the user and they support task performance. They're efficient. That means they're easy to learn, easy to use. They don't result in errors that would require backtracking and redoing work. And satisfaction is also a goal. We want users to like and want to work with these products. Otherwise, what happens is they either won't use them or they'll find work arounds. And that can cause a lot of other problems. So, these types of goals apply to lots of domains and industries, control centers, aircraft, ships, spacecraft, office applications.
Bridget Bell (25:43): Yeah. It seems like that list of other applications could go on and on because humans are operating all kinds of systems. And so, having the user test it first and really having that human centered design, it seems like it could be very important across areas. So, I'm curious, switching gears a little bit, is there anything that stands out that really makes Leidos unique when it comes to HSI?
Kritina Holden (26:10): So, at Leidos, we have expertise, not only in how to work with development teams and facilitate human centered design and knowledge of the formal techniques and methods, but we also have researchers and behavioral scientists that can go a step further. So, if during development or before development, there is an open design question that requires a formal research study, our HSI team can do that. And let me give you a few examples.
Kritina Holden (26:37): So, you might say, we're building an automated system. How do we know if users are going to trust automation? How can we design it to encourage trust? Or you might say, we're designing a system with a complex control panel. How do we make sure users have adequate situation awareness? How do we make sure they don't have too high of a workload? Or we're building an electronic procedures product. How can we compare performance with the legacy system? What kind of data do we need to convince management that this is a better solution? To answer these types of questions, you kind of have to go beyond human centered design. You have to have expertise in formal research methods. And, at Leidos, we have that expertise as part of our HSI team.
Meghan Good (27:24): I love that idea, as we're definitely moving towards a more data driven world and where we want those comparisons. We want to be able to motivate the changes that we want to see happen. And really it's the users who are making that happen in the end. Now, beyond telling us a bit about Wizard of Oz techniques, I'm wondering, what recommendations do you have for someone interested in learning more or who is interested in a career in HSI?
Kritina Holden (27:50): So, the most common area of expertise within people working in HSI is human factors. So, if you're interested in getting a degree, you want to look for a degree program in human factors. Now, this can be tricky because different schools have different names for this general area. The program might be in the engineering department. It might be called industrial engineering. Or it might be in the psychology department and called human factors or engineering psychology. Or it might even be in the computer science department called human-computer interaction. Most of these are graduate programs. And there's also a great HSI certification program at the Naval Postgraduate School. But, if you just want to read more about designing for the human, you can check out the human factors and ergonomics society website, usability.gov. Or really you can just Google human systems integration or human factors. There are a lot of good books and online resources available.
Bridget Bell (28:49): I also listened to your interview, Tina, with NASA's podcast, “Houston, We Have a Podcast.” And I would say that's another great resource. And I think you gave a lot of interesting facts and tidbits about how NASA is using HSI during that interview. So, I'll make a plug for you there.
Kritina Holden (29:08): Well, thanks. Thanks.
Bridget Bell (29:10): So, as we wrap up, any final guidance or words that you want to share with our listeners?
Kritina Holden (29:16): I would say, whenever you're in the position of developing or creating something for human use, it might be hardware, software, procedures, training, anything, think hard about who is the user, what skills and abilities are they bringing to the table, what environment are they going to be using the product in, what is the culture, are there special considerations? Involve the users early and throughout the process. Develop multiple concepts or prototypes. And, most importantly, and something we see so often, instead of just asking them, hey, what do you think about this? Does this look good? Thumbs up. You really want to take a more formal approach to that. Be sure to include some testing where users are actually using the product in a realistic scenario and you're collecting objective data. So, you're collecting how many mistakes are they making, how long are they taking, any points of confusion along the way. That stuff is real important.
Kritina Holden (30:16): And finally, if you need to do something like this and it's really important, try to reach out to an HSI or human factors professional because they really have the knowledge and tools to make your product a success.
Meghan Good (30:30): That's wonderful advice. Thank you, Tina.
Kritina Holden (30:33): Thank you. I enjoyed it.
Meghan Good (30:35): Oh, well, thanks so much. And thanks to our audience for listening to MindSET. If you enjoyed this episode, please share with your colleagues and visit Leidos.com/MindSET.