Feb. 20, 2026

315 - The Side Effects of AI

315 - The Side Effects of AI
YouTube podcast player iconRSS Feed podcast player iconApple Podcasts podcast player iconSpotify podcast player iconiHeartRadio podcast player iconAmazon Music podcast player iconOvercast podcast player iconCastro podcast player iconPocketCasts podcast player iconCastbox podcast player iconPodchaser podcast player iconTuneIn podcast player iconDeezer podcast player iconSpreaker podcast player iconPandora podcast player iconRadioPublic podcast player iconPodcast Addict podcast player icon
YouTube podcast player iconRSS Feed podcast player iconApple Podcasts podcast player iconSpotify podcast player iconiHeartRadio podcast player iconAmazon Music podcast player iconOvercast podcast player iconCastro podcast player iconPocketCasts podcast player iconCastbox podcast player iconPodchaser podcast player iconTuneIn podcast player iconDeezer podcast player iconSpreaker podcast player iconPandora podcast player iconRadioPublic podcast player iconPodcast Addict podcast player icon

Episode Link: https://www.humanfactorscast.media/315

News:

  1. This week we discuss AI’s indirect effects on work and mental health, plus a major update on US air traffic control modernization. We cover “AI replacement dysfunction,” described as stress, anxiety, insomnia, and identity loss driven by the narrative that AI will make workers obsolete, and debate whether it reflects AI-specific harm or broader workplace insecurity and poor management messaging. We also explore the concept of “cognitive debt,” where heavy chatbot reliance and cognitive offloading may erode skills or leave partially developed ideas and externalized reasoning trapped in chat histories.

Support us:

  1. Patreon: https://www.patreon.com/humanfactorscast
  2. Buy us a coffee: https://www.buymeacoffee.com/hfactorspodcast
  3. Merchandise Store: https://www.humanfactorscast.media/p/Store/

Human Factors Cast Socials:

  1. Join us on Discord: https://go.humanfactorscast.media/Discord
  2. Twitch: https://twitch.tv/HumanFactorsCast
  3. YouTube: https://www.youtube.com/HumanFactorsCast
  4. LinkedIn: https://www.linkedin.com/company/humanfactorscast
  5. Twitter: http://www.twitter.com/HFactorsPodcast
  6. Facebook: https://www.facebook.com/HumanFactorsCast
  7. Our official website: www.humanfactorscast.media

Reference:

  1. Our tools and software: https://www.humanfactorscast.media/p/resources/
  2. Our Ethics Policy:https://www.humanfactorscast.media/p/ethics-policy/
  3. Logo design by E Graphics LLC: https://egraphicsllc.com/
  4. Music by Kevin McLeod: https://incompetech.com/music/royalty-free/

Feedback:

  1. Have something you would like to share with us? (Feedback or news): https://go.humanfactorscast.media/feedback

Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here!

Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!

Vote Here

Follow us:

Thank you to our Human Factors Cast Honorary Staff Patreons: 

  • Michelle Tripp
  • Neil Ganey 

Support us:

Human Factors Cast Socials:

Reference:

Feedback:

  • Have something you would like to share with us? (Feedback or news):

 

Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.

(A) 315 - The Side Effects of AI

===

[00:00:00]

[00:00:00] Nick Roome: Yes. Hello everybody, and welcome back to another episode of Human Factors Cast. This is episode 315. We're recording this live on February 19th, 2026. I'm your host, Nick Rome, and I'm joined today by Mr. Barry Kirby.

[00:00:15] Barry Kirby: Hi there Nick. How are you doing?

[00:00:17] Nick Roome: Hey, I, I told you in the pre-show, Barry not to ask me that 'cause I'm not feeling too well today, but we're gonna get through it.

[00:00:23] Nick Roome: Uh, we got, we got a great show for you lined up tonight as reluctantly as last week. We have more AI stories, but they're not about ai. This time they're about the effects of ai. So it'll be an interesting take on it. But first, programming notes for those of you who have been following along with us here at the show and the lab and their efforts, uh, you know that we have a new series out called User Experience Points where we look at the intersection of human factors and gaming.

[00:00:53] Nick Roome: The latest episode is out now. It's, um, it's all about Mar Marvel rivals, which is a, a popular [00:01:00] character, big game, uh, and looking at engagement optimized matchmaking. Have you ever heard of that term, Barry?

[00:01:06] Barry Kirby: No, I haven't. It's, it was first for me, so

[00:01:09] Nick Roome: yeah. So, uh, go, go check it out on our, uh, on our short form platform.

[00:01:14] Nick Roome: So that's YouTube, Instagram, TikTok, go, go take a look at episode two. It's out now. But Barry, I, I gotta know what's going on over at 1202,

[00:01:24] Barry Kirby: well, 1202 February's episode is now up about an hour before this show's gone live. So, um, the, yeah, record a, um, a chat with Martin Dy. Now those of you who follow was over at 1202.

[00:01:38] Barry Kirby: We'll recognize the name because he is been a friend of the show for for a long time since the show started. But he's currently the president of the Charter Institute of Ergonomics and Human Factors. And, and so, 'cause he was a long time friend of the show, he was more than happy to come on and gimme his reflections and his thoughts about what it means to be president.

[00:01:56] Barry Kirby: And we were able to chat a bit from our own different own [00:02:00] experiences and perspectives. We're also considering doing some almost round table type activities where we can try and have a bit of a wider discussion on the topics that we've spoken about in the main interviews. Almost companion episode if you will.

[00:02:12] Barry Kirby: So if anybody fancy being involved in any of them, then just get in touch and we, once we get work out, how we going to do them. 'cause we wanted to do them almost live round properly around the table. Um, but actually we'll get much better interaction I think if we use use the online platform. So if you wanna get involved, get in touch and we'll see what we do.

[00:02:29] Barry Kirby: And just to find a reminder, if you are looking to use the website to access 1202, that the, that the that the website, um, name has changed. 1202 pod.com. So the address is 1202 pod com.

[00:02:41] Nick Roome: All right. Well, let's, let's get into the thick of it, huh?

[00:02:44] Nick Roome: Yes, that's right. This is the part, our show, part of the show all about human factors news. Barry, we got two stories up that you wanna take 'em one by one.

[00:02:54] Barry Kirby: Yeah. So tonight we're dive, dive, uh, diving into two timely and some would suggest troubling, uh, [00:03:00] developments of the intersection of artificial intelligence and psychological health.

[00:03:04] Barry Kirby: So first up, uh, researchers are now talking about what they call AI replacement dysfunction, a set of stress, anxiety, insomnia, and identity loss symptoms emerging, not from direct ai, AI usage per se, but from the constant narrative that AI will make people obsolete at work. This looming, fear of displacement, even ahead of actual automation is shaping how people feel about their roles, their purpose, and their mental wellbeing.

[00:03:35] Barry Kirby: Nick, do you feel that constant narrative? Do you feel constantly anxious, stressed, and in having insomnia around so AI taking over your job

[00:03:43] Nick Roome: personally? No. I can give a concrete example of why just this week I saw how, how a certain program translated something into something else that. That, uh, that you would use as like a design tool.

[00:03:57] Nick Roome: And the translation didn't do [00:04:00] anything structural to the to the core, um, to the core files. And so it was, it was bad. But so I don't personally feel, I think human factors work is going to be very human centered for a very long time. I think we will work with AI as, as teammates, blah, blah, blah.

[00:04:19] Nick Roome: Let's get back to the article at hand though here. I, I feel that this sentiment is something that you see fairly common. I, I mean, you can, you can see this in the comments of, of any of the new videos that are coming out generated by AI or, or pictures or something convincing. You see comments like, we're cooked or, you know, I'm not, I'm not gonna have a job or you know, back in my day, we didn't have to worry about this type of thing.

[00:04:48] Nick Roome: And so I think there is definitely the sentiment there publicly. I think there's the article itself pulls out you know, clinical, clinical applications and sort [00:05:00] of using this as a a, a, a clinically treatable thing. I, I struggle with that. Only because of our privileged position. But, but sort of our position advocating for people to team with AI partners rather than having those AI partners completely erase the position.

[00:05:24] Nick Roome: Um, will it, will it happen outright? Yes, absolutely. Is the fear justified in some cases? Sure. But I think we as a society should look beyond that a little bit and think about what opportunities then arise when those positions are are then taken over by AI in, in so many terms. Well then you have like supervisory roles of those ai and the role just changes.

[00:05:53] Nick Roome: I mean, the same thing happened with calculators. The same thing happened with automobiles. Like you name it, we've [00:06:00] adapted. And I, I, I firmly believe that we'll adapt again. And I, I think, uh, in a lot of ways this article is sensational, but what, what are your thoughts on it?

[00:06:10] Barry Kirby: Yeah, it's interesting 'cause I, I think when I first read the article.

[00:06:14] Barry Kirby: I was very much of the, the, the thing, well, we've been here before, we've been here loads of times before. Every time we get some sort of industrial or technological revolution, they're, they're gonna be, um, you know, jobs are, are lost, Jo jobs are lost to technology. But then new jobs, as you've just just said, jobs are created.

[00:06:33] Barry Kirby: So life isn't it, it isn't changing. It's going to move. And maybe, you know, we as humans, some people say we don't like change. Maybe that's it, but I don't think that's where this story is is, is truly going. Um, I think you're right, it is possibly a, a bit sensationalist, however, is this almost a, a, a real thing of AI bullying from management?

[00:06:56] Barry Kirby: So you could historically, I mean, I guess [00:07:00] the way I thought about it was if, uh, in the, in the factory assembly line that you could, you could see, you know, everything was by hand. Everybody, you know, people were on the line putting that together. And then people said, oh, we've gotta get these machines in to come and do your job.

[00:07:12] Barry Kirby: And now we've got factory lines that are pretty much autonomous. You know, you've got, um, robotics going and, and really doing things faster, longer, safer and with, with greater precision. And then there's certain elements that only that only people can do when they do do them bits, fam. And I was like, isn't that just this is it?

[00:07:33] Barry Kirby: Isn't that just, just this again, but. I think you can see here that that you've got bosses there, you've got managers going, AI's gotta take over your job. Why do we need you anymore? You're useless. AI's gonna do it. And you sit there and go, well, okay, AI could take over my job, but actually you don't know what my job is.

[00:07:52] Barry Kirby: And most people don't actually understand what AI is and it's entirety and how it's used. Therefore are you are you said that you don't know what I [00:08:00] do and then when it all goes wrong, um, so you, you leave or you or the, or you, you become discontent. You go and find another job and then they, they need to hire you back because they realize AI can't do the job.

[00:08:10] Barry Kirby: So it, it all really, it it's, it's just a mess. And so I think it is, I can see how people feel that, um, discontent because of what they're doing. And I think there are, I'm doing, I did some research on this. I said I did, Amanda pointed out some examples,

[00:08:26] Nick Roome: um,

[00:08:27] Barry Kirby: because we, we were talking about it on the way back from event today, and she, and, and she highlighted, um, actually probably the biggest one of this is Duolingo.

[00:08:36] Barry Kirby: So if, you know, Duolingo is the, um, the, the language training app has the cute little owl logo. And they fairly recently turned around and said, right, we are gonna be an AI first company. So when they to, so they have, they used to have lots of humans, um, doing mo most of the roles in terms of translation, in terms of their social media and all that sort of stuff, it was really on point in their [00:09:00] social media game.

[00:09:01] Barry Kirby: But the senior management at Duolingo turned around and said, no, we're gonna be an AI first company. And, and that had a double effect of the employees were really annoyed the fact that they were, they were being pushed down this, um, this AI first approach. So quite a lot of them left. Um, but then the subscribers, when they heard about what was happening, um, the number of subscribers to, to Duolingo went through the floor and the, the backlash on social media, um, forced Duolingo to delete most of the, um, the content of its accounts because of this, this approach that, uh, that AI will confer.

[00:09:38] Barry Kirby: So you can see how that affected both employees because they left, but also the, the end customer, the end customer felt awful about what had happened to them as well and responded that and um, and basically the, the company suffered hugely, um, as a result of that. So that's not just that, I don't, I don't think anybody was suggesting that the quality of [00:10:00] the work was, was any less at that point.

[00:10:02] Barry Kirby: But they just didn't like the way it was done. So. That was, so one example, the other example I'll bring out was was Klarna. So the, um, um, the, the, the, obviously the, the, the credit company and they decided that they were going to replace a lot of contractors with ai. And so they didn't realize that they thought they didn't need a lot of these people use AI to do a lot of that interface to, to, to do a lot of the, uh, investigative work and, and make a lot of that stuff happen.

[00:10:30] Barry Kirby: They then went through almost a, a revelation that they weren't getting, us, weren't being as successful, and they needed that human touch in, in what they were delivering. So they then re rehired all the people that they got rid of to say, look, we need to, we need to bring you back because we, we recognize we need to have this, um, uh, this input.

[00:10:51] Barry Kirby: So I guess there's two examples of where employees didn't feel valued, didn't feel safe in what they were doing. So they acted, [00:11:00] uh, with their feet and they left and um, and then they realized that needed they, that it's, it's, it was a failed, failed idea.

[00:11:09] Nick Roome: Yeah. You, you know, what's interesting about this, uh, this change versus some of the other changes that we've mentioned like cars or, or, um, calculators or anything else that has changed the industry is that those were, those were sort of, things that came out that were basically complete when they came out. So a calculator does calculations. Sure it's big and bulky, but it does what it does. Car, same thing. There may not be as much of them, but as more produced then, you know, you have more access to it, more people are using it. But the thing here I think that's different is that AI is here and the models are consistently changing.

[00:11:58] Nick Roome: And what you'll find is that [00:12:00] something that wasn't capable last week is suddenly capable this week of doing that thing.

[00:12:06] Neil Ganey: Yeah.

[00:12:06] Nick Roome: Um, you know, you mentioned vibe coding. I'll bring that up in one more thing, but, you know, developers laughed at this a few years ago and now it's a real strategy. I, I know there are some people, there are some reports out there of like big, uh, well-known teams at big well-known companies not having coded a single thing since like December.

[00:12:29] Nick Roome: And they're, you know, they're basically just interacting with an AI to get the code in a place where it needs to be. And they're doing reviews of the AI's work and that's what they're doing now. And so that's out there. And, and I bring that up to say that this rolling deployment of AI models and capabilities.

[00:12:56] Nick Roome: Is probably contributing to this [00:13:00] anxiety because you never know when that shoe is gonna drop. When is it going to be capable of doing my job? When is it going to be capable of doing that thing that will essentially quote unquote replace me? And so I think that's, that's probably one of the larger differences here that we see with this innovation versus innovations in the past, is that this is forever evolving or will be forever evolving.

[00:13:27] Nick Roome: And so there's, there's this uncertainty that, you know, this un this anticipatory stress and anxiety that that's coming. Because I might lose my job soon, but I don't know when it's gonna happen. Yeah. And, and how do you prepare for that? It's like, there, there's really the, the best thing that you can do is understand what the models are capable of.

[00:13:46] Nick Roome: But I mean, that's a lot of overhead to ask somebody who's not in the field to, you know, sort of learn overnight. Um, I don't know. I, I find that this one in [00:14:00] particular, the there, you know, especially if we lean towards more of like a clinical diagnosis of, of this thing, I think that's where we get ourselves into a little trouble.

[00:14:08] Nick Roome: And I don't want to downplay anyone's stress or anxiety. I want to downplay tying it to this innovation particularly because this is what I would consider workplace stress. If your employer is not providing an environment that makes you feel safe and secure for, you know, you, your family all that stuff, I think that is a related of, that's more related to work related anxiety than AI related anxiety.

[00:14:43] Nick Roome: Because I think now companies need to be a little bit more resilient as they approach you know, think about human first companies, right? I mean, they have to be a little bit more resilient in how they treat the integration of AI tools. So that way the, the, the, the people that are working [00:15:00] for those companies are going to be assured that they are taken care of and that they will integrate these tools respectfully and in, in a way that is efficient and not going to, you know, damage any sort of work satisfaction.

[00:15:16] Nick Roome: So, I, I have a lot of thoughts on this one. I tend to think that the. Clinical approach. We can't just tie it to AI because Yeah,

[00:15:27] Barry Kirby: I mean, I, yeah, I mean, I think the, I think the anxiety as you quite right, point out the, the anxiety over such a quickly changing technology. I mean, we're in this, we're in this element now that, um, you know, it is, it is advancing so rapidly that just when you think you can put a finger on it, it's advancing its movement.

[00:15:47] Barry Kirby: I mean, tragedy, g BT is probably the, the most famous AI enunciation that, that everybody talks about. Uh, yes, it's an LLM, but let, let's go go with it. Let's, when we started using it, just what, a couple of years [00:16:00] ago you, you know, I got into it in, um, what ver in, in, um, version three. It's now what, 5.2 and the change in the way it works now, it is so much more capable than it was then.

[00:16:14] Barry Kirby: So that, as you know, it, it is constantly evolving. I guess there is, you know, and so therefore I can sort of see in certain fields, and we mentioned in the pre-show around around software engineering this, the use of. LLMs and, um, an AI in this, in the software field is changing the field is changing what it means to be a software developer.

[00:16:36] Barry Kirby: Um, but I think that's changing, not deleting, because as a software developer, you've still gotta have somebody who is thinking about the overall picture about what it's, you're trying, trying to do not, and, and to be able to check and robustly test and, and develop the code. Here, I think we, you do get into a bigger discussion really about are we working to live or are we [00:17:00] living, or are we living to work?

[00:17:01] Barry Kirby: Because you could argue that as a society, if AI is doing what we feel it should be doing, which is taking the weight off, you know, it's, it's, it's allowing us to do what humans do best. We make decisions and, and things like that allow us to be more creative rather than get, rather than being have focused then, is this anxiety a just almost a displacement of what it means to, to be human and, and therefore are We, is a fear of how we, how we live our lives.

[00:17:32] Barry Kirby: Uh, we less re um, um, less less focused on work. However, there is a reality here of most people need work to pay the bills. Uh, work just doesn't magically co uh, so money doesn't come outta the air. So, once I've seen some of these arguments around, you know, we, this will allow humans to be more creative and things like that.

[00:17:51] Barry Kirby: Society, I don't feel is at a, is at a place that we can take AI taking over our jobs and it'll all be fine [00:18:00] because you know, you need to pay the bills at, at the end of the week. And that's where I think a lot of this anxiety comes from, is people don't see, people don't see them jobs to change into that they, that they used to be there.

[00:18:11] Barry Kirby: So it's complex and it's, but you're right. It, a lot of it is, it's, it's probably we blame it on ai, but it's workplace focus, you know, job security as a whole.

[00:18:23] Nick Roome: Yeah. And, and think about this too. I mean, this, this issue is largely like white collar focused. You think about AI is not gonna replace your plumber, your electrician, your, you know, like, I, I tend to think about these as like first world problems in a lot of ways.

[00:18:41] Nick Roome: And, and they are. Right. I think that, that, um. You know, between the organization approach and the responsibility of them to make sure that their employees are taken care of, or that these things are implemented in a way that changes the workforce but not, not damages the [00:19:00] workforce. You know, and then who it actually affects.

[00:19:02] Nick Roome: I think we have to kind of take a step back and understand that yeah, you know, it'll probably replace somebody taking your order at McDonald's, but it probably won't replace the fry cook in the back. I mean, prob maybe. Maybe.

[00:19:16] Barry Kirby: Yeah.

[00:19:17] Nick Roome: But, you know,

[00:19:19] Barry Kirby: and a lot of people are using it as force multipliers, not replacements.

[00:19:24] Barry Kirby: So some people are using it, and certainly I, I know that, that we have been using it in a way that we were able to do things that we weren't able to do before, or we didn't have access to before. So it's, it's not replacing a job because we just wouldn't have done it in the first place, if that makes sense.

[00:19:40] Barry Kirby: So that's also an interesting way, but I guess, you know, you look at some of the historically, I, I completely agree with you around, you know, the, the trades of plumbing, um, electrician woodwork, joinery, that type of, um, that stuff. There are other, other tradi traditional roles that [00:20:00] have been hugely impacted with technology and we possibly don't recognize or talk about.

[00:20:05] Barry Kirby: So take the farming industry. You know, the farms used to have loads and loads of people working, um, working the ground, working, working, the, the animals, you know, that all that, all that piece. A lot of that work has now been, you know, the, the personnel on a, on on a farm has now shrunk rapidly, um, or massively because you've got GPS and um, au automation within tractors to get a lot of that make a lot of that machinery work.

[00:20:32] Barry Kirby: And just the use of machinery and automation has reduced the the need for hands-on personnel. And I think that that's probably quite a decent perspective or insight into how technology can evolve an entire industry.

[00:20:48] Nick Roome: Yeah, I think, I think as we sort of look at this story here, and I'm, I'm trying to think towards the future and like, what can we do to, to help with some of this anxiety if, [00:21:00] if any organizations are listening.

[00:21:01] Nick Roome: I think, you know, one of the big things we talked about, the organizational responsibility to talk about it talk about it from the perspective of rather than, you know, half of you are gonna lose your job in 18 months to actually understanding what the AI. Tool that you're implementing will actually change, and how the people themselves will be supported rather than the other way around.

[00:21:25] Nick Roome: You know, how will the AI supported be by, be supported by the people, but rather how will the people be supported by the AI tool? And, and changing that narrative, uh, I think will go a long way for, for making this type of discourse or this type of anxiety, this type of stress associated with, um, AI's gonna take my job to be a little bit less of a sensational stressor in the workplace.

[00:21:51] Nick Roome: You know, and I think obviously I'm gonna go ahead and plug human factors here just in general as a field, because hi, human factors cast here. Uh, like, [00:22:00] you know, there, there's, there's always going to be a place for human factors practitioners to, to step in here and design these jobs with the humans being a key part of that human in the loop, right?

[00:22:12] Nick Roome: And I think that is one thing that we have to remember. You can't just outright replace people. And so I think I'm coming from it. Um, we're both probably going from it with a very privileged lens of saying, yeah, we know how this is gonna turn out. If you just flat out replace people with ai, it's not gonna be good.

[00:22:29] Nick Roome: So it's easy for us to not get super, wound around the axle for something like this, but I can see where if not everybody is talking about it in the way that we do then there could be some anxiety. So, yeah, I don't know. Any other closing thoughts from you, Barry, on, on this first story?

[00:22:46] Barry Kirby: Yeah, I guess the, I guess for some reason, for some people that anxiety, I guess, becomes very real because we, the nature of of the workforce or the nature of business, the nature of certain capitalism is there's a lot of people who are [00:23:00] willing to try, try this.

[00:23:01] Barry Kirby: And the two examples I gave there were where companies had gone out, they tried it, so obviously the, they, they did have physical effects to staff. Um, but it all comes sort of right in the end. So I do think there is legitimacy around, around the anxiety because people do try these things. But longer term we, we are in the, I think we are in the throes of a, of the revolution as it were.

[00:23:28] Barry Kirby: And, and things will come right to the end.

[00:23:33] Nick Roome: Alright. Why don't we get into this next story here.

[00:23:36] Barry Kirby: Yeah. So I guess to follow that follow on from that topic we're coming with, um, some of the most visceral concerns around ai, which around the technology's role in extreme psychological reactions, a scientist who's been studying AI effects on cognition warn that acce, uh, ones that excessive reliance on chatbots, especially in creative or reasoning work, maybe [00:24:00] creating a kind of what, what they term a cognitive debt eroding that, the mental skills of resilience over time.

[00:24:08] Barry Kirby: So, Nick, do you find that if you are using chatbots too much, are you refusing to think anymore and you're just taking on the what, what, what the chatbot gives you as gospel and running with it?

[00:24:19] Nick Roome: I don't know if, if that's necessarily what they're talking about here. I, I think to me, what my interpretation of this article was that so, so let me, let me back up and kind of give some more context for this.

[00:24:31] Nick Roome: This is by the same person who predicted AI psychosis, which if you're unfamiliar AI psychosis is, is basically it's psychosis triggered by ai. You want a different definition. I don't know. And, and, and this, this person predicted it, uh, a while ago, right? And, and I think what they're trying to predict now is that those who are in, as you say, these creative or reasoning roles you know, these, these scientists, um, intellectuals that are, that are [00:25:00] engaging, uh, using these AI tools, what's going to happen is that if you lean too hard on ai, you will start to feel like there's, there's all this ideation that's happening that you can't follow through on.

[00:25:18] Nick Roome: And so that's, that's the way I'm interpreting it. And, and I, I feel like there's, at least for me, I've encountered this specific thing. I have lots and lots of ideas as a content creator in the human factor space. There's only so many that see the light of day. A lot of them I explore through you know, AI chat, GPT, whatever.

[00:25:44] Nick Roome: I, I go in and I write like, Hey, what would it take to do X, Y, Z? Or, um, you know, give me a couple ideas for this, that the other thing and the ideation is, is nice and helpful, but I think the thing that I often find is that. [00:26:00] I will come up with this really great concept or idea, and then I'll flush it out a little bit and I'll leave it half baked because you know, before, the thing that would stop me from doing it would be sort of the ability, the, the, the, the bandwidth issue.

[00:26:19] Nick Roome: And then now what it is, is a bandwidth issue on a different piece. It's like now, now it's not necessarily that I can't do it, it's that I feel like I'm farther than I would've been if I just cut it off to begin with. Yeah. Saying I don't have the time to do that now. I can at least explore things a little bit.

[00:26:40] Nick Roome: And now that is this cognitive debt in my head, and I know that you know, when we're talking about this, this concept of cognitive offloading we're doing more and more of that with these AI chatbots and, you know, like I have this idea, boom, I put it into chat GPT, and then boom, it's gone forever because I've, I've, I've [00:27:00] offloaded it.

[00:27:00] Nick Roome: If I remember it, I'll go back and search up that chat and it'll be there. And I've offloaded that to. The ai, and I think this is sort of that concept of cognitive debt that they're talking about here. This, this accumulation of using these tools like AI to externalize the memory functions and the, and, and the, um, and the reasoning functions.

[00:27:28] Nick Roome: The behind some of these things that require a lot of intellectual thought is, is this cognitive debt where now I've offloaded that reasoning, uh, or, or talking through and it's stuck in a chat somewhere, and then I don't have it written down anywhere else. And then if I wanna retrieve it, I have to go and search for it, and I have to remember to retrieve it.

[00:27:50] Nick Roome: So it's, I I tend to think of it less as like a, a crutch and more of like a, I've put my information into this thing to explore this concept or to [00:28:00] do more, um, thought on a thing, and then it's suddenly lost. And so this cognitive debt is the buildup of that lost stuff that you'll never work on again. At least that's, that's my interpretation of it.

[00:28:14] Nick Roome: I, I feel like there's probably a little bit more nuance in the article, but this is me speaking from experience and kind of the way that I'm seeing myself going, and sometimes I have to back up and be like, I'm never gonna touch this chat again. Why am I even trying?

[00:28:27] Barry Kirby: Yeah. It's interesting. I, I sort of took it more from a a methodological toolkit.

[00:28:35] Barry Kirby: Perspective in that what I felt they were getting at was a lot of people may be using this to, uh, and using online tools to get you to a place where if we, if you don't necessarily have the scientific training to be able to, to do this or have the years of research foundation to understand how good research comes together, that, um, they were suggesting that [00:29:00] we lose that.

[00:29:01] Barry Kirby: Um, and you're just then leaning on Chachi, pt, et cetera, et cetera, to, to help do that. But I think they also negate the wrong argument in it because the, the, the example that they lean on was the the AI researchers who used AI or UI highlight the potential for AI to assist in scientific discovery, um, when they did, um, prediction of 3D structures of, of known proteins.

[00:29:28] Barry Kirby: And, but then they also then go and say, you know, they, it's, if, if they hadn't used this, they're still building on that lifetime of work, that lifetime of research. So yeah, it, I don't think the argument necessarily hangs together, but even if it did, um, the way we use technology changes and evolves. So, you know, we.

[00:29:55] Barry Kirby: We, we used to, you know, I say we, I certainly, my parents and, and before then used to [00:30:00] use log logarithmic scales, um, or logarithmic tables used to use slide rules and, and things like that. Before, like, so the calculators came along and spreadsheets and things. I have no idea how a slide rule is used. I very rarely use long division, long multiplication.

[00:30:19] Barry Kirby: I rarely do writing down of long form, uh, mathematics. In any of its forms. I'm using a calculator, I'm using an Excel spreadsheet. I'm using them sort of tools. I don't, but I don't. So my cognitive debt there is, I dunno how to use them, them them all the techniques. And I, I have no idea how to use them.

[00:30:42] Barry Kirby: So if I need to do that sort of thing, I have to rely on a calculator to do it. I have to rely on a spreadsheet to do it. I could go and find out if I wanted to, and that, and that. That's a, that's a different thing, but I don't use it on a day-to-day basis. So is there just a, a point where we turn around and say, well [00:31:00] actually because we can, oh, I don't think we can do it yet, but in the future we will be able to use AI based models and tools to help us advance scientific research so much that actually does it matter as much that we don't necessarily know?

[00:31:14] Barry Kirby: Don't remember the, the, the. Far-flung basics of how to do things. 'cause things have moved on. There is an argument around, um, the difference between tools and methodology. And so if the, if we suggesting that, um, methodologies can change and how do we make sure that the findings are robust and things like that, that's a skill we definitely need to keep.

[00:31:34] Barry Kirby: But how we use tools and and chat bots and things like that are we at risk of holding ourselves back?

[00:31:42] Nick Roome: Yeah. I find it interesting that they target intellectuals and this, this goes back to something that Neil said in the pre-show,

[00:31:50] Neil Ganey: So there's my observations of kind of kind of that cohort age group [00:32:00] and looking at it at kind of what their performance level is like, kind of academically, et cetera.

[00:32:12] Neil Ganey: And it's this kind of inverse shape function. The higher a performer level they are, the less attracted they are to heavy AI usage. I mean, they, they will get familiar with it because they feel like they need to be, but in terms of just full utilization and hooking their grade to it, not at all but's good.

[00:32:41] Neil Ganey: Kind of the more, uh, student kind of goes towards the other end of that continuum. It seems like they like to utilize it more and more and more. And that's actually the end where it's like you, you shouldn't be.

[00:32:54] Nick Roome: Uh, and sort of the inverse relationship of those who rely on ai.

[00:32:59] Barry Kirby: Mm. [00:33:00]

[00:33:00] Nick Roome: And so what Neil was saying is that, that those who maybe are, are more on, on the, and forgive me if I'm paraphrasing incorrectly here, but more on the intellectually inclined scale tend to rely on AI output less than, than those on the other end of the spectrum. And it almost seems like, you know, you need to flip that where where the, the, the ones who are intellectually inclined taking a look at an output.

[00:33:28] Nick Roome: I see this is, this is my problem. I think with the. The claim here is that they're claiming that this would be more for those who are in research based roles or or, you know, the, the science based roles. And I, I disagree with that slightly. I think if you were to actually plot this on a graph, you know, I, I, the name is escaping me right now, but do you know the graph where, you know, there's, there's the amount of knowledge that somebody knows in a [00:34:00] field and you know, their confidence in how how capable they are.

[00:34:05] Nick Roome: And then it kind of goes up sharply, uh, at the beginning. Sorry, I'll do it this way. It goes up sharply at the beginning. And then there you have that sort of trough of realization there. I feel like this type of thing would affect those who are at the peak of that first that first peak who feel like they know enough to be, to, to do the job, but are they're dangerous.

[00:34:29] Nick Roome: Yeah. In so many ways. I feel like this is who they're talking to because I feel like people like you and I can look at output and go, yeah, that's, that's bullshit. We're not gonna take that. Um, or we need to reprompt it to get to, you know, the certain output that we're looking for. But those type of people, I feel like might actually, you know, fall into this automation bias trap where you have, you know, just this, this.

[00:34:56] Nick Roome: Bias of accepting what comes back. [00:35:00] And, and they have these outputs that are, that are pretty and are you know, well-spoken or whatever, but they're not, they're not deep. They're shallow. They're, they're not, they're in depth and they're not giving you the thing that you actually need. And especially combine this with stuff like time pressure, I can imagine it's very, very real that somebody in that situation would go, yep, this is good enough.

[00:35:23] Nick Roome: And you know, when you have the people who are taking a, taking a look at the outputs and going, that's not quite right then I, I don't think you have that same issue. And so I think it, it kind of falls into these medium range familiarity with rigorous science that would fall into that trap. So I don't, I don't know how valid the claim is.

[00:35:45] Nick Roome: We might see it pop up more and more. And again, if you're like looking at if you're looking at AI psychosis, it affects a very specific target demographic. And I think this is likely the same thing that will happen [00:36:00] here. It will be an effect that we look at and we will see those that suffer from it.

[00:36:05] Nick Roome: But I don't think this is going to be something that many will suffer from. I think it's a warning for those in that position maybe. Um. I, I don't know if, if all that is conveyed in this article

[00:36:22] Barry Kirby: yeah,

[00:36:23] Nick Roome: yeah,

[00:36:24] Barry Kirby: yeah. I'd agree. I mean I think the, so the term you're looking for earlier was that graph is the Dunning Kruger.

[00:36:29] Barry Kirby: Oh, thank

[00:36:30] Nick Roome: you.

[00:36:30] Barry Kirby: Yeah. Um, yes. And again, that's interesting, isn't it? Because you talk about people within their domain. So you kind of got two people. The, um, the people who know and the people who bluff. You know, people who, who just try and talk a good game. And you do wonder whether using AI as a tool empowers more the, that, that the, you are the confident, the bluffing people and gives maybe more weight to their argument or their, their, the work that they [00:37:00] do.

[00:37:00] Barry Kirby: 'cause what I sort of believe is the more, the more expert you get in your role the more, um, the more you realize you don't know what you're doing anymore, the imposter syndrome sets in. So, uh, 'cause you realize that you don't know everything or you don't, you know vastly less than what you think you do.

[00:37:18] Barry Kirby: So I think that's, that's quite a, quite a thing. I think. I do think that this is, um. This is an evolution of, of the technology. As, as I sort of said earlier, I think there is an element here that we, that it is a tool. We've gotta get our heads around it. It it's possibly the most powerful tool or set of tools, um, that we've, we've had in such a long time since like the, almost since the first industrial revolution.

[00:37:45] Barry Kirby: That, that things are, are changing so dramatically. But there's so many people who, who don't know what this actually means and what it is. I mean, I'm seeing discussions, I'm seeing requirements or where people [00:38:00] just turn around and say, you know, AI is gonna solve this for me. Right. Okay, but what type of ai, what type, 'cause there's many different things.

[00:38:07] Barry Kirby: Um, why do you think it's gonna do that? How do you know you can trust the output? How do you know you're gonna work alongside it? And I think it's the people, um, who. With the greatest to respect, don't know what they're talking about, that will use this sort of stuff more without any, um, without any, without any guilt, without any fear, without any thought.

[00:38:30] Barry Kirby: The intellectual people that, uh, that the, or the, the, the strong academic people with that's been referenced in this article they do recognize what they're doing 'cause they are ai uh, researchers. And so they are, and the fact that this has been talked about in this way, they're clearly are thinking about how they are using it, the, the the, the limitations of it and, and what it does.

[00:38:53] Barry Kirby: But as a society, I think we've got potentially bigger problems of not necessarily [00:39:00] replacing the, as they do. I'll quote out of this here. They talk about how AI could replace the mental muscles, which is the kin to what I said, said at the top of this, a little bit around the way we use tools. Um, some people are not replacing them, replacing their mental muscles.

[00:39:13] Barry Kirby: Some people are using muscles that they didn't know they had, or muscles that they don't have, but they think they have. So, uh, in the way that they use the AI elements of it.

[00:39:23] Nick Roome: Are we, are we just immune to this because we're human factors practitioners? Like, is, is that like we're aware of the pitfalls here?

[00:39:30] Barry Kirby: I, no, I don't think we were immune to it at all. I mean, again, my, um, Amanda and I, my my wife and I, we talk about this on a regular basis because, and I think this is quite healthy. I think she's very she's not anti ai. She's worked in the AI field for a very long time. She's, um, she played with neural networks when they were still cool.

[00:39:51] Barry Kirby: The, so, but she's, I would say she's a AI skeptical that she, she thinks that we need to be taking the work we [00:40:00] do with ai, with a massive bucket full of salt. Whereas I'm quite pro, um, I see it as a, as a, as a, as a tool that we should be experimenting with learning about what, understand the limits of it.

[00:40:11] Barry Kirby: And I think between the two of us, we are about right. But I think it's very easy to, depending on what your motivations are, depending on, on the, the, the environment you are working in, I think you can probably go too pro or you can go too skeptical. So I don't think we, I don't think we're pro to, we probably more in a front seat position to understand the effects.

[00:40:41] Barry Kirby: So we recognize, and it's very much front and center that these effects are gonna exist and we have to really understand. How, you know, we del how we distribute tasks, how we, what the effects can be, how humans interact with stuff. Especially, we're in this vein at the moment of, you know, we expect [00:41:00] a, a car to be driving along and suddenly it does something wrong and goes, Nope, I'm not in AI mode anymore.

[00:41:05] Barry Kirby: It's over to you. And as the driver, you've gotta pick up and drive this, this thing with milliseconds of notice and then wonder why you crash. So there is a, there is, and even in the human factors domain, we have a range of very significant speakers, very significant people within the, within the field.

[00:41:22] Barry Kirby: Some who are so anti ai it worries me, uh, but other people who are so pro ai, it worries me. So, yeah, I think, I don't think we were immune to it, I think, but I think we do have a front row seat into, into the impact

[00:41:36] Nick Roome: front row tickets to the end of the world. All right. Any other, any other, any other closing thoughts on this one?

[00:41:45] Barry Kirby: Oh, I think I would sort of sum it up in that, in this way that it's about how we understand the new toolkit and the, there are gonna be some people who don't want to use the toolkit but we need to.

[00:41:59] Nick Roome: [00:42:00] Yeah. Agree. Okay, well thank you to our friends over at Futurism for our news stories this week. If you wanna follow along, we do post the links to the original articles in our Discord where you can find us, join us for more discussion on these stories and much more.

[00:42:12] Nick Roome: We do post those as we find them. So you'll notice to when, when these posts were timestamped. And then you can count back to how, how far before the episode that was. And you can know what we did. A good job. Uh, alright, we're gonna take a quick break and then we'll be back to see what's going on in, uh, in aviation.

[00:42:29] Nick Roome: Yeah. This week in aerospace, we'll be right after this

[00:42:35] Announcer: Human Factors cast brings you the best in human factors. News, interviews, conference coverage, and overall fun conversations into each and every episode we produce. But we can't do it without you. The Human Factors Cast Network is 100% listener supported.

[00:42:51] Announcer: All the funds that go into running the show come from our listeners. Our patrons are our priority and we wanna ensure we're giving back to you for [00:43:00] supporting us. Pledges start at just $1 per month and include rewards, like access to our weekly q and as with the host. Personalized professional reviews and human factors Minute, a Patreon only weekly podcast where the hosts breakdown unique, obscure, and interesting human factors topics in just one minute.

[00:43:18] Announcer: Patreon rewards are always evolving, so stop by patreon.com/human factors cast to see what support level may be right for you. Thank you. And remember, it depends.

[00:43:30] Nick Roome: Yes, huge. Thank you as always to our patrons. We especially thank you because you keep the lights on, you make the cool lights in the background happen. You make the lab happen, you make the platform happen. You make it happen. You, you make this whole thing happen. So thank you so much for everything that you do for us, uh, for your financial contributions.

[00:43:54] Nick Roome: And if you want to be a patron, uh, you, you can do that. Human factors, uh, [00:44:00] patreon.com/human factors cast, uh, I mentioned it before, but, uh, we have a, we have a brand new segment as of last time called this Week in Aerospace. Uh, so we're, we're going to, uh, take it away to our friends over from the HFES, aerospace Systems Technical Group Phil and Elena over to you.

[00:44:19] Nick Roome: What's the latest in aerospace?

[00:44:22] Elena Zhang: This is Elena Phil from Aerospace System Group at HIVS. So after our last recording, we started talking about the overhaul of the US air traffic system and all the changes happening right now, and it was just so fascinating. So today we focus our discussion to bring listeners up to speed on air traffic management.

[00:44:42] Phil Doyon: On January 11th, 2023, all US

[00:44:46] Phil Doyon: Flight were grounded or delayed nationwide for close to two hours. It was the first time since nine 11 that the FAA issued a nationwide ground stop in the country. [00:45:00] What happened during the schedule maintenance of the computers running notice to Airman's system or notam, one employee mistakenly replaced one file with another corrupting the database.

[00:45:15] Phil Doyon: This led to the outage of the NOM system, which provides advisories to all aircraft pilots about potential hazard along their routes and airports with the not DAM system down the FAA grounded all fights to ensure safety.

[00:45:33] Elena Zhang: It's so surprising that moving one file can disrupt the entire air traffic system.

[00:45:38] Elena Zhang: It took close to two hours to fix Arrow, and from a human factor standpoint, if a file location is that important, shouldn't the system be designed so that this type of user action is not allowed? That is something to consider.

[00:45:52] Phil Doyon: Then in May of 2025, the Denver on Route Center suffered a 92nd communication and radar blackout.[00:46:00]

[00:46:00] Phil Doyon: Now, to be clear, aircraft continued flying their assigned route during that time, but a TC could not reach Byt to give them a new clearance. Similar blackouts occurred days before at Newark Airport. The Black House were the result of faulty radio transmitter that failed one after the other primary, and the backups.

[00:46:23] Phil Doyon: During the Denver incident controller used a guard line that is usually used for dis risk call and reach one aircraft to inform them about the loss of communication at the center. That pilot then broadcasted to the other aircraft, instructing them to change radio frequency to reach an operating center.

[00:46:46] Elena Zhang: Wow. I was not aware of how it unfolded. So to better understand the problem, can you tell me what the current state of a TC in the country is?

[00:46:55] Phil Doyon: So I wanna be clear. Air traffic management is safe and working in the country, no doubt about [00:47:00] that. And everyone is committed to safety. You know, that is controller pilots, airline, et cetera.

[00:47:04] Phil Doyon: But a TC is facing two issues. One is infrastructure, and the other is workload about infrastructure. Following the Noem outage, the FAA was tasked to conduct a risk assessment to evaluate the sustainability of all a TC systems. For example, Noem is such a system such as electronic flight strips or weather information.

[00:47:32] Phil Doyon: The report was released in September of 2024 by the government accountability office. Its title was unequivocal. FAA. Actions are urgently needed to modernize aging systems. I remembered the report vividly because I had just started the HFES newsletter for aerospace at the time, and it was one of the first news that I covered and I was shocked by what I've learned.[00:48:00]

[00:48:00] Phil Doyon: Out of the 138 systems surveyed, they identified that 76% of them are unsustainable or potentially unsustainable, meaning that they have updated functionality, a lack of spare parts, or if an update is available, there is no funding to support this acquisition. What is more concerning is that 58. Of those systems are considered to have critical operation impacts on the safety and efficiency of the aerospace if they were to fail.

[00:48:32] Phil Doyon: This is what happened with the Noam system that we talked about earlier.

[00:48:37] Elena Zhang: And from my understanding, this situation has gone on for a while and the FAA is aware of it. Their problem is with funding. Over the past 15 years, the FA a's budget for a TC infrastructure has remained essentially flat at approximately 3 billion per year, meaning they were unable to upgrade equipment across the country.

[00:48:58] Elena Zhang: You can look at the FA a's report. [00:49:00] They share pictures of their work environment using fans to cool an old radar system using alumni foil and wrapping it around ribbon cables to reduce interference. Some systems even use floppy disk to share files between computers running on Windows 95.

[00:49:17] Phil Doyon: I mean, when your operation rely on using floppy disks and Windows 95, that's a sign of an unsustainable system.

[00:49:23] Phil Doyon: Now for people watching online, or if you go over the YouTube channel, uh, we present pictures taken from the report for UTC. So clearly there is an infrastructure deficit, but what about air traffic controllers? I would imagine that working in such environment must be a challenge for them.

[00:49:42] Elena Zhang: You are on the right track.

[00:49:43] Elena Zhang: A TC workforce is facing significant strain due to staffing shortages, high workload and demanding work schedules. There were just over 14,000 controllers in 2024, but close to three 7,000 of them are expected to depart by 2028. [00:50:00] That's half of the controllers within four years. To mitigate this, the FAA plans to hire 9,000 new controllers by 2028, offering incentives such as educational support and a 5,000 graduation bonus.

[00:50:14] Elena Zhang: Although the FAA has expended training beyond Oklahoma City to additional centers, including riddle training remains highly demanding with a 26% dropout rate since underscores ongoing human factors challenges related to fatigue, training burden and workforce sustainability.

[00:50:32] Phil Doyon: Alright, so let's circle back to the A TC overall that is happening.

[00:50:38] Phil Doyon: So in May of last year, the US government announced that it will replace the air traffic control system. Their program is called Brand New Air Traffic Control System, or B nacs. It has a budget of $12.5 billion and three years to make major changes. It will replace copper analog cables to bring in fiber optics and [00:51:00] IP internet connection.

[00:51:01] Phil Doyon: It will replace radio for voiceover IP to prevent issues reported previously. And we'll also install ground surveillance radars at more than 200 airports. It's a very ambitious program, and we'll need to move at a fast pace. And this is only the first part. As Secretary of Transportation, Sean Duffy, as already announced that a TC modernization will need an additional $20 billion complete the work.

[00:51:28] Phil Doyon: So this is the first part is to show the Congress that they can deliver before asking for the next 20 billion.

[00:51:36] Elena Zhang: So actually, I've read that Perton was named the prime sister integrator for this modernization effort, but I've never heard of them before. Who are they?

[00:51:46] Phil Doyon: Well, it's totally normal that you've never heard of them before because Perton came into existence less than 10 years ago.

[00:51:53] Phil Doyon: So in 2017, Veritas Capital, an investment firm based in New York, [00:52:00] both Harris government IT service division and renamed this perton. At the time, Harris was also responsible for a TC IT infrastructure, but that division, the one for A TC, was not sold to perton. Perton asked, continue growing by acquiring other business units in the areas including space intelligence, cyber, and defense.

[00:52:25] Elena Zhang: Wow. Sounds like a fast growing business. What else have they done so far?

[00:52:30] Phil Doyon: All right, so for what we know, Perton was not involved in previous projects related to eighties. Their role as prime system integrator of BNX is to steer the entire acquisition program and to allocate contract to supplier. One of the first major announcement they made is the replacement of over 600 round base radars.

[00:52:51] Phil Doyon: Collins was awarded over $400 million to install two types of radars. The condor MT three, which is a cooperative [00:53:00] surveillance radar, meaning that it use aircraft transponders to locate its position on ground for aircraft without a transponder. They also install the a SR exam, a non-cooperative radar that detects aircraft using reflected signals Intergroup USA receive close to 250 millions to install 46,000 new radios.

[00:53:26] Phil Doyon: So things are already moving. The FAA announced that their contract with Perton will be made available on sam.gov website. I have not seen it yet, but, uh, we'll keep you posted when it becomes public.

[00:53:39] Elena Zhang: All right, so thank you for everything. Today we explored the current state of US air traffic control from recent system out state outages to broader challenges posed by aging infrastructure and workforce strength.

[00:53:53] Elena Zhang: You know, that reminds me of something that Adam, Larry, a fellow human factors professional said in announced announce [00:54:00] following announcement of a TC modernization. Beyond installing new equipment infrastructure, we also need to consider the training for the new equipment and how it will change operations on both sides, so a TC and the pilots, and how a TC modernization is really a system wide change, and it's an exciting time to observe its development.

[00:54:22] Elena Zhang: That is all we have for today. Thank you.

[00:54:24] Nick Roome: Oh, well thank, thank you so much. I I ended it just a little bit early. I'm sorry. I'm sorry. They always look so sharp. I feel like we're, we're here in hoodies. They're like fully dressed up in Maka and I'm like, we, we need to step up our, uh, our, our, our game here, Barry. We

[00:54:42] Barry Kirby: need to obviously put, um, get on with our makeup game

[00:54:45] Nick Roome: makeup, and and they have the overlays and everything.

[00:54:47] Nick Roome: They go full out. Like, what are we doing here? Yes, they're, they're putting us to shame. Uh, all right. Well, thank you. Thank you, Phil. Thank you Elena, for the, uh, coverage of this week in aerospace. Really appreciate it as [00:55:00] always. If, if you love the coverage of this week in aerospace, uh, let us know. We wanna hear that.

[00:55:05] Nick Roome: But right now it's time for a show, part of the show that needs no introduction. So it's time for, it came from Barry, what is your, uh, or sorry, not okay. From one more thing. What am I? What? I messed it up. I messed it up. Now it's time for one more thing. What, what's your one more thing this week?

[00:55:22] Barry Kirby: So my, one more thing.

[00:55:23] Barry Kirby: This week I had a, um, a meeting today with the first me, uh, first Minister of Wales, who's effectively the Prime Minister of Wales, um, within, within the uk. And she was, she was great. She she said, hello. So what is it you do? So we do human factors and what is that? So we have this, um, thing about where you have to explain in a single sentence to a politician, you know.

[00:55:54] Barry Kirby: Um, Ellen and Morgan is not a a, you know, an intelligent person, [00:56:00] um, or anything like that. So, you know, you, you're trying to explain to an intelligent person, but they've got very, very limited time. Um, and you've gotta try and get it, get this concept over to them really, really quickly. Now I find we have many, many conferences that still can't describe human factors.

[00:56:15] Barry Kirby: Yeah. In a 50 sentence. So how was I going to do that in about 10 seconds? And so I, both, I and Amanda give it a, a good volumed go. And, um, and she turned around and said, so it's a bit like ergonomics. Then I'm like, yeah, broader, sort of. And then she went on to the next person. So it was an interesting exercise that I struggle with explaining what we do with the best of times, but to try and explain it to the first wins Wales in a pithy sentence, in a way that she could grasp before going onto to the next person was a particular challenge I did not enjoy.

[00:56:48] Barry Kirby: And, and we were then trying to discuss it on the way home from this event to try and work out, um, how can you do human factors in a one-liner that, that that appeals to a politician. So we're still [00:57:00] working on that one, but it was, it was a fantastic experience to, uh, to meet her today and introduce her to the world of human factors is,

[00:57:07] Nick Roome: The, the go-to I always go to, if I have to do like a one-liner, I always go to, it's designing for a human for their task, given the context that they perform that task in.

[00:57:19] Barry Kirby: Yeah.

[00:57:20] Nick Roome: No, not, not all encompassing, but it, it does enough to,

[00:57:26] Barry Kirby: yeah, get it across. We saw a similar thing around, you know, fitting the, um, fitting the task to the human rather than the, uh, the human to the task. But then I, I did throw some exam, some rather spicy examples that were very relevant to the rest of the audience that was there.

[00:57:42] Barry Kirby: That, that somebody else had heard me. I could have got into a bit of trouble, but anyway. No, it was a fun exercise. What about you, Nick? What's your, uh, what's your one more thing?

[00:57:50] Nick Roome: Well you had mentioned vibe coding a few weeks back and I thought, oh, you know what, it's, I, I should probably try [00:58:00] to do that.

[00:58:01] Nick Roome: I had tried like way early on when AI was just a, a brand new thing and was not anywhere where it needed to be. Then, especially for someone like me who's familiar ish with coding, but is not my strength. Like, I, I'm not, uh, I don't have a background in coding. Like I know some, I know how to print Hello World maybe is how I could put my coding skills.

[00:58:25] Nick Roome: Right. Um. So what I've been doing is I've been working on a tool to help with some of the human factors news that we get. So it's no secret that the way that we conduct our news search is across a variety of different sources. We try to mention them on every show that we do. You can imagine that there's a lot of noise in the signal that comes through.

[00:58:49] Nick Roome: And so what we're trying to do is trying to find the, the needles in the haystack that are pop culture, pop culture ish [00:59:00] related stories to human factors. Those are good ones to talk about on the show. Other, other types of stories that are you know, would, would have sort of the mass appeal. Those are the types of stories that we try to pick for the show.

[00:59:12] Nick Roome: Not, not only to make it interesting to talk about, but interesting to listen to. So with all that being said, I've started vibe coating a human factors news triage tool that looks through multiple RSS feeds and will use a locally installed LLM to start rating and and sorting the relevant of each of these individual items.

[00:59:38] Nick Roome: Uh, yeah. As, as it relates to human factors, and it'll come back with a score and then I can filter based on the score, I can filter based on how far back I want the dates to go. And it will tell me, you know, a summary of how it relates to human factors. So that way I don't have to think super hard about it as I'm trying to find these needles in the haystack.

[00:59:56] Nick Roome: It's again, meant as sort of a cursory glance. So [01:00:00] that way I don't have to go through headline by headline and go, oh, that's human factors. Can I read it? Oh, nope, that's actually not human factors go back to the list. It's actually meant to be just a, a short list of things that I look through. You know, you know, long time listeners of the show know that I used to put together a Human Factors news roundup every week, every month.

[01:00:20] Nick Roome: That got to be so tedious that for my own mental health I had to step away from it. So you know, this is trying to get back at, at at least trying to find a tool that will help with some of that. So it's in an okay. State it. I need to do some further testing and, and validation of it. But it's interesting and now I can talk about my own vibrance.

[01:00:41] Nick Roome: Yeah, that's what I got going on. Oh. And new Star Wars trailer. Excited. Yay.

[01:00:48] Barry Kirby: Oh

[01:00:49] Nick Roome: yeah, well

[01:00:51] Barry Kirby: coming out soon. I'm quite excited. So anyway.

[01:00:53] Nick Roome: Oh yeah. Yeah. Lots, lots of fun stuff. Let's talk about that in the post show. That's it for today, everyone. If you like this episode, [01:01:00] uh, enjoy some of the discussion about ai.

[01:01:02] Nick Roome: I'll encourage you to go listen to any of our about last week's episode. We talked about AI there. Uh, go do that. Comment wherever you're listening with what you think of the story. This week for more in-depth discussion, you can always join us on our Discord community, um, despite your feelings on Discord.

[01:01:16] Nick Roome: Yeah, you know, we we're aware, we're tracking. We might find a new home. Who knows. Anyway, we're there for now. Visit our official website. Sign up for our newsletter. Stay up to date with all the latest human factors news. If you like what you hear, you wanna support the show, there's a few things you can do.

[01:01:28] Nick Roome: One, you can leave us a five star review wherever you're watching or listening right now, that's free for you to do and it really helps us out. Two, that helps us out more than that is telling your friends about the show and tell 'em that you like us. 'Cause 'cause uh, your friends will trust your judgment over a, a, uh, anonymous review.

[01:01:45] Nick Roome: And three, if you have financial means to and wanna support the show, you can always support us on Patreon. Everything that you support goes right back into the show. Uh, I don't pocket any of it, although Barry skimmed off something a a lot one time to buy a [01:02:00] car. I still haven't got that money back, Barry.

[01:02:02] Nick Roome: Give it back. As always, links to all of our social center website or in the description of this episode. Mr. Barry Kirby, thank you for being on the show today. Where can our listeners go and find you if they wanna talk to you about AI taking their job?

[01:02:14] Barry Kirby: Well, AI's already taking a job, so this point, point is come talk to me now.

[01:02:18] Barry Kirby: Um, but no, you could always come talk to me on LinkedIn, on Facebook, Instagram if you wanna see some recent Potter failures. But, uh, if you wanna come and listen to the newly revitalized podcast where I chat to individuals in the human factors domain around their expertise and experiences, then find me on 1202 the Human Factors Podcast, which is oh two pod com.

[01:02:37] Nick Roome: As for me, I've been your host, Nick Rome. You can find me on our discord and across social media at Nick Rome. If you're watching live, stay tuned for the post show. For the rest of you, thanks again for tuning to Human Factors Cast. Until next time. It depends.