April 3, 2026

E318 - SXSW 2026

E318 - SXSW 2026
YouTube podcast player iconRSS Feed podcast player iconApple Podcasts podcast player iconSpotify podcast player iconiHeartRadio podcast player iconAmazon Music podcast player iconOvercast podcast player iconCastro podcast player iconPocketCasts podcast player iconCastbox podcast player iconPodchaser podcast player iconTuneIn podcast player iconDeezer podcast player iconSpreaker podcast player iconPandora podcast player iconRadioPublic podcast player iconPodcast Addict podcast player icon
YouTube podcast player iconRSS Feed podcast player iconApple Podcasts podcast player iconSpotify podcast player iconiHeartRadio podcast player iconAmazon Music podcast player iconOvercast podcast player iconCastro podcast player iconPocketCasts podcast player iconCastbox podcast player iconPodchaser podcast player iconTuneIn podcast player iconDeezer podcast player iconSpreaker podcast player iconPandora podcast player iconRadioPublic podcast player iconPodcast Addict podcast player icon

Episode Link: http://www.humanfactorscast.media

News:

  • Human Factors Cast episode 318 (recorded April 2, 2026) - This week, Nick Roome is joined by Megan Michaels to recap SXSW 2026 (March 12–18 in Austin) and its dominant themes, especially AI’s real-world impact, ethics, and the growing need to make “the human piece” explicit. Megan describes SXSW as a convergence of tech, film, music, startups, and government/defense, shares her work as a startup pitch coach (63 teams; she coached about 30), and highlights examples like MyFleetAI (truck fleet safety via sensors and AI) and Tube Bender (efficient conduit-bending for electricians).

Support us:


Human Factors Cast Socials:


Reference:


Feedback:


Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here!

Mentioned in this episode:

1202 - The Human Factors Podcast

Listen here: https://www.1202podcast.com

Listen to Human Factors Minute

Step into the world of Human Factors and UX with the Human Factors Minute podcast! Each episode is like a mini-crash course in all things related to the field, packed with valuable insights and information in just one minute. From organizations and conferences to theories, models, and tools, we've got you covered. Whether you're a practitioner, student or just a curious mind, this podcast is the perfect way to stay ahead of the curve and impress your colleagues with your knowledge. Tune in on the 10th, 20th, and last day of every month for a new and interesting tidbit related to Human Factors. Join us as we explore the field and discover how fun and engaging learning about Human Factors can be!https://www.humanfactorsminute.comhttps://feeds.captivate.fm/human-factors-minute/

Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!

Vote Here

Follow us:

Thank you to our Human Factors Cast Honorary Staff Patreons: 

  • Michelle Tripp
  • Neil Ganey 

Support us:

Human Factors Cast Socials:

Reference:

Feedback:

  • Have something you would like to share with us? (Feedback or news):

 

Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.

A E318 - SXSW 2026

===

[00:00:00]

[00:00:01] Nick Roome: Hello everybody, and welcome back to another episode of Human Factors Cast. This is episode 318. We're recording this episode live on April 2nd, 2026. I'm your host, Nick Rome, dealing with technical issues all day. Joined today, not by Barry Kirby, he's, he's on holiday. We're here with Megan Michaels today.

[00:00:22] Meghan Michaels: Hello.

[00:00:23] Nick Roome: Hey, Megan. Welcome back to the show. It's, it's so good to see you back here on the show after a whole year off. Why you just coming back now? You gotta come back sooner. We do have an awesome show for you lined up.

[00:00:35] Meghan Michaels: I mean, Nick I've already Oh, I've already told you. I most definitely, I mean, this is my third time.

[00:00:42] Meghan Michaels: We don't need to space things out so far, but I am, you know, I'm ready for my smoking jacket.

[00:00:47] Nick Roome: Is, is that, is it really your third time or is it your fourth? We need to investigate this in the post show. Let's check it out.

[00:00:53] Nick Roome: We'll check it out. I think it's my third. Well, I helped host one time and then I was a guest.[00:01:00]

[00:01:00] Nick Roome: Was, I guess twice. I don't know.

[00:01:03] Nick Roome: We'll definitely investigate in the post show, but right now we're here to talk about South by Southwest or mm-hmm. For those who have attended, as I learned last year, south by,

[00:01:16] Meghan Michaels: yes.

[00:01:17] Nick Roome: All right. Well, um, I don't have any program updates and Barry's not here to talk about 1202, so let's get into the news.

[00:01:23] Nick Roome: Yes. This is the part of the show all about human factors news and the story. This week's all about South by Southwest. I wasn't there, but Megan was. So, Megan. I'm so glad to have you back on the show, to talk about South by Southwest. Uh, we wanted to have you on right after, but I think some scheduling conflicts, uh, made that a little impossible because you were, you were stuck in airports left and Right trying to get outta there.

[00:01:48] Nick Roome: And, uh, and so we, we decided that we would do it the next aired episode, which is now, so South by Southwest happened when it was, I have it on the thing two weeks ago. Two weeks ago, yeah. March [00:02:00] 12th through the 18th. Mm-hmm. Austin, Texas. Mm-hmm. Uh, you know, I, I learned a lot about, so by Southwest last year, I've followed it over time, but I want it straight from you.

[00:02:12] Nick Roome: An attendee What is South by Southwest.

[00:02:17] Meghan Michaels: So it is truly one of the most unique. Conferences that exist. It is a, it is an absolute convergence of tech, of film, music, startups, culture and, and it's this place where you get to go through whatever track you want to for learning and exploration, but also you never know who you're gonna meet in an elevator or eating wings next to a billionaire.

[00:02:46] Meghan Michaels: You just don't know. It's this place where no walls exist essentially. And it's designed that way so that there can be this true emergence of ideas and connection and networking in [00:03:00] ways that just truly don't exist anywhere else.

[00:03:03] Nick Roome: I gotta, I gotta know, did you eat wings next to a billionaire?

[00:03:07] Meghan Michaels: You may never know.

[00:03:09] Nick Roome: Oh, I may never know. I'll know after this 'cause I want you to text me.

[00:03:12] Meghan Michaels: No, I did not. But I do know that these things happen. Um, I, I mean I definitely, I'm, it's, again, it's the kind of space where, you know, I went to a session where I was learning about AI and then I went to another building and I was in a room with 50 people with the assistant deputy director of the Department of War.

[00:03:36] Meghan Michaels: Like you. It's wild. The places and spaces that you get to enter into to learn and here and participate in discussions with people is, is just unparalleled.

[00:03:49] Nick Roome: I'd love to hear. A little bit about your experience this year, and maybe for those who have listened in the past and heard you last year, maybe compare and contrast [00:04:00] with your experience this year versus previous years.

[00:04:04] Nick Roome: You know, how did it all go? What was it like this year?

[00:04:07] Meghan Michaels: So the big thing that was different from this year, from last year is that. Last year for those who might remember the the experience was wild and varied. We, I, we went with a, or I went with a colleague of ours Erin. And so we kind of divided and conquered and we were able to.

[00:04:29] Meghan Michaels: Attend this huge swath of events and panels and, and then we did everything we could possibly get our hands on, including we got blowouts one day because that was one of the random popups that appeared in the street. That was amazing. But on top of that, we ate all the places we could ate or eat.

[00:04:48] Meghan Michaels: We met as many people as we could meet and we it was just, we took it all in and it was fantastic to do it that way. Um, I got a real sense at [00:05:00] that point and, you know, reflecting on it after that south by truly is going to be the experience that you make it to be the reason that I got to go last year was because I was a startup pitch coach on the video side of things for the startup pitch competition that is, is hosted by South by, and that is a major competition for young startups who are, trying to make it.

[00:05:24] Meghan Michaels: And kind of going again, back to that idea of you never know who you're gonna meet, you never know who's gonna be in the room during those pitch competitions. And it's just a phenomenal experience for these folks. So I did that again this year. And worked with, we had 63 different teams that were part of the competition.

[00:05:43] Meghan Michaels: I personally worked with about 30 of them and so was participating in that. So my focus was really intentional in that space, but also I had things that I wanted to learn and take back for, my own professional space. So [00:06:00] I spent time with the DOD community in that area. And then there were things that I wanted to find, overall with South By, in terms of what's the through line between, startups and, government, defense industry.

[00:06:14] Meghan Michaels: Like what is it? And it was it was AI all over the place. Um, and we kind of saw that last year that was a, a common theme, but this year it was. Everywhere. Everywhere. And so, really the, the big thing that I, I started to notice in, in the startup pitches, it was coming in the in the panels that I was listening to over at Capital Factory and with Red ThreadX, it was it's not just talking about ai.

[00:06:46] Meghan Michaels: Now we're grappling with, with the real world impact. There were a lot of conversations around ethics and a lot of conversations about like, where is, where's the human piece of all of this? And um, so it was [00:07:00] fascinating to see how how people were engaging with that idea across the entire week that I was there.

[00:07:08] Nick Roome: So some of the themes around AI sounds like some of the ethics around AI might have been. Pulled on it sounds like. Mm-hmm. Um, where's the human and the human pieces where I want to kind of pull on, last year we asked you about what HCD or UX or human factors representation looks like at this conference as a whole.

[00:07:32] Nick Roome: And you saying that the human element of having human as part of this AI ecosystem, uh, or AI ecosystems just generally is encouraging to me. Can you talk a little bit about the the human factors, HCD UX representation at the conference?

[00:07:50] Meghan Michaels: Yeah, absolutely. So I think the thing that I noticed, because I, you know, you and I had, had briefly discussed before I went that, it might be a [00:08:00] possibility to, you know, have a recap conversation like this again. And so I specifically was looking out for the design element of things to see, where it popped up. Were there some specific sessions that I could go to, that kind of thing. And what I found was no, there wasn't the things being explicit in regard to HCD ux it, it wasn't quite the same as it was last year.

[00:08:30] Meghan Michaels: And what I did find instead was that there was that more implicit thread in every session that I went to. And so what it made me think about was how how does that shift our perspective in this space of human factors and, and human-centered design for, how we need to be speaking about it.

[00:08:50] Meghan Michaels: And what I came to the conclusion of is that we need to be really intentional about what HCD looks [00:09:00] like now. It's becoming just a natural part in much of the work that takes place, but to specifically call out, okay, stop. This is this strategy and we're doing it for this specific reason because we know it's going to impact our users in this way.

[00:09:14] Meghan Michaels: And bringing to light the, um, just the very specific language for why it is needed. And it's not just tossed to the wayside and and forgotten about. I just think it's, it's gonna be really important.

[00:09:33] Nick Roome: It sounds like in a lot of ways, this is kind of a double-edged sword where you have. It's great news that, the human is included as a consideration with any of the systems or products that we create going forward.

[00:09:48] Nick Roome: And, and that's a good overall theme is the integration of how does the human use this thing. Mm-hmm. But it sounds like the absence of a [00:10:00] specific track or a specific set of presentations or panels about including the user is sort of the drawback to that. Mm-hmm. Where it's, you know, having it more integrated is, is good, but then that means Okay, we don't necessarily have to do more outside of that.

[00:10:19] Meghan Michaels: Yeah, I mean there were definitely like, like design panels specifically, you know, for, for UX stuff. There wa there just wasn't the same focus on the HCD end of it to understand why it's so critical and how we get to a, a powerfully impactful user experience, I think one of the best ways that I can describe my, my feeling about this is is when you're giving feedback to someone, right?

[00:10:55] Meghan Michaels: It's one thing like, Nick, you're doing a great job. [00:11:00]

[00:11:00] Nick Roome: Thanks.

[00:11:00] Meghan Michaels: Like, let's, how

[00:11:01] Nick Roome: do I improve?

It's

[00:11:01] Meghan Michaels: nice to hear. Exactly. So when we, in order for that exp for you to really be like, I'm doing a great job. We need to scaffold what that. That piece of communication looks like, and it begins with Nick. I'd really like to give you a piece of feedback about how this conversation is going right now.

[00:11:23] Meghan Michaels: First of all, I wanna tell you that it's really positive. Second of all, I wanna tell you that the questions that you're asking me are evoking deep thought, third, so you got what I'm saying? There's intentionality and and very clear focus to what it is we're communicating about and why.

[00:11:42] Meghan Michaels: And sometimes, um, I, I think right now with HCD specifically at South by this year, there was the, the and why piece that was, it was just an undercurrent. And that's, like you said, it's not a bad thing, but, I think there's a potential to kind of waiver on, we don't wanna forget it. [00:12:00] Right.

[00:12:00] Meghan Michaels: It's really important to still call it out.

[00:12:02] Nick Roome: Let's bring it back. So at least from, from your perspective, there were plenty of design panels, UX panels but it sounds like the and why piece was missing. So I'm gonna slightly modify Next questions. Go. What were your favorite panels, panels or presents that you went to while you were there?

[00:12:29] Meghan Michaels: Okay, so I had two really, really big favorites. The first one was I just got such a kick out of going to the startup pitches. The. And here's why. Like I mentioned with, with at the beginning this is a, a space where you just don't know who is going to be there, right? And working with these teams, I started working with them back in January and seeing the progression [00:13:00] of their pitch decks, of their confidence, you know, all of that as they prepare to stand on, on a stage in front of, 300, 500 people in a room, and they get three minutes to basically say, here's my company, here's it's doing amazing things and here's why.

[00:13:16] Meghan Michaels: It's, it's part of the greater good. And why you should invest in us. It's that in itself. I mean, there's a huge wage of wa wave of emotion that comes along with, you know, cheering people on and feeling excited for them. But what struck me most this year was the range in the level of curiosity around how do we make a deeper impact on the world that we live in.

[00:13:44] Meghan Michaels: Not just from, it was just so cool. It, I mean, everything from like health. Health Aids to I had a couple of teams that I really worked closely with, which were phenomenal. One was called My Fleet ai, and they dealt with sensors in in [00:14:00] fleets for, you know, the trucking industry to help improve coachability and improve the ability for people to drivers to stay safe and avoid accidents.

[00:14:10] Meghan Michaels: Through predictability using ai. Um, another company that I worked a lot with and had a great time getting to know is a company called Tube Bender. And basically what they have figured out is how to better empower electricians by, they developed a machine that can bend conduit in a way that is more efficient and cost effective, and they're helping to fill the gap of a, a section of the industry that can't grow fast enough for the way that it is needed.

[00:14:38] Meghan Michaels: So things like that, it, you know, when we talk about like, how do we use, how do we use technology, how do we use AI to make our world safer, make it better, give back in ways that there are gaps. That was really cool to see. So the curiosity, the excitement, the willingness to [00:15:00] put yourself out there, that is a.

[00:15:02] Meghan Michaels: That is a big deal. There are a lot of people in this world who can't stand in front of a room of 30 people, let alone a room of, 300 people and be vulnerable and say, I made this, it's great. You should think it's great too. So that was, that was a big aspect. I, I really, really loved that.

[00:15:20] Meghan Michaels: I loved being part of that journey and seeing people grow. The other thing, I think my other favorite session that I went to and I've got the book here 'cause the, the author was the speaker. It's a book called The Ethical Nightmare Challenge. How to Avoid the Worst of ai. It's by Reed Blackman at the book.

[00:15:38] Meghan Michaels: Yay. Fantastic session about, um, how we can, let me read the maybe I can't read it. I was gonna read a little daily. Nope, can't do it. Basically he talks about how we can better navigate the ethics of AI at [00:16:00] this point where we're going from a very, very narrow structured usage of AI to now we have multigenic experiences that are, that are occurring.

[00:16:13] Meghan Michaels: And, and we don't have, in many cases, ways to ensure that, there's safety and there's protocols and and people aren't getting all twisted and privacy isn't violated, you know, all of that kind of stuff. So he talks about ways that companies and teams and down to the project level people can address these moments where we can prevent.

[00:16:37] Meghan Michaels: AI from getting outta control. It was very cool.

[00:16:42] Nick Roome: I think one of the things that I would like to know is sort of some of the key takeaways that you are gonna bring back to the things that you work on.

[00:16:56] Nick Roome: Great question. Because there was [00:17:00] a lot, um, oh wow. Okay, great. So no taking notes. Um,

[00:17:05] Nick Roome: I think the big, I actually wrote some down on this. I'm looking at my screen to see. Okay.

[00:17:10] Meghan Michaels: One of the kind of themes you know, we're not, at this point, we're not facing a technology gap, right? It's translation gap. We have powerful tools that we have available to us. Now the challenge is going to be, especially in the human side, how do we translate those capabilities into the experiences that, that people can understand and most importantly, trust so that they will use and can use them in a way that are going to be beneficial for our lives.

[00:17:42] Meghan Michaels: So that's, that's a big. Big thing I would say in terms of my deepest reflection about the entire experience, it's, it's that I, there were a couple of, thinking over to like the defense side of things. There [00:18:00] were a couple of, of things that I, I heard, which I found very interesting.

[00:18:05] Meghan Michaels: In terms of one of the challenges that the government faces, for instance, is bringing things into operation quickly, right? So there, you know, there are processes all over the place and it's, there's red tape and there's checks and balances, and things take a long time, especially when we're talking about research and development to put out new technology.

[00:18:28] Meghan Michaels: And there was a, a phrase I was used that I found I couldn't decide if I liked it or not. Ultimately, I, I think maybe it was just an important way of phrasing it. But the phrase was used we have to stop, uh, keeping things in scientific experiment phase. Um, like no more experiments, it's time to move.

[00:18:48] Meghan Michaels: Like if it's, if it's stuck in this constant phase of experimental action, um, like what's the line before we're able to say, [00:19:00] okay, this is where we need to be. We're ready to push this forward. And then not only that, but cool. If you have a great idea for the government and cool. If you've experimented on it, awesome.

[00:19:12] Meghan Michaels: But if you can't show how you are going to very quickly be able to scale. Uh, nothing's gonna happen. And that was something that, that really struck me because it made me think about, all the small businesses that are trying to get connected into the government space because it, I mean, it really is a way for businesses to develop a, a, a unique runway, but something that is stable.

[00:19:41] Meghan Michaels: But also, that's how our company works, right? We started out as a small business in that space. But, but the thing was, what we figured out was how do we scale and do it quickly and do it in a way that is not going to cause disruption. So those were biggies. [00:20:00]

[00:20:01] Nick Roome: Yeah, I'd, I'd like to talk a little bit about you know, the translation gap.

[00:20:04] Nick Roome: I know you talked a little bit about how do we get somebody to trust the technology that's that's out there and that exists and that's like core human factors. How do we trust an automated system or now there's the whole subfield of how do you trust AI outputs, how does AI output or AI results how do you communicate what is AI results?

[00:20:26] Nick Roome: And I just, I find that fascinating that I'm, that was a, a theme or a takeaway that you took from this conference. Mm-hmm. Um, and I think one thing, or I guess a, a follow up question would be what are you going to do in your work to help ensure that people trust an AI piece of, of, uh, integration, right?

[00:20:51] Nick Roome: Like

[00:20:51] Meghan Michaels: Yeah. Ironically, I think one of the things that that I connected with most in that space to, [00:21:00] which is kind of what, what made me get to that translation gap idea was I went to this, this session about, um, AI in the film industry and this presenter, basically, he showed like a, a two minute video that was completely put together with ai.

[00:21:23] Meghan Michaels: But the way that it was done was through basically like that an older film style like you would see on like an eighties VHS quality recorder. And and the language that was used was similar to that time and the music that was put into and the the graininess of it that connected with me at a root level.

[00:21:50] Meghan Michaels: I'm a child of the eighties I was born then, so like the imagery that was coming across the screen, I immediately [00:22:00] connected with, and I had all of these, memory flashbacks of being a kid and running around in my kids and, you know, um, in the summertime. And and. It made me think about, okay, why?

[00:22:16] Meghan Michaels: I even got like a little bit teary at this video. It was, it was all about what was the video even about? I don't even know.

[00:22:23] Nick Roome: Nostalgia.

[00:22:25] Meghan Michaels: Yeah. I mean, essentially, yeah. But it, it made me think about like why did I, it he told us at the beginning, this was made by AI completely. But within like two seconds it that, oops.

[00:22:41] Meghan Michaels: That was like gone. I didn't even think about it. It lit like you felt the human layer of that p of that film, short film. And what I, what he came to talk about was how, yes, it was made by ai, but it had to be [00:23:00] told. It had to be provided with the, the boundaries and the parameters of where to go and what to draw from.

[00:23:09] Meghan Michaels: So there was still very much this human element for that. There was this machine out in the, in the great ether. I keep hitting my microphone. That's so embarrassing. It must sound awful because it, talk,

[00:23:23] Nick Roome: talk with your hands. It's okay.

[00:23:24] Meghan Michaels: I know. I'm sorry. So the, uh, the piece that was just so, so interesting to me in terms of like, going back to the idea of trust is yeah, there's this big machine in the sky that is taking all these parts and pieces and then creating this output that is fun to watch.

[00:23:45] Meghan Michaels: And that's, that's super cool. But there was someone behind it initially, pulling the strings and, and so as we think about what that looks like moving forward in, in the work that, that we do you know, you [00:24:00] and I have talked about in using AI specifically at work I want to use it for the things that don't necessarily need to be, need to have my human touch on it, right?

[00:24:12] Meghan Michaels: Like I, it's very helpful for me to use you know, an AI note taker to pull out, the summary of a meeting. And then from there, that is where I add my element of, my experience, my critical thinking, my background knowledge, my, current knowledge of the projects that we're working on to say, okay, ai.

[00:24:37] Meghan Michaels: Nice. Nice try. This is pretty good. You're doing

[00:24:40] Nick Roome: great, sweetie.

[00:24:41] Meghan Michaels: Yeah. Doing great. Good job. Good job. Let's start. Uh, to, to then, you know, shape and mold it into something that is going to be, um, useful for our project and for our team and for our members. Um, what then comes from that is I have, in our workplace I have a [00:25:00] reputation, right?

[00:25:01] Meghan Michaels: I, I do the things I say I'm gonna do, and I work really hard, and the output that I have is sound. So there's that trust, right? And I think one of the things that we can do, particularly as we continue to use AI, is to do it in really transparent ways. If you're gonna use tools, tell people, Hey, I use this to help me, right?

[00:25:27] Meghan Michaels: It's when you are in. You see it all the time. You see people that like do all of their social media posting. It's completely written, by chat GPT and that's okay. That's fine. But wouldn't it be great if we also said written by

[00:25:46] Nick Roome: Yeah.

[00:25:47] Meghan Michaels: You know, that sort of thing. I fear my fear, which is, I think what is going, what is already eroding the trust of, of the public is that there's not, the attribution of [00:26:00] this didn't all come from my entire brain.

[00:26:03] Meghan Michaels: It came from a collection of brains all over the world that a computer pulled together for me and then spit out

[00:26:12] Nick Roome: So much I could say about all that. I'm gonna go back to the video for a moment because mm-hmm. Some of the things that you're talking about I don't remember exactly when, but I know we talked about them on the show about how in itself, prompt writing mm-hmm.

[00:26:27] Nick Roome: Or doing, um, or, or basically prompting the AI to generate a thing in itself can be considered art, and especially when you consider that mm-hmm. AI generated film that you just talked about. They mentioned that sure, the whole thing was generated by ai, but I doubt that the editing was done by ai.

[00:26:49] Nick Roome: I doubt that the storytelling was done by ai. I doubt, you know, I, I imagine, right. That they prompted them for specific shots that were like five to six seconds [00:27:00] each, and then strung them together in a compelling message. Mm-hmm. And so you are using. Some elements of output from AI and stringing them together.

[00:27:11] Nick Roome: And I think that is the right way to incorporate it into our workflows. Right.

[00:27:15] Meghan Michaels: I, I agree. And they, they talked about that in the respect of, you know, specifically in the film industry, how like AI is here. We have a choice to embrace it or not. And really at this point in, uh, in Hollywood there are two schools of thought.

[00:27:33] Meghan Michaels: It's the let's use it and the like, staunch, like absolutely not ever. So it's gonna be really interesting to see how that community is able to navigate that moving forward. I mean, you look at the innovation of, of any piece of technology over the history of time. I mean, a pencil was considered technology.

[00:27:58] Nick Roome: I'm never gonna use a pencil. [00:28:00]

[00:28:01] Meghan Michaels: I know,

[00:28:02] Nick Roome: I, I, I do. Yeah. There's, there's not just a dichotomy within the film industry, but just in general, there's a lot of thought about, I'm a never AI person. I will never use ai. And that's, to me, that's the wrong approach. I think you need to know when and how to use it appropriately.

[00:28:20] Nick Roome: Mm-hmm. And, you know, in, in the, in the generative AI space for creativity, that is a very interesting conversation that I think when you deconstruct it, there's a lot going on, right? Mm-hmm. If you use AI to generate an image, for example, it may pull images as inspiration from art that's in data sets that the artist did not consent to.

[00:28:46] Nick Roome: Mm-hmm. There's ethical issues with that.

[00:28:48] Nick Roome: The prompt itself I would consider is an art form and somebody put the time and effort into it, whether or not pulling the images and the inspiration from other things [00:29:00] is ethical or not. That's a different discussion. But the people who are never ai need to understand that there are still some things that can come from AI that help us accelerate different things.

[00:29:13] Nick Roome: Right? So just in terms of peeling back the curtain on, on show production you know, some of the things that we do is to generate stuff before the show as like potential talking points. We don't take them all. We don't, you know, we modify them. We don't take them at their face value. We think about some of those things.

[00:29:34] Nick Roome: We use those to ideate and generate. And I have more on this in my, one more thing in terms of like, uh, an actual work example, which is. Hilarious because I didn't think about it until we started talking about this, but understanding when to use it and how to use it is probably the better thing.

[00:29:51] Nick Roome: Um

[00:29:52] Meghan Michaels: mm-hmm.

[00:29:52] Nick Roome: I also wanna talk about, uh, the, this, this thought about bringing things quickly or bringing [00:30:00] things up quickly that you mentioned in the DOD and like kind of having, or sorry.

[00:30:04] Meghan Michaels: Mm-hmm.

[00:30:04] Nick Roome: DOW uh, and having, all this, the science experiment piece of it, right. Having it not stay in that science experiment phase.

[00:30:12] Nick Roome: And I think that's also interesting because there's a lot that can go wrong if you do things wrong. Uh, there's a lot that can go wrong if you do things wrong. But I mean, the point I wanna bring up with this is that I think there is a temptation to go forward and use this to its fullest extent, which.

[00:30:38] Nick Roome: Fine. You can, people have vibe coded apps and programs and they work, right. Barry is one of those people who has mm-hmm. Vibe coded his entire application web-based application. And, but he knows enough to make it work. And, um, you know, I think there's some concerns, especially in like [00:31:00] high security projects where that type of stuff might be too fast.

[00:31:06] Nick Roome: Mm-hmm. And so the, the. The counterpoint to that would be maybe the agility when it comes to AI should not be necessarily within the development space, but rather in the human-centered design or human factor space, right? Mm-hmm. Use AI to iterate on different designs very quickly to where you didn't need to do that whole thing, but you can just generate a concept, see if it works.

[00:31:34] Nick Roome: If not, move on. You know, you have, uh, when I say you, I mean the royal, you, us we all have the understanding of what will and what will not work for our end users. Mm-hmm. And so we can very quickly take a look at something and make a, a quick assessment and be like, this is bullshit.

[00:31:52] Nick Roome: I'm not gonna use this. Or This is actually this is a great idea. And actually, yes, and let's do this other thing too, [00:32:00] and and then work it into some other piece. So I think there's a lot to be said there about the iteration when it comes to early stages. And then the development itself, maybe that is the longer train of, of thought or the one that you take a little bit more seriously, especially in the high security, um, in the high security space.

[00:32:19] Nick Roome: Mm-hmm. Do they talk about any of that when talking about scaling in those presentations?

[00:32:24] Meghan Michaels: Oh, definitely. I mean, it's, it's just a whole different ball of wax, right? Like you've got, I mean, we've, we've experienced it in, in our own work. It's working in the government space and the various level of securities that that we have to stay within.

[00:32:40] Meghan Michaels: I mean, it's, it's serious business, like serious. And I'm certainly not an expert on the security side of things. I just know that in our work from day to day every single year, and without fail every [00:33:00] single week, you know, we're receiving information about the various steps that as a company we are doing to stay in alignment with all of the necessary safety requirements that, you know, the government puts out in order to not only keep their systems safe, but it's ultimately in an effort to keep people safe.

[00:33:20] Meghan Michaels: All of the work and the research that is happening in that space is done with those things in mind. And and the companies that do that from the get go versus trying to retrofit all of those security elements they're the ones who are going to be able to scale and move quickly through that because it's it's a lot easier to build with the end in mind versus undo and rebuild.

[00:33:46] Meghan Michaels: Right.

[00:33:48] Nick Roome: Yeah. Okay. So final thoughts, south by Southwest 2026, what are your final thoughts?

[00:33:59] Meghan Michaels: You know [00:34:00] what so I've been reading the book by, uh, uh, Ryan Holiday called Wisdom Takes Work. And one of the things that talks about is. How, a lot of people when they're having conversations like this, they'll say, you know, what did you learn?

[00:34:16] Meghan Michaels: And that's fantastic. But he posed the idea of what if we are asking people more often when's the last time you changed your mind? You know? And I love that. I love that shift in perspective. And as I think about South by, the thing that makes it so powerful aside from, you know, being in this space that you wouldn't normally be in with, with other people, is that it presents an opportunity for you to consistently ask yourself, is this the things that I'm hearing?

[00:34:52] Meghan Michaels: Is it in alignment with what I've always known? Or is this causing me to. Shift the way that [00:35:00] I'm going to view this idea or, approach this topic or do this task. And the reminder that it's okay to change your mind. We change our minds all the time, but this space, especially with the convergence of all of these different people and industries, is the perfect way and psychologically safe space to do so because it's, it's encouraged every moment of every day that you're there.

[00:35:27] Meghan Michaels: I.

[00:35:29] Nick Roome: That's awesome. I'm glad you once again had a wonderful trip south by, except for the, the whole exiting thing. That was,

[00:35:37] Nick Roome: that

[00:35:37] Meghan Michaels: was bananas.

[00:35:39] Nick Roome: Yeah. Yep. That was extra special. Thank you to my team for helping to carry the weight while I sat in a bunch of different airports.

[00:35:50] Nick Roome: Well, thank you to our friends over at South by Southwest for our news story this week.

[00:35:54] Nick Roome: And thank you to Megan for attending South by Southwest, so you could talk about your experience if you wanna follow along, we post all the [00:36:00] links to our original articles in our Discord where you can find us, uh, for more discussion on these are on these stories and much more. We're gonna take a quick break, and then we'll be back to see what's going on in the Human Factors community right after this

[00:36:14] Nick Roome: Now I've been sworn to silence, but we have some exciting things coming for our patron soon. I can't say anything at this time, but now would be a pretty great time to sign up for Patreon. I can't say when it is. I want as much as I want to for reasons that will become obvious once once these things are live on our Patreon.

[00:36:39] Nick Roome: But let's just say it has something to do with the lab and some of the projects that they're working on. And let's just say it's good stuff. Uh, if I do say so myself that I have not had input into slightly, I'm a producer. I'm not the co, I'm not the creator of this stuff, and that is wonderful to see.

[00:36:59] Nick Roome: [00:37:00] So there's stuff coming soon. Stay tuned. Uh, but huge thank you as always to our patrons. You keep the lights on, you keep stuff like I am alluding to in works, and you help support all that. You help support clearly. My, my audio visual issues here on the show what I'm saying is we need a little bit more patronage, uh, to make sure that I have the right internet for.

[00:37:23] Meghan Michaels: You're looking and sounding better though, by the way.

[00:37:26] Nick Roome: Thank you. Alright, well, uh, I think we should go over to our friends over at the human fa HFES, aerospace Systems Technical Group. Mm-hmm. Um, this week in aerospace. Elena, Phil, over to you. What's the latest.

[00:37:44] Elena Zhang: Uh, thanks Nick and Barry. This is Elena Phil again from the Aerospace System Technical Group at HFES. Today we're excited to be joined by Dr. Lyle Berger from Riddle, um, ergonomical University. Vilas Research focuses [00:38:00] on team science and hu human AI teaming, particularly in complex environments such as aviation and space operations.

[00:38:07] Elena Zhang: Lelas, thank you so much for joining us today.

[00:38:10] Lila Berger: Thanks for having me. I'm excited to, to chat with you all,

[00:38:13] Phil Doyon: so we're very excited to have you. And, uh, maybe to start, could you tell us a little about your research journey and what drew you into studying teamwork and human AI collaboration in high stakes environments?

[00:38:26] Lila Berger: Yeah, absolutely. So I, um, I initially graduated with my bachelor's in psychology and I knew that I wanted to focus on something related to human performance and the psychological impacts of extreme environments, although I didn't necessarily at the time know, um, what an extreme environment was or that that was what it was called, right?

[00:38:47] Lila Berger: So, that those types of interests kind of brought me to the University of Calgary in Alberta, Canada, where I studied under Dr. Gibe Aria. Um, that's where I got my master's. And so through that opportunity [00:39:00] I had a lot of amazing experiences that kind of helped to shape and refine my interest, um, as it relates to human performance really.

[00:39:07] Lila Berger: And so, uh, this ultimately brought me to my interest in teamwork. And so from there I moved to Houston to work under Dr. Eduardo Salas at Rice University. And um, that's really where my love for team science really grew and blossomed. And so. After graduation, I knew that I wanted to continue to study teams, um, in extreme environments and there really felt like no better place to do that than at Embry Riddle.

[00:39:34] Lila Berger: And so I'm currently here as an assistant professor in the Department of Human Factors and Behavioral Neurobiology, and that's kind of my all over the place research journey, really. Yeah.

[00:39:46] Elena Zhang: Yeah. I think you had a very accelerated path, which was I impress. And something I wanted to start by asking is, I know you have some recent papers that are more on the AI's role and kind of human AI collaboration piece.

[00:39:58] Elena Zhang: So we know that in [00:40:00] aviation, a lot of other safety critical domain people have been kind of advocating saying that automation might be better equipped than human to perform these responsibilities. So in your opinion, or based on your research, how do you think we should view the role of AI in these systems?

[00:40:15] Elena Zhang: And do you think the goal of AI is to replace human, uh, as a critical component?

[00:40:22] Lila Berger: It's a great question. So, sort of the way I see it is that AI is best suited currently in this supportive role. So I think ai can support humans during a variety of tasks, especially during periods of like high workload and stress time demands.

[00:40:40] Lila Berger: And so from a human factors perspective, of course, you know, we always think about capabilities and limitations of people. And so I think that for areas where maybe humans might be naturally overwhelmed or stressed or, um, when they need to make decisions under, you know, these types of conditions, I think that [00:41:00] AI can serve in this complimentary role.

[00:41:02] Lila Berger: But personally, you know, I advocate for their use in the supportive style role rather than replacement.

[00:41:08] Phil Doyon: And, um, and so where do you see those human AI teaming having the biggest impact? Uh, especially in the aviation, you know, other particular application where, uh, this research will be especially important?

[00:41:22] Lila Berger: Yeah. Um, I think that human AI teaming is a really rapidly expanding field, right? So I think that one area where I can see, you know, the research being especially important is in domains where maybe AI is supporting humans with safety critical tasks. And I think aviation's a great example of that. But honestly, regardless of domain, I think that.

[00:41:44] Lila Berger: Uh, focusing on research questions where we highlight, uh, and develop a better understanding of how humans view AI and maybe how we can develop trust. Um, and really just how humans and this technology can complement one another the best is a really [00:42:00] important area where I think I could definitely see the research going and it's kind of even starting to blossom in those types of directions.

[00:42:07] Lila Berger: But I think in aviation and in really most domains, I think that this type of research can be really important and have a really big impact. So yeah, absolutely. I I definitely see aviation as one of those domains for sure.

[00:42:20] Elena Zhang: And I have following up on that, I know recently you contributed to a paper on AI's role in fly deck.

[00:42:26] Elena Zhang: Can you give us a bit more context on what the current, uh, context is? How is AI playing a role in fly deck?

[00:42:33] Lila Berger: Yeah, I think that. AI has a role in potentially many different areas. I think one kind of component of that paper was just thinking about, uh, you know, where humans can be best supported by ai.

[00:42:46] Lila Berger: So what types of roles and jobs where AI might be a good support system and where maybe humans are better suited for that specific task. And so, from my perspective and my lens, I think in extreme environments [00:43:00] it's obviously an environment where there's a lot of these contextual features that are somehow unique or I guess essentially extreme, right?

[00:43:09] Lila Berger: So I think that during those periods where maybe there is increased workload or stress or time demands or uncertainty. Um, and where risk is present and danger potentially. Right. I think that AI can help to kind of serve in this complimentary role, and I think we're still kind of learning how best to place trust and kind of calibrate our trust in these systems and how we can work alongside these a, these kind of agents or tools and technology.

[00:43:37] Lila Berger: So I think that paper was really focusing on, you know. Just thinking about that conversation in general of where AI might be best suited and where maybe that human should really take the reins and continue on with that process. But yeah, that's kind of just an overarching view of the paper. Mm-hmm.

[00:43:53] Phil Doyon: So we were kinda lucky in aviation that, you know, we've been dealing with automation issues for, you know, the past [00:44:00] 30 or 40 years or so.

[00:44:02] Phil Doyon: And so we were able, we were able to learn a lot on, you know, what is good human monitoring and autopilot and auto automation system. And I wonder to what extent does this knowledge that we've had for the lab the past 30 years still apply with these new AI tools? And to what extent are they the same result?

[00:44:22] Phil Doyon: Or is AI making that any different from what we've knew, you know, since, you know, the, the, the eighties now?

[00:44:29] Lila Berger: Yeah, it's a great question and I think unfortunately we're still kind of in the process of figuring that out to my knowledge. Right. And so I think that. Those are kind of areas where we need to further explore those differences.

[00:44:42] Lila Berger: So yeah, absolutely. I agree with, with that sentiment totally is like how can we, you know, better understand what those differences are and how that might impact human performance and how we interact with ai. I mean, I think one area that I go back and forth with in my mind is thinking about [00:45:00] AI as this tool versus this teammate or this collaborative, agent that can help support us as people.

[00:45:07] Lila Berger: And so I think that line can be really challenging to discover is like where, you know, where that line is and what makes sense when thinking about the agent's capabilities and whether we consider that a tool or a teammate. And so from kind of that team science lens and perspective, I think that's one of the major questions I've been grappling with.

[00:45:26] Lila Berger: And one of the large differences potentially between automation strictly and this new AI sort of technology is where that line becomes and where that, where we draw that line. Yeah,

[00:45:39] Elena Zhang: that's definitely a very intriguing question. So in a sense, you're kind of determining if this is autonomy so that you get to be teammates with them, or is this just automation where it is a tool for you and based on your answers seems like that's still kind of blurry in research.

[00:45:56] Lila Berger: Yeah, from my perspective at least. Yeah. I think that we still have some, [00:46:00] some, um, some kind of findings to discover before we kind of make any conclusions about that. But I think that it's definitely a debate that I've seen and kind of heard at conferences and just based on other elements that I've heard from other people is there's a lot still to learn essentially.

[00:46:16] Lila Berger: Yeah.

[00:46:17] Elena Zhang: Awesome. And then another thing I wanted to explore is, I was at your dissertation not that long ago where I know you applied the HVAC model to aviation accidents. Can you walk us through kind of your main findings a little bit? Uh, what motivated that research and, um, do you plan to do more in that direction in the future?

[00:46:37] Lila Berger: Yeah, it's kind of crazy. It feels like yesterday when you were at the defense. Yeah, absolutely. So my dissertation primarily focused on, um, analyzing the past decade's worth of aviation accident reports from the NTSB or the National Transportation Safety Board. And to do that I used a popular human error framework called the HVACs, or the Human Factors Analysis and [00:47:00] Classification System.

[00:47:01] Lila Berger: And I use this framework to essentially look at the contributing factors to these events, classify those and look to see whether there were any trends when kind of comparing my findings from that current sample to previous work that's published. Um, using similar samples from previous date ranges to see, you know, okay, how has the field evolved in terms of these contributing factors?

[00:47:24] Lila Berger: And because of my interest in team science, I am currently in the process of kind of analyzing that further. Looking to see if there were specific team constructs that might have contributed. So one of the elements in, uh, the HVACs is that communication, coordination and planning. And so, for accidents where that was an element where that might have been a contributing factor, I'm looking more deeply at those findings in a separate study to see, okay, what team level construct is that referring to?

[00:47:55] Lila Berger: So maybe it's communication or psychological safety or trust decision [00:48:00] making at the team level. So, kind of which construct in particular. Uh, is that referring to maybe there's a pattern or a trend, um, where one of those constructs is kind of coming up more consistently than others and trying to see how that might be informative for potentially team training interventions.

[00:48:16] Lila Berger: And so that's where I'm looking forward in the future thinking of taking that research. But yeah, I'm definitely very interested to continue in this area of human error. It's something I'm very passionate about studying and especially ERA that occurs within and between teams as well. And so looking at that not only within a single individual team, but what that looks like in maybe these more complex multi-team systems that are very common in aviation.

[00:48:43] Lila Berger: So for example, maybe pilots and air traffic controllers that have to work together. And so that's where I hope to kind of continue and take this research in the future is kind of that direction.

[00:48:53] Phil Doyon: Well, uh, thank you very much for, uh, sharing your insights, uh, with us today. And, uh, thank you all for [00:49:00] listening.

[00:49:00] Phil Doyon: So that's all we have for this week. And now, uh, back to the show.

[00:49:05] Nick Roome: Wow. Okay. I have to say this. That

[00:49:08] Nick Roome: was creepy. How aligned that was. That was wild,

[00:49:12] Nick Roome: I have to say. So that, that segment of the show is produced by the, the human Factors in ergonomic society, aerospace systems, technical group. Mm-hmm. Um, Phil sends me the file every week.

[00:49:26] Nick Roome: And most weeks I watch it. Most weeks I do before it goes live, just to make sure that there's no like issues and that there's no playback issues. This one we're talking all about trust baby. 'cause I just uploaded it and said, let's go. And, and so to my surprise, wow. How crazy is it that was themed so well with our topics that we talked about from South by Southwest not planned.

[00:49:51] Nick Roome: And how wonderful. So thank you once again to our friends over at the Aerospace Systems technical group, uh, for this week at Aerospace. I love that segment. It's [00:50:00] so refreshing to get that perspective.

[00:50:02] Nick Roome: It was fantastic. It really was. I enjoyed it a lot. I know you said I, I know you said I could go take a quick break, but I watched it.

[00:50:11] Nick Roome: I liked it.

[00:50:12] Nick Roome: Yeah it's good. Good stuff. I, I don't need to introduce this next thing 'cause it's just one more thing. Uh, it's where we talk about one more thing. Megan, what's your one more thing this week? Okay.

[00:50:23] Meghan Michaels: So I guess, I'll continue on the, the thread from last year. So I'm a parent. I have two, two teenagers.

[00:50:31] Meghan Michaels: I have one in college and one in high school. My youngest just finished driver's training. Yes. Okay. Yeah. Yeah.

[00:50:43] Nick Roome: Uh, how, how are you feeling about that?

[00:50:45] Meghan Michaels: Okay, so here's the thing about it. It is a, a very interesting experience to, uh, let go of the control of being the one who takes your family everywhere.

[00:50:57] Meghan Michaels: And now suddenly you have to let [00:51:00] another human that you created to do the driving. Uh, it's very interesting to do. But what I wanted to talk about is I, I'm very excited, very excited for my, my kiddo. It's it's an excellent milestone. We did not get Nico into the course as soon as what a lot of people do, because it was really important for me that Nico do the course when Nico was ready.

[00:51:27] Meghan Michaels: And, uh, you know, that frontal lobe, that's a real net development. That's a real thing. So, I, uh, I did not enroll Nico until it. It kind of came up naturally on its own in our conversations of, Hey, when do I get to do this? And there was like actual interest in getting started. So Nico took the course.

[00:51:48] Meghan Michaels: It was done very differently from when I took driver's training and that it was, this was not in person. Um, it was however, a a fully synchronous [00:52:00] online course. And they were very the parameters of it were, were really interesting. The kids in the class they had to be seated, at a table or a desk.

[00:52:10] Meghan Michaels: They could not have pets near them. They could not be eating or drinking. What, while they were in the class, they got one bathroom break, you know, it was, it was very structured. And happened to be sitting in the kitchen, uh, this week for one of his last classes. And and. I was working away and, um, I, all of a sudden he's like speaking in this like really commanding voice and I could, it was like just a really weird, random statement about, about driving and a traffic law.

[00:52:46] Meghan Michaels: And after the class was done, I was like, what was that all about? You know, why did you start like shouting about laws? And he goes, oh, the way that they make sure everyone is participating is [00:53:00] she, the instructor would randomly call on a student and you you have to either read the slide that is up on the screen and all of the options for the discussion, or you have to provide the answer for the discussion that's, or the question that's being posed.

[00:53:19] Meghan Michaels: And, and I was like, okay. You know, so not only were there really, you know, solid parameters for the kids to stay in fully focused on the class, but then they had to stay engaged in order to participate. And it's 24 hours of classroom time. I mean, it was a, a significant amount of, of learning that happened.

[00:53:38] Meghan Michaels: But what I found to be so interesting was the the integration with the actual physical element of things. So all the learning was happening online, but of course you have to do your behind the wheel lessons, right? So you go to a location and you, you have six hours of driving and four hours of observation.

[00:53:58] Meghan Michaels: And when you do [00:54:00] the, or when he would do the, the lessons we would get this like notification right afterwards. And it was an evaluation of everything that Nico had done during that, that driving lesson. And there was Nico's signature on it and the instructor's signature on it and the observer's signature on it.

[00:54:19] Meghan Michaels: And all of this was facilitated through, um, they have this online platform really easy to, really easy to follow. But there were two things that happened in this experience. First of all, um, there are locations all over the state where you can do this program and we signed up through one of 'em, and you're supposed to do your driving lessons within a certain range of time and and you have to do them within a certain range of time so that you can take the test.

[00:54:46] Meghan Michaels: Nico signed up. To take a driving lesson, but the next lesson was not available for like 35 days.

[00:54:53] Nick Roome: Oh.

[00:54:54] Meghan Michaels: So it was interesting to navigate that system [00:55:00] because it's largely driven by AI capabilities.

[00:55:04] Nick Roome: Oh boy.

[00:55:04] Meghan Michaels: And could not figure out how to let me look in another location to sign up for additional lessons somewhere else.

[00:55:17] Meghan Michaels: That was not the location that we had signed up to take the class through. So it was, it, that was an interesting experience to navigate that. And as soon as I, it took a minute to actually get a person on the phone to like let them know what was going on, but once we did it was like solved in two seconds.

[00:55:36] Meghan Michaels: So that was an interesting, um. Moment for me. And it helped me understand that reality of like, you know how it's very common amongst, especially an older generation when they are having an issue with something and they call customer service and it's just like the robot, right? Mm-hmm. And they're just like, I wanna talk to a person.

[00:55:57] Meghan Michaels: I totally got that. [00:56:00] Um, because there is that human element of like, if I just talk to a person, they would know exactly what I was saying. You'd hope, not everybody is good at prompts. So yes, there was that. And, uh, oh, there was one other thing about it. I don't know. It was, it's been a fascinating experience to see the difference in in how.

[00:56:26] Meghan Michaels: The learning has, has shifted to match, you know, technological elements that are coming from the government. So like now, everything that Nico did is now like in a system and it's already sent to the Secretary of State. So, next week we'll go there and Nico will get his picture taken and get his temporary and, we're off to the races.

[00:56:48] Meghan Michaels: But the whole experience was very interesting to see how it's. Evolved.

[00:56:56] Nick Roome: Yeah. Sounds like I, I still remember going to the, the [00:57:00] like trailer classroom with the ramp. Like at the back part of a, of a campus and like, oh yeah.

[00:57:08] Nick Roome: Yeah. I took mine at Sears.

[00:57:10] Nick Roome: Yeah.

[00:57:11] Nick Roome: Sears driving school.

[00:57:14] Nick Roome: Yep. That

[00:57:15] Nick Roome: was fun. Well, for, for my, one more thing I have so last week I had talked about getting in the weeds with Figma and trying to basically create all these micro interactions for a, for the various components and kind of reminding myself of how the micro affects the macro. There's a new task that I've been looking at within Figma and friend of the show, Frank longtime listeners will know who that is.

[00:57:40] Nick Roome: Folks who have watched our HFES coverage will know who Frank is as well. We've been, I'm gonna call this Figma ai, A Tale of Two Experiences and you know, I've long time listeners know my opinion on AI and how it, how it all works and how it should work. And and [00:58:00] Frank's kind of. Kind of new to AI and is, is using AI sparingly.

[00:58:04] Nick Roome: But then we got this task to think about some future capabilities, uh, on a project. And we've been using Figma AI to see what a modern interface might look like for the various capabilities that we're like pie in the sky looking at. And my experience has been, oh my God, why didn't they do the thing? Why didn't they just put these in and containerize these?

[00:58:28] Nick Roome: And so that way I can take them and modify them and do the things that I wanna do with them. And meanwhile, Frank is on the line going, oh, this is really cool. Oh, it did this. And that's really neat. And so we were just having two completely different experiences with this capability within Figma.

[00:58:45] Meghan Michaels: Hmm.

[00:58:45] Nick Roome: And it's rather funny because I had quite a similar experience when AI was new and novel, and I think a lot of people have this as like, oh wow. What can this thing really do? And that's, that's a good, I was reminded that's a good place [00:59:00] to stay in for some things. Especially as I was talking about earlier, this like, iterative phase of including HCD within using AI in this early HCD iteration phase of okay, it doesn't necessarily matter that the components are broke.

[00:59:16] Nick Roome: You can always fix those later. But like the iteration of the ideas is the important thing. And I think this tale of two experiences really, I was stuck in the weeds, and I was like, but I just wanna put this over here and combine these two elements. And there were other ways to do it, and it just reminded me of how how we think about things differently.

[00:59:36] Nick Roome: And that's, that's the lesson. There's not really, uh, there's not really, I don't know. You can take what you want from that, but take the ai. That's it. Mm-hmm. Enjoy some of the discussion about South By Cell. Encourage you to go listen to last year's coverage of South by Southwest, where we had both Megan and Erin on the show to talk about their experience from the conference last year.

[00:59:56] Nick Roome: Comment wherever you're listening with what you think of the coverage this [01:00:00] week. For more in depth discussion, you can join us on our Discord community. Visit our official website, sign up for our newsletter, stay up to date with all the latest human factors news. If you like what you hear and wanna support the show so I can get better internet.

[01:00:12] Nick Roome: Uh, one, you can leave us a five star review wherever you're at that's free for you to do and cost you nothing. Two, you can always tell your friends about us. That's how the show grows and how other people find us. Three, if you have the financial means to and you wanna support us on Patreon, we would greatly appreciate that.

[01:00:28] Nick Roome: That's another way that you can support the show. As always, links to all of our socials and our website are in the description of this episode. I wanna thank Megan Michaels for being on the show today. Where can our listeners go and find you if they wanna talk about South by Southwest?

[01:00:43] Nick Roome: Feel free to find me on LinkedIn.

[01:00:44] Nick Roome: Meghan Michaels just like my name's spelled on the screen.

[01:00:49] Nick Roome: As for me, I've been your host, Nick Rome. You can find me, uh, screaming at my ISP provider and across social media at Nick Rome. Thanks again for tuning into Human Factors Cast. Until next [01:01:00] time.

[01:01:01] Nick Roome: It depends.

[01:01:02] Nick Roome: It depends.