Human Factors Minute is now available to the public as of March 1st, 2023. Find out more information in our: Announcement Post!
May 6, 2022

E244 - Fixing Humanity's Broken Risk Perception to Save the Planet

This week on the show, we talk about humanity’s broken risk perception and how it is reversing global progress in (what the UN is calling) a ‘spiral of self-destruction’. We also answer some questions from the community about being lowballed in compensation from internship opportunities, what to do if you’re not enjoying new role after making the transition to the field, and how to address the feeling of hitting a plateau.

YouTube podcast player icon
RSS Feed podcast player icon
Apple Podcasts podcast player icon
Spotify podcast player icon
iHeartRadio podcast player icon
Google Podcasts podcast player icon
Amazon Music podcast player icon
Overcast podcast player icon
Castro podcast player icon
Stitcher podcast player icon
PocketCasts podcast player icon
Castbox podcast player icon
Podchaser podcast player icon
TuneIn podcast player icon
Deezer podcast player icon
Spreaker podcast player icon
Pandora podcast player icon
RadioPublic podcast player icon
Podcast Addict podcast player icon

Recorded live on May 5th, 2022, hosted by Nick Roome with Barry Kirby.

Check out the latest from our sister podcast - 1202 The Human Factors Podcast - on Farming, Decision Making and IOT - An interview with John Owen:



It Came From:

Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!

Vote Here

Follow us:

Thank you to our Human Factors Cast Honorary Staff Patreons: 

  • Michelle Tripp
  • Neil Ganey 

Support us:

Human Factors Cast Socials:



  • Have something you would like to share with us? (Feedback or news):


Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.


Welcome to Human Factors Cast, your weekly podcast for Human Factors, Psychology and Design.



Hello everybody. It is episode 244. We're recording us live on Me the Fifth be with you. Wait, no. Revenge of the Fifth. I messed it up in the intro. Great. 2022. This is Human Factors guest. I'm your host, Nick Rome. I'm joined across the way from Mr. Berry Kirby. Good evening and how are you? I'm really upset that I messed up that reference. May the 50th doesn't quite work. We got a great show for you all. Tonight we'll be talking about humanity's broken risk perception and how it's reversing global progress in what the UN is calling a spiral of self destruction. And later we're going to answer some questions in the community about being low bald in compensation from internship opportunities, what to do if you're not enjoying your new role after making the transition into the field, and how to address the feeling of hitting a Plateau. But first we got some programming notes and a community update here. Just want to say from the Human Factor side of the house, we are going to have a little bit of a scheduled programming weirdness here. Next week will be a normal show after that. On the 19th, we're aiming to bring you some coverage from the wonderful conference that Barry went to last week. We're still kind of getting all that together for you, but we'll have kind of a tag team effort there and then 26 will be on a short hiatus and then we'll be back on the second, I think. Or wait, maybe I just shifted all those down one. I think it's regular show 1219, then 26 will do that. Yes, that's what we did. I'm messing everything up tonight, Barry. It's almost like I should have made notes for this. But all that being said, what's going on over at twelve two? So twelve two talking about mess ups. We are a slight hiatus as well because of me not taking my cables to EHF, 2022. Then we are pulling the coverage that we expected to have. I didn't get. So we back filling that and we've got a whole bunch of interviews that we matching together. What that has meant is that we got about a two week break on actual interviews, normal interviews, which we are sorting out. There will be more interviews coming up, but I do implore you to go and have a listen to Farming Decision Maker and IoT, which an interview we did with a local educational College. A Farming educational College. And the project manager there a chap called John Owen where they're looking at teaching students around how people can use IoT to make their farms a lot more useful. And really quite an interesting one. Slight dodgy sound quality. So just set the expectation up there. But really well worth going to a listen. Yeah. Well, anyway, we know why you're here. You're here for the news so let's get into it.



Yes. This is the part of the show all about human factors news. I hear we have a really uplifting topic tonight. Barry, what is the story this week? So our uplifting topic is humanity's broken risk perception is reversing global progress in a spiral of self destruction. According to the UN, the world could undo social and economic advances and face one and a half disasters a day by the year 2030, according to the UN flagship Global Assessment Report. Human activity and behavior is contributing to an increasing number of disasters across the world, but in millions of lives and every social and economic gain in danger. The Global Assessment Report released by the UN Office for Disaster Risk Reduction, or the UNDR, reveals that between 355 hundred medium to large scale disasters took place every year over the past two decades, and he's projected to reach approximately 560 a year or one and a half disasters a day by 2030. The report blames these disasters on a broken perception of risk based on optimism, underestimation and invincibility, which leads to policy, finance and development decisions that exacerbate existing vulnerabilities and put people in danger. The report found that the implementation of disaster risk reduction strategies had reduced both the number of people impacted and killed by disasters in the last decade. But the scale and intensity of disasters are increasing with more people killed or affected by disasters in the last five years than in the previous five report was drafted by a group of experts from around the world as a reflection of the various areas of expertise required to understand and reduce complex risks. The good news, says the head of the Undrra, is that the human decisions are the largest contributors to disaster risk. So we have the power to substantially reduce the threats posed to humanity and especially the most vulnerable among us. So, Nick, do you find this a bit of a risky topic? What are your thoughts? Yeah, my initial thoughts here, when seeing this, when reading this, it's like, yes, this report is quite detailed and we'll go over these details in detail. But my first initial thought was, Yikes, how do we even start to fix this? This brought me back immediately to some of the stuff that you and I connected over early on with that climate ergonomics from kind of taking the bottom up approach rather than the top down approach from policy perspective. And it's going to take all that to really fix some of this. There's some really sharp minds, Barry, who are thinking about climate ergonomics issue and how to really fix this. And that's my initial thought. How do we fix this? This is crazy. What are your thoughts? Yes, I'm kind of the same. We've just had you bought the climate economics. That is one really good example of where we think it's somebody else's problem all the time. And actually it's just something that people are moaning about and it doesn't actually affect us. But look at what happened during the pandemic. So normally we have an illness comes around. Vaccines take a long time to get to the point where they're actually usable. You're talking like tens if not hundreds of years. Yet everybody expected that when the pandemic happened. It's bad, but sorry, we'll have a vaccine soon. We'll be fine. We'll let this disease rip through our communities and it will be everybody else that's at risk. I'll be fine. Therefore, we'll be fine. It would take me and my loved ones and it showed us just how wrong we were. I mean, the fact that we've still got it going on now in various formats and we now just getting into that bit of, well, we'll just have to carry on, don't we? We sort of have this idea that science and everything will just solve everything and we can solve everything without there's nothing new in the world. But actually that's just not true. So there is definitely something here around that invincibility piece, I think is for me that the biggest outcome is three factors that were mentioned because we just think it will come to somebody else and not me.



Yeah. I just want to mention that invincibility piece, the thumbnail for tonight's episode shows a guy flexing and says, don't do this because flexing. I don't know. Anyway, I feel like I explained that. But yes, I think you're right. And the thing that I kind of want to talk about tonight in light of this dark article is really taking a look at risk perception, what it means not only from the individual perspective, but really thinking about societal perspective for risk perception too. And then I think maybe later we can talk a little bit about some of the sort of self destructive behaviors that we're all engaging in and kind of how that again, individual versus society, we kind of know what they are, but thinking about them from this perspective, we really get under the hood here of the psychology, and then we'll kind of bring it back to the article. Let's start with risk perception. Do you want to kind of give everybody an overview of what risk perception is? Yeah. So risk perception, when we talk about it in this context, is really based on the study of this, the observation that experts and lay people often disagreed about how risky various technologies and natural hazards were. So key paper was written in 1069 by John CSTAR and Star used a revealed preference approach to find out what risks are considered acceptable by society. He assumed that society had reached equilibrium in his judgment of risks. So whatever the risk levels actually existed in society, they obviously were acceptable. His major finding was that people will accept risks 1000 times greater if they are voluntary. I drive the car, that type of thing, then they are involuntary nuclear disaster. So that is the basic element of it. Do you want to talk about really around the psychological approach of this? Yeah. I want to talk briefly on that point, though, except risks 1000 times greater if they are voluntary is engaging in non prosocial behavior. So not recycling. Is that a voluntary action versus an involuntary thing of being subject to a climate disaster or somebody else's actions? Right. Contributing to a climate. I just want to comment on that because you have kind of two examples there. But this situation of us collectively contributing to something that is impacting all of us, how does our risk perception fall into that category? That's what I kind of appeal back here. So let's talk about the psychological approach here. When we talk about the psychological approach of risk perception, we're really trying to understand how people sort of process this information about risk. Right. We talked about heuristics not last week, but two weeks ago when we talked about consumer products and reviews. Go listen to that episode if you want to kind of refresher on some of these heuristics, because again, we talked about them there. We'll talk a little bit about them in context of risk perception a little later. But again, that's kind of where you want to listen for that. Obviously, these heuristics are when you sort and simplify this information, it leads to biases and your comprehension of this risk. And there's a lot of different factors that are responsible for influencing these perceptions of risk, including things feelings like dread, novelty, stigma and other factors. And so if you really break down the risk perceptions factors, there's cognitive factors, affective factors, contextual factors and individual factors as well. We'll kind of go through some of them, but really they're all sort of just attributes about the risk. Right. So like gravity of events or how much media coverage something is getting. Those might be cognitive factors. You might have emotions and feelings that play into effective factors or the framing of information or availability of information from other sources. That might be another contextual thing that you might want to look at and then individual factors. So personality traits, age, gender, those typical ones. When thinking about sort of the psychological approaches, there's research that suggests that risk perceptions are influenced by the emotional state of the perceiver. I already mentioned some of those affective factors and kind of overall risk and benefit tend to be positively correlated across activities in the world that are negatively correlated in people's minds and judgments. And so when you think about sort of the benefit of engaging in something risky with a hazardous activity in the world, ie not recycling, that correlation is more positive when you recycle, you do better versus people's minds and judgments. Sort of the interpersonal, I guess, example, you want to talk about heuristics and biases a little bit because I mentioned them. Yeah. So to do this in detail, go back. What do we say? Two episodes and we go about these in a whole lot of detail around different heuristics, but just to go over them briefly. So for those of you unfamiliar with heuristics and biases, the only research was done by psychologists who performed series of gambling experiments to see how people evaluate probabilities. And the point of this was they found a number of people used heuristics to evaluate information. So what they are heuristics are useful shortcuts for thinking, but you might get inaccurate or biased judgments in some situations, in which case they become cognitive biases. So a few that we want to pick up, we won't go over them in detail. Like said, we've done that before. But in some of your biases. So if you have representativeness, which is usually employ, that when people have to judge the probability of an object or an event will come to pass. And really one of the things around this is that people become insensitive to predictability. So when we're talking about risk, then their insensitivity to predictability means that their engagement with risk will be different. We also have this idea of anchoring an adjustment. So people often start with one piece of known information and then adjust it to create an estimate of an unknown risk. But you might not have the things that to make that adjustment won't be big enough, so you might not give it sufficient adjustment. And the biases in your evaluation of the conductive and destructive event will force you to make a wrong assessment. And then the anchoring of the assessment of subjective probability distributions means that with this in terms of risk, you'll place it in the wrong area. So when we're using heuristics and bring them into the use of risk, then we've got to be really almost doubly aware of the methods that we are using to make that assessment and how fallible it can be to be wrong. So with that heuristics piece over, do you want to take us into the cognitive psychology element of it? Yeah, I want to talk about a couple of other sort of aspects here with Heuristics. Right. So one thing that we didn't talk about a couple of weeks ago here was this asymmetry between gains and losses. And really this is kind of a bias type of thing rather than a heuristic. But you're really looking at sort of these like gambling behaviors. Right. And you can kind of think about the climate as sort of one of these behaviors where you're engaging in something that's really going to affect you in serious ways. And so people are risk averse with respect to gains. So they want to prefer a sure thing over a gamble with a higher expected utility. And when we talk about that, it's perhaps something that's more valuable to them. Right. They want the sure thing over the thing that you could potentially have a higher value to. But also with that higher value thing, you have the opportunity to get nothing. So it's like it's the all or nothing, but I'd rather take the sum than the sum little thing versus the some big thing at a chance. And so when you think about that in terms of our self destructive tendencies in the environment, we really do want an all in one solution. It's not going to happen that way. It's like the end goal here is to get to the baseline level of disasters per year. But really, when we talk about it, the goal is to get anything less than 1.5 disaster. So if we get 1.4, we've still done something to improve that situation. And the risk is that we don't get back to the baseline, but we still got to try. And so I just find that fascinating. There's the other flip side of it where people are risk seeking about losses, so they prefer the chance of losing nothing rather than taking a sure but smaller loss. So this is something like insurance. Right. And when you think about the loss of environment, life, things like that, when you're risk seeking. Right. You're hoping for the chance of losing nothing. That means you hope that everything will go the way that you hope it will go. Everything will kind of stay the way it is. It's not going to happen that way because we're on a trajectory of the self destruction. And so we're hoping and because of that, we're losing. There's also threshold effects that I want to talk about really quick and biases. You also have sort of this people prefer to move from uncertainty to certainty over making a gain. Uncertainty doesn't lead to full certainty. So trying to slowly understand how we can better this process of saving the environment. Right. So an example, people would rather choose a vaccine. You brought up vaccines earlier that reduces the incidence of disease A from over, one that reduces the disease B from 20% to 10%. And that's kind of what I'm talking about there. When we look at the environment, when we look at sort of the effectiveness of the way that we implement things, is it not worth trying the thing that's going to reduce the number of disasters from 1.5 to 1.4 versus the thing that's going to do 1.5 to 1.0? I think that's what we're looking at right there. Right. So, I mean, thinking about those threshold effects, I'll let you talk about cognitive psychology because I kind of took the second half of Heuristics and biases there. Well, it would have helped if I hadn't actually just forgotten to talk about it. In terms of cognitive psychology, the majority of people in the public express a greater concern for problems which appear to possess an immediate effect on everyday life, such as hazardous waste or pesticide, rather than longer term problems that may affect future generations, such as climate change and population growth. Exactly what we've been talking about with climate economics and why we want to make an impact with it in that way. So people rely greatly on the scientific community to assess the threat of environmental problems because they usually do not directly experience the effects of the phenomena such as climate change. And another good example of this is such as air pollution, where you struggle to see actual air pollution itself. So you rely on experts to tell you what the air pollution levels are, and then it's only when you get there and you're like, oh, so that is bad because generally unless it's smog, you don't really see air pollution. So you're relying on somebody else to tell you about it. The exposure most people have to climate change has been impersonal. Most people have only had a virtual experience through documentaries and news media. It won't seem like a remote area of the world, so it doesn't naturally apply to them. However, culture of the population's wait and see attitude. People do not understand the importance of changing environmentally destructive behaviors, even when experts provide detailed and clear risks caused by climate change, which is absolutely right. And that just follows all the research that we've just published. There is another bit onto that which I will throw in there in that people need to have a path, people need a guide on what they can do, because if the target is seen as too big and there's no direct line from their action to solving the target, then there's also a blocker there as well. So you do your day to day recycling and the climate change being kept within 2% sorry, the climate temperature being kept within 2% of two degrees of where we're at. There's no direct link, there's no causation in there. Psychometric paradigm. Psychometric paradigm. Yeah. I mean, that last point that you made is just like, how do we fix this? Right? You did kind of break it down a little bit. We need to make it manageable chunks and directly link it. Right? So we're already getting to some of the solutions and I like that. So let's talk about the psychometric paradigm. So this really focuses on sort of the roles of affect, emotion, stigma when influencing risk perception. So again, thinking about how we think, feel and perceive climate change in this example here, people generally see most risks in society as being unacceptably high, ie, the climate is changing, it is getting too out of control. We're seeing too many disasters. This is unacceptable. All things being equal, the greater the people perceive a benefit, the greater the tolerance for a risk. So let's say there's, I don't know, some new technology that comes around and could potentially harm people because it's, I don't know, taking carbon emissions out of the air. And I'm trying to think of a scenario here. But basically, when you think about that, it could save, right? It's the perceived risk to be able to save. You have a risky technology that could reduce it down to 1.0 instead of a less risky that'll reduce it from 1.5 to 1.4. Go ahead. A good example of this at the moment actually, is our electric vehicles. So people use electric vehicles or are using electric vehicles because they perceive that they are better for the environment. But then one of the locked in issues that we got at the moment is the battery technology and the lithium batteries. How are we going to recycle them? Lithium batteries? We don't know yet. All we know is that we're going to store them, and at some point we expect to develop technology to recycle out that lithium. So it's not that bad and we'll get different battery technologies. But right now, we are happy to take that risk because we derive your pleasure from using our electric vehicle because it would be if it would arrive. We want to use electric vehicle because we've been told that that helps the environment, and that's good. And therefore, we feel we get all our self fulfillment, all that good, all that good behavior stuff. But we are putting aside a risk at the moment that we have to take because we think it's a problem that will be solved down the line. We have a lot of we basically biased to that because experts have told us it will be solved. Yeah. Well, so thinking about risk perception in psychometric paradigm, specifically, we're looking at sort of this highly dependent on intuition, experimental thinking, emotions. Again, we're looking at sort of the way we think, feel and perceive things. And when you look at sort of this broad domain of characteristics, there's sort of these three high order categories in which we think about risk, the degree to which a risk is understood, the degree to which it evokes a feeling of dread and the number of people exposed to the risk. So thinking about climate change, how well do we understand that problem? How bad do we feel about it? I know my wife and I have had multiple conversations about do we bring another child into this world if we know that? What kind of world are we bringing them into and then really dread and then the number of people exposed to the risk? Well, it's everybody in the world. And so that's really high. This dread risk. When we talk about dread risks, these are things that are eliciting visceral feelings of terror, uncontrollable catastrophe, inequality, uncontrolled type. Those are the emotions that we're talking about. I certainly think that climate change fits that bill for a lot of people. And the more a person kind of dreads an activity, the higher that's perceived risk and the more that that person wants that risk reduced. And so that is one sort of a glimmer of hope in all this is that the more that we collectively as a society feel that this climate disaster, this climate change is, in fact, something that we dread feeling of terror, uncontrollable catastrophe. I think we will start to think that we need to do something more and more. But when is that point of where it gets too late? Right. It's a glimmer of hope. But is it too late? I don't think so. We have to live with some of these effects. But can we reduce it over time? Maybe. I don't know. Let's talk about culture. Yeah. So cultural theory in terms of risk is really interesting because in cultural theory there's four ways of life. If you think about that in a group or a grid arrangement, each way of life corresponds to a specific social structure and a particular outlook on risk. So the grid categorizes to the degree which people are constrained and circumscribed in their social role. The tighter binding of the social constraints limits the individual negotiation. The group refers to the extent to which individuals are bounded by feelings and belongings of solidarity, as I guess we call it. The greater the bonds, the less the individual choice are subject to personal control. So these four ways of life include hierarchical, individualist, egalitarian and fatalist. So if you imagine that as a grid, depending on where you're at and within it, you're bound by the community, you're bound by the social constructs to almost go along with things. Or depending on how you fit within that, you might find yourself, even though you might push against what's going on, going along with them for the sake of it. Some of this comes up really quite strongly in project management. When you do in project management, risk assessment, because there is a different adaption of this cultural theory grid which you use to assess your risk appetite. And we use that. And I find it really interesting culturally, we then normalize the risk. We talk about the risk as an objective thing because then we monetize it. For me, that's quite interesting. And it's one particular culture that we play with it. There's also been a national culture and risk survey. So the first national culture and risk survey of cultural cognition found that a person's worldview on the two social and cultural dimensions of hierarchy, egalitarianism and individualism solidarity. It's been a long day solidarism was predictive of their response to risk. So that's really quite interesting. And the fact that there is survey stuff out there that backs up this thinking. Do you want to talk about the social implication of risk framework? Yeah. So this is a framework through the social amplification side of things. So thinking about society as a culture, right. They're abbreviating this Sarf, S-A-R-F. But this combines research from a bunch of different domains, psychology, sociology, anthropology, communications. And really it's outlining how communications of risk events pass from the sender to the immediate stations of a receiver and in the process serve to amplify or attenuate the perception of risk. In layman's terms, how do we communicate about risky things? So that we engage less in risky behavior. So when we think about the environment, how are we or any other media outlet communicating through these channels to people getting the message across that your behaviors have these consequences? Right. So really, you're thinking about all these links in a communication chain. Individuals, groups, media, they contain filters through which that information is sorted and understood. And so when you think about this framework, it attempts to explain the process by which risks are amplified after receiving public attention or attenuated receiving less public attention. So all that being said, I think ultimately the main thesis here is kind of that the risk events that interact with individual, psychological, societal and other cultural factors in ways that increase or decrease sort of these public perceptions of risk. There's ripple effects, obviously, when you think about the amplification of these messages and these include enduring mental perceptions. So the way that we think about climate change slowly changing over time, and we need to match that. And so basically you think about some of these in terms of seminal human factors. Right. Training, education, all that stuff. Long story short, basically some of these traditional risk analysis might neglect these ripple effects that some of these communication impacts have on it and really underestimate the adverse effects from certain risk events. And so you have sort of this public distortion of these risk signals, and this will provide a corrective sort of mechanism so we can look at these problems, these risky behaviors in a better light. We spent a lot of time on risk perception. I do want to make sure that we talk a little bit about self destructive behavior. I think maybe we all kind of know what self destructor self destructive behavior is, and in this case, we're specifically talking about contributing to an environmental disaster. I'm going to skip over a lot of this. We have come prepared with notes, but we are running late on time. So I think the main point that we want to make about sort of self destructive behavior really is that you can diagnose it on the individual scale, but it's really difficult to sort of diagnose it on a global scale because some people are doing things, everything that they can to contribute to a world in which they want to see this 1.5 number come down and other people just don't either have awareness or don't care. And so it's hard to diagnose it from a societal standpoint. And then there's obviously ways to test it from an individual level. But then when you get to this, it doesn't scale. Right. Correct. Some of the outlook of these self destructive behaviors obviously are risky. Selfdestructive behaviors can increase the risk of poor mental health. But you want to give this little glimmer of hope here with the selfdestructive behavior. I think you can fully recover from selfdestructive behavior. So we know this from an individual basis, and there is no reason at all. Why then we shouldn't think that scales up so you can fully recover from it. How long it takes depends on a number of factors. So that includes the frequency and the severity of symptoms, whether you've had other conditions such as depression or PTSD, your specific self destructive behavior, and whether it's linked to such things as alcohol and things or an eating disorder. And your outlook depends on your individual circumstances. So we know that therapy and medication can be effective in treating a variety of mental health disorders. And your doctor will be able to give you an overview of what you can expect. But when we talk about it in terms of society, then actually we're talking about societal outlook but based on individual circumstances because it is the sum of the parts. So there is clearly a lot of work here, but this does give us that bit of hope. I mean, fundamentally self destructive behavior is when you repeatedly do things that are going to have to be physically, mentally, or both. And when we talk about this in terms of climate, then we are talking about the things that we can do or neglecting to do, such as recycling, such as planning, such as that type of thing. So if you think you're engaging in this self restricted behavior in terms of disasters, well, you probably are. The fact that you think that you're doing it means that you're not that far away from it. So you don't have to be like that. You can do things differently. So if you find them new skills, their new coping skills, and practice them, particularly practicing alternative behaviors, we can hopefully live a less self destructive life. And that could be one of the key things to take us around, things like climate change. Yeah, right. We can't medicate everybody. We can't give everybody therapy, but certainly thinking about practicing some of these coping skills and alternative behaviors to social, I say antisocial but anti environmental practices. So let's get back to the article because there's some really good points that they make in here. And I guess, Barry, do you have any key takeaways that you want to make sure that we talk about before we get out of here? I know we're short on time, but I do want to take the time here to revisit the thing that this sort of echoed to me. And it's about how we project risk and how we communicate risk to the public. So we talked at the top about how people's reactions to the pandemic and how we thought that scientists will come up with a solution. And as always, it's somebody else the first time and it won't be the first time it's happened. But the first time, I think it happened in a big way was around the Millennium book, when for years people were talking around the Millennium bug, and that happening. And then basically engineering managed to solve it so actually the repercussions were very small. And so then people suddenly started having that talk of clearly everyone's going to say it's a disaster, but then the Millennium was going to be a disaster and it didn't happen, did it? We expect disasters to happen and when they're not, then we tend to then mock them. Whereas actually the Millennium Burg could have been a really big disaster. Lots of people recognize the risk. Lots of people did some really good, solid chunks of work to mitigate that risk because it wasn't sold, it was mitigated against. And I wonder whether that was the start of modern day risk apathy, because that's kind of what this is talking about. We hear the risk, we watch the news, we all get depressed about it, but then we become quite apathetic about it. So, yeah, that's something I wanted to get across before we went out of time on this. Well, that's quite depressing. I'm going to leave us on a slightly more optimistic note. Rather than apathetic attitudes towards climate disaster, think about this. Disasters can be prevented. And this comes with a slight warning. Obviously, we need as not only countries, but as a world really to invest some of these time and resources to understand and reduce the risks that we're engaging in. And ultimately, when we deliberately ignore risk and fail to integrate it into some of the decision making that we make daily, the world is effectively bankrolling its own destruction. That is a quote by the head of the UNDR critical sectors, from government to development and financial services must urgently rethink how they perceive and address disaster risk. So this is putting the onus on sort of these higher level functioning bodies of government, financial services, development, those types of things. But from an individual perspective, sort of the good news here is, again, we're going to restate this because it is kind of that glimmer of hope. Human decisions are the largest contributors to disaster risk. And so we have the power, if we understand humans, which psychologists, human factors professionals do, we have the power to substantially reduce those threats pose to humanity and especially the most vulnerable among us. So that's kind of where we want to end. Positive note, we can fix this, but we got to work at it. So let's do it. Thank you to our patrons and everyone on social media this week for selecting our news topic. And thank you to our friends over at the United Nations Office for Disaster Risk Reduction for our news story this week. If you want to follow along, we do post the links to the original articles on our weekly roundups and our blog. You can also join us on our discord for more discussion of these stories. We're going to take a quick break. We'll be back to see what's going on in the human factors community right after this. Human Factors cast brings you the best in human Factors news interviews, conference coverage, and overall fun conversations into each and every episode we produce. But we can't do it without you. The Human Factors Cast Network is 100% listener supported. All the funds that go into running the show come from our listeners. Our patrons are our priority and we want to ensure we're giving back to you for supporting us. Pledges start at just one dollars per month and include rewards like access to our weekly Q and A hosts personalized professional reviews and Human Factors Minute, a Patreon only weekly podcast where the host breakdown unique, obscure and interesting Human Factors topics in just 1 minute Patreon. Rewards are always evolving. So stop by Humanfactorscast to see what support level may be right for you. Thank you. And remember, it depends. Yeah, huge. Thank you. As always to our patrons, we especially want to thank our honorary Human Factors Cast staff patron, Michelle Tripp. Hey, did you know, as we're talking about ways to support the show that we have a merch store? We have some really awesome designs over there. We're always updating them. We got it the pins shirts. If you want those, we got to show logo, and there's other cool designs based in Human Factors culture. So if you want to support the show, look good doing it. I won't spend too much time on this. You can go support the store. You find it on our website or something. I don't know. Anyway, it's time we get into this next part of the show we like to call.



Yes, this part of the show is called It Came From. This is where we talk about. We find all these news stories all over the not news stories. I am messing up a lot tonight. That's okay, though. We find topics that the community is talking about all over the Internet. If you are listening, watching wherever you're at, give us a like to help other people find this content. First one we're going to be talking about tonight is actually from our discord. So this is Am I being lowballed? This is by V in our discord. I'm taking out names. You can see the details in our discord if you join us. But I'm being offered a Human Factors and VR development internship for a government contractor on a government project. But the pay is only 15 to 20 at 20 hours a week, $15 to $20 at $20 a week. As far as market value, I'm a first year grad student and terminal Masters. I have three years of undergraduate research, with one being a self led study on relevant themes, another year of research here out of the gate, two internships relating to user research and observational methods. Not a lot in VR development, but a bit in VR research seems like a lowball. But I'm not a professional by any means. Am I being lowballed? Should I just accept it for the experience and the name? We're not going to say the name here on the show. Again, you can go check that out in our discord for the full conversation. Barry, what do you say to this? Is Vy getting lowballed? In short, yes, which is the official answer. However, there's an element here about looking at the entire package because you got to work out what is the internship there to do, is it there to pay the bills, which is obviously going to do some of that. But as an intern, you're also looking at the value of the experience that you're getting. So is the contract that you're going to do or the company that you're going to go and work with? Would having them on your CV give you sufficient experience? Just having a bit of pizzazz or a bit of wow on your CV? What is the value of that to what you're doing? So I do believe that any sort of internship should pay the bills, that type of thing. If you're going to be offered that, you should be offered a decent wage. But it's a very personal decision as to dare I say it depends on what do you think you're going to get out of it? Is it long period, short period, whatever? So the official answer, yes, I think you're being a low ball. But is that worth it to you? I jokingly said take out a bunch of student loans to pay for this because they're about to be forgiven. Maybe. And so that's probably the best advice. No, not really. Don't do that. That's a terrible advice. We are not an independent financial advisor. All loans are at your own risk. Look, this is a really interesting question because yes, to me it is a low ball. I mean, when I did my internship years ago, this is less than I got paid then. And you're right, there are some things to consider that are outside of the money. But I do find it really icky when you think about companies saying you're getting paid in experience. I hate that. And it's the reason why in the lab I put no hard requirements over it because you're volunteering your time. I don't want you to feel like you're getting paid in experience. I want you to get out what you put in. All that being said, what value do you put on the things that are not money related? Are you going to starve? First off, that's pretty important question to answer. The other questions to answer are who would you be working with? Would they be valuable connections in the future? How much is that worth to you? I think that is a tremendous sort of amount of consideration should go into that. The name again, if it is a reputable name, then that is also something to consider. Looks good on your resume. Fine. Again, paying for the experience at name though, shouldn't really be. I don't know. I feel a little icky about that. The last sort of piece of the pie here is the topic that you're working on. Is this closely aligned to your goals long term? If yes, then I think it could be a good career step. I'm actually talking to this person tomorrow, so hopefully some of these answers, it sounds like they're going to accept it. But anyway, I do want to mention that this question actually prompted Blake, you remember that guy to enter the chat? I've never actually met him, so I have no idea who you talk about. Yes, that guy, he actually entered the chat. He came out from the shadows to answer this question. But, I mean, there's a lot of good sort of advice that he gave. Again, I'm going to plug our discord. Go check out the full conversation there. We also just opened up a new channel for career advice. So if you do have any similar questions to this, I think that's a great place to ask these. And again, based on the way that I'm handling this, we're going to say maybe what the topic is, but we'll leave out some certain details so you don't feel so called out in a public manner. Again, our discord is quite intimate, so I don't think that's a problem. All right, next one here is from the UX research subreddit. This is by mystery to me. 120 not enjoying my new job. I'm new to UX transition from academia. They said they made a career change. They were in a PhD program, enrolled in the boot camp, and landed an OK paying job in UX, which is supposed to be about 80% research, 20% design and educating employees on best UX practices like documentation. The issue is that their product really sucks. It's the most annoying product I've ever used in my life and haven't managed to understand 10% of it. They got training on it. The trainer felt disrespectful. Gold students publicly shamed them, never explained any context or provided any presentation while teaching. On top of this feeling, I'm feeling like only 20% of the job is research and 80% as they advertise. Since it's my first job, I feel like I have to stick it out, but I'm just looking for reassurance that this happens sometimes. And any suggestions on how to accept a better role next time? Thanks, Barry. How do you sort of do this when you've just transitioned to the field and you're getting met with all this stuff? First off, do you think their complaint is valid? And then second off, how do you approach this? So Firstly, I think it is possible that the perhaps didn't do as much research into the future job. So if this is their product and it sucks, if it's their only product and you didn't pick that up, then maybe you should have done slightly more research into it. But they might have a range of products. So let's just go with that assumption. Sometimes the hardest thing you can do is you might have made a mistake for whatever reason. But the sooner you can make that admission, the better it is, or at least come to be at peace with it. So some people might argue that, say, quitting a job within a year is a bad thing, because if you do that too many times, you have a really sporty TV and that's unhelpful for when you're looking at continuity. And there is that sort of advice that you should be trying to stick a role for at least a year to two years consistently to have that. However, if you are in a role that you just don't like, that isn't bringing you joy to going to work with every day, then quite frankly, it's a tough decision to make. But I would pull the rip cord. I would get out of it personally, and the sooner you can, the better. So it is up to you in terms of how you do it. But I've been in sort of two situations, one where I had somebody working for me in a different role who they were going to be taking over from me. And then they realized the breadth and depth of work that they would be doing, and they made what I thought was a really brave decision to say, you know what? I think I've been off more than I can chew, or they didn't necessarily tell me about the full breadth of what this role is going to be. I don't think this is where I want to be. And actually they left before I did, and I thought that was really brave, but the right move for them. And then I've seen other people do similar things. I did it myself. I went in for a short project where they wanted to keep it on going on for longer. But I was like, Actually, no, I've done what I can. I'm stopping. And so whilst we had money on the table and all that sort of stuff, I felt that the way that it was going wasn't where I wanted it to go. And so we completely ended. Basically, we didn't renew that contract because it just didn't feel right. And actually, that was about ten years ago. I've since been proven right, which is quite nice, but yeah, this is one of these things. I think my gut feel for the way that it's described. I will get out of there. What do you think, Nick? I don't know. I find that a lot of the stuff that this person is describing is part of the enjoyment I get. If a product sucks, then it's your job to fix it. It's your job to understand why. To me, it's a challenge. Again, I don't know where the expectation differences occurred. And this kind of highlights the importance of really understanding what you're getting into before accepting a position. So I don't know. I think the best question to ask is if it didn't exist before, what sort of things are they anticipating? If this position doesn't exist and you are sort of filling it for the first time, what are the expectations of everyone around you? If they cited research, then that's what they had in mind for you. And I'm just kind of wondering why you're at a 20% right now. Is that because there's some sort of blockage within the company that you need to break through to get to that? I don't know. I feel like I don't have all the pieces of the puzzle here. Just try your best to ask questions, understand what you're getting into, and communicate effectively. When you get into the role from academia, you always have that backup. If you don't like it, go back to what you know, like that's kind of the nice thing about switching fields is you can always go back to that other thing, right? I don't know. It's not great advice, but it's there. Let's get into this last one here. Feeling like I've hit a Plateau. I'm seven months in. Sorry. This is by Deftones 5554 on the UX research subreddit. I'm seven months into my first job as a UX UI designer at a marketing agency. I feel like I'm hitting the Plateau. I'm the only UX person here on a design team of six. Before I got here, they didn't care much about research, but I tried to change that. I'm the only researcher here. I'm trying to beef up our research process, but it seems impossible by myself. They go in and talk a little bit about details about what they do. They say this process I use feels like it's lacking a lot, but it's mostly just user interviews and testing. I've thought a lot about using other tools, but we just don't have the space in our timelines. My biting off more than I can chew with this job. Is this the best place for me so early on in my career? What I'm scared of, though, is that all this weird anecdotal work been working on super hard to display on my portfolio won't be displayed on my portfolio. I'm very proud of most of it, and I feel big companies will only want to hire people who started their career in a more traditional position at a company that trained them. They're proud of theirself for the research process anyway. Feedback Barry, what's going on? Hitting Plateau. Yeah, I think this happens to everybody at some point or another. I mean, I've been there, I've done that and certainly been in my role at the moment in terms of running my own company. I hit that quite often. It's how you're going to stretch yourself. Are you going to push yourself? I think there's a certain element here that you haven't been there that long in the grand scheme of things, but it is hard. We sort of said this quite a lot. Quite times before where you've been. You know, if you're that one person that's doing your role, it can feel quite lonely place at times. So I think if you think there is more stuff that you can do, different tools and things like that, bring it up. Talk to the rest of the team about it, talk to your manager about it, around what your ambitions are, and can we develop the role in a way that allows you to express yourself in the way that you want to express yourself? You said that you're worried about how you're going to display it on your portfolio. Quite frankly, for me, that would be a secondary consideration at this point, because the first thing you want to do is good work. Your portfolio is kind of important, but it's not the be all and end all because a lot of things, I guess you're not going to be able to talk about in the grand scheme of things anyway. So I wouldn't be putting quite as much emphasis on that. This one's more difficult than the previous question, I think because they clearly want to be able to do some stuff. They have some ambitions, they clearly have some ideas that they want, how they can do the job differently and do the job better. But it just feels either they don't have the receptive team there to naturally say about them, or they're nervous about putting up new ideas. And my view would be just put yourself out there and if you want to change things, recommend it. The worst thing that can happen is they say no. But I bet you might realize some of your ambitions. I have your awful lot around that. Nick, what do you think? Have you got something a bit more concise, a bit better advice than what I was just giving?



Yeah. I also agree that this is too soon to feel a Plateau. To me, this feels like maybe you didn't understand the assignment. Ux. And research is always about advocating for the importance, and if you're the only person there doing research, then you got to do your job and advocate for it. I don't know. I feel like setting up a research process is some of the most fun. I mean, I'm a different kind of person, but setting up a research process is the most fun thing that I've ever had experience with. I've done it at multiple companies, and I think really the biggest challenge can be to get people to use that process, and that is a challenge, but it's also a fun thing for me to do. I really enjoy that aspect of my work, and I wonder that if you're not enjoying that aspect of it, is this the right field for you? I don't know. Maybe another company can help if they already have an established process that might be a better fit. I don't know. I think to me that is the most fun part, and if you're having problem with it, then maybe find a different area within UX. I don't know. I don't want to tell you to quit your job and go look elsewhere, but to me, that is the most exciting part of it. And I can't really give great advice other than find what you love and kind of stick with it. Anyway, it's not great advice. I'm on a roll with not great advice tonight, but that's it. All right, this time we get into this thing, we just call One More Thing needs no introduction. Barry, what's your One more thing this week? So this week is just me talking about my ultimate job interview that I've just finished, which is standing for local government office and having that piece of knocking on hundreds of doors and talking to people and just trying to sell yourself to every single one of them. But try to be your authentic self, but also have to tailor it slightly for what their individual wants and needs are. And you've got about, I don't know, 30 seconds a minute to make an impression hundreds of times over, rather than just a normal sort of interview where you got maybe a panel of three, maybe a panel of five, and you're doing it all day. And I've literally spent the past 48 hours highly focused on this and getting outside of the work that I've done over the past few months. So, yeah, it feels like the end of a really weird period of time. So that the polls closed now about an hour and a half ago, but we don't find out tomorrow. So I get the results of the job interview tomorrow and what the public at large have thought about me. Yeah, it's just really weird. It's very public as well, because everybody will know the outcome. Do your cheeks hurt? My cheeks hurt. My jaws hurt. My feet hurt. My knees hurt. There's Hills around here. I've never had Hills like it. But what a thoroughly enjoyable thing to do, the ability to go and talk to people in this way. It's like the weirdest, biggest piece of UX research you ever do. It's mad, but yeah. No, it's been great. Yeah. My one more thing this week is last week I talked about getting my 3D printer out and starting projects and going to a Star Wars convention next month. This month. This month. It's this month. Wow. And I have been working on my son is going to dress up and I've been working on his helmet. Those who are watching can see progress here. Isn't that cool? Yeah. So it's got some paint on it. It's still work in progress, obviously, but something I'm very proud of. It looks very good. Anyway, that looks great. Yeah. It's still right in the middle of it. But anyway, that's it for today, everyone. If you like this episode and enjoy some of the discussion about our doomed planet I'll encourage you to go listen to you know what, I got a short homework assignment for you this week. It's Climate ergonomics it's a human factors minute as part of the team sees effort. So go listen to that. It's only a minute comment wherever you're listening with what you think of the story this week. For more in depth discussion, you can join us on our Discord community. Like I said, we got that new career advice section in there for you to throw questions at us. You can always visit our official website, sign up for our Newsletter stay up to date with all the latest Game Factors news. If you like what you hear, you want to support the show. We do have a merch store. Go and do that or you can leave us a five star review that's free for you to do. Tell your friends about us or consider supporting us on Patreon. As always, links to all of our socials and our website are in the description of this episode. Mr. Barry Kirby, thank you so much for being on the show today. Despite knocking on doors all day, where can our listeners go and find you if they want to break out of this self destructive cycle? Well, any time now they can see me in bed for the next two days. You can find me on Twitter basil K I'll come and listen to some interviews that I haven't done yet but will be up soon on travel to the humanfactors As for me, I've been here host Nick Rome. You can find me on our discord and across social media at nickrome thanks again for tuning in the human Factors cast. Until next time. It Depends!

Barry KirbyProfile Photo

Barry Kirby

Managing Director

A human factors practitioner, based in Wales, UK. MD of K Sharp, Fellow of the CIEHF and a bit of a gadget geek.