Episode Link: https://www.humanfactorscast.media/242
Recorded live on April 21st, 2022, hosted by Nick Roome & Barry Kirby.
Check out Barry’s Interview with Nick Rutter about fire alarms on our sister podcast - 1202 The Human Factors Podcast:
It Came From:
Thank you to our Human Factors Cast Honorary Staff Patreons:
Human Factors Cast Socials:
Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.
Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!
Thank you to our Human Factors Cast Honorary Staff Patreons:
Human Factors Cast Socials:
Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.
Welcome to Human Factors Cast, your weekly podcast for Human factors, psychology and design.
Hello, everyone. Welcome back to another episode of Human Factors Cast. We're recording this live on April 21, episode 242. I'm your host, Nick Rome. I'm joined today by Mr. Barry Kirby. And a great good evening to you, sir. Great. Good evening to you, sir, as well. We've got a great show for you. Tonight we're going to be talking about how top reviews, not average ratings, are going to sway consumer decision making. Later, we're going to answer some questions from the community about representation and UX research, transitioning into the field and getting a mentor and how to apply for jobs. But first, Barry, what's going on over at twelve two? So at twelve two this week we are interviewing Chris Reed, who is the current President of HFS. And so we're really looking forward to chatting to him because maybe you give me some point about how to do my presidency next year. But also on Sunday, we go in on tour. So for the first time, we take in twelve or two out on the road and we are more precisely, we're going to Birmingham for the second phase of the CIAF economics conference. Not entirely sure what's going to happen if I'm brutally honest, but it's going to be a learning experience for all involved. Well, that sounds awesome. We know Chris. We know Chris. He's been on the show. We do the town. Yeah. Friend of the show. So I'm excited for that episode and excited to see what comes out of your trip. But we know why everyone else is here. They're here for the news. So let's get into it.
That's right. This is the part of the show all about Human Factors news. Barry, what is our news story this week? So this week, a study has found that top reviews, not average ratings, sway customer decision making. So when you're shopping on your favorite online store, generally, as usual, things picture a title, a description, as well as obviously things like the price and shipping may also have customer feedback. It has a star rating and some sort of qualitative review, something that you've Typed in, what you thought of, what you thought of the product. Collective wisdom dictates that consumers gravitate towards the highest rated products. The difference between a four star average rating and a four and a half star average rating could potentially play a massive role when buyers are trying to decide to hit that add to cart button. But recently, a recently accepted research paper from a USF Mermaid College of Business researcher shows that the half star chasm may not be all that important. It turns out the top reviews carry way more sway in a customer's final buying decision when they're comparing products. This research debugs a widely held notion that serious online consumers buy products with a high star rating. So Nick, is this a five star article for you, or do you want to throw some shade in your review? You know what? I think this is a four and a half star article for me, but I did read a review of this article that I thought really changed my mind on it. But look like this is an interesting one to me because we haven't really done much on the consumer purchasing side here on the podcast. And this one will get us a little out of our comfort zone. It's less human factors, more marketing, but there's a lot of overlap in some interesting psychology that happens behind decision making. So we thought it was a good kind of springboard. But yeah, it's really when it comes down to it, it's communication. How do we communicate with others about products and how do we interpret other people's signals about those products? But Barry, what are your kind of initial thoughts on the article? Yes, I think it's interesting because not only and it's most basic about how we interpret things and make them thought decisions, but it's also about the assumptions about what our shopping habits are that I think, as I said in the article, you think you want to do that, you take knowledge, information. But actually we might not be. For my own personal preference, I go I look at the stars first and see if they've got a high number of stars. And then I go and look at the top reviews. But then I also look at the bottom reviews for the people who maybe had a bad experience to sort of balance them off. And then people game reviews as well, don't they? Which I'm sure will come on to. But it's interesting, I think. Yes, you're right. It is out of our comfort zone. So we'll probably wander around the shop of it tonight. But there's definitely some bits that I think we need to pull in and get into the nittygritty of. Yeah, I mean, let's start with kind of let's bring this human Factor's perspective to it. Right. Let's talk about decision making. There's a lot of psychology that goes into decision making, and we could talk about some of these. Right. We have like, past experiences, cognitive biases, escalation of commitment sunk outcomes. You have individual differences and personal relevance. I think we could dig into each one of those and talk a little bit about some of the context for how these impact our decision making. Barry, do you want to start with past experiences here? Yeah, because past experiences can really impact future decision making. So the way it goes into influencing the decisions people make in the future, it stands to reason that when something positive results from a decision, people are more likely to make to decide in a similar fashion in the future. On the other hand, people tend to avoid repeating making past mistakes. So if you made a purchase and it's gone badly for you, you're not likely to go and make that same purchase. Again, in financial decision making, highly successful people do not make investment decisions based on past sunk outcomes, but rather by examining choices with no regard for past experience. This approach conflicts with basically what we just talked about, what we may expect. Nick, do you want to dive into cognitive biases? Yeah. So even with those past experiences, there's some things going on in your head that are biasing you and influencing your decision making here. So what cognitive biases are just to remind everyone are kind of these thinking patterns based on some observations and generalizations that might actually not be true. They kind of are memory errors as they're coming up. And so what's happening here is that you're making inaccurate judgments based on kind of faulty logic. And so what this can manifest itself as is kind of there's a couple of different ways in which cognitive biases manifest, but here's a couple, to name a few. Right. So belief bias, the overdependence on prior knowledge in arriving at decisions. There's hindsight bias, which people tend to readily explain in an event that was as inevitable once it has happened. There's omission bias where people have a propensity to omit information perceived as risky and confirmation bias in which people observe what they expect in observation. So these are a couple biases in which you can take into decision making processes. So it's kind of that self fulfilling prophecy outcome determinism thing. If you think something is going to happen and you make the decision to make that thing happen, then I knew it was going to happen. It happened. And so that's kind of what cognitive biases play into. They do influence kind of these poor decisions, but they really do. We'll talk about heuristics later. And so that's kind of when this works is these cognitive biases. So just take that with a grain of salt. We can make bad decisions, but we can also make good decisions based on those two. Barry, talk a little bit about escalation of commitment and sunk outcomes. Yeah. So in addition to past experiences and accomplished biases, decision making may be influenced by an escalation of commitment and sunk outcomes, which are unrecoverable costs. Julian, Carlson and Garling in 2005 concluded that people make decisions based on an irrational escalation of commitment. That is, individuals invest large amounts of time, money and effort into a decision which they feel committed. Further, people will tend to continue to make risky decisions when they feel responsible for some cost, time, money, and effort spent on the project. As a result, decision making may at times be influenced by how far in the whole the individual feels they are. Do you want to dive into their individual differences? Yeah, let's get into it. So, you know, obviously with any person, there's going to be different things about that person. You have age, which is a big one. Right. Life experience, let's not call it age, let's call it life experience. The older we get, the more experience we have with living, as we might take different things and do decision making with that experience. We also have things like socio economic status. Right. Especially when you're talking about purchasing a product, let's say on a massive retailer on the Web, you might be more inclined to buy a less expensive option or be more swayed to buy a more expensive option if, let's say there are certain things about it that are aspects of that are qualities like durable. Right. So you're not wasting your money anyway. That's what socioeconomic status brings into it. But basically what we're talking about here with decision making is that these differences between us, they really do influence decision making. So we talked about age here, but that sort of, I guess, widely I don't even know how to say this. Our minds kind of decline as we age. And so with that, that might be as you're trying to make, let's say, larger decisions you need some assistance with. And so I'm trying to be really nice around this because I will be of an aging population one day, too. So older people may be overconfident when you think about their ability to make decisions, which ultimately might inhibit their ability to apply some of these strategies. And then there's also some evidence to support the notion that older adults actually prefer fewer choices than younger adults. And so it's kind of that whole choice paralysis thing. If you give somebody too many choices, it might be too much for them to pick. There's also the socioeconomic status. Right. Especially that correlates with education resources, which might actually leave them more susceptible to experiencing negative life events. And these are often beyond their control of society as a whole is kind of implementing these experiences on them. And so they might make poor decisions based on some of these other past decisions that they've made, which can it's a self fulfilling circle in a lot of ways, but everyone is different. You want to talk about bit little A about personal relevance and what it means to individuals. Yeah. So personal relevance is that belief in what they decide actually matters. And if they believe what they decide actually matters, they're more likely to make a decision. So when that analyzing individual voting patterns, people vote more readily when they believe their opinion is indicative of the attitudes of the general population as well as when they have regard for their own importance in the outcome. So people vote when they believe their vote counts. This is pointed out where this and this voting phenomenon is quite ironic, where more people vote, the individual vote counts less intellectual math, something that's very pertinent for me right now. But that is quite true. We see that all the time. There is a lot of apathy at the moment, generally around the world where people don't want to vote because they don't feel that they can actually have an input. Their vote doesn't count for anything. And therefore, what's the point? Yeah, all these things. Right. So I mentioned heuristics before. I want to get back into that. And really that's when cognitive biases perform well. Right. Heuristics are kind of they're shortcuts. So when you're looking at these decision making strategies people are using, especially in situations where you have sort of very little information to operate on, heuristics can often be very correct. And like I said, these are kind of mental shortcuts that reduce the cognitive load, so to speak, when you are trying to make these decisions. And so, you know, it really works in a couple of different ways. It's kind of allowing the user to scrutinize a few different signals or alternative choices rather than everything. Right. Instead of weighing the pros and cons of every single thing about whatever the decision is, you're just operating off of a few of them. And so you have to make a call ultimately, right there. Heuristics are diminishing the work of retrieving and storing information in the brain. And so that, again, is reducing that cognitive load. And lastly, kind of streamlining all this stuff, reduces the amount of integrated information that a human needs to make the correct choice or to pass judgment. And so all of this stuff altogether, this is heuristics. These are involved in decision making all the time, all over the place, and they range. So there are some very specific heuristics that you can get into, and there are some very broad ones that you can get to, and they also serve various functions. Let's talk a little bit about heuristics, Barry, what's this first one here? This was very relevant to our discussion today. Yes, very much so. The first one is the price heuristic. In essence, people judge anything that is higher priced items. They have to have higher quality, lower price things. It is specific to consumer patterns. While the outrage heuristic, which people consider how contemptible a crime is when decision on deciding on a punishment. So basically, when you have that price heuristic, if it's higher priced, then you deem it to have higher quality. Therefore, you're going to go for that as you go. Yeah, we messed up the show notes here to be transparent. So there's a lot of different heuristics right. There's. The price heuristic. There's also this one that you mentioned, the outrage heuristic. Then there's also other important heuristics, like representative availability and anchoring and adjustment. I think these have been heard in cycle 101 for many folks. But for some folks who might be coming from the engineering side or the design side of the house, let's get into some of these. The first one up here is the representative heuristic. And so oftentimes in a lot of these heuristics, it's kind of based on convenience and speed of being able to make a decision with these heuristics in mind and with this representative heuristic we're looking at sort of it is an economical heuristic. When you think about you have these things that are recognizable, people tend to choose that recognizable thing because they are familiar with it. You're not putting in a whole lot of effort to understand what it is. It's really difficult to sort of research and answer definitively if an individual is using this representative heuristic alone, but we kind of think that it exists. Right. Or if they're using it alone or using it in conjunction with another type of piece of information that they're drawing a conclusion from. And so there's some mixed research on representative heuristic. And so basically what it comes down to is sort of this recognition memory being memory of the decision option, if you will, being recognizable, perceptive, reliable, and more accurate than chance alone. So those are attributes about the choice. Let's see here. What else about the representative heuristic? We're looking at another flicking piece of evidence here. People don't solely rely on this recognition piece alone. They're thinking that sound recognition might be a larger player when they're using this for additional information, although when that comes to consumer reviews, I don't know how much sound plays into it other than actual reviews like you might find on YouTube or something. Availability heuristic is a little bit easier. Barry, you want to talk about that one? I'll take the easy one, yeah. According to this heuristic, people are inclined to retrieve information. It's most readily available in making a decision. Interesting, it's an important heuristic as it's a basis for many of our judgments and decisions. For example, when people are asked to read a list, then identify names from the list. Often the names identify the names of the famous individuals with which the participants are familiar. In the field of medicine, red line red, my miss medical diagnosis are often attributable to heuristics, the availability heuristic being one of those responsible. They explain that heuristics are beneficial as they're cognitively economical, but caution clinicians and practitioners that they need to recognize when heuristics need to be overridden in favor of more comprehensive decision making processes. So having that idea that, yes, you can use the heuristics, but also knowing where the heuristic may not work properly and recognizing that fact and knowing when you've got to basically go back to basics again. So, Nick, do you want to take us into anchoring and adjustment? Yeah. This will kind of be the last one that we're talking about tonight in terms of heuristics and decision making. But this one's anchoring and adjustment heuristic. And so this is where you need some sort of value. Right. And this, I think, is what's happening with the other aspect of consumer reviews, which we can talk about in the article discussion piece, but specifically when it comes to stars. So some of the discussion around this article is that stars still make a difference or overall rating still makes a difference, but the actual reviews themselves are different. So let's talk about this anchoring and adjustment Heuristic, because I think it'll be important for later. Basically, whenever there's a value attached, the person making the decision is being anchored to that value, whichever is presented to them to begin with. Right. So let's use an example. If we were to talk about in what year did John F. Kennedy take office? This is one for us here in the States. You might use this anchoring and adjustment Heuristic, where you can think about a known date. So like, when did they get assassinated? And so you can think about that November 2263. And then from there you can kind of how many years back. Right. So that anchoring, you're making an estimate based off a piece of known information about that person, but it may not be an anchor that is close enough to the thing that you need. Right. So the practical application here is honestly, when you're thinking about negotiations for salary and the counter offers are based on that anchor that you set. So if they ask you what is your preferred salary, you give them a number and they will adjust based on that or vice versa. That's why oftentimes they will ask you what you want for your salary because then they anchor off of your decision. So really, what you need to ask and push on is what's the base salary that you're hoping to get away with here? There's a lot of things, and especially when it comes to making mistakes, when decision making, there's a lot there because folks will likely gravitate towards that anchor that they've put in place. Right. So if someone lowballs you and says like $30,000, that's pretty low no matter what field you're in. And so if you say, okay, well, 35, well, that's 5000 more than 30, but it's still low. But you might think it's more because you're anchored to $30,000 anyway, kind of getting back to that Heuristic, it requires a lot of effort to get out of. And so it's important to think about that when trying to avoid anchor bias. So we talked a little bit about sort of these cognitive human factors issues when it comes to decision making. But let's kind of bring it back to the article. Let's talk a little bit about consumer decisions, really here. This is a mix of human factors and usability, and there's a lot going on here, Barry. You want to break it down for us. Yeah. And we can go through this relatively quickly, I think. But it's a breaks down into sort of four or five different areas, the first one being and some of it will be repeat what we've already said around psychological factors. So the motivation about why you're making these decisions in the first place, the perception of what it is you think you're getting, the learning and the attitudes and beliefs around not only you but the environment that you're buying it in, you then also got your dive into the social factors of what we're talking about. So we as human beings, we live around people we tend to like to congregate. And so really, if you're trying to buy something, you'll take some of that influence from your family, a local reference group, which might be your work group or some sort of social group, and also any roles and status that you've got. So that could be your local chapter of the HFAs, for example. Then you got cultural factors. So it's not only just the culture you live in, but actually the subcultures that exist. And depending on where you're at, how much social class has to play. But then it's a very personal choice. So there's a bunch of personal factors that come into play. So we've talked about before age or rather life experience, the income that you have and the occupation that you've got, the lifestyle that you need. And then finally and I guess it's probably quite key to a consumer decision is can you afford it, those economic factors? So you've got your personal income. If it's a big purchase, how does the family income come into play? Have you got any credit? They've got liquid assets, and have you got savings. So that's been a very quick rattle through them. But they're also the key elements that come in and help us make or influence how we make consumer decisions. Yeah. I mean, it really is kind of a multifaceted approach to making these decisions. And when you break it down with how many different things are going through your mind at one point, not only are you faced with all these different attributes and factors that are coming into the decision making process of whether or not you need this product, you mentioned it, right, the motivation, perception, all these things, but then also can you afford it? And then all these heuristics stacked on top of that. That's a whole nother layer, right. This whole decision making process. So there's a lot going on here. What we're talking about tonight is really coming down to what other people say about a product is more influential than maybe an average review. So really, I thought let's kind of get back to the article here. And I want to talk specifically about this because that anchoring bias is going to be kind of key here with some of the points that I want to bring up that the article talked about. So one of the authors of the article here said it's the top, it's the text at the top reviews that made a difference. The swaying effect only happened for the text reviews. Without text, people were not swayed. It's the concrete details that are driving this impact. So the research is not saying that average ratings don't matter. If a product had a low average rating, consumers will not consider the product, much less read the product reviews. But in cases where the buyers were comparing different products and reading their reviews, a few of the top reviews can easily sway their purchase decisions. Adding that he also added that the study findings are not limited to app or product reviews. There are some key takeaways for retailers, but I just want to pause there. That's a lot of kind of key takeaways here, right. So when we talk about Heuristics and Anchoring, that's exactly what's happening. When you see a one star review, you're anchored to that one star review. No review that you read will change your mind on it, regardless of how positive that review is. But if you think about the difference between a half star at the higher end, it doesn't really matter that much because you're reading more detailed reviews about those products. Whichever one you find more helpful, whichever one has more detail, that's what you're going to stick with. It's a huge point for decision making because it does get at some of those other factors that we talked about, especially some of the psychological factors. I think it would be really interesting to try to do a review, and I've kicked around this idea before of human factors cast actually doing reviews. It'd be interesting to make sure that we touch on a lot of these different points. Right. Like what is your motivation for buying it? What's your perception of it? I don't know. I find it fascinating. Barry, what some of the talking points that you want to take away from this article? So there's some, I think, factual bits that we need to maybe dive into. We focus on Amazon for this, but obviously there are others out there. As the article was mentioned, one of the things that things like TripAdvisor and things like that use similar mechanisms. But an interesting thing about on Amazon was there was another article that came through that we found in Wired magazine, where in 2015, Amazon began waiting stars using a proprietary machine learning model. Now, some reviews now count more than others in the total average, basing on factors like how recently they are, where they came from, where they came from, verified purchases, meaning that Amazon could actually say that the review actually bought the item that they claimed to love or hate. And people who write about Amazon believe that Amazon also takes into account consideration things like the age of the reviewers account and the average star rating they usually leave. So if they usually leave a three star for everything, if they leave a five star, then that means something more. There also appears to be some sort of discount applied to reviews who predominantly leave negative reviews. So even just looking at that basic review mechanism, a star isn't just a star. We are being manipulated in the background. I think when we're making these decisions, that's quite interesting. The other bit that I think is quite interesting with this as well is you have a five star rating system. Whenever we do any sort of rating, your central, which is your OK, which is like perfectly fine, is three is your middle. And that should be the standard. Anything above that is amazing. Anything below that is bad. Yet with the star system, we are constantly being pushed to go to the five star. It should always be that. And we constantly go for that five star look when we try to discern it. So is the style system that the right method for truly from our perspective, discerning the right measure? I think. Yeah. It really comes down to this is a larger discussion about products themselves. What goes into a review? We've talked about all these psychological factors here, but the experience can be largely subjective when it comes to pieces of art. Right. So, like, let's say you purchase some piece of entertainment, like a movie or a video game or music or something. And so reviews on that I'd be interested in sort of how the star system or just some sort of numeric rating system really impacts those. Because if you think about something like a video game, I use that as an example because you have sort of this large thing to explore. Not only is it a piece of art, but you're also exploring it in a way that perhaps other people are not. You might experience something that somebody else doesn't and that might resonate more with your personal experiences than somebody else. And it's insanely fascinating to me when you start thinking about ratings and reviews for larger, complex things that require more thought and effort to put into them. Right. Because a lot of people just want to slap a score on it and call it done. You bring up a good point with the stars. Right out of ten. I know IGN, a gaming site, uses seven as their base because it's like five to ten really is their rating, but it's out of ten. And so it has to be really, really bad for it to be a one. I just find it fascinating. I want to talk about some key takeaways here for potentially retailers or anyone trying to sort of bring this into practice here. There's sort of this I don't know if it's an ongoing effort by retailers to game the system by sort of providing people with money for these fake reviews. So it doesn't really make sense to do that because people aren't, especially with products that are already kind of rated highly doesn't make sense to do because people are just going to read the ones that are more detailed, and that's going to ultimately sway their decision.
Really. The author says here they really shouldn't spend a whole lot of time gaming these rating systems. The effort is not actually very meaningful or effective based on our findings. Our findings suggest that as long as your average reviews were fine, what matters is the top reviews. So there's another way that you can game the system, pay people to like a certain review that favorably puts your product in a favorable light, and so that might be the new way to game the system based on this research. Anything else? Any other points that you want to bring up here? I guess the interesting for us? Well, the interesting thing for us will be if people are going to rate the podcast and give us favorable reviews, having read our in depth PC and what they bring out of it. But researchers have also recommended online review platforms such as Yelp and Amazon really could benefit consumers by designing a layout that spotlight individual reviews with less emphasis on average rating. So that's given some I guess that's like, UX input into how we could put some of these sites together and make the most out of reviews. Yeah. And I do want to say you stole my Thunder there, Barry. I was going to ask everyone listening. There's a call to action here, everybody. We're talking about reviews because we obviously want you to rate the podcast. No, but seriously, I think there are some good ones out there. If you want to make those top reviews, you can or you can. That's fine, too. There's one comment here on Twitch reexpectation. There are similar expectations with classroom feedback. A three star review is considered bad for instructors, and I think I know that's fairly commonplace in several different domains. So like cashiers also get rated the same way. If it's not a five star, it's bad. And so there's this weird skewing of ratings, and we hold so much importance on it, and really, ultimately, it just doesn't matter. One last time, Barry, any closing thoughts here? No, I think we've pretty much captured that altogether. This one. All right. Well, thank you to our patrons this week for selecting our topic, and thank you to our friends over at the University of South Florida for our news story this week. If you want to follow along, we do post links to all the original articles on our weekly roundups in our blog. You can also join us on our discord for more discussion on these stories. We're going to take a quick break, and then we'll be back to see what's going on in the Human Factors community right after this. Human Factors Cast brings you the best in Human Factors news, interviews, conference coverage, and overall fun conversations into each and every episode we produce. But we can't do it without you. The Human Factors Cast Network is 100% listeners supported. All the funds that go into running the show come from our listeners. Our patrons are our priority, and we want to ensure we're giving back to you for supporting us. Pledges start at just $1 per month and include rewards like access to our weekly Q and A with the hosts, personalized professional reviews and Human Factors Minute, a Patreon only weekly podcast where the hosts break down unique, obscure, and interesting Human Factors topics in just 1 minute. Patreon rewards are always evolving, so stop by Patreon.com Humanfactorscast to see what support level may be right for you. Thank you. And remember, It Depends. Yeah, huge. Thank you. As always, to our Patreon patrons, especially want to thank our honorary Human Factors cast staff patron Michelle Tripp. Patrons like you keep the show running, and it really does mean a whole lot to us. Now we're going to ask you for more. I don't like asking for more, but we just need some purchases behind the scenes that are going to make things really cool around here, and we need money to do so. I'm out of pocket again now. I'm just kidding. You don't have to. Look, if you have the financial means and want to, you can. But we are reinvesting that money into the show. I need you to know that none of that goes into the pocket of Barry or myself, although I suspect that Barry might be siphoning some off to pay for his electric vehicle. I don't know if it's kind of been a only for tons up. Yeah, it's been a rumor. But I do want to say there's another way to support the show. If you can't do Patreon, you can always go to our merch store. We have some fun things over there, like our It Depends shirts. I've worn this one so much. I'm wearing one tonight. I've worn this one so much, it started to fade. It's not anything to do with the quality of shirt. I've worn it a lot. There's a show logo, obviously. Speaking of reviews, you can get our worst review ever that we've got in the form of a T shirt. I'm thinking about wearing that to HFES this year. A one star review. So if you want to support the show and look good doing so, we have stickers over there, too. If you want it to pin sticker or I'm going to Human Factors the shit out of this. We have that over there, too. There's a bunch of cool designs over there. We're always trying to make more. So that's another way to support the show. Anyway, I think it's time that we look out to the community and get into this next part of the show we like to call this Came From.
That's right. It came from. This is part of the show where we search all over the Internet to bring you topics the community is talking about. If you find any of these answers useful, give us a like or whatever. Hearts. Wherever you're listening, watching doesn't matter. Help other people find this content. Really, that's what we're looking for here. I do want to make one kind of quick shout out. We mentioned this in the Preshow, but we got some traction on the Human Factors subreddit this week when somebody asked about which Human Factors podcast or book recommendations do you have? We jumped in kindly and said, hey, there's two, there's three. You can listen to Human Factors podcast. There's twelve two, the Human Factors Podcast. And there's Human Factors minute if you want to support the show again with that theme. So we don't really ask a whole lot of our Human Factors cast army, but if you do want to go onto that post and give it a like and boost, we would certainly appreciate that. Anyway, we got three for you tonight. This first one here comes from Zenaxia. I hope I said that right on the User Experience Research subreddit, one that we don't typically frequent but might more now that I fix the bot, they're asking specifically about representative and qualitative UX research. Greetings, I'm learning a bit about representative and changing the way I recruit users accordingly. I was reading Think Like a UX Researcher and saw this statement. I'm going to truncate here. Engaging a representative sample of participants in UX Research sounds like a good idea, but it's flawed.
Theoretical sampling provides a more practical alternative. It's a long quote. Anyway, this is quite an eye opener for me because where we work I often aim for a representative sample rather than a focused sample. Me being a junior UX researcher, changing my recruiting because I read something in a book will probably not be taken kindly though, especially since I work at an agency. We sometimes just do one day of user testing a product, and that's all we ever do for a product, so aiming might seem dangerous. I'm curious about your opinion or personal experience about screening for users and whether this is just a radical idea or if there's a lot of truth to it. Barry, let's talk about representation in UX Research. What is your experience with it? What do you do from my experience? I come from a slightly odd background, working primarily on the defense side of things. I kind of take what I'm given and it is a bit of a get out into a certain extent, but we do try and do what we can to get as representative as we possibly can. But in the grand scheme of things, we are working on some platforms where literally everybody's out operating the platform. The people you get are the people who just happened to be on site or whatever the case may be. I do think that you try from my perspective, try and get the edge cases as best as we can. Certainly when you talk about physical type of product as well, to make sure that you can do the fit form function ideas, I can Whimp out of this one to a certain extent by I don't normally have the luxury of choosing. I do think it's right with what they say in the agile. The agile environment sometimes I think gets used as a slight excuse for this, but because if the Agile environment is stopping you from doing what you need to do, then you need to change how you've customized Agile. But fundamentally, I get what they're saying, that it can seem very fast paced and we don't have the time to do things necessarily how in a human Factor's perspective would do it. And this is one of the differences between human factors and UX potentially is because I think UX can get away with this to a certain extent with the way they do it. So, yeah, I think for me it depends. And I want to push the big red button. Push the big red button for me. I think this really opens up an interesting discussion because I think for me there's three levels and it's all speed, accuracy, trade off. If you have access to some users is better than access to no users, but having a UX person on hand is better than not having anyone on hand. And so you have kind of these various steps that you can take. Step one, hire UX or human factors person. Step two, get users. If you can't get users, then rely on the expertise of the human factors person. If you can't get users, but you can get subject matter experts. That is where we're kind of talking about this other piece of easy access to people that are familiar with domain that can provide answers that you're looking for. So that's kind of a preference over no users, although actual users is better. When we get all the way up to the cream of the crop, we're looking at a representative sample of real users across various different domains that you're looking at, right. Or various different companies, whatever it is that you're measuring, that's kind of where we're at is that representative sample. And that's the best. But I think the point here is that it doesn't always work when you have internal processes and procedures that are trying to encourage you to get feedback quickly. And so when it does come to UX or sort of these companies that engage in Agile style processes, you're looking at sort of what can you get quickly that's going to give you enough information to get by? Is it going to be the best solution? Probably not, because you haven't talked to everybody, but it's going to be good enough. And I think that good enough approach is kind of what industry UX or industry human Factors is pushing a lot of the time. Any other thoughts on that one, Barry? I guess I'll just repeat something that you said in previous episodes as well. It's picky battles sometimes, yes. It might be a nice opportunity just to save some of your resources because we generally have limited resources and things like that as well. So if it's part of an engagement that actually you've got more important things to be thinking about, there might be more fundamental issues you want to solve you might want to pick your battles in that as well. Yeah. All right. This next one here is about transitioning into UX, landing a first job, and finding a mentor. This is by Wonderful Quality 34 on the User Experience subreddit. There's some backstory here, but I'm going to skip straight to the questions because I think these questions are kind of self explanatory and are pretty great. So there's three of them here. First, one question is what is the best way to gain work experience? Pro Bono design work some other way? Second question is what should I expect as an applicant when applying an entrylevel or intern generalist position? And three, finding a mentor that has a similar career path. How do you find those? And so with those three questions, let's break them down one by one. Barry, what is the best way to gain work experience for someone who's potentially getting into UX for the first time? I suggest you go and get some work experience. It's a tough one. I don't necessarily always agree with a lot of people who say go and get that voluntary work and things like that. It isn't everything that you can get, but sometimes it works. But get yourself out there and try and find them entry level positions. They're not always all there as you're going through your degree or your training, trying to get you some companies at that point, particularly if you get some industrial placement or something like that, make the most of any sort of opportunity you've got to engage with in this industry or anything like that, because they might be willing to give you some early ins and maybe some holiday work and things like that that will do. That where the volunteer work does come in. My fear with volunteer work to some extent is you don't necessarily get the drive and the pressure of having to get work done on time to a certain level of quality, because with pro Bono work or voluntary work, sometimes you just can't come to that same pressure. But anything is better than nothing. I would also suggest that you go and I don't know, look up a good design lab or some sort of voluntary lab. They're normally quite good to do. I don't know if you know of any Nick that might be worthwhile looking at. Wow, you almost took my answer there. We have the human Factors digital media Lab. You can look that is pro Bono work. You're right. There is not sort of that external pressure to perform well or to, although that doesn't. I only take people that perform well. We have great lab members, but we could provide that pressure, but we don't. We're very lax in the lab for a good reason. But yes, and I've gone over some of these examples before. Go out looking for existing problems and try to solve them. Use those to pad your portfolio. It doesn't have to be you can send them to if you want. But I really need to make a class on this because this is one of the most common questions and it's not real world experience or it can be real world experience if you do it correctly. Anyway, look for problems where they exist and try to solve them and that might be a good way to flex your skills. Find those examples. Second question here. What should I expect as an applicant when applying at an entry level or intern generalist position? This one kind of goes very good with the next question, so maybe we'll skip that one for now and get back to it in a second. Last one here. Finding a mentor that has a similar career path. How do you find those? Barry? So certainly in the UK side of things, my first Port of call would be the challenge factors. They who run membership programs and set you up with people probably not in your own company, but people alongside who can then help you and walk you through with that. So that would be my first protocol. But when I have new members join my teams, I generally try and find them a mentor, particularly if they're new to industry and I will generally try and find people who will work with them who are not part of my company. I would generally go out and it doesn't actually matter if we're in the same field really though it is helpful to know that they worked in defense field before. But fundamentally you're looking at people who will try to match what you're doing and basically nudge you in the right direction. Just because you get put in front of a mentor doesn't necessarily mean you have to take them. The relationship has to work and that's probably the biggest thing I would take out of this. Just because somebody said they'll stick their hand up and be a mentor doesn't necessarily mean they're the right person for you, because you've got to have that person who is a shoulder to cry on when things are going bad and help pick you up type thing when you're maybe not feeling at your best, but also when you've done some really great work, they'll be the ones to help applaud you and channel that into something good. But then there are also the people that you want to be able to go and say actually, maybe things at work and going so well, how can I approach my boss, et cetera, et cetera. So they become your confident as well as anything else. And so it's not a friendship though there's a good chance of friendship will evolve. It is a professional relationship, but if you get a really good mentor early on, then it's absolutely valuable. So talk to people like say, in the UK, go to the CIAF, they will sort you out. I don't know whether HFS or has a similar sort of set up or anything like that. Nick, how do you go about that? I believe so. I think of mentorship in two different categories. There's kind of forced mentorship and then there's sort of, I guess what is it a natural inclination for mentorship? So the forced one is obviously any institution that pairs you with somebody else internally and says you are going to work with this person and they are going to mentor you, and that can work. That can certainly work. But I think you're right. You do lose some of that confidentiality when it comes to it. You can't be like my boss is being a jerk to somebody that you work with. You could, but it's probably not the best decision. And so if you're wanting to look outward towards other people in different companies that you can truly, truly get the most out of, I think that's it you also kind of have the situations where you go to grad school, you're looking for somebody that has a similar interest to you to begin with, and you're kind of pairing up with them because of the work that they're doing. And so when it comes to a grad school environment, it's almost forced in a lot of ways because you are looking for someone that is just in your same vein and you have to work with them because you need your degree. And I feel like that's kind of forced, too. Then there's the natural ones that form as you're working with somebody who might be more senior to you, you can kind of say, hey, look like I have these questions to ask. Can I run them by you? I've had a couple of those along the way, too that I really trust. And actually one of them is in chat tonight. And so I really appreciate their opinion on a lot of things that I can bring to them and say, hey, look, I'm in this situation awkward if you work together because you can't say this, that the other thing doesn't have to be, especially if they're experiencing some of the same pain points. So that's another way. Just one more point on that. As long as we're talking about the lab, there are mentors in the lab that you can reach out to. In fact, we have some people in the lab who solely mentor. Barry and myself do more than just mentoring, but we are part of that crew. So please reach out to us. The last one here. What should I expect as an applicant when applying to an entry level or intern generalist position? I'm going to tie in with this next question from the Dying Sailor on the User Experience subreddit. They should say, how should I be applying for UX jobs? So I'm going to read The Dying Sailor and then kind of reframe the other one. I'm going to be graduating soon. I'm looking for a junior UX position. How should I be applying for jobs or engaging with recruiters to get an interview. I've only managed to get one interview so far though a friend said my portfolio is in good shape. It's exactly what they were looking for. I've been able to grab anyone else's attention. Sitting down and applying for jobs doesn't seem to be doing it. Do you guys have any tips on how to go about the process for me? Again, how do you get it and how do you get these interviews or apply to the jobs and what should you expect as an applicant I think are very, very similar. So let's talk about it. What do you do from a junior perspective to apply to these jobs? Barry? It's about 20 years since I've sat down for junior. But from somebody who's on the receiving end of this quite a lot now
you just got to keep going and you've got to keep putting yourself out there. So yes, keep applying for the jobs. It's keep searching and the Internet is pretty much now that you can get yourself the right sort of places to see the adverts, talk to recruiters. The more you talk to recruiters and other recruitment agencies and then they know you're there. There's more of a chance that they will then be able to try and do some matching with you. But then it's going back to and we do seem to be pushing things like the Lab Digital Media Lab tonight quite a lot. But again, get into them sort of conversations, get into them sort of communities. So HFCs CIHF, wherever you're at in the world, there is an ergonomics association near you with either local chapter or a regional group or an online forum or something like that. Just get talking to people. I've met more people, interesting people who've got something to offer, maybe wouldn't pick up their CV and pay as much attention as you do when you meet the person and maybe got some interesting ideas or anything like that and they just Pique your interest. And I'd like to see more of you or whatever. For me, there is no golden thing here that there's a magic trick that will solve this for you. It is hard work. We look in respect that actually the uxhf world is still comparatively small. There is going to be more things out there that generally certainly there's more jobs than people out there at the moment. So you just got to keep on pushing yourself out there. Don't lose heart. It's still a great place to be.
Yeah. I think you hit it right on the head and I think that was the point that I was going to bring up is we say it all the time but it's about who you know, I think that's critical for getting into the field because for me I don't know about you, Barry, but my first job wouldn't have happened the way it happened if I didn't know somebody that knew somebody. And so it's one of those things where you really want to get involved in those communities, get involved with other people, because you might meet somebody that says, oh, we have an opening over here for a junior position. I'd love for you to join us and just kind of rope you in. And I think really when you're considering what the application process is, if you don't know anybody and you're just playing kind of out of the blue, you're going to get a lot of those hit easy apply on LinkedIn just a million times at that point, it's difficult because you don't have really the portfolio of work to back your ability to perform up. And so it is one of those things where you really just need to either know somebody or keep banging your head against the wall until you get in. And I hate that that's the way it is. And there's no like, mechanism professionally. Well, there are in the professional organizations, but again, you have to get involved with those. But there's no mechanism that puts you out of school or out of your design work courses or whatever you're doing and puts you into a job. There's no kind of transition piece. It's kind of on your own unless you have a good mentor, unless you have a body of work that you can present unless you know people. So that's kind of it for that question. Great questions. Tonight we're going to get to this last part of the show. We just called One More Thing. We're running out of time here, so let's keep it short tonight. Barry, what's your One More Thing? So my One More Thing is it's the citizens conference on Monday, which I'm thoroughly looking forward to. It's going to be great to get back to a physical conference. And I thought I was just going to lounge around and go around with my microphone, interviewing people and all that sort of thing and just have a great time. However, with my new responsibilities, I'm now chairing sessions, I'm supporting people, delivering keynotes and all this sort of stuff. So, yeah, my lazy two days of just enjoying the ambiences has kind of gone out of the window. Oh, man, I'm very excited. I'm excited to see what you come back with. And honestly, I kind of expected all those responsibilities to creep up on you, but not so quickly. All right. My One More Thing this week is it's a small win, but it's a win. I've kind of been in this weird spot with my hobbies recently where I haven't wanted to do much. But this week we gave my son a large Hot Wheels monster truck for Easter, and I modified it by putting LEDs, like red LEDs in it's a crocodile monster truck. So you have, like, you know, red eyes. So I put red LEDs in the lights and like a green Led pointing down. So it has this green glow below it and a little switch on it. I don't know, it's cool. It's a small win and no other kid has that truck so that's kind of made me feel good. Anyway, that's it for today, everyone. If you liked this episode, especially the bits about how our minds make decisions, I will go ahead and encourage you to go listen to episode 219 where we talk about mental maths and problem solving. That's actually Barry's first Human Factors cast episode. Go check that out comment wherever you're listening to what you think of the story this week. For more in depth discussion, you can always join our discord community. Visit our official Website sign up for our Newsletters Stay up to date with all the latest Team Factors news. If you like what you hear, you want to support the show, leave us a five star review that has a lot of details about why people might like this show. You can always tell your friends about us. Word of mouth really helps us grow and if you like, if you have the means to support us on Patreon, buy stuff from our merch store. All that any links to all of our socials and our website are going to be in the description of this episode. I want to thank Mr. Barry Kirby for being on the show today. Where can our listeners go and find you if they want to talk about making big decisions? So if you want to make big decisions, head on down to my Twitter at Basmaskokay or come to the Twelve Two Humanfactors podcast which is a twelve two Podcast.com. As for me, I've been your host, Nick Rome. You can find me across social media at Nickrom. Thanks again for tuning in to Human Factors Cast. Until next time, it's.
If you're new to the podcast, check our some of our favorite episodes here!
This week on the show, we talk about the human factors behind AI content generation. We also answer some questions from the community about tools for tracking multiple research projects, self-education, and tips for those just starting their graduate degrees.
Recorded in front of a LIVE studio audience on October 11th, 2022, in Atlanta Georgia. Hosted by Nick Roome and Barry Kirby. On this bonus conference coverage episode of Human Factors Cast we recap our livestream coverage of ...
Take a deeper look into the human element in our ever changing digital world. Human Factors Cast is a podcast that investigates the sciences of psychology, engineering, biomechanics, industrial design, physiology and anthropometry and how it affects our interaction with …
This week on the show, we talk about how drivers could get alerts from nearby pedestrians’ phones. We also answer some questions from the community about ideas on how to reduce meeting overload, research repository development, and resources for shy/introverted …
This week on the show, we talk about THE LINE, The City of the Future. We also answer some questions from the community about managers hiring unqualified researchers, where to discuss minutiae of daily work with more experienced colleagues in …
This week on the show, we talk about how New York State is giving out robot companions to the elderly. We also answer some questions from the community about advice on working with a solution-oriented (rather than needs-oriented) product team, …
This week on the show, we talk about how new technology will allow us to feel sensations in and around our mouth, We also answer some questions from the community about lucrative sectors in HF/UX/HCI, our ideal work environments, and …
This week on the show, we talk about how paramedics might use jetpacks in the near future to save people’s lives. We also answer some questions from the community about how to push back on developer suggestions, advice for conducting …
This week on the show, we talk about sex as a form of distracted driving and answer some questions from the community about needing good handwriting skills in UX, coping with job rejections after receiving positive feedback, and we discuss …
This week on the show, we talk about what the term “mental health” really means and its impact on Human Factors. We also check in with the community about what keeps us excited and motivated about Human Factors and User …
Join us for our first HFES Presidential Town Hall. HFES President Christopher Reid and HFES President-Elect Carolyn Sommerich will join Human Factors Cast host Nick Roome for an hour long podcast discussing latest HF/E industry news and trends.
Recorded live on November 4th, 2021, hosted by Nick Roome, & Barry Kirby.
Pre-Recorded in September, 2021, hosted by Nick Roome.
Recorded on September 3rd, 2021, hosted by Nick Roome and Christy Harper
Recorded on April 1st, 2021, hosted by Nick Roome…
Today is October15th, 2020 and it's an all new Hu…
Today is October 9th, 2020 and it's an all new Hu…
Today is August 11th, 2020 and it's an all new Hu…
Join us for our bonus interview with Dr. Mica End…