Human Factors Minute is now available to the public as of March 1st, 2023. Find out more information in our: Announcement Post!
Sept. 8, 2023

E293 - Neurorights for Brain-Computer Interfaces

This week, we delve into the fascinating topic of how new neurotechnology is challenging the boundaries of mental privacy. We also address community questions on HF traveling jobs, crafting research questions, and the length of time researchers keep their recordings. Tune in to gain insight into these thought-provoking subjects. 💡💬✨

#neurotechnology #mentalprivacy #humanrights #HFjobs #researchquestions #researchers #recordings #communityquestions #podcast

Recorded live on September 7th, 2023, hosted by Nick Roome with Barry Kirby.

Check out the latest from our sister podcast - 1202 The Human Factors Podcast -on Balancing Academia and industry in Human Factors - An interview with Dr Mark Young:

News:

It Came From:

 

Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!

Vote Here

Follow us:

Thank you to our Human Factors Cast Honorary Staff Patreons: 

  • Michelle Tripp
  • Neil Ganey 

Support us:

Human Factors Cast Socials:

Reference:

Feedback:

  • Have something you would like to share with us? (Feedback or news):

 

Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.

Transcript

Hi,

 

[00:00:00] Nick Roome: everybody. Hi, this is September 7th, 2023. This is Human Factors Cast, episode 293. I'm your host, Nick Rome, and I'm joined today by Mr. Barry Kirby. Hello. Hi, Barry. Your head's not wet.

 

[00:00:15] Barry Kirby: Oh, it is. You just can't see. Okay. All right. Oh,

 

[00:00:21] Nick Roome: nice. For context, go watch the pre show. We've got an amazing show for you all lined up tonight.

 

We're going to be diving into the fascinating world of neurotechnology once again, and how it's blurring the lines around mental privacy. and what it means for our Nero rights. And later on, we'll be answering some questions from our community about traveling jobs for human factors professionals, creating research questions, and when to delete usability test recordings.

 

Nice refreshing. Change from some of the normal questions we get. But before we jump into all that, let's kick things off with some programming notes. Hey, we're not gonna do a show next week. No show September 14th.

 

And if you do have the means to, you, we would love for you to go leave us a review and tell your friends about the show. And if you have the money and you want to do that too, we have a Patreon. You'll find all about that at the link. But Barry, I have to know what's going on over at 1202. So

 

[00:01:17] Barry Kirby: 1202, the latest interview is an interview with Dr.

 

Mark Young. And he's one of these people who spent a fair bit of time in academia and in industry, and he's just flipped back again to go back into academia. So we talk about that balance between academia and industry and what it means, what both sides could learn from each other in terms of what our priorities are and what our drivers are.

 

Thank you. Also, Mark is the upcoming president, president elect of the CIHF. So we talk a bit about what his ambitions are for the Institute. So well worth go listen.

 

[00:01:46] Nick Roome: Perfect. Let's not bury the lead. Let's get into the show, into the article of the show and the thing with.

 

I love it when I cut myself off because I'm just stumbling over the words. This is the part of the show all about human factors news. Barry, what is the story

 

[00:01:59] Barry Kirby: this week? As long as you promise not to cut me off before I finish. The story this week is about new,

 

[00:02:03] Nick Roome: okay, I promise not to cut you off. Oh, sorry.

 

[00:02:05] Barry Kirby: I asked for that, didn't I? Totally. The story this week is about new neurotechnology is blurring the lines around mental privacy, but are new human rights the answer? So in the article, the author discusses the advancements of neurotechnology and the ethical and privacy concerns that come along with it.

 

They highlight the growing number of companies developing brain computer interfaces, BCIs, as we've referred to them in the past, and the potential benefits for patients with paralysis or neurological disorders. However, it also raises the important questions around mental privacy and autonomy. They argue that while neurotechnologies can record brain activity with great specificity interpreting and reading the activity is complex and not straightforward.

 

They compare the privacy risks of neurotechnology to those Of more familiar data collection technologies like online surveillance and wearable devices. The article also delves into the concept of cognitive liberty, which is the right of individuals to think independently and autonomously. Proponents of cognitive liberty argue that greater regulation of neurotechnology is needed to protect individuals freedom to control their own thoughts.

 

However, the author suggests that the way cognitive freedom is discussed neglects the relational aspects of who we are and how we think. They emphasize the importance of acknowledging the many influences and forces that shape our thoughts and advocate for a holistic approach to protecting privacy and freedom in an era of advancing neurotechnologies.

 

This article raises important questions around the social and ethical implications of neurotechnology and calls for thoughtful consideration and regulation in order to protect individual rights. So Nick, what are your thoughts on oh, wait, hold on. Can I not just plug a USB into you and read your thoughts directly and it'd be there for all to see?

 

[00:03:46] Nick Roome: Wouldn't that be scary? So this topic might sound a little familiar to listeners of the show. There's going to be some overlap with another episode that we did back, I think in March of this year was talking about Nero. Privacy it was episode 279. So we're going to try to focus on the neuro rights rather than the technology piece.

 

Like I said, there'll be some blurring and some overlap here, but. It'll be a continuation of that discussion. My first thoughts is, yes, we should have human rights to our thoughts and our privacy and our brain. Because no way am I letting Elon Musk inside my head. I don't know how you feel, but if we're going to continue the theme of last week of hitting Elon Musk over the head of bad decisions, no way am I letting him in my brain.

 

I... I get worried about the aggregation of data from things like my browser history or my search queries, like how can you give me a targeted ad when I didn't even search for that thing, but some, and I may have said it and my thoughts somehow have connected. When I say thoughts, my, my search patterns my browsing history on social media or whatever has somehow indicated to these algorithms that I needed an ad for deodorant or something.

 

I don't know, right? How does it do that? It aggregates all the data and how does it transform that data? We don't know. There's, if you can do that with brain and thoughts that's a lot more scary and especially when you start to get to like neurodivergence, where you might have more errant thoughts than maybe someone who's neurotypical.

 

So I was bringing it up in the pre show, but invasive, intrusive thoughts, like what if you have those are you going to get an ad for something? I don't know but I, to me, I think this just calls for an incognito mode for your mind. Can you turn it on and off?

 

You brought up that question also in the pre show. I don't know. I'm cart before the horse here, Barry. What are your initial

 

[00:05:52] Barry Kirby: thoughts on this? My initial thoughts are quite long. I think for, to reinforce a point that you made let's not worry about specific technologies. Let's give Elon a break.

 

Assume fundamentally that, for the sake of this discussion, that this is a thing that we can do or we're close to it. But as we get close to it, It is absolutely right that it is something that we should be talking about. We have learned through the examples you've just highlighted around the evolution of the way that people can use and abuse data and also, really interrogate it in a really deep way.

 

That that we didn't know about before. So the way we talk about this, a lot of this is evolutionary, but in a similar way that we talked about the the space stuff that the FAA we're doing because we now know we've learned that about that. We need to be doing some of this early that we should really use the example of this as well.

 

And yes let's talk about, and let's put some rights in place. again, having that piece around it, that we don't restrict what we can do. Cause I think there is some amazing things we can do with this. We talk about a neurodiversity. I think this will lead us to understand better what neurodiversity is all about.

 

And actually I think the, we'll find out that the idea of being neurotypical is probably quite a rare thing. That we understand more about the wonderful thing that we currently. spectrum, which I think is way, it's way too simplistic. And now this will help us learn more. We don't want to restrict that type of understanding, but equally we we are a long way.

 

I think from the movies would have us believe that, or, a simplistic way of understanding this is with somebody. puts a spike, a lot of the matrix inside your head. And we will just get a verbatim stream or a movie of what your thoughts are. And we don't think, I certainly don't think that, I don't think like that.

 

That's, I don't know. We don't really know how anybody else thinks it's our extrapolation of doing that type of thing. So what that looks like, how we interpret that is still miles and miles away. So it does frame the question around. Neuro rights and what are they actually consisting of so we've quite rightly said we have freedom of speech What about the freedom of thought if you're having nice pleasant thoughts, Fluffy clouds and nice beer that type of thing You know that they're all fine thoughts.

 

But what about thoughts that might be? Close to the edge that the you might be having violent thoughts you might be having thoughts about other people that are fine in your own head or you might have an opinion that you never voice, but you might still think, but you have per, you have personal self control that you don't do anything about that.

 

Is it right that actually we could actually censor the way that you think? So the, obviously the obvious film correlation here is 1984, where you have thought speak and thought crimes and things like that. We would, we need to make sure that our democracy relies on the need for people to be able to think and have that freedom of speech.

 

But I think also, this is more than just simple, and I use that word in, do the whole inverted commas thing, simple civil liberties. Because. It's not just about being able to communicate with other people. What about, what does it mean for communicating at home with loved ones? Will there be a demand almost to be able to hook up on that mental level?

 

And there'd be that level of expectation to do that. And what could that mean for relationships? What could that mean for for building up relationships and the relationships that are breaking down? What do we have to have a secret? To have our own personal thoughts, our own personal opinions about things that we never pass on to anybody else.

 

Having now then, so that's a whole one package that I would like to explore. The second bit is how do these thoughts get stored? So have we not talking about specific technologies, but around, if we are thinking these things, they get uploaded somewhere to the cloud, whatever that cloud looks like, who owns a cloud, what rights do they have to with that cloud?

 

You mentioned the whole. Sort of Google ads type of thing. What would Google do with that would do with that or and, or any other company who gets their hands on it. And then if somebody else can get hold of your experiences, I don't know if they use. Your experiences for their own gain. So if you've done something and then they use the way that you've done something, cause they've basically shared your experience that they've hacked into it or whatever it is, and they use your experiences for their own benefit without asking your permission.

 

Or you knowing about it? Isn't that theft, is that something around that? I, and I dunno quite where that goes. There's, I know there's been loads of work done on it on various aspects about doing this in a positive aspect about being able to share information, share that, that type of thing. But what about on the nefarious side?

 

I think I've rambled on quite a lot about a whole bunch of different things. Where do you wanna start?

 

[00:10:47] Nick Roome: Yeah that's a good question. I, you mentioned too, I'm thinking of this multi dimensionally as well. I'm thinking almost so to get at your point about your own thoughts I feel like there's this right to autonomy.

 

We should be able to think freely without any influence from outside sources or interference from outside sources, right? And I think that Is should be the foundation. We have the right to think freely. I think there's also some of the things that you brought up around data ownership and where's that data stored.

 

I think that there you know, there should be some, if you're thinking about almost like a bill of rights for our thoughts I feel like, or our mental privacy, however, Nero writes as this article puts it, I think we must think about who owns that data. And it should absolutely be the person with those thoughts, because they are the originator of those thoughts.

 

Is there a shared ownership? I don't think there should be. Is there licensing? I don't think there should be. But again, how do you use that for some of the technology? I don't know. I just think that you should have the ownership to those thoughts wherever it's stored on a data center. Because you are the originator, and there shouldn't be any gotchas in the...

 

I think there's also an element of consent with collecting thoughts. You should have a right to consent, for lack of a better term, what data can and can't be collected. I mentioned incognito mode earlier to me, that's I don't want you collecting anything right now. I am in a space where I am brainstorming ideas for, I don't know, offshoot podcasts, and I don't want anybody stealing those ideas because they're stored in a database and you don't have good privacy or, whatever it is to turn it on and off, right to consent when those technologies come in.

 

I think there's also along the same lines of the data center, I think there's Gotta be rights for being able to erase that data and protecting people from having their data stored against their will. When that happens, there's also the transparency elements about data centers and all that stuff too.

 

So I think there's a lot of different... Things that I'm thinking about from that perspective as well. And it, it almost be fun to think up a bill of rights. I don't think that we're fully capable of doing that right now on this podcast on the fly. It's not like we've prepared that extensively for this show.

 

Not to the extent that we would go through and look at bill of rights and everything, but you think about some of these things in connection to older technologies, right? So like the autonomy you can think about how some of these algorithms on social media are. Influencing our thinking and decision making and you can, like I said, with those advertisements, like those are coming from somewhere and it's the connection of browsing history from like another data perspective.

 

I can see it being like, yeah we need to avoid that. And I think that should be absolutely true for social media too. You should have that sort of autonomy. I think when you compare this to other rights that are codified in various. Governments across the world and not so much in others, but like this is very similar to like freedom of speech.

 

You have the right to express your thoughts without interference, not without consequence. That's another interesting piece though.

 

[00:14:24] Barry Kirby: Although I was going to say, because the, the freedom of speech is, I'm going to use the wrong terminology, but the freedom of speech isn't an absolute right.

 

It is a, it's a conditional right on, we're seeing that I think more and more at the moment around social approval. So the way we talk about race, so we've got the protected characteristics, for example. So when we talk about, the absolute right to freedom of speech, actually, we don't necessarily have that.

 

In that same way. So is there, in the same way that we have protected characteristics in terms of freedom of speech and things like that, how do protected characteristics work? And yeah, you're a thought because exactly wow, head blown. So let's go a bit simpler than that because I think we need to come back to that one.

 

[00:15:09] Nick Roome: Okay, let's come back to that one. Where do you want to go? Where else? A couple of

 

[00:15:12] Barry Kirby: simple scenarios. When you get arrested, you have a you can take the fifth in, in America, in the UK, we have the the right not to say anything unless it harms your defense, blah, blah, blah, blah, blah. Thankfully I've never had it read to me, so I can't I can't, I don't know the Miranda off by heart, but the police have a right to question you when you're being when you've been arrested and all that sort of stuff.

 

And then they can take whatever you say and use that as evidence. If you've got an ability to have a device, and say it's actually a fairly simple device, it's equivalent of USB or whatever it is, but they can actually just hook you up should they have the right to do that? What do you think?

 

[00:15:56] Nick Roome: I thought you were going simpler!

 

This is, to me, a more complex situation.

 

[00:16:00] Barry Kirby: Actually, no I, so I think the fundamental principle of the poli the, our let's split up police from justice for the moment. Okay. So just the police you've been arrested, Nick, you've been arrested. You've done whatever you've been doing. The police are, and they want to know what you were doing at a particular time, et cetera, et cetera.

 

Do you, do they, and they, and we have the technology, it's easy, blah, blah, blah. They're a trusted entity because we elect and the police force, should they be able to plug into you and be able to get where you were at a particular time and what you were doing and thinking,

 

[00:16:33] Nick Roome: et cetera.

 

Okay. This kind of goes back to my point about consent for me. I think there's a right to consent. If you say I've done nothing wrong, you can plug in. And if they find something that's not, you did do something wrong. You've consented and that is your choice. But I think this. Connects to other things like end user license agreement.

 

Okay. Let me

 

[00:16:55] Barry Kirby: take that a little bit further. Cause you've got, there's some, there's two things that you've raised there, which I think is really interesting. Firstly, if I don't consent, does that imply guilt because I'm hiding something now we have, so you, I think in the, in it shouldn't.

 

But if you give a a no comment interview, for example, then that can be characteristically used against you in the courtroom. I think in the, your Miranda rights still say that actually, you can, you could, you don't have to say anything unless you don't have to say anything at all.

 

Cause you've got the, in the UK, we've got it's our. They change slightly differently. You can no comment, but if you later rely, want to rely on something in court that you haven't said previously, then that makes it more difficult. So they're trying to encourage discussion. So that's one thing.

 

So if you, so you can no comment, do the equivalent of a no comment. No, you can't plug in. What are the consequences of that? And I think that's almost captured already in that idea of I'm doing, I'm taking the fifth. What happens. They plug in you've consented to this, that you said you can plug in and prove that I wasn't jaywalking or I wasn't speeding or whatever it was, or I didn't murder that individual.

 

They find that actually, yes, that's legit. You've that's proven your alibi. That's, you weren't you didn't do it, but actually they accidentally find something else. They maybe look a bit, cause as we said, is the technology going to be the sort that says I'm all, I only want to look at thoughts between nine 32 and nine 47.

 

I can't see it being as simple as that. There's going to have to be some level of analysis, some level of interpretation almost a psychological interview type thing. And if they if the person doing the interpretation, that type of thing finds, oh no, they didn't do that, but actually they've been doing tax fraud for the past 40 years.

 

Is that admissible? That's one thing. And I think it's interesting because I think it, that's, you can go down to a whole web of things there. Twist it though, let's make it a force for good. You've had knee complaints, you're going to go and see the doctor, will the you get to the doctor and they say, it's okay, just let me hook in and and then I'll be able to understand what you were doing, what you were thinking at the time, why you did such an activity or even, would this work if you're unconscious?

 

That'd be cool or not. Do you have to give positive consent? to plug in because if you rock up to the ER unconscious and they have the ability to plug in and work out what you're doing for completely good, genuinely good reasons, but then they find stuff out. Where do you have to consent before that happens?

 

Or can people do just jacking to your brain just for fun? What do you think? Yeah,

 

[00:19:37] Nick Roome: I think there's. You've brought up a lot of really interesting questions. So there's you could let's start with the medical. Cause that's where my head is at right now. Okay. So there are informed consent practices in medical.

 

And I think that is a good analogy to take when it comes to plugging in at the doctor's office. Do you give consent? Yes. Patients must be informed prior to whatever procedure they're. getting done, or in this case a BCI, that what data is being accessed and they gotta give consent for that.

 

And I think that is going to be fairly analogous to the rights that are present today. I feel like that's just going to be built on. When it comes to other things, like if you're unconscious, I feel like you must have other safeguards in place In some cases, there's like people with DNRs do not resuscitate instructions basically that if they go offline, they're not going to reboot them.

 

So I think something like that could potentially exist for this. If I'm unconscious, do not plug me in. Even if it could be life saving, do not plug me in. And I think that is the right, that falls under that right to consent to. Now when it comes to law, I don't know all the answers, and I feel like Before we jump into the law a bit okay, go.

 

[00:21:02] Barry Kirby: With this informed consent with the doctors, what am I consenting to? And the reason I ask the question is, if I consent to a blood test, I consent to a blood test and they take the blood, but they still turn around and I'm consenting to a blood test for, this test. You can only use my blood to do, or you get, you, you are using my blood to do these four tests.

 

You can't then go off. They're not allowed to then go off and do a say a pregnancy test or a, another test that you haven't approved to you haven't approved the use of, if I'm saying. Yes, of course you can jack in what am

 

[00:21:42] Nick Roome: I okay? Be careful with your language here. We don't want to get that's true

 

[00:21:45] Barry Kirby: But it's a that's a good matrix phrase.

 

So that's fine If I'm hooking up, what am I? Allowing you to see and how do I? That's true How do I know whether my rights have been infringed or not because it's not like you can say Oh actually look I've got evidency. You took five tests instead of four. You took two vials of blood instead of one all you've got is At the moment, we assume that your, the thoughts are not getting sucked out of your head.

 

They're they're being read. Or they're, interpreted in some way. You presumably don't know what has and has not been read, interpreted whatever the terminology. Okay.

 

[00:22:28] Nick Roome: All right. All right. Let's cart horse this. Because I think there's two things going on here. I think. One, we're talking about rights in the sense of in a perfect world, this is what should happen.

 

I think what some of the stipulation that you're throwing out there could be seen as this middle ground where technology could potentially overstep when we haven't necessarily figured out how to rip the correct thoughts out. And so I guess what comes first, the rights or the devices? The devices are already out there, or they're starting to be developed, which is I think why we need the these rights, because the rights will drive the limitations of the devices.

 

If we say that you cannot access these parts of the brain or whatever it is that is happening, then. These companies will build the devices to do exactly that. Now they might fail, they might need to get government approval to do some things, it might be red tape and caution tape everywhere to get those things done, but they'll happen in a way that Will be adherent in adherence with those that bill of rights that the neuro rights if it goes the other way and it happens like AI Where there's this big boom and everyone's oh look what we can do with this We can all these nefarious purposes We can fake your grandma's voice and get your pin number without any laws in place, right?

 

And I'm sure that breaks some but that's up for interpretation Right now because it's not codified in an AI bill of rights But if we had done the AI Bill of Rights before all this technology, then, and we had specified who's liable in those situations, then there might be some limitations on the technology.

 

So I think we're thinking about the cart before the horse. I'm just talking about from like a, what should be perspective and not like what happens if perspective,

 

[00:24:36] Barry Kirby: if that makes sense. I think the way that a lot of good policy is. All these rights are because it rather told in the abstract, I think throwing real life.

 

Oh, supposedly real life cases that we don't understand yet. And actually somebody else is doing the same some of this on the, on social medium. So Sean Sean Sabatini on my LinkedIn raises a couple of interesting points that I think we should reflect on. So you've got a non neurotypical person.

 

How can you obtain consent of a non verbal person about getting their thoughts accessed, scrapped? Whatever. So that's going to have to be something that would be bought into this how do we get that to happen? There's got to be other ways of doing it. Can

 

[00:25:20] Nick Roome: non verbal people not write their signature on a piece of paper?

 

Possibly. That outline? Yeah.

 

[00:25:26] Barry Kirby: Okay. But I think there's, but again, I think it's again, acknowledging, the different type of people that we have and, how do...

 

[00:25:32] Nick Roome: There's, yeah, there's more than one way to gather consent. And I'm...

 

[00:25:36] Barry Kirby: The... The, an interesting one here. So we really, we talk in terms of reading brain patterns, reading your thoughts, and really talking about the activity within synaptic pathways.

 

What about people who have non typical synaptic pathways? The whole that they have more of them, like sort of people who are more multilingual, people who are musical, things like that. And there's elements around that. But it's, I honestly don't know the answer to that. And that is probably an entire new episode in itself.

 

Presumably once the technology gets a bit better. Yeah, I think that's interesting, but I think the bit that he goes on to say is if we can understand and read all of that. We promised we won't get into the technology, but I think this is cool. How far from it would we be then into influencing people?

 

So not necessarily given them specific thoughts via technology, but influencing what they're doing. It goes back to what you said around people being able to put the right sort of adverts in front of you, if we understand what their thoughts are, could we activate specific pathways in order to influence what they're thinking in order to do, get them to do things like.

 

buy things or do certain actions. So yeah, interesting. And then there's a whole cybersecurity element that he highlights that I think is a very good point. Let's save that one for another time as well.

 

[00:26:53] Nick Roome: Good, good social thought. Yeah. I think a lot of that comes back to that autonomy, being able to think or interference.

 

And then the the consent piece that's an easy solve for me. It, the only. Scenario in which I can visualize is somebody who's blind, deaf, and cannot read Braille and has no other way to communicate in that case, like there might be some issues with consent, but I think in most cases, a piece of paper that says we are collecting this data similar to what you sign when you go to the doctor for blood.

 

We're conducting these tests and with that we'll be accessing, your memories for the last six weeks we'll be accessing I don't know how this data will be like, what the phrasing of this data, like your memory what is a memory versus like your your vitals for the last.

 

Is that stored somewhere in your brain? Can you access that? I don't know. What can we, what do we have access to it? How do we categorize that? Whole other question, I think,

 

[00:27:59] Barry Kirby: outside of this. It's is and it isn't because you're right. We have we've majored here largely on conscious thought.

 

What about subconscious thought? What about the other data that exists within your within your brain that isn't about your your conscious thoughts, conscious or subconscious thought. It is about, it's got the other stuff that is going on. And then you've got dreams as well.

 

Cause that yeah, I possibly don't want to go down

 

[00:28:27] Nick Roome: that. And then you have different brain functions. Happening in different parts of your brain and so you're like, what if you gave access to everything except for your prefrontal cortex,

 

you don't know what critical thinking is happening. You're getting the raw input, I'm just wondering where, or specific brain functions, right? You might just tap in only to my prefrontal cortex and tell me how I'm thinking about, how, get the input and then give it input, recreate my prefrontal cortex and have it create an input based on, or have it create an output based on an input.

 

[00:29:06] Barry Kirby: So then it would, given a set of context, it would, it, mapping how you think, not what you think. And so therefore it could give I've done a project on that, just saying. Yeah. It's quite cool. Cool stuff. Not as blatant as the reading of the brain thing, but certainly the other side of it. And it's it bought if it sparks a lot of conversations, which we have done previously around, but it is again worth bringing up around what happens when you die, all these things are stored.

 

They get reused because I think you and Heidi did the the actual episode on this sort of thing. Cause I think I was on a holiday and I was commenting in the background. But again, this is something that I think we certainly I've spoke to David Burden about who's a bit of an expert in this type of thing as well.

 

So this thing comes. Comes up to that again around your, not only your right to privacy, but your right to be deleted, but your right to die. And therefore if, do you need to be able to put into your will, for example, to say in the same way now that, you, do you give somebody else access to your Facebook account?

 

Is it still called Facebook this week? I think it is. I don't think that's trended off anywhere. You give, somebody kind of access to your Facebook account. That's fine. But do you want them to have other people to have access to your thoughts that have been beautifully stored in the cloud, not been abused in any way?

 

But they're there and therefore people can find out truly what you thought about them after you've died and there's no consequence for you. Or do you have the right to turn around and say on my death? Press that big red delete button. There's something else that should go into this neurotypical neuro bill of rights.

 

[00:30:42] Nick Roome: Yeah I agree. Yeah. I think the right to opt out is one that rings true with those statements, right? Any system technology platform that collects your data, you should be able to say, no, thank you. There's, Yeah, I think, yes. I have some closing thoughts that I want to get to, but I want to hear from you.

 

What are your closing thoughts on this, because I can't believe we've already talked for almost 40 minutes on

 

[00:31:07] Barry Kirby: this. Wow, that went quick. I think my last thought is... Around again, it's around government usage and it's at the moment, a lot of, so we've transitioned a lot in the past, I would say 10 to 12, 10 to 15 years to everything that you access for government is now online that online is standard.

 

So if you want to order, get work out when your bins are being collected all the way through to accessing critical services online is the way you do it and anything else is by exception rather than, so if you've got, if you. can't access the internet, there is a paper version that you can find somewhere very, in a very hard to do way.

 

How long are we, when Brain HCI's become, BCI's become the norm, how far away are we from the government saying, the way that you access this service is through a BCI? No, no other way we'll do it, because it's the easiest way for us to implement it. And to know that, you're a legitimate person doing their thing.

 

They'll make the case for it. It might be a hundred years away from now, but access by BCI is the only thing. Are we seeing the thin end of the wedge at the moment? We better

 

[00:32:15] Nick Roome: make some good bill of rights for that because that's what it's going to be built on. And I think along my final thoughts here on along the same lines of the bill of rights, there's some that we didn't even bring up.

 

We touched on security a little bit, but the right to have those stored in a secure. facility. I think the right to transparency, knowing what your data is being used for, and how it's being used, and what things they're doing to it, and for what purposes. I think that's really important as well. I think we didn't even talk about the non discri we talked a little bit about non discrimination, about thoughts, but there should be no sort of the Presence or absence of certain thoughts should not be, you shouldn't be discriminated against because of those.

 

And I think that touches on a lot of things, and I think and I don't know, you can compare that to Equality other equality laws out there, non discrimination laws, those types of things, to make sure everyone's treated fairly. I think there's also just to end this out I brought up the opt out thing, but then there's also the right to remediation.

 

If any of these rights are broken, then you should have the right to seek some sort of remediation for those. I think this is a really interesting topic, and I can't believe we talked about it for what seems like. Not that long at all. And it's already been 40 minutes. So with that, Thank you to our patrons and all of you for selecting our topic this week. And thanks to our friends over at Scientific American for our news story this week. We do All the links to all the original articles in our weekly roundups in our blog.

 

You can also join us on our discord for more discussion on these stories and much more.

 

we're going to take a quick break.

 

We'll be back to see what's going on with the human factors community right after this. We also want to give a huge thank you, as always, to our patrons. We especially want to thank our Human Factors Cast, All Access, and VIP patrons , patrons like you truly keep the show running. And you know what?

 

I will go ahead and just plug the website, humanfactorscast. media. We post. more than just our news roundups on there that you can find every episode we've ever done, every interview that we've ever done. It's all indexed there for your convenience.

 

So if you go in and search something like BCIs, you'll find every episode that we've ever done on BCIs. It's a great resource if you want to look into a certain topic area, like AI, BCIs, cars, transportation I don't know, name something else. Barry, what else have we talked about on the show? Human factors

 

[00:34:42] Barry Kirby: comes up

 

[00:34:43] Nick Roome: occasionally.

 

Yes. Do that. Go do a search on human factors on the human factors cast website, humanfactorscast. media. All right, let's switch gears and get to the next part of the show. Simply called.

 

It came from, yes, this is the part of the show where we search all over the internet to bring you topics the community the ominous community as it's referred to is talking about. If you find any of these answers useful, give us a like wherever you're watching or listening to help other people find this content.

 

All right. This first one here is by user L. Timpierre10 on the Human Factors subreddit, they write Human Factors traveling jobs. Are there traveling jobs for Human Factors professionals, especially in aerospace and aviation? I want to avoid desk roles and would appreciate any insights or recommendations, Barry.

 

Do you know of any jobs, roles that travel for work?

 

[00:35:36] Barry Kirby: Yeah, mine. I do a fair bit of travel. It's one of these things that, what is it you're trying to get out of it? It really depends. Are you talking about where you do cool things like go and sit in the cockpits and do analysis of pilot workload, things like that?

 

That's One thing I've done a fair bit of that I'll give you a hint after a while it gets a bit samey. Or are you wanting to go to far flung places and just talk to a broad range of different people that I think actually we, you've got I know a lot of people who do like the conference circuit or, down the academic side of things where people go to conferences and do a lot of foreign travel.

 

Certainly Good colleague of mine at the moment has spent a lot of time around everywhere from Australia New Zealand, Japan just in the past few weeks and they get around quite a lot. If you're going to do, if you want to have that sort of variation rather than going to a large company that you might want to go more think about freelance and become a consultant that will get you out and about.

 

But I guess it's more about the looking at the sort of things where you're looking at engaging with people, come up with reasons about why you have to travel rather than just. being sat at a desk. And that is, that's doing the sort of activities like workshop facilitation, like doing interviews focus groups, that type of thing, where you might want to do more of that.

 

But I think no matter what, you still end up sitting at a desk for a good chunk of time, because no matter where you go, what you do, you've still got to write stuff up, you've still got to do the analysis, you've still got to do that type of thing. And I at the risk of sounding slightly cynical once you've done so much travel, it gets boring really quickly.

 

I think you certainly now with what I've learned, I think since COVID you can get so much more done over these types, using your team zoom, whatever type of facilities, but getting that balance right between remote and face to face is absolutely key because you can't do it all remotely.

 

That was a really long way around saying it depends, doesn't it? What do you think, Nick?

 

[00:37:25] Nick Roome: It depends times two. So there's, so I have two things and a secret. So the first thing is it depends on who you're talking to and their location. And whether or not it's specialized. And then the second thing, or I guess that is the second thing, where they're at who it is and where they're at.

 

If you are, if you have a user base that's everywhere, then unlikely that you'll need to travel. There's always going to be desk work. So I agree with you there, there's always going to be desk work, you're always going to be doing something. That's not, unless you're like just in a data gathering role, but you need to process that information and you need to do something with it.

 

So I can't imagine there's a scenario where someone would be doing all the fun stuff and then somebody else would be doing all the analysis on that fun stuff. I just don't see that. So there's the who and there's the where. And there's scenarios where you have very specific users that are located in very.

 

Specific types of places, so if you need to do like contextual inquiry, if you're doing like figuring out what they're doing and what they have on their desk and what's nearby as they reference this thing that you're building, or as you said, Barry, doing like a cockpit evaluation or something like that, a workstation, a workshop workspace evaluation, that type of thing, I think there's There's a need to go out and do that stuff.

 

Here's the secret. If you want to go to fairly exotic locations and travel the world. Get into naval defense contracting. Okay. Naval bases are usually somewhere in located by a body of water. And usually there's a lot to do by those bodies of water. And usually they're in fairly populous locations.

 

Not always, but usually. There's gotta be a community to support that base. And so if you need to go out to very specific users at these very specific bases that do these very specific things, you're going to need to travel. So there you go.

 

[00:39:28] Barry Kirby: I would qualify that for those that know that I, last time I did that, I went to Barrow and Furness for four years.

 

If all

 

[00:39:35] Nick Roome: Let's get into this next one here. This next one by I desire pickles on the U S research subreddit of these things, they write crafting research questions. They write as a researcher in a new role. The design manager expects me to create research questions and validate assumptions about creating an app.

 

I am struggling with this and seeking advice on how to approach it. Barry. Can I

 

[00:40:02] Barry Kirby: just use it depends again. So actually I like this question. It's a decent question because it's so different from what does my portfolio look like? So fundamentally, I think it almost goes to the crux of how, you're basically setting your own requirement at this point and getting the requirement right at the beginning of the project, the beginning of the research element is absolutely key to getting good research out of the back end, because if you understand what it, what the question is, you're trying to ask, then you'll understand whether you've answered it.

 

And it sounds a simple thing to say, but actually. The face question, the obvious question might not be the question that you truly get into the bottom of. So if you're trying to get to an app about whatever you're doing, fundamentally, what does success look like? Is it the number of downloads?

 

Is it the number of good, you customer feedback? Is it number of sales or, what is the actual driving? The app doesn't exist just to exist. The app exists to do something. So therefore, how do you know if it's been successful? And there will be, an overarching thing that what success looks like, and therefore that is really the question that you're trying to drive to, but that will be a very high level question.

 

I then tend to break it down to four, between four, five, six sub themes because things normally fit into that type of thing, or maybe just the way I think about them. And you will be driven because people love this, people will want to have some sort of numeric output because I think it's useful for them.

 

That's not necessarily useful for you. So this is where you actually not only define the question, but be quite stringent on why you're trying to get the type of answer that you're trying to get. So it might be a scalar. It might be something binary. It might be, yes, no, I like it or dislike.

 

It might be a numeric. It might be something emotive. It might be something subjective. Qualitative rather than quantitative. As long as you know why you're getting the type of answer and what you're going to do with the data you get in order to answer your research questions, then that's entirely within your remit, but everything has got to roll up to answering your research question, which is fundamentally achieving your requirement.

 

Nick, what about you?

 

[00:42:10] Nick Roome: I think that's right. Look at the goals of what you're trying to accomplish and. Start to think backwards into what questions will answer those goals. Ultimately, you're looking at hypothesis validation or invalidation at the base assumption, right? You might have multiple hypotheses.

 

You're trying to figure out whether or not your assumption of how people will do a certain thing or feel about a certain thing is true or untrue and how, what questions will get at that. And so that's the process that you work backwards from that. I think just in general I, I think this is not like a researcher asking this.

 

I feel like they've been pushed into this role or they're asking that somebody is asking them to do this and they're asking. Researchers who know how to do this for advice. And so some practical things to look for here. Question banks are a great place to start. I know maize has a good one.

 

There's tons of question banks and question batteries out there. Pick and choose from those, which might. And modify them for validating or invalidating those hypotheses that you have that those might be a good place to start. Alternatively, if you don't know of any question banks or you're just not hitting the right ones I don't always advocate for this, but use something like an AI language model.

 

You can say something along the lines of here are the goals that I'm trying to do for this study, and it'll, what are some questions that I can ask? users to figure out these things. It'll come up with some things. They're not all going to be great, but it might give you a starting point for thinking about what other questions you can ask.

 

Or what sort of categorization things that you can start to put together to measure those things. And if you need to, you can always ask it to follow up and say, how does this get to the goal? And that might actually give you some good thoughts into how to approach that. Okay. Let's get into this last question here.

 

All right. Researchers, how long do you keep your recordings? This is by Kim Yu on the User Experience subreddit. They write, when do researchers usually delete their usability test recordings? Or in this case, we'll just say any data collection. I have recordings from two studies and my laptop is running out of space.

 

Should I get an external drive? Barry, what do you think?

 

[00:44:33] Barry Kirby: So I think the test recordings here is relatively important because the It is something that you will be thinking, actually, I'll hold onto that because that's your core. That's your raw material, isn't it? That's the actual words of the people that you've got.

 

However, I'm going to suggest that really be careful with just how long you keep this stuff full. Because firstly, legalities, there are legal things around here. So in Europe, particularly G GDPR general data protection rights and general data protection in of itself, you should only keep data and people for as long as you need to use that data for you shouldn't just be keeping data just for for the sake of it.

 

So when you do your initial recording, you should be upfront with your subjects about saying, I'm using the data for it to do. This, and I will be keeping the data for this long. And at the end of it I will be doing this with it. And ex you should already know what your data plan is. That should be part of your briefing.

 

And really I would say for recordings in particular as short as possible, don't hang on to them. Once you've transcribed them, once you've done your notes. But make sure you've done your transcription and you've done your notes and ana and analysis before you delete them. Because you'll transcribe them and then you'll think, actually, I need to go back and look at, particularly if you're looking at something as well, it's not just about what they've said, it's about what they're doing.

 

It's about their body language, all that sort of stuff. But it should be as short as possible because people do have a right to be forgotten. People do have a right to ask for a data access request in the UK, freedom of information request, whatever you want to call it, depending on where you're at.

 

So if you're holding more of this data, then actually the the more of a headache you're creating for yourself. So I would say once you've transcribed, noted and analyzed, basically once you've done what you need to do with the actual recordings, I would delete them. Get rid of them as soon as possible because a, from a a logistical perspective as the the original, as original posters highlighted, it takes up space and you have to think about where I'm going to store it.

 

Yeah, it has to be stored correctly. You have to store it securely. You can't just have it on random thumb drives, et cetera, et cetera. If nothing else, if you've got nothing else in place, once a project is finished, a general rule of thumb I always go to is you keep. The data that you've got.

 

So I would close a project down and I keep all data for five years. Because the, if you might end up, there might be legal issues coming up. There might be anything where you have to prove evidence. Particularly if you've been involved in design sign offs anything like that a general rule of thumb, keep things for five years on one things.

 

So do a general on project shutdown, do a cleanse, keep the basics of the raw the basic information that you need. For five years. And you will determine that what on the type of project and the type of evidence you might need to produce. And then at the end of five years, get rid of everything, but the essentials.

 

That was a long way around saying, and that was the entire data management program. Nick, what do you think?

 

[00:47:17] Nick Roome: As to not incriminate me, I think, look here's the thing is you. It depends. The first and foremost, you have a data plan. Specify in your informed consent how long you keep that data.

 

There is I guess there's one sort of edge case where it's okay to keep data indefinitely. One, if it's your data you can keep that. If you've accessed it from some other usability researcher, you can keep that. I think there's you should be entitled to your data. And we talked about data all this episode.

 

So I'm just going to leave it at this. If the people that you are talking to are internal, and you're storing that internal, then I don't think there necessarily needs to be a limitation on it. Although, I'm talking from like a U. S. perspective where companies come first and, we don't quite yet have the GDPR over here.

 

I'm a hoarder, so I personally am really bad about this. I don't.

 

[00:48:18] Barry Kirby: It, but it is such a, it's a, actually, I think this will make a fascinating episode in its own right because the, we have such a drive to, you don't want to throw stuff away because what happens if you get some, a nice little, Niche bit of data out of that you didn't get on the first pass or, if something does come back, do you want to refer to it?

 

Why wouldn't you, but equally, I've been through so many projects now where you never touched again. You just don't, you have the best of intentions, all that sort of stuff. But as soon as that, as soon as that project's over, you're out of there and you're onto the next project. Ah, but see,

 

[00:48:49] Nick Roome: here's the thing is like for me what I want to do is go through and.

 

Do a massive search across all the transcripts and do search for anytime somebody has mentioned this phrase and be like, Oh, this thing that we're looking at now, let's go back and check that out, define it.

 

[00:49:03] Barry Kirby: I think that, but I think that's a really, I think you're you're having your transcripts or your end analysis, I wouldn't throw that away.

 

Cause it will be suitably anonymized and all that sort of stuff. The raw recordings I would definitely throw

 

[00:49:19] Nick Roome: anyway. All right. Good question. All right. Time for one more thing, Barry, what's your one more

 

[00:49:23] Barry Kirby: thing? I can't remember. I need to scroll down now. Cause I've got so excited by having three three came froms that we actually had to think about.

 

So I guess for me this week, I got a level of honesty or a level of reflection We've been talking a lot. And I wrote about, I wrote an article on this recently in for the ergonomist that is just hitting the hitting the things now hit the doorsteps around the value of leadership. And I like to think I'm not a bad leader.

 

I can do all this sort of stuff, but I was caught. Caught up short the other day when we were looking at, we were doing some work and I do get into a habit and I can be really guilty of it, of not delegating out work because I think A, it's quicker to do it myself, B, I know it will be done to the standard that I want to be doing it.

 

If I do it myself to the point that 24 hours in a day just isn't enough for doing it. And a long time listeners to the show will know that both me and my wife work together. In my company and she did turn around the other day. Oh, you're an amazing leader. You can motivate people, but you're a rubbish manager.

 

You can't delegate. You talked about other people about delegating, but when it comes to it, you just can't let go. And you need to work on that. Now having been running my own company now for over 11 years, it's nice for her to leave it to now to tell me this. But I just thought it was really interesting cause she is absolutely right.

 

I am quite, it takes me a long time to be able to trust people to delegate stuff off and then still keep the right level of of support in place. So I just thought it was really interesting because I'm, we talked about burnout last week and the last episode, this episode, my one more thing was just, I need more than 24 hours in a day.

 

I just haven't got enough hours in the day to do all the things I want to do. And actually part of that is because I need to. Reflect more on my ability to delegate work off and let people get on with it. I feel better for saying that. Good. What's your one more

 

[00:51:16] Nick Roome: thing? If you need more than 24 hours in a day, I need more than five minutes for a one more thing.

 

So I'm going to try to keep this short. If you want to hear the full story, go listen to the pre show. But I managed to sprain both of my knees at the same time, and it's a hilarious story. I was squatting. And I came out of the squat and heard both my knees pop and woke up on the floor face up with my wife over me saying, what happened?

 

I said, I don't know. My legs still hurt. So that's one thing. The other thing that I want to say is just a huge shout out to one of our listeners, Noah, who sent a lovely voicemail. After I had gone on last week about burnout and he was giving me some advice and how he's sharing some similar thoughts.

 

I'm not going to play it on the episode. I don't think that's right. It was a very it was a very nice thing to receive. So thank you, Noah, for that. And just an update on the burnout thing. I'm freaking back, baby. We're back.

 

[00:52:07] Barry Kirby: Oh no. I don't think I can take it.

 

[00:52:10] Nick Roome: All right. And with that, we'll go ahead and say,

 

[00:52:12] Barry Kirby: I think that's very good to hear.

 

And I'm pleased, firstly, thank you to Noah for reaching out. It makes this feel like it, it makes it feel like the community. It is that people can realize that they, everybody needs a a pat on the

 

[00:52:22] Nick Roome: back occasion that we're human too. We're not just personalities that come on and talk about human factors every week.

 

Yes. Anyway, that is it for today, everyone. If you liked this episode and enjoy some of the discussion about BCIs and the tech they use and maybe even about mental privacy, I'll encourage you all to go listen to episode 279, BCIs Pose a Threat to Mental Privacy. Comment wherever you're listening, what you think of the story this week.

 

For more in depth discussion, you can always join us on our discord community. Visit our official website, sign up for our newsletter, stay up to date with all the latest human factors news. If you like what you hear, you want to support the show. There's a couple of things you can do. One, you can stop what you're doing.

 

Leave us a five star review right now. That's free for you to do. And we really appreciate those. Tell your friends about us. That is also free for you to do. And. Costs you nothing. And if you want to send money our way, we have a Patreon with a bunch of different benefits that you can look at on our website.

 

So please go do that if you have the financial means to and want to support us that way. As always, links to all of our socials and our website are in the description of this episode. Mr. Barry Kirby, thank you for being on the show today. Where can our listeners go and find you if they want to talk about your invasive thoughts?

 

[00:53:31] Barry Kirby: You can just plug into my head at Twitter or X depending on how rebellious you're feeling I'm Baz underscore K on there and across all the socials, or if you want to listen to some interviews with esteemed people around the 1202 the human practice podcast at 1202 podcast. com.

 

[00:53:48] Nick Roome: As for me, I've been your host, Nick Rome.

 

You can find me across social media at Nick underscore Rome. Thanks again for tuning into human factors cast. Remember next no, no show next week until next time it depends.

 

 

Barry KirbyProfile Photo

Barry Kirby

Managing Director

A human factors practitioner, based in Wales, UK. MD of K Sharp, Fellow of the CIEHF and a bit of a gadget geek.