Ethics and Empathy in Technical Communication

Room 42 is where practitioners and academics meet to share knowledge about breaking research. In this episode, Dawn Armfield explains how empathy and ethics create technical communications for real humans and not monolithic audiences.

Airdate: September 29, 2021

Season 2 Episode 7 | 48 min

Watch on our YouTube Channel Listen on Apple podcasts Listen on Google podcasts Listen on Amazon music Listen on Spotify Listen on Stitcher Listen on iHeartRadio Listen on Buzzsprout

Transcript (Expand to View)

[00:00:10.640] Liz Fraley

Good morning everyone, and welcome to Room 42. I’m Liz Fraley from Single-Sourcing Solutions. I’m the moderator. This is Janice Summers from TC Camp. She’s our interviewer. And welcome to Dr. Dawn Armfield, today’s guest in Room 42.

[00:00:24.330] Liz Fraley

Dawn Armfield, PhD, is an associate professor of Technical Communication in the Department of English, where she teaches usability, user experience, research methods, visual communication, and technical communication and instructional design, travel writing and prototyping. She’s amazing.

[00:00:41.490] Liz Fraley

Her research focus is on human-centered design in emerging immersive and embodied technologies, with a focus on empathy and ethics.

[00:00:49.630] Liz Fraley

She has published an interdisciplinary field with an emphasis in emerging technologies, visual communication, online collaboration, and educational technology.

[00:00:59.940] Liz Fraley

Her most recent publication is a coauthor chapter, Human-Centered Content Design in Augmented Reality. Prior to becoming a professor, she is an instructional technologist, assistant analyst, a project manager, and a web content developer. And today she’s here to help us start answering the question: How do ethics and empathy create better communication? Welcome.

[00:01:19.970] Dawn Armfield

Thank you.

[00:01:21.270] Janice Summers

We are thrilled to have you here with us today. So one of the things I want to kick things off with is talk about the shift from user to human. When did that shift start to happen? Because I was thinking about that and I thought, she’s right. We have started to shift, I think.

[00:01:44.750] Dawn Armfield

Yeah. Well, I know in education, there was, I would say maybe about eight or nine years ago, I had some colleagues who are talking about user experience and experience architecture and things like that, and they were avoiding that.

[00:02:11.280] Dawn Armfield

And at that same time, I was reading documents from people who are actually out in the field saying, user can be one of those terms that people start to think about in derogatory ways, and it has negative connotations because of drug use and all of these other things. And it also dehumanizes the people that we’re actually creating things for. So there was a shift around, probably about a decade ago, I would say.

[00:02:46.860] Janice Summers

And it’s really interesting that you’re bringing all these things up because words have meaning. I know that sounds ironic, but words have meaning. And with them, there’s a lot of other meanings culturally that affect us.

[00:03:00.460] Dawn Armfield

Yeah.

[00:03:01.280] Janice Summers

So that word user means more than just that. And it’s also a detached word, isn’t it?

[00:03:07.980] Dawn Armfield

Right. Well, and user, it could be a negative connotation in terms of relationships. That person’s a user, right?

[00:03:17.580] Janice Summers

Right.

[00:03:18.610] Dawn Armfield

So if we want to be better communicators then we need to think about the people at the other end of whatever it is that we’re creating as humans rather than just using the stuff that we’re doing.

[00:03:34.260] Dawn Armfield

And I think that’s where the empathy and the ethics come in because we’re trying to think about them as people, not just as somebody who’s doing a function or creating something from whatever it is that we’ve developed.

[00:03:51.600] Janice Summers

Yeah. Like somebody’s here just to use and move along.

[00:03:55.230] Dawn Armfield

Right. If we want them to engage with whatever it is that we’re making or that we’re sharing, then we have to think about them in a better light.

[00:04:07.300] Janice Summers

Right. I was listening to an interview with John Carroll (Director of Penn States Center for Human-Computer Interaction) that was done not too long ago, and he was saying, They’re learners, right?

[00:04:18.660] Dawn Armfield

Yeah.

[00:04:19.760] Janice Summers

That was the word that he was using. So that shift into humansnow let’s talk about empathy and ethics. You pull this into your teaching?

[00:04:29.900] Dawn Armfield

Yes.

[00:04:30.820] Janice Summers

Tell me more about that. How do you teach empathy? That’s a tough one.

[00:04:36.570] Dawn Armfield

It really is. I was actually thinking about this this morning as I was getting ready to come here, and I was thinking about, when did I start doing this?

[00:04:46.670] Dawn Armfield

And I would say, let’s see, I was in another position and I was at a conference in Ireland, and I was wearing wearable technology because we were presenting wearable tech, and the wearable tech that I was wearing was actually a camera that took snapshots. And it was pretty innovative at the time, so that tells you about how long ago this was. It was probably eight or nine years ago.

[00:05:15.980] Dawn Armfield

So it’s taking continuous photographs while I’m at lunch, but I’m at lunch with people that I’ve known since grad school. Some of them were my professors in grad school. And all of a sudden, the conversation stops, and one of my former professors turns to me and says, That does not take audio, does it? And I said, No, why? And she said, Because I don’t want what we’re saying here to be recorded and shared back.

[00:05:39.730] Dawn Armfield

And I started thinking about that in terms of the technologies that I was testing to talk about, to share about and saying, We do all of these things, we test all of these things, but are we thinking about who it is that we are testing with and the participants that may or may not know the full extent of whatever it is that we’re testing?

[00:06:04.720] Dawn Armfield

And they all knew that I was wearing wearable tech and they knew what it was. They knew, for the most part, what it was doing, that I was taking pictures, they just didn’t know that it didn’t take audio as well.

[00:06:15.130] Dawn Armfield

And I started just really thinking about that. So I came back to my classroom and I said, Let’s talk about this because this is really important.

[00:06:22.510] Dawn Armfield

When we’re doing usability testing, are we thinking about how the people that are participants, how they feel at that moment, or how they are responding to this in a visceral way? Are we thinking about them in terms of being human? And are we thinking about that they might come into this situation scared or nervous or tired, or they’ve been forced into it because their mom said This is a good thing for you to do or whatever, right?

[00:06:54.640] Dawn Armfield

Or they’re a classmate and they feel like this is the way that they will get a good grade. Whatever it is there are these outside factors that can affect them.

[00:07:05.500] Dawn Armfield

So I want you to start to put yourself in their place and think, how can I make this situation better for them? And I extend that past my usability and my UX (user experience) classes. And I think in visual technical communication, we start off with this whole cultural identity section.

[00:07:26.660] Dawn Armfield

And everybody has towhat we do is we all share a piece of our culture because I want us all to, number one, we’re all in this class together, so we have a classroom culture. But everybody is coming to the class from a different space, so I want it all to be appreciative of culture and the differences that make that, and why a classmate might respond in this way or that way to something that you post visually.

[00:07:57.090] Dawn Armfield

So it can take many forms when I’m doing it, but really, it’s about understanding others that we are working with or that we are working around or just people.

[00:08:13.520] Janice Summers

So that empathy is saying Look beyond just the user of this technology for the technology’s functions.

[00:08:22.700] Dawn Armfield

Right.

[00:08:23.900] Janice Summers

But the person as a whole human with all of the information that they have, it’s more than just I was teaching classes the other day, and we look beyond demographics because we all have the typical demographics, but that doesn’t describe the human. What is their life like?

[00:08:46.140] Janice Summers

You can say, well, this is a worker, but what are they like their whole life? Their hobbies, their interests, their social life. What drives and motivates them, because that influences our interaction with technology, right?

[00:09:02.910] Dawn Armfield

It does. And I think even in my UX class, when students have to create personas, one of the things that I ask them to do is get deep into this person. Who are they really? Not just what are their demographics? Who are they? Why would they respond to this site in this way? Understand them because that’s important.

[00:09:26.310] Dawn Armfield

It’s easy to just do cursory usability testing or think about our audience as just this faceless mass out here who is reading or doing something, but they’re more than that.

[00:09:41.980] Dawn Armfield

And once we understand them, then we can actually create better materials for them in different ways that will reach them where they’re at. Because really it’s about them, it’s not about us.

[00:09:59.140] Janice Summers

Yeah. Because the technology, ideally, is there to create some solution for them.

[00:10:05.570] Dawn Armfield

Right.

[00:10:06.760] Janice Summers

But why are they there? I like the fact that you’re looking beyond just the surface and you’re asking: Why? What are their motivations? More than just what they’re trying to do. So let’s talk about the other side, the ethics side.

[00:10:26.320] Dawn Armfield

Yeah. So the ethics actually came in my first year at this position, I was talking about empathy, and I was teaching an Ethics in Emerging Technologies class. It was a special topics class, and I was really excited about it.

[00:10:44.960] Dawn Armfield

And we were talking about drones, and we were talking about AI. And we were talking about immersive technologies. But then one of the conversations turned to How do we make sure that drones are used responsibly? How do we make sure that AI is used responsibly?

[00:11:05.100] Dawn Armfield

And so then we started talking about all of the laws that govern technologies or don’t govern technologies, as the case may be, because there’s a lot that’s left to chance as we’ve seen in the last year with a lot of the court cases and what-not.

[00:11:22.800] Dawn Armfield

And my students started saying, Well, we need to have lay people on these governing boards or these spaces so that they can say how real people feel about these things.

[00:11:35.670] Dawn Armfield

And isn’t that why we elect our elected officials? Aren’t they supposed to be? Yeah, but they’re in people’s pockets. And so then I said, okay, so let’s start thinking about the ethical implications about where the money is, how we’re talking about technologies, how we’re sharing information about technologies, how we’re writing the documentation for technologies, all of this. Where are those ethics?

[00:12:03.140] Dawn Armfield

And anybody who has taken Technical Communication, especially In my area I know we came up against the Challenger memo time and time again, about was it ethical that it wasn’t read properly or was it ethicalso the whole memo about the O-rings and the Challenger is brought up over and over again to talk about ethics, but ethics goes deeper.

[00:12:35.460] Dawn Armfield

And so one of the things that I’ve been doing is I pull up documentation that talks about these guidelines to go through to think about: Is what I’m creating ethical? Does it do any harm? Does it give people space to think about things in different ways? Is it allowing them to be ethical themselves? Is it asking them to do something that might be illegal?

[00:13:11.180] Dawn Armfield

So with every step of whatever it is that they’re creating, are they thinking about how the audience may perceive it and may use whatever they’re creating to do something that might not be proper?

[00:13:33.130] Dawn Armfield

The way that I bring this up to my students is, we hear this all the time with social media. I was one of the first, I think 100,000 people on Twitter, and back in that day, Twitter was not what it is today, and it was more of what the originators thought it should be. These different kinds of communication.

[00:13:56.240] Dawn Armfield

And you hear this conversation all the time. Well, Twitter, it’s not supposed to be used this way, or it’s not supposed to be used this way. Or Facebook isn’t supposed to be used this way. iPhones aren’t supposed to be used this way, but we can use it this way.

[00:14:13.750] Dawn Armfield

Who is the arbiter of how it should be used? So if we’re thinking ethically, if we’re creating these things, or if we’re creating the documentation for these, or if we’re creating spaces for these (because I do immersive experiences) are we creating them and thinking about the ethics that could be involved if you might use it this way as the tester, but how might our audience think about using it? And is it going to be ethical?

[00:14:45.520] Dawn Armfield

That’s where it all came from. It came from these long conversations with students because they’re brilliant and they bring things out of me.

[00:14:54.790] Janice Summers

Well, they think outside because I think we get set in our pattern. So if your pattern is shaken up, talk to a bunch of college students.

[00:15:04.220] Dawn Armfield

Exactly.

[00:15:05.710] Janice Summers

Because they’ll get right outside of that box. They haven’t been boxed in so much. And it’s interesting because when you create things and you have a code of ethics, then somebody goes out and uses them in unethical ways.

[00:15:17.440] Dawn Armfield

Yes. So this week, my usability class was doing the ethical module. And we went over the UXPA (Usability Professionals’ Association) code of conduct or ethics that they have. And we went over the I’m trying to think, Usability Geek had a page about ethics. There are a few good links about ethics out there, so we went over all of them and we said, okay, so how realistic is it to be able to do this in every single usability test that you’re going to do? Can you think about this ethical approach?

[00:16:03.210] Dawn Armfield

And they’re like, Yeah, because it’s pretty common sense. And I said, It is common sense. And I’ll share those with you, Liz. And they are pretty common sense but are you thinking about them when you’re writing your questions for your testing? Are you thinking about them as you’re setting up the environment for your testing? Are you thinking about them as you’re collecting the data and writing the data for your final audience?

[00:16:35.280] Janice Summers

Well, and it’s interesting. I would say that there’s nothing common about common sense, right?

[00:16:40.720] Dawn Armfield

True.

[00:16:41.540] Janice Summers

And I think you’re bringing up some good points as far as well, maybe you need to have a list of ethics thought provokers as you’re writing to help you think in ethical terms that are not so common, right?

[00:16:59.790] Dawn Armfield

Yeah.

[00:17:09.370] Janice Summers

Something that triggers you to think consciously about that. What do you think about that?

[00:17:15.210] Dawn Armfield

And I think that’s really important. So the interesting thing is a lot of my students are actually in computer science, because those are the students I get the most.

[00:17:28.930] Dawn Armfield

My undergraduate students are in computer science. My graduate studentsand we have combined classes, so they’re undergraduate/graduate, at least in this classand my graduate students are Tech Comm (Technical Communications) master students.

[00:17:43.320] Dawn Armfield

So I have them work in teams, but the undergraduates will always say, I’m not getting this approach in computer science. But you should be because you are creating things that are going to affect people.

[00:17:57.460] Dawn Armfield

So maybe that’s my job as the person in the humanities. I’m bringing the humans back into whatever it is that you’re doing.

[00:18:04.120] Janice Summers

Right back into the science, the humans, who are going to be interacting. So when you go out into the commercial world, companies have ethical codes, I think tech writers have ethical codes. So there’s a lot of different ethical codes out there.

[00:18:22.460] Janice Summers

How would you make sure that you’re adhering to a higher ethical standard? Would you combine these and create your own code of ethics as you’re working to help guide you?

[00:18:35.520] Dawn Armfield

That’s a really good point. And I think that if you’re working for a company, of course, you’re going to use theirs, because that’s your higher authority at that point.

[00:18:45.630] Janice Summers

But what if there’s a silence?

[00:18:50.280] Liz Fraley

Then there’s ethics for your discipline and your industries also?

[00:18:55.900] Dawn Armfield

Yeah. Right. And I think that there are organizations So STC (Society for Technical Communication), sorry

[00:19:11.860] Liz Fraley

We understood what you said.

[00:19:12.810] Janice Summers

The American Medical Writers has a really comprehensive set of ethics, for sure.

[00:19:17.320] Dawn Armfield

You know where I actually started looking at codes of ethics is, I belong to the Association of Internet Researchers, and early on, they were creating code of conduct for the research that we were doing on the Internet.

[00:19:35.510] Dawn Armfield

And we were starting to think about that and to think about the ways that we do research with people who may not ever see us. We’re studying them but they may never even know that we’re studying them because everything they’re doing is out in the public. How can I be ethical about that, and how can I ethically approach that research?

[00:20:00.040] Dawn Armfield

Because IRBs (Institutional Review Boards) at the time didn’t really know how to deal with that either. They’re getting better, but still, in immersive technologies, they have no idea what things might go on in those. And sometimes it’s creepy, and sometimes it’s really cool.

[00:20:22.520] Dawn Armfield

Yeah. But I think if you’re trying to be an ethical technical communicator, there are many, many ways to do so. And if you want some governing rules to follow, I thought UXPA’s are some of the best out there.

[00:20:42.220] Dawn Armfield

And yeah, I think theirs are some of the best, and I think because they’re just so clear and they’re really geared towards the kinds of work that we do.

[00:21:00.620] Janice Summers

Okay. So what do you do This is just a thought thing. What do you do So say, I’m a writer. I’m working out in the field. The company has a code of ethics. I have a code of ethics as a professional writer, and I’ve incorporated some of the better ethics in my way of operating. What do you do if you run into something where it crosses that line for what you’re being asked to do, and you feel like it crosses that line?

[00:21:31.910] Dawn Armfield

So this is something I actually talk to my students all the time about because I say, we’re kind of living in an idealized world in a classroom.

[00:21:41.640] Janice Summers

It’s nice.

[00:21:43.520] Dawn Armfield

I can say you need to be ethical, but we can go into the workplace, and those lines are a lot more gray or blurred than they are in the classroom.

[00:21:55.310] Janice Summers

And what you’re working on.

[00:21:57.550] Dawn Armfield

Right. Exactly.

[00:21:59.850] Janice Summers

That has an impact, right?

[00:22:01.450] Dawn Armfield

Yeah. Well, I’ve had some students go to work for the industrial-military complex. So they’re working in spaces that could be iffy depending on your perspective.

[00:22:17.200] Dawn Armfield

And so what I say to them is, you need to think about the context, you need to think about who you’re creating this for. Is it ethical to that audience? If you’re feeling like it’s crossing a line, then you talk to your supervisor. If you’re not getting any feedback from them or if they’re just blowing it off, whatever, you have to decide is this a point where I become a whistleblower or not? Is this the point where it could actually cause damage or people could be hurt, because I joke around my students and I say that technical communication is not brain surgery, it’s not life or death, but it can be because some of the things that we create can be.

[00:23:11.740] Dawn Armfield

So we have to think very carefully about what it is that we’re creating. And if it has the ability to do harm or to create chaos or however it might cross some of those ethical lines, then you need to make a decision about it and to always keep in mind those people at the other end; how will this affect them? Because that’s where that empathy comes back into play with the ethics. How can my decisions right at this point make a difference in their lives?

[00:23:54.810] Janice Summers

And I think that one of the really good Litmus tests is, will it cause harm? Does it have a potential to cause harm? And I think in that situation there’s a higher authority called the law. And a lot of times these companies will have legal departments that if it’s that egregious, that that might be a good path to pursue what are the potential liabilities to the company. It might help them wake up to something; you’re exposing a potential liability.

[00:24:33.690] Dawn Armfield

Yes, definitely. If you approach it from that perspective or an economic loss, then they might be more willing to see it as something that needs to be addressed right away.

[00:24:47.060] Janice Summers

Right.

[00:24:51.620] Dawn Armfield

Because who cares about those humans? It’s about the money.

[00:25:02.090] Janice Summers

And it’s important that we bring that all back into perspective. Now with all of the artificial intelligence now, how is that with empathy for humans? There’s a lot of gray area I think in the artificial intelligence, isn’t there?

[00:25:20.200] Dawn Armfield

Definitely. So when I’m thinking about AI, I’m thinking about, is it this small thing? Do my lights turn on automatically for me so that I’m heading into a lit apartment rather than a dark apartment so I feel safer? That’s a small thing, and that’s automated. And it’s not really an extensive AI.

[00:25:54.660] Dawn Armfield

But there’s now, my Pixel phone, which is designed by Google which has amazing AI, can answer my phone for me and tell the person on the other end that they are checking the call to make sure that they are valid person and to let them know what they are calling about, or it can call and make reservations for me.

[00:26:21.740] Janice Summers

I never knew my phone could do that.

[00:26:24.330] Dawn Armfield

Yes. So when I start to think about those, I think, does the person on the other end know that they are not talking to me, but that they are talking to artificial intelligence? And if they did know, how would that make them feel?

[00:26:46.460] Dawn Armfield

So do I use it or do I not use it? Is it a thing of convenience for me but a thing that could be really creepy for them, or they might start wondering, Is my voice being recorded? Do they now have my voice in their files?

[00:27:10.270] Dawn Armfield

So am I thinking about how this might affect that person that I’m using the AI with? And that’s just a really small, small thing that they’re looking at AI for, that we can use AI for in everyday life.

[00:27:31.820] Janice Summers

But look how reaching it is?

[00:27:34.880] Dawn Armfield

Right. And I think that it’s up to us as those people who might be employing it or deploying it to think about who is on the other end. And that’s not even necessarily technical communication. That’s just human interaction and thinking about the ways that we’re interacting with our fellow human beings out there in the world, and what they may or may not want.

[00:27:58.740] Janice Summers

Right, because that’s taking that human to the nth degree, because you’re communicating to a human who’s using the Pixel phone and trying to decide: Do I turn this feature on or off? What does the feature mean? And that human is going to affect other humans with that decision that they make. So it’s like two deep in or three deep in that you’re thinking about that human factor.

[00:28:27.280] Dawn Armfield

And it can snowball, because how many layers will it go through to reach I mean, that’s not to say that when I’m pretty sure that I’m getting a spam call that I don’t use the screening, because I do, because they’re using automated services themselves.

[00:28:49.800] Janice Summers

Automated dialers with recorded voices.

[00:28:54.100] Dawn Armfield

That’s right. So we have AI speaking to AI or whatever.

[00:29:00.390] Janice Summers

Now we’re really writing for the machines?

[00:29:03.940] Dawn Armfield

Right. Well, and machine learning these days is just creating amazingly human-like responses. But we need to think about those kinds of things. We need to think about those connections and the way that they are reaching the people on the other end.

[00:29:29.300] Dawn Armfield

I like that 90 percent of the emails are written by Gmail, and I totally feel that. My text messages are often written by whatever is popping up in the prediction because Google knows me so well. I tell people, I’m pretty much owned by Google and Amazon at this point. They know everything about me.

[00:29:54.400] Janice Summers

They suckered me in for convenience and comfort. I sold my soul.

[00:30:01.360] Dawn Armfield

And I’m not sure if that’s a good thing or a bad thing.

[00:30:05.440] Janice Summers

Because honestly, when you think about it, there’s so much that automation helps us with in our daily lives, and it really does help us do better and be better on its highest level. It’s this other side.

[00:30:22.030] Janice Summers

And I think it’s important what you’re talking about is that we need to consider all of these things. So maybe we need to find a way to communicate to that person who’s deciding if you want Google to rule your world or not. Here are the pros and here’s the cons.

[00:30:40.480] Janice Summers

Maybe if we confront them with choices and potentialthe risk and reward so that they can weigh the decision for themselves because ethics are colored and influenced by the individual as well, because we all have a driving compass in us as individual humans.

[00:31:06.440] Dawn Armfield

And I think that that’s the important thing is to think about, I might not want to use that service that allows me to make reservations, because I want to think about the person on the other end. Would it be convenient? Sure.

[00:31:28.620] Dawn Armfield

I also don’t live in a big city and don’t have to make too many reservations. So it might not impact me as much as it might impact somebody who lives in LA or San Francisco or New York City because everything is done with reservations.

[00:31:47.940] Dawn Armfield

And in those places, the people on the other end might be used to it. They might be well aware that that’s what is happening in any case. So I think that it does come down to an individual responsibility.

[00:32:06.010] Dawn Armfield

There’s a lot of this conversation going on out there right now. How do our individual responsibilities affect the greater good, or how do we affect one another? And I’m trying to talk to my students about this all the time that every little decision that you make in technical communication can affect thousands of others.

[00:32:30.240] Dawn Armfield

Every little choice that you make, word choice can affect however many other people will be at the other end of whatever it is that you’re creating. Or my computer science students who are developing things and they’re thinking about the word choices in whatever it is that they’re developing; I asked them to take responsibility and to care for the people at the other end, because if we’re not doing it, then who will?

[00:33:00.360] Janice Summers

Right. That’s true. Because oftentimes the creators of technology are so wrapped up in that creation of, that they don’t step away and think about it. And I think as writers, they’re trained. They’re trained from the beginning to think outside of that, not just think about the features and the functions of whatever technology you’re developing, but the human on the other end. Thank goodness we’ve transitioned from user to human on the other end, right?

[00:33:31.220] Dawn Armfield

Yeah. And I think it makes us better people when we’rewe talk about these ways that we’re in these online spaces, and we have an entire generation who’s growing up not feeling super connected.

[00:33:51.170] Dawn Armfield

But if you start to connect yourself in those ways and you start to think about anybody else When we first started on the Internet early on, when the chat rooms were going on and people would be like, it’s just another person on the other end or whatever. It might not even be a person. It could be a bot.

[00:34:17.350] Dawn Armfield

But the whole thing was, you have to think about the people at the other end of that chat conversation. It’s an actual person. There’s a real person, and we’ve been doing this ever since the Internet came around; to think about the people at the other end.

[00:34:30.680] Dawn Armfield

Technical communication is no different than that. We are thinking about the audience at the other end all of the time. And if we think about them as human beings, as the people that might surround us; our families, our friends, if we think about how they might use this, whatever it is that we’re creating, I think it just makes us better people because we are creating a space that is more human-friendly and human-facing, thinking about the ways that it’s really focused on the humanity of it and not necessarily the technology of it.

[00:35:16.310] Dawn Armfield

How is it helping the human rather than elevating the technology? And now I’m going to get all philosophical.

[00:35:30.430] Liz Fraley

Yeah. Well, it’s been a really good conversation because you’ve got me thinking about the word suggestions. In email, the word suggestions. They’ve got it in Google Docs now, word suggestions on my iPhone that figure what the next word should be. And then I was thinking about Microsoft’s bot that went horribly, horribly wrong.

[00:35:58.240] Dawn Armfield

Yeah.

[00:36:00.640] Liz Fraley

So those suggested words canI’m going to start watching and paying attention to those now because I’m curious, what direction do they go?

[00:36:10.580] Dawn Armfield

Well, I don’t know if you’ve noticed this, but on Facebook these days, I post on Facebook once a day, I post a picture, but Facebook now has suggested responses to post.

[00:36:28.450] Liz Fraley

Really?

[00:36:29.500] Dawn Armfield

Yes, not just emoticons or emojis or the specialized emoji, but they actually have worded suggestions now. And I looked at them and I was like

[00:36:44.350] Janice Summers

I notice that little ribbon, now that you bring it up. Yeah.

[00:36:47.650] Dawn Armfield

Yes. How is this impacting our actual conversation with one another?

[00:36:54.690] Janice Summers

Yes. Oh, my goodness. Yes. I think I saw it on LinkedIn too.

[00:36:59.880] Liz Fraley

I was thinking more of the There’s all kinds of issues about communication on Facebook in the funneling of people to specific ideas. And I wonder how that will play into it.

[00:37:15.800] Liz Fraley

What are the suggested comments going to be? Are they going to differ between groups, between people, between kinds of people? And I don’t know that any independent researcher can see that because it’s all tuned.

[00:37:29.060] Dawn Armfield

Well I was actually wondering that. I was wondering if my suggested responses would be different from say, my brother. And because he and I do research together, I thought this might be a fun place to do some research and to think about those kinds of things, because those algorithms, how are they

[00:37:48.420] Janice Summers

How are they deciding what to say?

[00:37:49.550] Dawn Armfield

Yeah. Well, and the thing is, is that it it sounds a lot like me. And so then I think, What are they doing? Do they know how I respond to people?

[00:38:02.520] Liz Fraley

It make you want to create some false personas to see how it tunes.

[00:38:09.300] Janice Summers

How it tunes to different personas; that would be interesting. And it’s funny that you bring that up because I noticed the other day I was sending out an email, and I use Gmail, and it was finishing my sentence. And I look up and it started finishing my sentence. I’m like, no, that’s taking it in the wrong direction, those aren’t the words that I would choose for this particular sentence. Good choice, but not the ones I want for right now.

[00:38:31.340] Janice Summers

And I thought, wow, what if I just automatically went along? It’s guiding my whole conversation, basically. When it’s deciding what I’m going to say, it guides my entire conversation so that I lose my voice.

[00:38:48.760] Dawn Armfield

Yeah.

[00:38:49.760] Janice Summers

But on the flip side of that, boy, is it handy to have an automatic response. I can just send it off for the times where I don’t have time to really think.

[00:38:58.960] Dawn Armfield

But but then I start to question, okay, is it the system that has agency, or do I have agency here? And yes, I get to choose those, but we’ve seen autocorrect do all kinds of crazy things.

[00:39:17.430] Janice Summers

Yeah.

[00:39:19.320] Dawn Armfield

So when those kinds of thingsand Jessica brings us up where the suggestions can work because it’s focused to a specific situationthen I start to think, what if that specific situation is set up in this way? But that wouldn’t be my response to that situation in this particular case, because content matters, and the context of that situation matters, and understanding the nuances of the situation matter. That’s where we are actually experts. That’s the kind of stuff that we think about.

[00:40:01.900] Janice Summers

Yes. Context rules.

[00:40:04.820] Dawn Armfield

Yeah.

[00:40:06.640] Janice Summers

Because that is so frustrating when people lift things out of context and then misuse and misinform, even with the best intentions.

[00:40:17.500] Dawn Armfield

And don’t you think that the bots can do that? Because they are pulling out suggestions based on what they think that you’re going to say, and you might just click and not realize that it actually added that to your conversation or to your response, right?

[00:40:34.170] Janice Summers

Yeah.

[00:40:36.060] Dawn Armfield

So it requires those of us who think about the context of a situation to really pay attention to that kind of thing at all times, and not to let the suggestions have the agency over our knowledge of the context.

[00:40:58.740] Janice Summers

Yeah.

[00:41:00.700] Liz Fraley

Some cool research coming along from there I can see.

[00:41:10.410] Dawn Armfield

There was a comment earlier in here about how if you hire young white guys to do these things, and it will focus something in a different way. But there’s some truth to this.

[00:41:20.460] Dawn Armfield

We hire different kinds of people, or we hire homogeneous groups of people, and they’re not thinking about how a woman might respond to this one area or a person of color might respond to this kind of thing in a totally different way than that person who created the algorithm might.

[00:41:43.740] Dawn Armfield

So we need to really think about these things and go deeper when we’re thinking about Why am I making this choice? Is it because it is easy and it gets me through that email quicker? Or is it because that actually was the word choice that I wanted to do? They can spark the thought process, definitely. I don’t know what the original intended purpose was.

[00:42:17.360] Dawn Armfield

But we’ve had them, we’ve had these automated responses since Clippy. Clippy was doing one of that for us.

[00:42:29.340] Janice Summers

Yeah, everybody remembers Clippy.

[00:42:29.340] Liz Fraley

I think that Clippy is falling out of a certain age group.

[00:42:34.830] Dawn Armfield

But obviously people will respond to Clippy. But really if we think back to Clippy, Clippy was that precursor to all of this.

[00:42:44.040] Janice Summers

Right. And it just keeps gettingClippy just gets more sophisticated as artificial intelligence starts to grow more.

[00:42:52.170] Dawn Armfield

Clippy is back. Oh, no.

[00:42:58.440] Janice Summers

All hail Clippy.

[00:43:04.030] Dawn Armfield

Yeah.

[00:43:05.040] Janice Summers

What are Clippy’s potentials for empathy and for ethics?

[00:43:11.330] Dawn Armfield

You get that smiley face and you can’t help but feel good. Or frustrated. It’s one or the other.

[00:43:22.110] Dawn Armfield

It was all in the googly eyes. So some peopleClippy made me mad. That’s right, because it would be frustrating You would say, no, I want this to happen. But Clippy would say, no, you really want this to happen. Is that what the auto suggestions are doing to us? They’re telling us, no, you really want to write in this way.

[00:43:47.170] Janice Summers

They’re trying to rule us. Well, it’s like those automated chat bots.

[00:43:55.540] Dawn Armfield

It’s the rise of the machine. We should just throw it in now, it’s the rise of the machine. And all of those Sci-Fi movies are goingTerminator is coming. That’s all there is.

[00:44:11.320] Janice Summers

We just have to make sure that we program ethics into them so that they stop and think too. When machines start to rule, how do you make sure they understand ethics and empathy?

[00:44:25.490] Dawn Armfield

How would you program empathy into AI? Yeah.

[00:44:29.450] Janice Summers

That’s why I started out, because teaching empathy is hard. To teach empathy to humans, from a psychology perspective, there are psychologists who may argue it is impossible to teach empathy.

[00:44:42.160] Dawn Armfield

So one of the things that I do, to get back to that, is I come back all the time, and my students might create something, and I’ll say, okay, that is not affecting me in quite the way I think that you think that it should be.

[00:44:58.600] Dawn Armfield

This is how I feel when I read this or when I view this or when I’m immersed in this. This is how it’s making me feel. Is that what you meant to do? Or ask your testers or the participants in your usability testing, how are they feeling in this? Because that matters.

[00:45:23.790] Janice Summers

How do you feel?

[00:45:25.640] Dawn Armfield

And that’s how I think that we start to teach empathy is to start to care about those other people on the other end.

[00:45:35.380] Janice Summers

That should be a question: How do you feel?

[00:45:40.330] Liz Fraley

I’m bringing it back to that one-to-one person, complex individual to complex individual. And you can practice.

[00:45:50.400] Janice Summers

Yeah.

[00:45:51.740] Dawn Armfield

Exactly. And I see that as my role. I see that as my role to influence good human beings in the end. I can teach them everything I know about technical communication. But really, in the end, what I tell them is this isn’t really a writing class or it’s not really a communication class, it’s really about being a good human class. How can we take that forth into the world? And that might seem all wishy-washy to STEM people. But my STEM students respond to it really well. And I think a part of that is because they’re not getting it anywhere else.

[00:46:38.060] Janice Summers

Well, and everybody who creates something wants it to be used and wants it to be appreciated. They want their creations to be appreciated and used properly. So from that perspective, I think they do care.

[00:46:53.080] Dawn Armfield

Yeah.

[00:46:55.950] Janice Summers

Our time is up. I can go on and on.

[00:46:59.420] Liz Fraley

Yeah.

[00:47:00.160] Janice Summers

There’s so many other

[00:47:01.430] Liz Fraley

I keep trying to stop it, and I don’t want to.

[00:47:04.090] Janice Summers

I know, right? I cannot believe my time is up already, but I’m getting that red flashy thing. I have so enjoyed this conversation, and there’s so much here for people. I hope we’ve communicated some techniques that some practitioners can implement in their practical lives.

[00:47:21.670] Janice Summers

And I’m so thrilled that you’re teaching from the perspective, before they get out into the working world, of let’s implant this desire to be empathetic and ethical from the very first out of the gate. And that makes us all better.

[00:47:38.740] Dawn Armfield

Yeah. It does.

[00:47:40.770] Liz Fraley

Thank you so much for being here.

[00:47:43.130] Dawn Armfield

Thanks for the opportunity.

[00:47:43.440] Janice Summers

Thank you so much for your time. Thank you. Thanks everybody for attending. And we’ll see you the next time.

[00:47:50.900] Liz Fraley

Bye-bye.

[00:47:52.310] Dawn Armfield

Bye.

In this episode

Dawn M. Armfield, PhD, is an Associate Professor of Technical Communication in the Department of English where she teaches usability, user experience, research methods, visual communication in technical communication, instructional design, travel writing, and prototyping. Her research focus is on human-centered design in emerging, immersive, and embodied technologies with a focus on empathy and ethics. She has published in interdisciplinary fields with emphasis in emerging technologies, visual communications, online collaborations, and educational technologies. Her most recent publication is a co-authored chapter, Human-centered content design in augmented reality. Prior to becoming a professor, she was an instructional technologist, systems analyst, project manager, and web content developer.

The semantic shift from user to human has created a space in which the people, the audiences, that we create content for have more depth and diversity than those of us in technical communication used to focus on. In order to create spaces, documents, and environments that appeal to that diversity, we need to look at the ways people connect with one another, the problems that arise in those connections, and the solutions that we can deliver. By incorporating empathy and ethical studies into all of my classes, students, and future technical communicators, have the understanding of why we’re not creating for a monolithic audience, but for real humans.

Resources

Faculty Page: https://carts.mnsu.edu/academics/english/english-facultystaff/dawn-m-armfield/

Twitter: @dawn_armfield

LinkedIn: #

Mentioned during the session

These are the links her students used in class this week:


Support TC Camp

We at TC Camp work hard to produce educational public service projects. We hope youll consider supporting our efforts to continue to produce innovative programs for the TPC community.

Looking for something?

Join TC Camp!

Room 42

The Unconference