Collisions in Patient Education: Surveillance, Medical Devices, and Communication

Room 42 is where practitioners and academics meet to share knowledge about breaking research. In this episode, Krista Kennedy and Noah Wilson discuss the complexities of algorithmic ecologies in medical wearables and navigating data surveillance disclosure in patient education materials of compulsory medical wearables.

Airdate: April 28, 2021

Season 1 Episode 21 | 42 min

Watch on our YouTube Channel   Listen on Apple podcasts   Listen on Google podcasts   Listen on Amazon music   Listen on Spotify   Listen on Stitcher   Listen on iHeartRadio   Listen on Buzzsprout

Transcript (Expand to View)

[00:00:13.310] – Liz Fraley

Good morning everyone and welcome to Room 42. I’m Liz Fraley from Single-Sourcing Solutions, I’m your moderator. This is Janice Summers, our interviewer and welcome to Krista Kennedy and Noah Wilson, today’s guests in Room 42. Krista is an Associate Professor of Writing and Rhetoric at Syracuse University. She is P.I, I don’t know if I know what that means, P.I of the Disability Data and Surveillance Project, Principal Investigator I’m guessing, affiliated with Syracuse University’s Autonomous Systems Policy Institute, and for this 2020/2021 year she’s an NEH Visiting Professor of Writing and Rhetoric at Colgate University

[00:00:53.270] – Liz Fraley

She is fascinated by the ways humans work closely with technology and the rhetorical implications of policies and laws that shape the work. Her experiences in academic informs her current project, which examines the intersection of deafness, artificial intelligence, passing, and ethics of medical data collection.

[00:01:12.440] – Liz Fraley

Noah is a Ph.D. candidate at Syracuse University’s Composition and Cultural Rhetoric Program, and he’s a visiting instructor of writing and rhetoric at Colgate University, where he teaches first-year writing, rhetorical history and theory and surveillance rhetorics. I love this next part, he’s curious about the way technology shape our rhetorical actions, and his dissertation addresses recent trends in social media content, recommendation algorithms that have led to increasing political polarization in the US and proliferation of radical conspiracy theories like QAnon and Pizzagate. Today, Krista and Noah are here to help us start answering the question, how do medical wearables and privacy concerns intersect with patient materials and communication? Welcome.

[00:02:00.150] – Krista Kennedy

 we are so happy to be here.

[00:02:01.470] – Noah Wilson

Yeah, this will be a lot of fun.

[00:02:03.540] – Janice Summers

Yes, we’re very happy to talk with both of you. So first, let’s talk about how you guys got together. If you could share with people how that origin story.

[00:02:17.660] – Noah Wilson

For this project specifically? Or Just how I met Krista, trying to be more specific.  So I had taken a grad seminar class with Krista, writing in technology and I started to drift into looking at like, surveillance, and I was really interested in this, I didn’t know there’s something called surveillance studies at the time and so I’m working on that project, getting feedback on that. She proposed that we should work together on this because it is something that she was interested in. I don’t know where to fill in all the gaps in between, but like that work kept growing, we’ve written a couple of things together, I ended up having my exam article about some of that work. But there is still like this nagging thing about surveillance and medical wearables and A.I. that we haven’t really scratched and like my dissertation was going in a slightly different direction, but we had done a chapter in a collection called Digital Ethics and it’s about Online Aggression.

[00:03:08.720] – Noah Wilson

And that’s where some of this thinking started happening, and then we started extending it more towards the article that we have now. I don’t know if I missed anything Krista, if that’s a good broad overview of how we started with this project at least. But I’ve worked with Krista for a while, she’s a great mentor and we talk a lot. We do a lot of projects and I’m glad she’s at Colgate with me right now.

[00:03:32.580] – Krista Kennedy

Yeah, that is pretty much what happened. You know, about somewhere around 2016, I started writing about hearing aids, actually, even though I’ve been deaf since I was two, I was not remotely interested in writing about disability or connection between disability and technology. So my earlier research was on digital and textual technology and what it meant to author and caused contact. But then I began to write about this and got interested in human-machine collaboration and the way that working very very closely with an assistive technology, results in an interesting kind of relationship with a machine-shaped human performance, but also human performance shaped technological performance.

[00:04:16.100] – Krista Kennedy

So I was thinking about that teaching the seminar that Noah just mentioned, and you know we just started talking one of those wonderful conversations, that spring up. He was doing very separate work on surveillance. So I thought, hey, here is this really interesting student and I’ve got a database I need to build you can help me do that and to skip that one summer. And we just kept talking and we kept doing a lot of thinking, both together and separately about this ever since and it’s definitely productive. So with the work we do as part of the larger disability data surveillance project. And we have another partner, Charlotte Tschider, who is an Assistant Professor of law at Loyola University, Chicago School of Law. She’s the legal mind on this team. And she and I have known each other since grad school and for a long time we’re just friends. We went to the farmer’s market every week before we ran graduate school together and then went on to separate career me as a professor, and her in various corporate contacts.

[00:05:13.970] – Krista Kennedy

And then at some point, I was babbling on Facebook about working on this with Noah and she said, Oh, hey, what about some interesting aspect of it? And we’ve been all working together ever since.

[00:05:29.210] – Janice Summers

So with medical devices, they’re gathering all of this information, but they’re not telling you that they’re gathering this information?

[00:05:37.970] – Krista Kennedy

It really depends on the type of device, we’re interested in compulsory medical wearable. So the medical wearable that you don’t necessarily opt-in to wearing. For me, that’s a smart hearing aid, but we’re also thinking about things like, you know, all of a sudden the words are not coming to me. Insulin pumps and things like that, and those typically don’t provide a whole lot of data directly to the wearer although, you know, it depends, insulin pump people do need some access to their data so they can calibrate things. But that’s very different from a Fitbit that you opt into wearing, that you’re not going to die if you take it off, and the entire purpose is to feed you data on your performance, on your body performance, so that you can use that in order to improve in all the ways you could like improve. So in the case of the smart hearing aids that we look at, almost no data actually is provided to the wearers. I go to the audiologist and then I also goes back to the manufacturer in the form of big data for R&D purposes.

[00:06:47.210] – Janice Summers

So they’re not giving you, they’re not communicating this, they’re not explaining all of this to the end consumer. Like in Fitbit, you can log in but probably a bad comparison, but I happen to have a Fitbit on, so I don’t even think about the data that it’s collecting on me, you know, and I was reading in your article that said Fitbit and I’m like, oh, hey, I wear one of those. So, brings up a really interesting because what is all of the data that they’re collecting and am I seeing all of the data? And can I opt out of them? like this is an option, right, to wear this. It doesn’t, like you bring up a really good point. My life does not depend on the Fitbit, right?

[00:07:28.960] – Krista Kennedy

Right.

[00:07:30.670] – Janice Summers

 So I could choose not to wear it. But you can’t choose not to have your insulin pump.

[00:07:36.340] – Krista Kennedy

Now, you really can make that decision, can make a decision to wear a different Insulin pump. But there are two sides to this, two aspects of it. One is the difference between American and European privacy doctrine. You know, we get our devices whether its your fitbit, your hearing aid, your insulin pump, and do you agree this, and you click yes. And if you click no, then basically you’re saying I’m not going to use this device.

[00:08:02.830] – Janice Summers

Right.

[00:08:02.830] – Krista Kennedy

And there’s not really a meaningful way for you to opt-out of data collection. The GDPR in Europe is a little more fine-grained in which it does mandate that you have some sort of access to data, that you have a meaningful way to opt-out, and that there is room for at least some sort of exchange of information between the wearer and the corporation that’s actually making the device. So part of that is just our overall expectation when it comes to what you might call bio capitalism and the sort of data that you continue to exchange with the device, but maybe it’s also useful for me to say a little bit about the types of data.

[00:08:50.110] – Janice Summers

Yeah.

[00:08:52.390] – Krista Kennedy

Yeah, I mean our case study focuses on the Starkey Halo hearing aid, which was the first smart hearing aid that was really widely adopted internationally. And I want to say 2013, I got mine in 2014 and it was part of what got me so interested in this topic. And I also want to say that we’re not picking on Starkey by using this as a case study. They’re not an evil company, they are a large international company that makes fantastic products, heavily regulated industry and they’re following the guidelines. We use them as a case study precisely because they are absolutely unremarkable when it comes to the sort of data collection they’re doing and the way they do, we don’t disclose them to their wearers.

[00:09:39.730] – Krista Kennedy

That hearing aid has multiple algorithms within it. And the current generation, which is Livio, have even more and it’s actually marketed as A.I hearing aid. But I mean for instance in mine there are algorithms that work with my hearing range. I don’t have a lot of high range in mine, so it’s grabbing all those higher sounds which include women’s voices and compressing them down into my range. So when you’re talking, I’m probably not hearing your actual voice.

[00:10:07.480] – Krista Kennedy

It’s also running an algorithm that distinguishes between syllable that separates out your speech a little bit, that filters out all the ambient sound around me and especially behind my head, so there is a lot of directional mics which in that happens.

[00:10:24.340] – Janice Summers

Right.

[00:10:25.450] – Krista Kennedy

There is feedback that’s being throttled, which is really useful if you’re wearing a hat in the winter in upstate New York. So there’s a lot that’s happening there, and the real innovation about the Halo was also that you could set up what were called dumpedgate memories that are geotagged, because that particular hearing aid, all of my hearing aid, tend to work with a smartphone. And you no longer have all the little switches and buttons that you would use to control volume, you do that from the app on your phone, which means you have telemetry, which means you can do things like set up, in my case, a memory or the grocery store down the street that has a lot of tile floors that makes carts rattle really, really loudly. I can filter that out and it knows when I’m in the store and it just automatically toggles to that setting. So that also means that your hearing aid is in some ways, I’m talking about the ecology of the hearing aid knows where you are in tracking that data.

[00:11:26.270] – Krista Kennedy

So all of your sound levels are being tracked, where you are being tracked, your rate of speed is being tracked. Because one of the handy things about this hearing aid, is if you’re going more than five miles an hour if there is too much wind in the car, and it start to throttle out road noise and car noise and things like that. So these are things that are really, really useful. I mean, I wear one of these, I like it, I don’t want to go back to wearing another hearing aid, but there is a trade-off.

[00:11:53.650] – Janice Summers

And all of those things make your life better.

[00:11:56.860] – Krista Kennedy

Yeah.

[00:11:57.280] – Janice Summers

Right.

[00:11:58.150] – Krista Kennedy

It do make my life a lot better.

[00:11:59.350] – Janice Summers

Some great features that really can help make things a lot more comfortable for you. So they’re beneficial for sure.

[00:12:07.090] – Krista Kennedy

And I willingly opt-in to wearing one of these before my feet hit the floor in the morning, like my hearing aid is on, and it is the last thing I take off at night, so depending on how long my day is, 14 or 18 hours of data. And I’ll keep doing that, I like these hearing aids, I like the way they let me run my life. I’m glad they exist. At the same time, I do think that it’s possible to be more transparent about the data that’s collected, how it’s transferred, how it’s stored, if it’s resold, what purposes it put you, and to let people know what’s being collected as they wear these devices that are really good.

[00:12:52.950] – Janice Summers

Right, and I think those were the advices that you had in the conclusion. Like there were several things that you found from studying all of this that you felt if they acted on these things, that would make things better for people and more transparent, I think transparency is kind of key. Like I can see, I hate keep using the Fitbit that’s the only device I have. But one of the things I can see, like my scale is actually a Fitbit scale, so it all communicates with Bluetooth together, and I can see on my smartphone, but I can track all that and I know what data they’re collecting. I don’t know if there’s additional data, right, so–but knowing all this and transparency, I think is one of the key things in communicating that and writing clear communications.

[00:13:51.150] – Krista Kennedy

Starkey and Fitbit, I’m assuming I don’t have a Fitbit, but I have an Apple Watch doing really well, is there Legal discourse exist in great plain language they’re short they’re accessible, now these are not the long legal documents that people just scroll scroll scroll and say yes on. And I think it’s possible to take–there are still a lot of challenges that we can talk about in a second, but one would hope it would be possible that it could point to maintain that great plain language practice and say, here are the types of data we’re collecting, here’s how we store it, here’s how we transfer it. We’re going to resell that because that’s a commodity for us, you know, and make those sorts of disclosures, you know, also let people know what sort of algorithms are in play, what do they do. We reverse-engineered the Halo actually to a preferrable functionality and because Charlotte for a number of years with a member of the target team, which also collects data in a number of ways that she was familiar with a lot of functionality of an app and data collection. And the nice thing is that she does have acquaintance with Starkey.

[00:15:01.990] – Krista Kennedy

So after we were done, she was able to kind of informally go get this right and they went we know it’s ya. So simply saying, you know, the really interesting features I’m fascinated by things like sellable distinction and the sort of filtering that happens, and I think it’s possible to say this is what we’re doing without inching over into trade secret areas and I think they really do need to be kept quiet or proprietary.

[00:15:32.970] – Noah Wilson

I was just kind of thinking with all this too, it’s about a relationship between a technology user, a company, and in this case, an audiologist, but I think having those disclosure, that transparency helps to strengthen that relationship, that’s one of the big arguments, too, it’s not that people are if they know these things they’re certainly going to opt-out. If anything, they would feel more comfortable opting in that they know, but the fact that these things aren’t even listed, that’s kind of the rub. What I really like about the European law, whether they label, like algorithmic processes as something they have to be disclosed.

[00:16:05.490] – Noah Wilson

And so that was something I didn’t know much about the law, I know it’s changed a lot of things even with websites like that whole provision, but this idea of automated processes and you have to disclose that they’re happening like they are these algorithmic decisions. But then the ability to object I think is huge under limited circumstances. Some of that is just offering them an explanation for what’s actually being automated decision right, like with a hearing aid I could see a lot of that’s going to be like a trade secret kind of thing. But this idea of like you’re able to change your consent, I think is one of the recommendations we make at the end is that if something were to change on circuits, then you should have the option to be like, I don’t want to be part of your data cell anymore.

[00:16:43.750] – Janice Summers

Right.

[00:16:45.200] – Noah Wilson

I think a lot of time with data it’s like you’re under terms, you feel good about it. But then, like that company gets bought out or their trajectory changes and sometimes that’s not communicated in a way, laws are kind of now they don’t necessarily have to ask you to trail all the time, or they don’t have to be transparent, like I could see like a really–maybe like one could to be like an e-mail like, by the way, where our data was bought by whatever, like it should be better than that. I think that yeah, some of the recommendation we make will help with that. It’s really about the relationship between people that technology kind of–

[00:17:18.590] – Janice Summers

Right, and I think, you know, whether or not you can use my data to sell for lack of a better phrase, versus you’re going to be collecting data. I should be able to opt to say it’s OK to share this information with others and what I would want to share with others. Because as a participant, because I am a participant, even with the Fitbit or a device, I’m a participant. Like you said, you could choose different hearing aids. I’m participating and it’s OK, you’re being clear with me about the information you’re gathering. And I can choose to allow you to use that data to help you, or I can choose to say, no, please keep my data private. That I think would be more fair and equitable.

[00:18:07.490] – Krista Kennedy

I think so, I mean, I like that my audiologist has access to my data because it helps us do a lot of really personal calibration for this particular hearing aid. I don’t necessarily loathe the fact that Starkey uses it for R&D because I want me and other people to have better hearing aids.

[00:18:26.000] – Janice Summers

Yes.

[00:18:26.590] – Krista Kennedy

Would I like to know how much stored, would I like to know the extent to which it’s anonymized? You know anonymizing data that don’t do anything, can you truly anonymize data? There’s a lot of varying theories out there about that. I would like to know when it’s resold and to who. That’s to sort of–I’m not against data collection wholesale, although I think there are some things I would like to not have collected just for general privacy problems, not because i’m up to anything particularly interesting. But I would like to know what happened to my data.

[00:19:05.090] – Janice Summers

Right.

[00:19:05.960] – Krista Kennedy

And I think there’s a certain level, I wish that impact patient or where dignity and that’s the next stage of what we’re working on is thinking in a more theoretical way about how that sort of disclosure or lack of transparency, general opacity surrounding information in any context, how that connects with maintaining patient dignity across multiple medical contexts where you’re working very closely with a compulsory medical wearable.

[00:19:33.650] – Liz Fraley

Right, you said you had to reverse engineer some of it? How clear were they? And what did you find or not find?

[00:19:47.390] – Krista Kennedy

Well, they’re not very clear at all. They weren’t for the Halo, we’ve not yet looked at the material for the Livio. And the Livio will be marketed completely differently because it’s been marketed as an AI hearing aid, whereas automation and  algorithm were not mentioned really in any of the literature connected to the Halo. So it’s a very different way of marketing a hearing aid because we don’t normally talk about the hearing aid, and when you sell a hearing aid, you say here’s how your life’s going to be different, you’re going to be more social, you get to be with your grandkids. You won’t have to retire early, you know, all these sorts of things. But really, there was nothing said about how it worked, either in patient literature and the literature that would apply to my audiologist, she  requires a medical admin to look at it or in the patent filing because we were thinking, OK, what are all the possible ways that a patient is really determined might find this information.

[00:20:43.440] – Krista Kennedy

Probably most people are not going to go for the patent, but we did, partly because we have a lawyer on the team and it’s not disclosed there either in any meaningful way. So we spent a lot of time mapping, figuring out from white papers, Starkey is great about publishing their white papers, we gathered what we could from that portion of the website thinking about how this is worked across other apps, Noah, you did a lot of research to figure out exactly how things like cell towers and Wi-Fi and satellites come into this.

[00:21:28.760] – Noah Wilson

There’s even like a document just finding like the official government document for talking about GPS satellites and the official way they describe how they work with them, thinking about how triangulation works with different things. And a lot of times a GPS satellite might be in play. But I was surprised that I found out that like cell towers are kind of part of that triangulation tool. But it’s still not like–I think the accuracy is pretty good I mean, we’re able to navigate by it. But even that is not like a 100%. Like I live right near a cell phone tower, and that kind of messes with my triangulation sometimes in some interesting ways where, like, it thinks I’m closer to the cell phone tower than I actually am. Because, it’s weird, like I’m in such a close proximity that it thinks I’m close to that, but that’s either here or there. But if I have the little things there and it’s sometimes like Wi-Fi access points might come into play, and so it depends on each I guess platforms like Apple’s a big part of it, like their way of triangulating your space is something that’s very very heavily tied to it. And correct me if I’m wrong, Krista, but I think it only works with an Apple iPhone right? It only works with like that app or do they have it on Android now?

[00:22:35.460] – Krista Kennedy

They probably got the Android now, that program only works with an iPhone when we were running the study. I mean, it was an iPhone only Q and A. That was one of the surprises for me, the first article I read about this for Communication Design Quarterly, I was only thinking about the ecology that is the human, the hearing aid and all the algorithms that Nano coding and other things that happened there and the phone. And then when we really started to think about data for this larger studying that we’re starting to get out, really surprised at how much you have to expand out in the world to think about how did that work for this hearing aid actual functions?

[00:23:14.450] – Janice Summers

I think that was sorry–that was one of the things that struck me when I was reading through your article was when you look at the network, because for me, pictures speak volumes, and when you look at the network pictures, it’s like all these other pieces that are in it. Cause I was just thinking in terms of well, person to the company. But there’s all of these other data points or places in that connection where that big data is being shuffled around.

[00:23:45.860] – Krista Kennedy

Yeah.

[00:23:46.700] – Noah Wilson

Yeah, and it all fits through the one app so like you have this whole big like, collect everything and then it gets translated through software that figures out what to do with that data, and then is that the only thing transmitted? But then you have your GPS coordinates this way, I don’t think necessarily that Apple has recorded the TruLink data, but it’s interesting, it still is mediated through an app, which is then also taking data from the phones regular working.

[00:24:13.982] – Janice Summers

Yes the phones yeah.

[00:24:14.450] – Noah Wilson

Yeah, it’s really I was really surprised at how much the phone is necessary for that, and I think like other technologies, like I even think of like VR, how a lot of the advances in VR because the things we could do with phones, I think it’s really interesting that smartphones have opened up a–I guess, a lower cost of entry to do things like this, like I don’t I think that ubiquity of the smartphone is what makes something like this possible worldwide because it has all this technology built into it.

[00:24:41.580] – Noah Wilson

But again, it’s like, how do you map that out? Like how do they incorporate and talk about maybe the ways the phone is used? I mean, I would expect maybe some discussion of like the different components which we didn’t see but like a different component and how it collects data and filters it through. But then stuff like, rewards like a phones operations in this and like me, I was surprised that’d be a question I asked it’s like, how is the phone integrated? And how do you consider like, the data you’re getting from that and what hurdles do you have for that Starkey? And I wouldn’t think they asked that before we actually kind of mapped it out and saw that like there’s a lot of different networks of tech that are kind of working together in this one middle point. I think the only thing that actually interfaces were like the human is a very very small part of it where like you would think they would be the most important part. But its like all these other systems that make that part work.

[00:25:27.880] – Krista Kennedy

And one of the questions we really — you know, still, I don’t think we’ll ever answer within the scope of this team is how much of that data does apple also have? Especially when it comes to the location and that end as a telemetry. I don’t know, it’s their device, they have access to it.

[00:25:43.950] – Noah Wilson

Yeah, so do they count it as just like is that Apple or is it just you — doing like a coordinates thing. So is that data the same for them as if you’re looking up, you know, directions somewhere. But it’s like you know because it’s TruLink app is the mediator of that. So, yeah I wonder to Apple is it just like you use your GPS every day for everywhere you go and you’re just constantly tracking yourself is that just the way it sees it? Or — and again, we’ll never really know what happened to that data or anything.

[00:26:10.500] – Janice Summers

Yeah. We come into the world of surveillance there’s boy! You can go down a lot of rabbit holes with that one.

[00:26:17.700] – Krista Kennedy

You can.

[00:26:18.550] – Liz Fraley

Yes, and I was just thinking that, too. And we’re talking about this in the context of patent materials and communications and some of this is almost out of scope or that I mean in some way it feels like it’s out of scope right, it’s the–`who should explain that?

[00:26:37.950] – Janice Summers

And how, so — Liz, you’re right on like, where I was going is like, how are you supposed to communicate this to — there’s various people, right? So there’s me there’s like the me of the world who is like look, I want the benefit of the Fitbit. I don’t care about all the other stuff, I really don’t. I know that my right to privacy, I just for me personally, I make that decision and I can personally abdicate any rights to my data.

[00:27:07.260] – Janice Summers

And it’s fine because I want this trade-off. Now, there’s other people that want to know exactly what’s being collected, exactly how you’re handling it and who’s handling my data right, how do you communicate this to such a broad audience from the extreme people who are like, whatever, I’ll sign away, I don’t care to the people who really do want to know all of the details. How do you address that and communicate that?

[00:27:34.060] – Krista Kennedy

That is a great question, you know and as we prepare for this, I really was thinking about the fact that it’s all very well and good for a bunch of academics to go through so we think you should do in a perfect world about this. And then the actual rhetorical problems, are both legal problems on the ground for both the corporation and for technical writers.

[00:27:51.730] – Janice Summers

Yes.

[00:27:52.600] – Krista Kennedy

I mean, I think there are two sets of problems here. One is persuading people to wear hearing aids, and one is documenting this. And I mean the first hearing aids have a very low adoption rate. I’m sure a lot of you in the audience have had an older relative who you think would benefit from a hearing aid, who doesn’t want to get one or you have someone they live with– they don’t want to wear them. Cause it’s a long process to learn to wear hearing aids right, and as I said a little while ago, you don’t sell a hearing aid by going, wow, don’t you want the shiniest new hearing aid like you do a car or an iPhone or something like that?

[00:28:30.670] – Krista Kennedy

Because, honestly people don’t. No human really wants to put something mechanical in their ear, we don’t like having things in our ears remember when you were kids and we run around giving each other wet willies to irritate each other? We don’t like to put things in our ears.

[00:28:46.220] – Janice Summers

Just thought of it.

[00:28:49.030] – Noah Wilson

Yeah, cringing a little bit.

[00:28:50.710] – Krista Kennedy

Yeah and humans also don’t like amplified sound a whole lot. We do, we like our music loud.

[00:28:55.284] – Janice Summers

Right.

[00:28:56.220] – Krista Kennedy

Right, but learning to hear or process sound is a whole different thing. So there are a lot of barriers plus the cost. I mean this hearing aids sell for up to $5000 a pair basically, and you need an iPhone on top of that and whatever other peripherals you need. So there’s also a significant financial barrier to actually successfully adopting hearing aids. So you sell these, not by going don’t you want to buy shiny technology? But by going don’t you want to be more social, don’t you want to be able to connect to your family?

[00:29:26.500] – Janice Summers

It’s the quality of life.

[00:29:28.720] – Krista Kennedy

Right, it’s absolutely a quality of life pitch and that’s been the most effective thing. Quality of life and also — the part of quality of life that has helped. Don’t You want to stave off All-timers by making sure you keep your language center more active and therefore you are more social, you’re constantly problem-solving, you’re keeping your brain more like active. That’s really the persuasive move that a lot of hearing aid brochures, advertisement, you know, what-not, make to actually sell hearing aids. So saying don’t you want a robot in your ear is not the way if people really become most receptive to this. So there is the whole problem of selling this and then there’s the problem of we spend a lot of time making something that’s really proprietary. I’m Starkey, I don’t necessarily want my competitors stealing my algorithms, I’m not going to talk too much about how they work, right?

[00:30:26.930] – Krista Kennedy

And then there’s the problem of documenting what’s essentially a black box right. That if it’s really based on smart learning, humans may not have a really deep understanding of how it’s working at this point, I don’t know if that’s true in the case of hearing aids, its not, but it’s not unrealistic to wonder about it. And then also, how do you protect the proprietary aspects of this, especially if you’re relying on trade secrets. So as a technical writer, I’m very curious what your audience would have to say about this either the folks that are here right now or folks that might hear this podcast later on, how do you actually deal with these issues on the ground in an ethical kind of way?

[00:31:15.640] – Krista Kennedy

It’s a very different scenario than — like when I was learning to teach technical communication, we had all the case studies for Ethics day right, and it was like you work for a company that’s making heinous chemicals and dumping them into the waterway, do you say something about that about what’s like your ethics scenario? This is a world away from that, right.

[00:31:38.020] – Liz Fraley

Yeah.

[00:31:38.020] – Krista Kennedy

How do you document a black box that’s proprietary, that’s not hurting people and really helping them in some ways, but also has other potential problems down the line, And what’s your ethical obligation to that? You know, it’s just a completely different set of problems, from that angle, and I don’t know the answer to it.

[00:31:59.120] – Liz Fraley

That’s a tough question, and half the time, I don’t think that the software engineers know the answer to that question either and, or even really what the full implications are going to be when they’re thinking it up at the time, right

[00:32:11.790] – Krista Kennedy

Future here, right?

[00:32:13.940] – Liz Fraley

Yeah.

[00:32:14.840] – Krista Kennedy

Yeah.

[00:32:15.290] – Noah Wilson

That’s one thing that struck me about the EU lost of two is like they’re trying to account for this downstream kind of stuff and we talk a lot, even before we kind of dug in for the EU loss of–I know we were first talking about this article. We are thinking about, like, here’s the immediate benefit, but then there’s like these trickle-down things that happen, so like if all the preferences for this device are made for a particular kind of user, what about the other users who can’t afford it yet but might be buying technology down the line that is informed by this research and it’s made more accessible.

[00:32:43.740] – Noah Wilson

Maybe they have less customization. And so there’s like, there’s a lot of down the stream kind of things and it’s hard to think about that. Like if you ask anyone to think 10, 50, 100 years into the future it’s tough, let alone when you have something where, like, it is like a literal black box. But how do you ethically talk about the downstream effects of something we’re just kind of getting a grasp on? It’s difficult.

[00:33:06.340] – Noah Wilson

Even as someone like trying to theorize it, it’s really difficult to talk about something like this could happen or will this happen? You know, a lot of things deal with statistics or probabilities, even like the way surveillance kind of works with surveillance capitalism. It’s all about like the probability that something will happen or you have a chance to do that and like nudging, and so just dealing with a lot of uncertainty, with numbers to kind of give us some direction.

[00:33:28.840] – Noah Wilson

But I feel like some of the downstream stuff we talk about in the article are stuff that maybe think about, maybe to start opening up channel for people to at least get information when they need it. So I think that audiologist, dialog would be really, really important, for the human users and also this having someone to talk to, if you are user. Like, I want to know how my data being used, maybe you’re the only user out of 100 who does, but to have a channel to ask those questions, even if they get stuck with the dial one then six, but eventually get to talk to somebody at least having that I think it’s important.

[00:34:01.890] – Janice Summers

Yeah, having a touchpoint to go to, to ask information.

[00:34:09.350] – Noah Wilson

Yeah, and then at least there has to be an answer to give to people rather than well, there is no one asking.

[00:34:16.250] – Liz Fraley

I know you two had a good question you wanted to post to the technical writers who are listening and watching, and we’ve got a comment that’s coming back in that I will forward back around. But I don’t want to–we are getting close on time, it sought of zipped past us.

[00:34:32.390] – Janice Summers

We want to make sure, those we can get there questions now —

[00:34:34.590] – Noah Wilson

Yeah, yeah, yeah, a question I had, you know, thinking as tech writers, like, what parameters are you giving for the writing that you do? And how much can you push back or ask questions or kind of like there is directives you have to work within them like how much do you get the dialog, I guess, with the people setting the parameters for you. That’s something I’ve always been curious about, and then how do you do that, what does that look like when you do push back or ask questions if you have done that, like what kind of responses have you gotten? Again, I don’t know I’ve never been put in that situation per se so I’m curious about that relationship with the person that you’re writing for I guess, on behalf of you.

[00:35:14.200] – Janice Summers

And I think that question could extend for the technical and professional writer beyond those who are doing medical devices.

[00:35:20.770] – Noah Wilson

Yeah, definitely.

[00:35:21.910] – Janice Summers

It could be anybody, and it could be that your company is doing some kind of surveillance type equipment or not. It doesn’t matter because I think all technical and professional writers run into this dilemma and all people would have valid input to give to Noah.

[00:35:37.480] – Noah Wilson

Yeah, definitely.

[00:35:39.220] – Janice Summers

I think everybody should kind of — this is a callout, all you professional practitioners, academia is asking you to answer that question. And Noah’s contact information is along with this podcast, it’s on the page, so hit him up, blow up his e-mail.

[00:36:00.690] – Noah Wilson

Yeah, this is something that I talk about with my students, like we talk more about like social media platforms and privacy policies. And so, again, thinking about like where the points you get pushed back and what does that look like, so I have them try to remediate privacy policies with some of the stuff we learned in my surveillance class and mine. It’s like, how would you just tweak a policy to be a little more transparent? Maybe it’s just listing, you know, that name of someone you could contact or like maybe it’s just having a little more specificity. It’s still pretty broad, but at least it’s more than just like third parties, like what would be more specific and they kind of have to try to think about that as like someone writing these policies themselves and what that would look like.

[00:36:39.570] – Noah Wilson

 A lot of them say like, well, what if the company won’t tell me, like any of that stuff, like what do I do then? So I love to do a little more speculation, like they can’t ask Facebook, like what would be a more precise terminology for your privacy policy? But they kind of think about what some of those stuff and that’s partly why I’m mentioning the question, it’s a curiosity about how to navigate this.

[00:36:57.150] – Janice Summers

Yeah, and they can answer that question without divulging the company name.

[00:37:00.840] – Noah Wilson

Yeah.

[00:37:00.930] – Janice Summers

You don’t have to tagline your employer or tattle on a past employer, just,you know, just a scenario and, you can keep the company names confidential. Cause there is with some people I can imagine that there would be a confidentiality thing that they’d be concerned with. So they can answer the question without divulging that, because you’re more interested in the practices and procedures and what happens in the real world or you know, the commercial world.

[00:37:28.650] – Noah Wilson

Yeah, it makes me think of when I was teaching professional and technical writing, I taught a section of that, we talked about instructions and some of my students like it’s this good information to have is like better prepared my students who might go on to do this writing.

[00:37:41.920] – Janice Summers

Yes, cause they are going to be hiring your students.

[00:37:43.340] – Noah Wilson

Yeah, and it blows their mind when they talk about ethics because they’re expecting, like Krista said, like that case study of like, you know, someone’s dumping chemicals, what do you do? But more so it’s like, well, how do you make sure that you’ve given enough information to users and how do you talk about–my one student told me that, like having those conversations were beneficial for her later on, like an interview because she was thinking about those questions. I want to do more of that, and I’m like the technical writing class to help my students think about the problems now and then, they can incorporate them into their work or even look into it–

[00:38:17.060] – Janice Summers

Right, well, hopefully people will take their time to respond to you, because part of the goal of Room 42 is to bridge that gap between academia and practitioners too, right.

[00:38:27.020] – Liz Fraley

Yeah.

[00:38:30.470] – Janice Summers

So was there another question you wanted to shout out to the audience to get their help in?

[00:38:37.980] – Krista Kennedy

Oh, that was me, actually, I asked earlier when I was asking about really the underground difficulties of documenting lightbox algorithm and figuring out what you can document when you’re also dealing with a legal team who is very concerned about proprietary information and trade secrets.

[00:38:57.160] – Krista Kennedy

I mean, we’re saying, oh, this should be documented. How much can really be documented? I think there are multiple factors here that limit that logical level.

[00:39:08.390] – Janice Summers

Right.

[00:39:08.920] – Krista Kennedy

Love to hear what people are actually dealing with when you try to document these sorts of things.

[00:39:16.930] – Janice Summers

Yeah, where do you draw the line?

[00:39:19.820] – Krista Kennedy

Yeah, and where is the corporation drawing the line for you because of legal factors?

[00:39:26.600] – Janice Summers

Right.

[00:39:28.900] – Liz Fraley

Who drew that line for who? Like–and I think that one of the things that we didn’t really talk about, but I want to close with is it’s really neat how you all came together and it came out of just conversations and talking with a colleague that you opened up this whole new place to find inside and dig into new policy research and communications. It’s just like you can’t–that is so very important and it is just so neat to see that, that was a trigger for you guys.

[00:40:11.010] – Krista Kennedy

Thanks, I don’t know about Noah, but I feel really lucky, and I’ve learned a ton. I mean, Noah and I are both rhetoricians, but we work in different ways and in slightly different parts of the field.

[00:40:21.270] – Janice Summers

Yes.

[00:40:22.050] – Krista Kennedy

And Charlotte mastered the rhetoric and of course did the J.D and so I think it’s fair to say that we all really learn from each other in this situation. And I mean, we each have our own solo work, but I really love doing this work with this team and it’s been really productive.

[00:40:40.050] – Liz Fraley

Yeah.

[00:40:40.950] – Noah Wilson

Think about like earlier on, we have like this whiteboard of like all the projects we want to do with this thing, and it was like this three-year plan or four-year plan, I think we did almost everything on it. I think there was like one article that didn’t quite get around to, but I think about that, and I remember a session where Charlotte actually came up and visit us in Syracuse and we just like buried ourselves in the library and we were using whiteboards, and like, we had to take pictures of these things because we had so many intersecting lines that when I look back at the pictures like it didn’t completely make sense to me anymore because I don’t know where my eye was going with it, but it was just really cool to see like that’s like the raw stuff and like the conversation in like this graphical form, but those are like the fun moments. To just try to hash out an idea and it takes a while and it’s messy and then you eventually have something that makes sense to other people.

[00:41:30.540] – Liz Fraley

Yeah

[00:41:31.380] – Krista Kennedy

We had a lot to–, and we keep finding out. You know what Rhetoric of Health and Medicine ask us to do an interview, a written interview about the interdisciplinary nature of this, and we thought it would be like five pages and then we turned it into 13. You know, it was productive, but sometimes we have to wrangle ourselves, so I think– you know what I’m saying.

[00:41:54.570] – Liz Fraley

Awesome, well, this was great. Thank you for viewing this and sharing this, you guys are really interesting to just get ideas from.

[00:42:04.230] – Noah Wilson

Yeah, thanks for having us on, this was awesome.

[00:42:06.730] – Janice Summers

Very thought-provoking, yeah, we appreciate it.

In this episode

In this episode of Room 42, we discuss the Disability, Data, and Surveillance Project, a joint project of researchers at Syracuse University and Loyola University Chicago, and the results of our ongoing study of algorithmic data collection in compulsory medical wearables.

Device manufacturers and other high-tech companies increasingly incorporate algorithmic data surveillance in next-gen medical wearables. These devices, including smart hearing aids, leverage patient data created through human-computer interaction to not only power devices but also increase corporate profit.

Although US and EU data protection laws establish privacy requirements for personal information and use, these companies continue to legally rely on patients’ personal information with little notice or education, significantly curtailing the agency of wearers. Join us to learn more about the complexities of algorithmic ecologies in medical wearables and navigating data surveillance disclosure in patient education materials.

Krista Kennedy is fascinated by the ways that humans work closely with technologies and the rhetorical implications of policies and laws that shape that work. Her experience as a deaf academic informs her current project, which examines intersections of deafness, artificial intelligence, passing, and ethics of medical data collection. She is the author of Textual Curation: Authorship, Agency, and Technology in the Chambers’ Cyclopaedia and Wikipedia as well as more than 25 essays. Kennedy is Associate Professor of Writing & Rhetoric at Syracuse University, PI of the Disability, Data, and Surveillance Project, affiliated with SU’s Autonomous Systems Policy Institute, and, for the 2020-21 year, NEH Visiting Professor of Writing & Rhetoric at Colgate University. She teaches courses on information design, cultural history of robotics, rhetorics of technology, and professional and technical writing.

Noah Wilson is curious about the ways technologies shape our rhetorical actions, particularly how we make connections with other people. He is currently a PhD candidate in Syracuse University’s Composition and Cultural Rhetoric program and a Visiting Instructor of Writing & Rhetoric at Colgate University where he teaches first year writing, rhetorical history and theory, and surveillance rhetorics. His dissertation addresses recent trends in social media content recommendation algorithms that have led to increased political polarization in the United States and the proliferation of radicalizing conspiracy theories such as Qanon and Pizzagate.

Resources

Krista’s email: krista01@syr.edu

Krista’s Website: kristakennedy.net

Noah’s email: npwilson@syr.edu

Noah’s website: npwilson.net

Read the paper: “Balancing the Halo: Data Surveillance Disclosure and Algorithmic Opacity in Smart Hearing Aids” Rhetoric of Health & Medicine. Vol. 4, No. 1, pp. 33–74. DOI: 10.5744/rhm.2021.1003


Support TC Camp

We at TC Camp work hard to produce educational public service projects. We hope you’ll consider supporting our efforts to continue to produce innovative programs for the TPC community.

Looking for something?

Join TC Camp!

Room 42

The Unconference