Room 42 is where practitioners and academics meet to share knowledge about breaking research. In this episode, Yoel Strimling discusses what documentation quality means and gives reliable methods and metrics for measuring it.Airdate: September 1, 2021
Transcript (Expand to View)
[00:00:10.250] – Liz Fraley
And good morning, everyone. Welcome to Room 42. I'm Liz Fraley from Single-Sourcing Solutions. I'm your moderator. This is Janice Summers from TC Camp. She's our interviewer. And welcome to Yoel Strimling. He is the first practitioner researcher we've had here in Room 42. He's been spinning straw into gold for over 20 years. He works currently as the senior technical editor, documentation quality SME for CEVA Inc. He's in Israel, and we're super appreciative that he could be here with us. Over the course of his career, he has successfully improved content, writing style, and look and feel of his employer's most important and most used customer facing documentation by researching and applying principles of documentation quality and survey design.
[00:00:55.530] – Liz Fraley
He's an associate fellow from the STC. He's a member of Tekom Israel. He's the editor of Corrigo, the official publication of the STC Editing SIG. And today he's here to help us start answering the question, what does documentation quality mean and how do we improve it? Welcome.
[00:01:12.910] – Janice Summers
Yoel, we're so excited to have you here. Really thrilled. Now, the first question I have is, how did you, I mean, I look at your academic background in psychology and you're talking about quality. What drew you to this topic of quality in communication and documentation?
[00:01:31.240] – Yoel Strimling
Well, first, let me thank you for inviting me here, both Liz and Janice. I was looking at the past Archives of Room 42, and I'm very impressed by the quality of the people who you've interviewed. And I'm really honored and humbled to be in that list. I would recommend to the listeners, now, after this, go back and listen to the other archive. Lot's of great, great stuff there in the archives. So, Janice, you asked me how I got into this from academia. Let me explain a little bit about my background in academia.
[00:02:07.780] – Yoel Strimling
I have a master's degree in developmental psychology, ages zero to five, and I started working– because I was a Hebrew University, I started working as an editor for the Psychology Department because all the professors were writing articles and things like that. And I was doing my own research. As an interesting piece of knowledge, my master's thesis is actually a development of personal space in young children. So basically, I went and sat to close to little kids until they moved away. Why do I do that? Because I'm always interested in how people react to their environment.
[00:02:50.520] – Janice Summers
[00:02:51.650] – Yoel Strimling
And because I was working as an editor for the Psychology Department and being as a psychologist–well, not a psychologist as a psychologist-in-training, right. I was fascinated by the people who read these articles. I read the article and I said myself, Who's gonna read this? It's a well known joke in technical communication that nobody reads the documentation. We know that's not true because at least an editor reads a documentation.
[00:03:22.580] – Janice Summers
[00:03:22.980] – Yoel Strimling
I might be the last reader, but I'm always the first reader. And because I'm the first reader, I always saw myself as the reader advocate.
[00:03:34.840] – Janice Summers
[00:03:36.040] – Speaker 2
And that's what I brought, when I worked as an editor in a psych department. And I said to me, you know what? I don't have the skills really to be a therapist, I don't have the empathy. That's how it is. Sometimes you do. Sometimes you don't. But I said, I really have a feeling for this kind of work, for editing material that will help readers understand what they need to understand or do what they need to do. So I got into technical communication. But because I come from an academic background, having done research myself, scaring little children in the sandbox, I come with a certain skill set that I think is very important for practitioners.
[00:04:17.880] – Janice Summers
Yes, it is very important. And that's why I wanted to ask you that because I think your research is very solid and very sound research. And I think it's that background that you have, even though you're in the practitioner and not the typical person who comes here. It's your research is very practiced. It's very sound research. So what specifically attracted you to the topic of quality?
[00:04:45.180] – Yoel Strimling
As I said, because I'm an editor, I have to be a reader advocate. But I can't be a reader advocate. I don't know what my readers want.
[00:04:53.340] – Janice Summers
[00:04:53.950] – Yoel Strimling
So in psychology and in any other field, people who work in the field should look at the literature to see what's going on. Doctors always have to read the current journals. Lawyers always read the cases. What do technical communicators to? Well, they read Intercomm and they read magazines and they read technical journals. They read Technical Communication, other scholarly and trade journals. But I found it was a lack of connection between what we, as practitioners do, especially Editors, and what's going on in the academic world.
[00:05:28.240] – Yoel Strimling
So I said, there's a niche here that needs to be filled. There's something missing. There's often a disconnect, unfortunately, between information consumers and information producers. As information producers, as technical communicators, we don't have the access we need often, if at all, to our readers. It's weird. When I was doing my research, I'm doing my research. One of the hardest parts of the research is finding readers, finding readers who had participated in my study because I didn't have access. I had to go to customer support groups at different companies around the world and say, hey, will you be willing to send my surveys to your readers?
[00:06:13.880] – Yoel Strimling
And what's interesting is there's often a policy of not asking readers, customers, about documentation quality. It could be a lack of time, a lack of resources, a lack of interest. But there's a disconnect there. And it was very bothering to me. I said, listen, I can't be a good editor. I can't be a good reader advocate if I don't know what my readers want.
[00:06:38.930] – Yoel Strimling
So, pulling on my psychology background, I started looking in the literature for anything I could find about information quality, documentation quality. And I found– what's interesting, there's at least 50 different definitions of documentation quality out there. I'm sure everybody listening to the podcast can think of them on the top of the head right now.
[00:07:00.920] – Janice Summers
And that was the thing, too. It's like, well, you know, quality like I just assume I know what quality is right? You would assume.
[00:07:10.880] – Yoel Strimling
Yeah. You know what quality is. But do you know what your readers think quality is?
[00:07:16.471] – Janice Summers
See, that's the key thing.
[00:07:16.480] – Yoel Strimling
We can't ask ourselves, what do reader's want? We're not the readers. We're not the right audience? And that's one of the other things that I found as well in a lot of the technical communications research on quality. It's from a technical writers point of view, but based on, of course, what they think readers want and some research done in the past. But it wasn't enough for me. I said, I need to find out my readers want.
[00:07:49.460] – Yoel Strimling
I need to find out. And before you can find what reader's want, if you want to know what reader's say is documentation quality, you need to say what quality is in general. What is quality? Quality can be anything. In Zen, The Art of Motorcycle Maintenance, Robert Person, who used to be a technical communicator himself, says that, you know what quality is, but you don't know what quality is. You can't name it, but you can know what it's not. How do you measure that then? How do you measure quality if you don't know what it is?
[00:08:24.830] – Yoel Strimling
And if you can't measure something, you can't improve it. So if we want to improve our quality, improve our quality of documentation, we have to be able to find something that we can use to say this is quality according to our readers. It has to be a reader oriented definition of documentation quality.
[00:08:39.070] – Yoel Strimling
And, of course, it has to cover all aspects of quality, our definition. Because quality– there's objective side to it, an objective aspect, and subjective aspects. Objective meaning meets requirements and subjective meaning meets expectations. Any definition of documentation quality that we have has to keep both of those in sight.
[00:09:05.940] – Janice Summers
Balancing both halves. Right. So, were you able to define quality? To come up with a definition applicable?
[00:09:24.900] – Yoel Strimling
I was. What, Liz?
[00:09:25.530] – Liz Fraley
No, go ahead.
[00:09:26.020] – Yoel Strimling
Yeah, I was. I found a fascinating study by professors Wang and Strong at MIT from 1996. Their relatively well known research in the field is what used to be called data quality is now called information quality. Total data quality management. Now it's total information quality management. There's a difference in data information. I'll talk about that later.
[00:09:46.900] – Janice Summers
Yeah, there is. And I'm glad you're going to– We have to remember to come back to that. Let's not forget. OK, good.
[00:09:51.620] – Yoel Strimling
Okay. They came up with what they found, an empirically based, hierarchical conceptual framework of how to define information quality. And actually, the idea is that their framework for quality categories of dimensions can actually be applied to any kind of quality. Let me give you an example. We talk about intrinsic quality. Representational quality. Contextual quality. And accessibility quality. Four different quality categories. Intrinsic quality means information has quality in its own right. Representational quality means information has to be well represented. Contextual quality means information has to be understood within the context of the task at hand.
[00:10:36.180] – Yoel Strimling
And accessibility quality means that information has to be easy to find and retrieve. To make this more concrete, I like to give an example of two pens. I have here two pens. A fancy pen, which I got after ten years of work at my previous employer. I got my name on it spelled right and everything. Funny story. Previous company I worked at give me a pen with a company name spelled wrong, which is an ironic gift to give an editor.
[00:10:58.480] – Janice Summers
That is. It's very memorable.
[00:11:02.120] – Yoel Strimling
Yeah, well, I remember it. All right. So I have here two pens. This is a fancy pen. It's good, strong material, well built. And this is a cheap plastic pen. Flimsy. I stole it from my boss. Don't tell her. So this has high intrinsic quality. And this has low intrinsic quality. Representational quality. It's a fancy pen. I go to a meeting or interview. I take it out. People say hey, it's a fancy pen must, be a fancy guy. As soon as I saw this pen, I said it's a nice pen.
[00:11:30.080] – Yoel Strimling
And this pen here is dirty. It's cracked. Maybe my boss headed under her couch or something. I don't know. So you got high Representational quality and Low representational quality. Contextual quality. Hold on, my computer is– Contextual quality. This this pen writes well. Doesn't smudge. Doesn't Smear. Works well. This pen here has ink. Can't write with it. So this has high contextual quality. And this has low contextual quality. Fitness for use, which is how most people define quality is a purely contextual idea. I can only use one of these pen to write with, but in theory, in theory, I can use both of these pants to clean my ears.
[00:12:15.570] – Yoel Strimling
Don't worry, I'm not going to because you shouldn't never see anything in you here except for your elbow. And then, of course, there's accessibility quality. If I can't find my pen, if I can't find my pen. If I can't find my pen. I can't use my pen. Right? So these are the four categories of quality. Intrinsic. Representational. Contextual. Accessibility. Wang and Strong break these down for information into 15 different dimensions. You can look at my research for— my publish research for the list, I don't remember off top of my head–there's 15 different information quality dimensions.
[00:12:49.490] – Yoel Strimling
So my study, what I did. I asked– I've basically replicated Wang and Strong's study to find out what readers thought was the most important dimension per category. And then I could use that as a way to define documentation quality from the readers. So my research shows that readers define high quality documentation as being accurate from the intrinsic quality category, easy to understand from the representational quality category, relevant which is from the contextual quality category, and accessible. So high quality documentation according to readers is accurate, you need to understand, relevant, and accessible, and that covers everything.
[00:13:32.080] – Yoel Strimling
Now that sounds relatively self-evident, but it provides a strong empirical basis for the claim that documentation quality can be measured using a small, yet comprehensive set of clear and distinct information quality dimensions. And now that we have this definition of documentation quality. We use it for all sorts of things. We can use it for metrics. We can say, okay, last year we had five accuracy issues, and now we have two accuracy issues. Last year we had 100 easy-to-understand issues. Now we have 200. You know where the problems are and you know how to solve them because everybody knows what these terms mean.
[00:14:06.060] – Yoel Strimling
When you sit with your SMEs, you can say, okay, we got accuracy issues. We got relevancy issues. These are clear things that readers themselves used to define documentation quality.
[00:14:17.920] – Janice Summers
Once you can measure it, then you can make the improvement.
[00:14:20.150] – Yoel Strimling
Once you can measure, then you can improve. If you can't measure, you can't improve it. Now, I know for a fact that there are people who are using this model of how readers defined documentation quality in their technical communication teams. Tom Johnson, of the I'd Rather Be Writing blog, has used this as a basis for API rubrics, which is fascinating. I really recommend looking up his blog. I don't have the link available, but you can just Google it. It's quite fascinating. And there are other teams, like I said around the world, we're implementing this to see if it works in real life, because the proof of any research, it doesn't actually work in real life, right?
[00:15:02.420] – Janice Summers
It's the application of it from practitioners perspective.
[00:15:05.840] – Yoel Strimling
Which really emphasizes a disconnect. Again, there's that word disconnect between academic researchers and practitioners. We, as practitioners, need to follow the research, but the research isn't interesting to us or apply to us, it's no good. What's the point of that? There has to be a two way conversation between academics and practitioners. Academics should should be in contact with practitioners to see what is it we're looking at. I did see that an article that there was– CIDM did a survey about what academics and practitioners think is important, and there was some overlap, but a lot of divergences.
[00:15:52.740] – Yoel Strimling
One of the things about academics and practitioners are interested in is user behavior, which, of course, is what we're all here for. Most of us, I hope.
[00:16:04.460] – Janice Summers
I think that's the recent study that– the recent questionnaire that they put out CIDM?
[00:16:11.280] – Liz Fraley
Rebekka and Joanne's current new article.
[00:16:14.920] – Yoel Strimling
Yeah. Rebekka and Joanne, that's correct. I don't have it in front of me, so I don't remember the names but that's what I recall.
[00:16:23.180] – Janice Summers
So now people are applying this. And how is that going with people?
[00:16:28.550] – Yoel Strimling
It seems to be working really well. Seems to be working really well. Again, this is a preliminary model based on my population. My survey population was 81 reader, which is pretty good, pretty good for social science research. Obviously, I want to make the model more robust, so I do need to apply it to real life. And I'm currently doing research to make the model more robust using something called the Kano model of customer satisfaction. The Kano model of customer satisfaction is quite fascinating. It's well known in the quality management spheres, also in marketing, if I recall correctly.
[00:17:07.000] – Yoel Strimling
The Kano model is based on the fact that there are four types of features. There's must be features which Jared Spool likes to talk about the Kano model. The idea is delighting customers and frustrating customers. And the Kano model looks at different features, asks customers, well, how would you feel if it was there? How would you feel if it wasn't there? And then they can plot them into different types of must-be types, things that you have to have, and there's performance this that the more you implement them, the more satisfied they are.
[00:17:42.460] – Yoel Strimling
There's what's called attractive things, which is things that customers didn't expect, but when they're there, even if they're not 100% done, they're really great. And then there's different things that readers don't care if they're there and customers don't care, they're there or not. I'm applying the Kano model to the definition of documentation quality to see what frustrates and what delights our readers, which is the next step really to say, to make me this model of definition more robust. I'm finding some interesting things. It's only a pilot test, so there's no statistical significance yet.
[00:18:18.790] – Yoel Strimling
In my original study, there was statistical significance to these things. Because this is a pilot study I just have to stick it out. What I found interesting is, in my first study, readers defined documentation quality of being accurate, relevant, easy to understand, and accessible. Using the Kano model, they say it's accurate, easy to understand, accessible, but they say it's more important to be complete rather than relevant. That's an interesting finding, because a document can be relevant but not complete or complete and not relevant, depending on the savviness of the reader.
[00:18:53.030] – Yoel Strimling
A reader may be experienced enough to fill in missing information but not know if the information is relevant to them or not. Or a reader may know that isn't relevant to them, but they may not be able to fill in information. So I'm doing more research on this complete and irrelevant, and it's quite interesting.
[00:19:15.710] – Yoel Strimling
Because if you think it comes down to this: documentation is always used in context. Nobody reads documentations for fun, except for me an an editor, and I like reading. So I like reading documentation that maybe nobody else ever reads. But certainly our readers never read documentation for fun. They mean because they want to do something or want to know something. So you always have to see information in context, which is why the contextual category of documentation quality– it's not more important than the others. If your document is accurate but not relevant or easy to understand, and then no big deal.
[00:19:51.690] – Yoel Strimling
I like to say there's an infinite amount of inaccurate information out there. Our job as technical communicators is to provide only accurate information. And to paraphrase Jacob Nielsen, I don't care if it's right if it's not what I want. Right? So it can be accurate but if it's irrelevant, who cares, right? It's nice, thank you. But it doesn't apply to me. So relevant is the idea of personalization. Make it personalized for me. But on the other end, there's completeness, which is also a contextual thing and of the research I did, I found the closest dimensions together were all the contextual ones.
[00:20:34.500] – Yoel Strimling
Relevant, complete and valuable. Readers want our documentation to be valuable, and I published this as well. In one of the psych things. It's that what does it mean to say documentation is valuable. Okay, I got relevant documentation, right? If a document's relevant– a relevant document tells me how to set up a system. A relevant document tells me how to manage my clusters. A relevant document explains the hardware architecture to me. But a valuable document goes beyond that. A valuable document helps me set my system more efficiently.
[00:21:05.440] – Yoel Strimling
A valuable document helps me manage my clusters more effectively. And a valuable document tells me why the hardware architecture it is and how why it's not. So a relevant document helps me do my job. But as valuable as document help me do my job better. That's another interesting side bit that came out of the research. But in the end, the way readers defined documentation quality has to be accurate. It has to be relevant. It has to be easy to understand. Accessible. All four of them.
[00:21:36.320] – Yoel Strimling
If I can't find it. I don't care how accurate and relevant is. I don't understand it, I don't care how relevant it is. Like, if it's not relevant, I don't care accurate it is. If it's not accurate, I don't care how relevant it is. Use all four at a time together. And as an editor, I use this. I use this a lot. I use this when I edit stuff. I say, okay, I look for accuracy issues. I look for relevancy issues. I look for easy to understand issues. I look for accessibility issues. Is information easy to find? Is the search working?
[00:22:05.210] – Yoel Strimling
Are the links working? These are things that and I know that when I send a document out after reviewing it for all these four things, accurate, relevant, easy to stand, accessible, intrinsic, representational, contextual, accessibility. I know that my document will, in theory, hopefully make my readers happy.
[00:22:26.580] – Janice Summers
And then the feedback that you need to find out from that is the effectiveness of it from a readers perspective.
[00:22:32.320] – Yoel Strimling
That's right. So we also use this model to give meaningful and actionable feedback. We, ask– readers are busy people. They don't have time to answer all these questions?
[00:22:43.440] – Janice Summers
No, they don't. And they don't appreciate having to answer a lot of questions.
[00:22:46.740] – Yoel Strimling
So if you can ask only four questions: Could you find information? Was it accurate? Was it relevant? What it easy to understand? You start with closed yes or no questions. Was the documentation information easy to find yes or no. If no, what was wrong? So you follow with an open question that's the basic survey design. Try to use the minimum number of questions, use closed questions and then follow up with open questions, where not clear. And we find it very helpful because readers understand what these terms mean and then because we know they're important to readers, they're willing to answer them.
[00:23:25.590] – Yoel Strimling
And there's only four questions, you know? People you know, was it accurate? Was it easy to understand? Could you find it? Was it relevant? Yes. No. And then we can take that information and sit with the SMEs and say, okay, listen, this guy complained that the information was irrelevant. Why was it irrelevant? All because it's the wrong model, the wrong whatever. I don't know or the wrong audience.
[00:23:47.160] – Janice Summers
[00:23:48.280] – Yoel Strimling
So we are finding certainly here at CEVA where I work, we find that it actually does provide measurable and improvable results.
[00:24:02.110] – Janice Summers
And you're not over-burdening your content readers with trying to get a lot of information from them. It's quick and easy for them answer right. And then if you take action, I think that's the key thing: you have to take action on it.
[00:24:19.340] – Yoel Strimling
If it doesn't do any good to collect feedback and not do anything about. I think it might be worse to collect feedback and not do anything about it. Like oh, we care. No, we don't care, right? Your call is important to us. No, it's not. But we can use this model that's been empirically proven. But this is how readers what and give readers what they want. It's a novel idea. Give our readers what they want. And I find because many technical communicators can't contact their readers, they can say, okay, you know what?
[00:24:55.880] – Yoel Strimling
They can rely on my research. So this has been proved. This has been empirically shown that this what reader's want and they can go to their management and say, listen, we need to focus on this. And it kind of solves the problem of not talking to your– if you can't talk to your readers, talk to people who have talked to readers.
[00:25:13.400] – Janice Summers
So okay, you bring up a really good point because there are times and it's real that people don't have access to their readers.
[00:25:25.200] – Yoel Strimling
Most of the time.
[00:25:26.010] – Janice Summers
Yeah. So in that situation, then you're saying that talk to people who do talk to the readers.
[00:25:32.730] – Yoel Strimling
Which is why practitioners and academics have to work together. Because how are practitioners going to find out what readers want? Academics often have access to people, to populations, to audiences that we as practitioners don't always have. I mean, they can do this the study, I don't know how they get their populations, but they do. They do research about what readers want and they have access. Whereas we practitioners don't have it. The problem is when you open a technical journal, you got to find something that's interesting to you.
[00:26:09.740] – Yoel Strimling
And honestly, and I come to the academic world. There's a lot of boring stuff in these journals. No offense to people who do publishing, but I read some of this stuff and I think to myself Why? I mean, OK, it's very interesting and very small point, but where's a big picture. How can I apply this? Or what's in it for me?
[00:26:29.970] – Janice Summers
Right. And I kind of want to back up a little bit and point to the fact that for academics, academics is a different world and that they have time to do a longitudinal study. And when they approach companies to do studies, it's different than you as a practitioner approaching other companies. So I think that's to your point, that all more reason to have practitioners involved in the early stages of research from an academic's perspective, because they can help guide what is practical for practitioners perspective. But academics have access that practitioners won't, right.
[00:27:11.850] – Janice Summers
Because like you said, you had to go and contact these customer support departments, and it wasn't. Hey. And I'm sure you hit a lot of walls.
[00:27:20.140] – Yoel Strimling
You should see my Excel. Of all the people I contacted, it's like three quarters red and a quarter green. Yes. Yes. Yes.
[00:27:30.710] – Yoel Strimling
On the plus side, the companies, who did, who were interested, were very interested in quality of the documentation. That's already a plus that they're in improve the documentation quality.
[00:27:43.970] – Janice Summers
Yeah. Yeah. And I would also put out there because the show speaks to both practitioners and academics. If you're a practitioner and you have something that you want researched, reach out to the academics and see if you can partner with somebody in academics who's doing research who might want a new research topic.
[00:28:04.160] – Yoel Strimling
Twitter is a very good place for that kind of thing. Twitter, certainly there's a lot of academic technical communicators on Twitter and they're usually quite open to conversations. I've had a number of conversations with academics and practitioners on Twitter about this kind of stuff. Another interesting possibility is that maybe practitioners could comment on articles that are published and have the responses published as well. That'll be interesting, because that would get — practitioners don't have time to do all the research and the literature review and to do my research.
[00:28:42.910] – Yoel Strimling
I'm not paid to research. I'm a practitioner. Right? So it took me seven years to do all this research
[00:28:49.380] – Janice Summers
because you researaching on the side but that has to do with your training, too. Like you're really interested in research. So you have a drive for it.
[00:28:57.370] – Yoel Strimling
So practitioners who don't necessarily have the time or the resources or the experience to do this can certainly respond to– they can send a response to the Journal, say, this article is interesting, and I wonder maybe we could apply it here, apply it there. They wouldn't need to do any research themselves. They wouldn't need to do any lit review or peer review. They can send a letter to the editor or letter to the Journal Editor. Journal Editors, honestly, Journal Editors, technical communication Journal Editors. I would recommend them approaching practitioners and say,
[00:29:28.210] – Yoel Strimling
Hey, listen, let's have a feature where practitioners respond to articles. That'll be great because then the practitioners, the practitioners would feel (a) they'd be published and they could get– their needs will be known to the academics and academics would also be able to have a conversation. And on the other direction, academics, I don't know how pleasable this is, academics could maybe contact practitioners and say, hey, I wrote an article. Hey, what do you think? Let's maybe amplify this on Twitter or LinkedIn or something like that. There needs to be some sort of two way conversation between academics and practitioners.
[00:30:09.930] – Janice Summers
Right. Well, and hence Room 42. Right? That's our goal to help bridge the gap between academics and practitioners, because academics are very interested in having their work practically applied. To be clear, they're very interested in having their work practically applied and having their work get results. That's why they do what they do, right? Practitioners really want some good, solid research so that they can go back and say why we want to make a change.
[00:30:40.260] – Yoel Strimling
But it has to be research that then can be applied. And research that's interesting to practitioners. The article by Rebekka and Joanne showed that both academic and practitioners aren't interested in tools…that much. So, research into tools wouldn't be useful. Research into training may or may not be interesting, but certainly research into content strategy and user behavior and processes. These are things, that– As an editor, I'll tell you what interests me as an editor. What I would like to see in more research: is how users respond to different modalities of information, meaning a user in this particular field would like to see XYZ in their documentation.
[00:31:35.170] – Yoel Strimling
A user in this field would like to see ABC, which is one of the triggers again for my own research because I wanted what readers wanted, you know. And I also want to know if I knew what readers wanted, I thought I knew what readers wanted, I want to know if it was true, right? I would sort of right.
[00:31:53.600] – Janice Summers
Yeah. I mean, you can make some assumptions and you can have a general understanding, but you do want to know straight from them–that feedback straight from the readers–because you might be making an intelligent assumption. Right?
[00:32:08.780] – Janice Summers
But you kind of want to have that backed up at least once in a while just to check and make sure.
[00:32:14.540] – Yoel Strimling
Well, certainly. We as technical communicators have lots of tools available and stuff. We have personas. We have use cases. We have journey maps, all sorts of tools that we have to sort of simulate a real reader. But it's just a simulation. It's just a simulation. And while it's based on data, certainly, I mean, you can make a user profile out of thin air. You have to do research, you know, who's your reader, who's your audience, but you get big picture kind of stuff sometimes.
[00:32:48.490] – Janice Summers
Hey, let's get to that question earlier that we put a pin in–
[00:32:53.510] – Yoel Strimling
Data vs Information?
[00:32:54.130] – Janice Summers
[00:32:58.560] – Yoel Strimling
Okay. There's three levels. Let me put it this way. In the information quality literature, there are three levels. There's data, which is just facts. There's information which, is the facts applied to something. There's knowledge is how you use that information to do something else. So data, information, and knowledge are different things. Information helps you connect the dots, data are dots. Information helps you connect the dots. And knowledge shows you that that it's a unicorn or a turtle or something. Okay, to us as practitioners– and certainly, I'm seeing a lot in the more modern information quality literature —
[00:33:43.230] – Janice Summers
They're using data and information interchangeably. Why? Because they realize that data in and of itself isn't really meaningful. Data quality. Is quality an aspect, an objective asset of data? Data can be accurate. Data can be believable. Not really. Because as soon as you look at this data, you're turning it into information because you change it. To apply your– as soon as you look, it's like the Heisenberg Uncertainty Principle, you can know how fast something is going or where it is. You can't know both. So as soon as you observe it.
[00:34:20.450] – Yoel Strimling
I'm sorry, Schrödinger. Soon as you observe something, it changes. So as soon as I observe the data. It becomes information that I'm looking at. That said, information can be both objective and subjective. Believable. Right? Information believability. Do I believe what you're saying to me? Fake news, right? Do I believe it or I don't believe it? That's one of the things, a problem with medical websites. I am not a trained doctor. So when I look online for my symptoms, I have to believe the information. I don't know if it's accurate or not, but I can know if it's believable.
[00:34:56.320] – Yoel Strimling
Accuracy is objective. Believability is subjective. So when I say I talk about information, it becomes– when I start looking at data it automatically becomes information. Which is why the current literature is getting away from data quality. And more information quality. Documentation is always information because it leads to knowledge. Because, again, documentation is always used in context. As soon as you use something in context, it becomes information. What to do with the information. I think it was Ray Gallon who said information is purely static. Knowledge is what you do with it.
[00:35:36.880] – Yoel Strimling
So I think that's a good definition. Data is kind of etherial. It doesn't exist in of itself.
[00:35:45.360] – Janice Summers
[00:35:45.670] – Yoel Strimling
The observer becomes information. The information is static. What you do with it becomes known. And that's what documentation is for. Documentation is for– to help readers do what they want to do or nobody needs to know.
[00:35:57.580] – Janice Summers
Right to inform or instruct. And I think one of the key points that you hit on and it just stuck in my head is accuracy. Believability is one thing. Accuracy is the key thing. And I think that's where a lot of misinformation happens is a lack of accuracy.
[00:36:14.610] – Yoel Strimling
But I can't know if it's accurate or not. I have to believe the information. As a technical communicator, I know that my audience doesn't know the stuff that my SMEs know. They have to believe that it's accurate. But believability and accuracy are not the same thing.
[00:36:35.780] – Janice Summers
[00:36:36.270] – Yoel Strimling
So to readers, it's more important that we be accurate. Believability isn't as important to readers, according to my study at least, that accuracy is. Believability and accuracy are both intrinsic quality dimension. Accuracy is statistically more significant than believability is. Because it's objective versus subjective. At the end of the day, and that's one of the things I titled my article based on Wang and Strong, the title of the article is “Beyond Accuracy.” Accuracy is a given. As soon as you say, oh, the information is not accurate. I don't care how clear it is. Right?
[00:37:13.960] – Janice Summers
[00:37:15.020] – Yoel Strimling
In the world of information quality, people say accuracy is the starting point. We see, though, that accuracy in of itself is not enough. It has to be relevant, has to be easy to understand, has to be accessible. They're all equally important. Believability is above and beyond that. Consider this: another one of the intrinsic quality category dimensions is objective. Information has to be objective. It has to be even and based. That is to readers, readers don't care if information's objective or not. They want to know if it's acccurate. Accuracy trumps all in the intrinsic category.
[00:37:58.860] – Yoel Strimling
In other categories, for example, in the contextual quality category, there's complete versus relevant versus valuable. They're all very, very close to each other, but it's certainly in the intrinsic quality category, accuracy versus believability versus all the others, accuracy wins all. Because our readers are knowledgeable. They know. They know if it's accurate or not, because they're the ones doing the things, that are working with the things. To a website user of– a medical website user, maybe believability is more important because I can't know. I can't know if it's accurate or not.
[00:38:38.770] – Yoel Strimling
I have to believe that it's accurate.
[00:38:44.140] – Liz Fraley
[00:38:46.000] – Janice Summers
Yeah. That's interesting. Context is important. What you just pointed out, I think is a key factor.
[00:38:56.090] – Yoel Strimling
That because documention is always used in context.
[00:39:00.380] – Janice Summers
[00:39:04.090] – Liz Fraley
And what a great conversation. Yeah. Right?
[00:39:07.820] – Janice Summers
[00:39:08.540] – Yoel Strimling
I mean, like I said, nobody picks up documentation for fun.
[00:39:13.430] – Janice Summers
[00:39:14.060] – Yoel Strimling
Documentation is not a novel. And nobody but me, as an editor, reads it cover to cover. Readers, somebody once told me, we should consider our readers to the attention deficit irritable teenagers. They don't care. They want to go to the information, find what they want, and go away and do their work. They don't– Reading documentation is often a last resort. People are busy. People are tired. They want to go home. Reading documentation isn't fun. Except for– unless you're an editor, you know, then it's fun.
[00:39:46.130] – Yoel Strimling
Documentation quality is very important. If you give it a focus on those four things, your reader will be very happy.
[00:39:50.000] – Yoel Strimling
You will have made your reader's day. They can go home early and say, oh, I did my job because the documentation helped me. I saw a meme just this week. Five minutes of reading documentation can save 6 hours of debugging. Again, it's a meme and it's funny, but there's a kernel of truth in there. If you read the documentation first, maybe you'll know something that will save you time later. But it only works if the documentation is accurate, relevant, easy to understand, and accessible. If not any of those, too bad.
[00:40:21.480] – Yoel Strimling
Or as they say in Hebrew, “Chaval… chaval al hazman”
[00:40:27.950] – Janice Summers
Yeah. Time is– Is the time to read something? That, I guess, would fall under easy. Right?
[00:40:37.100] – Yoel Strimling
Easy to understand?
[00:40:39.300] – Janice Summers
[00:40:40.760] – Yoel Strimling
Yeah, I would say so. An interesting finding that I'm finding in the Kano model study I'm doing now–concise. Readers want the information to be easy to understand, but they'd like it to also be concise. What is concise? Minimalism, minimalism. Minimalism doesn't mean write less. Minimalism means write more concisely.
[00:41:05.641] – Janice Summers
Get to the point.
[00:41:05.650] – Yoel Strimling
Readers would like to have information that's concise. But if it's not concise, no big deal. As long as it's easy to understand. I hope to be publishing my Kano model findings…I don't know. Maybe another year. It's not so easy because again, I have a real job.
[00:41:26.630] – Janice Summers
Well, you know, and good research is not always… I mean, it takes time.
[00:41:32.060] – Yoel Strimling
[00:41:33.650] – Janice Summers
It does take time. It's not something you can just push out right away. You want to get a right amount of exposure.
[00:41:38.000] – Yoel Strimling
One of the funny things about research is you can always read another paper, right? As a grad student, that's one of the things that was always a problem. Right? I'll read the paper. You go to down a rabbit hole of just reading one paper, one paper, another paper. If you read everybody else's paper, you'll never publish your own papers.
[00:41:52.820] – Janice Summers
[00:41:53.780] – Yoel Strimling
And, you know, as an editor, I love reading. So it's always here's an article and there's an article. And I've got a stack of articles this big. Half of them aren't important. Half of them don't make any sense. But, you know.
[00:42:05.870] – Liz Fraley
[00:42:06.880] – Janice Summers
[00:42:10.880] – Liz Fraley
There we go. So we have one question that I do want to work in. What are your thoughts about video?
[00:42:18.760] – Janice Summers
[00:42:20.560] – Liz Fraley
And video replacing written documentation because at least it always seems to be what the push is.
[00:42:29.380] – Janice Summers
Right, which gets into that modality question that you brought up earlier is what mode are the content readers or consumers? I guess they'd be content consumers at that point.
[00:42:40.700] – Yoel Strimling
It's “information” don't use “content”. Information.
[00:42:43.960] – Janice Summers
[00:42:45.630] – Yoel Strimling
I don't like content. Content is a very vague term. Information. Information leads to knowledge. So information quality, information consumers, information producers. I like to use word information, not content.
[00:43:03.450] – Janice Summers
How do you deliver the information?
[00:43:05.070] – Yoel Strimling
Well, that's interesting thing. That's not my research. My research just not on how people prefer their information. People want information to be accurate, relevant, easy to understand, and accessible. If your video is accurate, if the information in the video is easy to understand, if the information in the video is relevant to me in context and it's relevant, if the information is accessible, then the quality of the information should still be good. But a grainy– I'm always reminded of who was it? Ann Rockley saying once reused, incorrect –what was it?– bad information reused is still bad information, right?
[00:43:46.700] – Yoel Strimling
It's true. You can have that the best accurate, relevant, easy to understand and accessibile information. But if your video's choppy, or the sound's no good, or excessive talking head like this, which is really boring. You know, I was at a presentation once they said, if all your videos are a talking head, don't bother. There's no point to that. I can't speak to how people prefer the modality of delivery. I talk about information, not delivery. Me, personally, I hate videos. You know why?
[00:44:18.770] – Yoel Strimling
Because the rewind back and forth, rewind back and forth. Oh, wait, huh? What? I can't do that. I need to have the paper in front of me. Check, check, check. I print it out. I'm old school. I'm a tree killer. You know, I print it out. Check. Yeah, I did this. And that, okay. I have to read it in front of me. So I can't use videos for instruction. But my opinion would be that if your information is accurate, relevant, easy to understand, and accessible, then work on the quality of the video.
[00:44:48.940] – Janice Summers
So no matter what modality you're using to deliver information, the quality model stands.
[00:44:57.730] – Yoel Strimling
Exactly. That's the whole point. How do we define information quality? I call documentation quality. Documentation quality, information quality, according to readers, must be accurate, relevant, easy to understand, and accessible. Intrinsic, representational, contextual, accessibility. And just as an aside, Wang and Strong's framework is used extensively throughout the quality management literature. There's been any number of studies done to show that it's really good in identifying and solving information quality. Ever since it was published 1996, it's been used lots of places, and it's been fine tuned here and there.
[00:45:37.780] – Yoel Strimling
But if people are interested, I highly recommend looking up Wang and Strong's original paper.
[00:45:44.810] – Janice Summers
And the thing is, if you want to start implementing this model, you need to find a feedback mechanism. Right? You need to create a way to get feedback. Correct? Because you have a measurable and actionable. This is the thing. You want to measure. Right? You want to implement. You want to measure it. And then you want to act on that. So are there ways that– what are some of the ways that people could create that in their practice in their work environment?
[00:46:16.820] – Yoel Strimling
It depends. If you have access to your readers, surveys are the easy way, but they're not the best way, often not even close to the best way. The best way, of course, is sit and talk to your readers face to face. But again, even if you sit– I know people who do in training sessions and the users come to the training sessions and then say, okay, let's talk about the accuracy. Let's talk about the easy to understand, on the relevance. Let's talk about the acceptability.
[00:46:47.050] – Yoel Strimling
If they focus on these four categories, these four dimensions, they're more likely to get meaningful and actionable feedback. If you can't talk to your readers and all you have are your SMEs or your manager, you can bring my papers and say here, look, research has been done. Let's talk to the SMEs and let's look at the documentation ourselves and mark accuracy issues and mark relevancy issues and mark easy understand issues and accessibility issues. Personally, I find that when I sit with my SMEs to go over documentation it's very helpful to use these terms.
[00:47:22.070] – Yoel Strimling
They know these terms mean and they know how to solve them. These are actionable things. This is inaccurate. Oh, why is it inaccurate? Because you put the X instead of a Y. Okay. This is not easy to understand? Why? Because written badly; write it better. Easier to say, write it better. That's my job as an editor. I make it very clear when I sit with my SMEs that they're the accuracy people. And I'm the easy to understand person. Together we work as a team to create information quality for our readers.
[00:47:54.040] – Yoel Strimling
And that actually helps because SMEs who don't really care about readers sometimes, say you know what? That makes sense. Because me as a reader, says the SME, I also want it to be accurate, relevant, easy to understand, and accessible.
[00:48:06.830] – Janice Summers
Right. So that kind of also helps overcome the I want to explain. Because sometimes, in some situations, you run into where a subject matter expert wants a lot of features explained rather than the functions that a user might be interested in.
[00:48:24.120] – Yoel Strimling
Right. So, then you say, okay, listen, you say as a user, is this relevant? And they'll say, well, not really, no. And you'll say, well, what's relevant to the user? When I train writers, I often say, don't document the tool, don't document what the tool does, document what I can do with the tool. Think like a reader. As a reader advocate, that's how I have to think.
[00:48:47.440] – Yoel Strimling
So. I say, Is this really relevant to the reader. If the answer is no, goodbye, delete it.
[00:48:55.520] – Yoel Strimling
Liz, I'd like to make a comment. I saw something going on the chat about usability. I'd like to say something about the term usability. We as technical communicators like to word usability a lot. And usability is one of those terms that doesn't have a lot of meeting. Ray Gallon once told me that saying documentation is usable is like going to a restaurant and saying the food is edible. It doesn't mean anything. A document's usable. What do mean usable?
[00:49:22.840] – Yoel Strimling
What do you mean usable? I could use it. What do I use it for? I could use it prop my door open. I could use it to line the Birdcage. What does it mean, usability? Usability doesn't mean anything out of context. Documentation, to be usable, must be accurate, relevant, easy to understand, accessible. Usability is a vague, generic term that I try to avoid using because usability is not– is too vague.
[00:49:50.590] – Liz Fraley
[00:49:52.600] – Janice Summers
Context is missing there. Right.
[00:49:57.670] – Liz Fraley
And we are out of time. What a way to end.
[00:50:00.750] – Janice Summers
Are we, already? Oh, wow.
[00:50:02.480] – Liz Fraley
We are. Well past, but it's been a really good conversation, so I'm not upset. And every's engaged. So, we apologize for taking extra time, folks, but this was great.
[00:50:13.630] – Liz Fraley
Thank you Yoel.
[00:50:14.460] – Janice Summers
And thank you, Yoel, for coming. And the article is fantastic. It's very well written. It's very accessible and it's very easy to understand.
[00:50:25.390] – Yoel Strimling
I hope it's both relevant and valuable.
[00:50:28.220] – Janice Summers
It is relevant and very valuable. Highly recommend. There's links for it. Everybody should go and check it out. I really appreciated the amount of work that went into doing this, especially considering that this is not your job to do
[00:50:43.058] – Yoel Strimling
Well it is my job. I'm a reader advocate. It' s my job.
[00:50:46.710] – Janice Summers
But I mean the research work. This is like in and above what you're doing in an already busy life. So I really, really appreciate it. Thank you very much for taking the time to come here our first international guest.
[00:51:02.720] – Yoel Strimling
[00:51:03.740] – Janice Summers
[00:51:04.550] – Liz Fraley
Alright. Great. Everyone.
In this episode
Yoel Strimling has been spinning straw into gold for over 20 years, and currently works as the Senior Technical Editor/Documentation Quality SME for CEVA Inc. in Herzelia Pituach, Israel. Over the course of his career, he has successfully improved the content, writing style, and look and feel of his employers’ most important and most used customer-facing documentation by researching and applying the principles of documentation quality and survey design. Yoel is an STC Associate Fellow, a member of tekom Israel, and the editor of Corrigo, the official publication of the STC Technical Editing SIG.
As technical communicators, we put a lot of time and effort into creating the highest-quality documentation we can. We write because we want to help our readers do the tasks they need to do or understand the concepts they need to know. But what do we mean when we talk about “documentation quality”? What do our readers mean when they talk about it? And is it the same thing we mean?
For the past seven years, Yoel has been researching these questions. As a practicing technical editor, he doesn’t always have time available to investigate what readers want. But as a “reader advocate”, he feels that it is critical that technical communicators have solid and empirical evidence to help them do their jobs better.
Yoel's published research into how readers define documentation quality is being used by technical communication departments around the world to collect meaningful and actionable feedback, and provide reliable methods and metrics for measuring documentation quality.
Contact him at firstname.lastname@example.org
Mentioned during the episode
Don Norman's Emotional Design
Wang, R. and Strong, D. (1996, Spring). Beyond Accuracy: What Data Quality Means to Data Consumers. Journal of Management Information Systems, 12(4), 5-33. You can get access to the PDF here: http://mitiq.mit.edu//publications.aspx
Kano Model of customer satisfaction: https://www.career.pm/briefings/kano-model
Jared Spool on the Kano Model: https://www.youtube.com/watch?v=ewpz2gR_oJQ
Support TC Camp
We at TC Camp work hard to produce educational public service projects. We hope you’ll consider supporting our efforts to continue to produce innovative programs for the TPC community.
1 thought on “What Does Documentation Quality Mean and How Do We Improve It?”
Pingback: Of Interest: What Does Documentation Quality Really Mean and How Do We Improve It? – Corrigo