4 Reasons You Should Regression Test Your Docs (and 5 Ways to Make it Happen)

4 Reasons to Regression Test Your Docs.png

I’ve been a technical writer for almost six years, and I’ve spent most of that time as the only technical writer in sight. For being the only one, I think I’ve done a pretty good job: I’ve set up docs development cycles, sprinted with Agile development teams, released alongside software deployments, created the documentation architecture for three major products, and generally got a lot of docs out in a short time.

All that said, my docs effort hasn’t been flawless. I’ve typically operated like a developer with no QA – reviewing my docs myself, but with little or no outside testing. Without a comprehensive QA step, errors have crept in and compounded. Recently, when I finally regression tested my own docs, I found an error on nearly every page. The most egregious errors provided information that was outdated or wrong.

I had expected to find errors, but I was shocked by the number and frequency. As I fixed them, I realized that docs regression isn’t something that’s commonly added to documentation effort estimates, but it’s a vital step.

Why regression test the docs?

It's good tech hygiene

You have a good product. You’ve hired the best developers you can get to create the best system they can make, and you have QA test it until it purrs.

You do all this because the code is parsed by a machine, and when the machine can't parse the code – or when the code is written to do something other than what you’ve intended – it causes errors that frustrate your users. The docs are the same way, but instead of being parsed by a machine, they're parsed by the human minds of your users. And just like errors in the code produce user frustration, errors in the docs do the same.

Errors in the docs erode trust in your brand

Okay, so the user encounters an error in the docs – so what? A typo here or there is trivial. But documentation errors can be much more insidious.

If the error is an outdated description of the product or a process, it misleads users when they need help the most.

If the error is a poor explanation of how to resolve an issue, it can make the issue even more stressful.

The user came to the docs because they needed help. They were likely frustrated or annoyed before the accessed the docs at all. If the help is wrong, they’ll only become more frustrated. After all, if they can’t trust the quality of the docs, how can they trust the quality of the product?

As soon as the help isn't helpful, your customer starts looking for a different product that can do what they want better, faster, and with less stress.

The docs are subject to errors

I can follow along with user stories and make updates just fine. I can even follow along if the developers pull stories from the backlog and add them to the current sprint. No biggie! But unless either your sprint hygiene or team communication is excellent, a miss here or there is inevitable. Are there minor changes made between sprints? Any cracks in the stories? How about just this one little kink to work out a client request? What if a dev gets a little bored and rather than pull something from the backlog, they work to correct a personal pet peeve? What if I go on vacation and I just miss something when I come back? (I'm human too.)

The docs regress. These errors and missteps often mean almost nothing to start with, but they compound quickly. This makes the docs inaccurate, which makes them less trustworthy. When your docs aren’t trustworthy, you and your product aren’t trustworthy.

I can only police my own work so much

There's a reason you have QA team, and it's not because your developers are too lazy to QA their own work. It's because, despite that devs do QA their own work, they frequently miss edge cases or don’t notice gaps in their own code. This isn't bad programming, this is being human. The docs are the same way. I have technique after technique for editing my own work, but errors still slip through. I need a different set of eyes to look at the docs critically and let me know where I've gone wrong. Far from being upset, I'll be happy to know that the errors have been caught and can be corrected.

How can you regression test the docs?

So in practical terms, how do you regression test the docs? After all, it's not like your team is swimming in free time, and the writer can't effectively test their own docs any more than a developer can effectively test their own code.

Download the Docs Regression Checklist and get started!

Put docs testing on the schedule

The foundation of making sure anything on the project gets done is putting it on the schedule. Docs regression is no different.

This is where people get nervous. But what about the velocity? What about client expectations? What about our commitments? If you give your writer good access to project resources and the team, chances are good you won’t need to lengthen sprints or the time between releases.

Remember: clear, concise, correct documentation is as much of a deliverable as the software itself, and it's frankly amateur to release without any documentation for your users. (Do you think Microsoft or Google release anything without updated docs? Hell no, they don’t.)

Make docs testing a hard gate

In addition to putting docs testing on the schedule, make sure that it's a hard gate the team must clear before release. Make it the policy that the docs need to be edited and tested before the docs go out the door. The product doesn't go out the door till the docs do.

Just like putting docs testing on the schedule, this can be scary at first. You have critical release dates, client commitments, contractual obligations – the very lifeblood of your organization depends on timely releases. All the more reason to make sure the docs are right before they go out the door. With the hard gate in place, your team can prioritize correct and current docs in a way that reflects their importance to your mission (i.e., giving your customers a great experience with your product).

Set up peer reviews between existing writers

If you have multiple writers, try setting up peer reviews between them. (In fact, if you have more than one writer, they've probably already started peer reviews.) Make sure these reviews include a full regression on a regular basis. Even if the writers work on different projects, getting a second (or third, or fourth) set of eyes on the docs will help curb problems with clarity, accuracy, and organization.

Hire another tech writer (or better yet, a technical editor)

If you don’t have multiple writers, it may be worth considering hiring an additional writer or editor. Even if this person is only part time, this is the gold standard answer, and you'll get a gold standard result. A technical writer or technical editor has the advantage of understanding how the tech writing process works, what to look for in a good document, and how to make constructive suggestions for your existing writer. And because they didn’t write the doc themselves, they’re more perceptive of errors and points lacking clarity than the author can be.

Encourage the rest of the team to help

If you don’t have multiple writers to help with testing, get the team involved. Set expectations, split up the work, get it done, and get your product out the door. If docs testing is on the schedule and hard gate, it won't be difficult to get the rest of the team onboard. Indeed, the faster they test the docs, the faster the software gets out. (Just don't let them get sloppy any more than you'd let QA get sloppy.) And if they gripe enough, perhaps it'll help justify the need for that technical editor.