Your team is strong – you guys eat software development for breakfast. You get things done for the client, and the client loves you. Hell, your software is so good, it doesn't even need documentation! Right? Despite your confidence, I'd argue you need a writer anyway (or at least a point person on documentation). That said, I think some of your belief that you don't need a writer stems from your misunderstanding of what a tech writer does and how they function within the team. Here are four ways a technical writer can make a strong team even stronger.
So you've been reading my blog and have taken to heart my message that you need docs. Aww! Thank you! I'm flattered. But now, brass tacks, how do you do it? How does this person fit in your team? How do you make them feel welcome?
I'm passionate about getting high-quality feedback on my documentation. This is my profession, after all – I don't want to turn in shoddy work and I don't want customers to suffer with sub-par help. I'm happy to get whatever feedback I can; I'll even take spelling and grammar feedback and eat it like the delicious cookie it is. But as I've worked more and more on software development projects, I've set my sight on a new type of documentation feedback: user acceptance testing (UAT).
I don't think I've ever heard of a software team that performs docs UAT. But the more I think about it, the more it seems absurd. After all, we do software UAT to ensure customers have a chance to tell us whether our code meets their needs, has any bugs we missed, and how it can be further improved. This would also be valuable information for the documentation! And yet I've never heard of a place that does this. But why? Or better yet: why not?
Both the internet and bookstore are full to the brim with information about how to set up, run, manage, and get value from development teams. But for all that information, there's comparatively little about how to do the same for documentation. Docs seem to be a dark spot in the software development universe ‒ it can be hard to navigate. How do you know you're doing what's right? How do you know you're avoiding the pitfalls? And does any of it matter anyway? After all, docs don't have a profound effect on your business ‒ right?
Let's dispel 6 documentation myths that might be hurting your business.
Good user experience (UX) is quickly becoming a differentiating factor for websites and software alike. With so many options at their disposal there, users aren't incentivized to use your system when there's something else out that that’s easier to use. But UX principles don't just apply to software; your docs need to be well-designed as well. After all, what's more frustrating than going to the help only to discover that you can't find the help you need? Use these 7 techniques to find out what your users need and to put best UX practices into effect for your documentation.
So you've taken it to heart that you need documentation. It's good for your business, it's good for your brand, you stood it up and put it out there.... And still nothing is happening. Your customers are still unhappy; your support staff is still overwhelemed. But why? You have docs - aren't they doing their job? And that's exactly the question. How do you know if your docs are crap, and what do you do about it if they are?
Get ready to get your support agents to help gather clues: you'll need information about the exact problem to know where your docs are falling short.
I’ve been a technical writer for almost six years, and I’ve spent most of that time as the only technical writer in sight. For being the only one, I think I’ve done a pretty good job: I’ve set up docs development cycles, sprinted with Agile development teams, released alongside software deployments, created the documentation architecture for three major products, and generally got a lot of docs out in a short time.
All that said, my docs effort hasn’t been flawless. I’ve typically operated like a developer with no QA – reviewing my docs myself, but with little or no outside testing. Without a comprehensive QA step, errors have crept in and compounded. Recently, when I finally regression tested my own docs, I found an error on nearly every page. The most egregious errors provided information that was outdated or wrong.
I had expected to find errors, but I was shocked by the number and frequency. As I fixed them, I realized that docs regression isn’t something that’s commonly added to documentation effort estimates, but it’s a vital step.
What do we mean by "working documentation"? It's difficult to apply a parallel to software. After all, working software is software that works: it functions and doesn't have bugs. Does that mean "working documentation" doesn’t have any "bugs" like grammatical or description errors?
That doesn't feel right. Nor should it. After all, it's hard to say documentation is "working" if it's simply error-free. For documentation, the definition of "working" is "serving the needs of the user." To serve the user, the docs need to be accessible – technologically, physically, and verbally. This goes for every member of the audience at which it’s directed.
I've been thinking about documentation in Agile for a long time, but only recently have I thought about what Agile documentation development itself might look like. Organizations have adopted Agile for its ability to empower teams to create high-quality products faster. From that angle, I wondered whether a documentation team might see some gains by adopting "docs Agile."
To clarify, I'm not talking about including technical writers on Agile teams (although I’ve written about it in the past). Instead, I'm doing a thought experiment to see what the Agile Manifesto would look like if, instead of being applied to development teams, it applied to documentation teams. I’ve tried to keep the docs team incorporated with development and testing during my thought experiment. Overall, I think these ideas are instructive in terms of what writing teams can do within an Agile framework.