I love reviews of my work. (Yes, I'm a glutton for punishment). When it gets close to time for documentation reviews, it's like my Work Christmas: I get nervicited (nervous + excited) as I start to dream of all the comments I'll get back. Will people like the design? I wonder how many typos they'll catch... I hope they notice how careful I was to make the wording consistent! I really hope they catch a couple of errors, so I'll know everything else is definitely right....
Then the dread sneaks in: what if no one responds? When I don't have good feedback (or, almost worse, I get the single phrase "looks good"), it's hard to know if my work provided any value at all. I'm a perfectionist about my work, but I'm not delusional: there's definitely something wrong in there. Not even NASA gets it perfectly right the first time. But after weeks or months of staring at the docs, my eyes glide right over the problems. I know mistakes are there, and I need the team to help me find them.
For code, we have exhaustive testing of every function, every release. But in the sprint to catch the next person in the Agile relay (or to dive headlong across the tape in Waterfall), documentation reviews often fall to the wayside. But failing to test the docs is just as short-sighed as failing to test the code. The docs are our customers' companion as they traverse the world of the release. Good UX is a godsend, and flawless programming is great, but if you find yourself stuck on a desert island with a well-designed and thoughtfully constructed raft, you'll still want the instructions to know you're putting it together properly.
That we need buy-in for doc reviews is obvious. But how can we get it? Well...