RE: Quality metrics for documentation?

Subject: RE: Quality metrics for documentation?
From: SteveFJong -at- aol -dot- com
To: TECHWR-L -at- lists -dot- raycomm -dot- com
Date: Sat, 4 Mar 2000 16:43:50 EST

After I saw the original posting by Susan Peradze <susan -dot- peradze -at- peri -dot- com>, I
was well into composing a response when I read Geoff Hart's
<Geoff-H -at- MTL -dot- FERIC -dot- CA> masterful answer. But the question--is there a quality
metric that shows we technical writers write better documentation than
applications engineers?--cuts right to the heart of the matter, doesn't it? I
sense a real threat brewing, Susan! (The use of the word "objective," in
quotes, is the tip-off.) If we can do better work, shouldn't metrics show it?
If not, what is their validity? I would like to suggest another answer,
though it will ultimately be the same answer as Geoff's.

Your users pass final judgment on the effectiveness of your product,
including its documentation; they ARE the objective! Who does the better
work? A usability study would be definitive. Another route would be to submit
documents to an STC publications competition, without identifying either
party as an STC member. I am confident you would get the feedback you expect.
However, that isn't a usability test, only an examination to see how well
entries meet professional standards.

Short of that, there objective metrics, but as Geoff points out, be careful
what you're measuring: we're self-aware lab rats, and we can jigger any
metric to advantage. If the measure of effective documentation is pages per
day, then your average secretary is the best writer of all (and the best
programmer, too 8^). Writing is not typing any more than programming is.
Anyway, you already realize productivity is a process metric; what you want
is product--that is, document--metrics.

The more different things you measure, the more likely you are to get a
coherent, unjiggerable picture. Document attributes commonly cited by users
as important are readability, clarity, accessibility, examples, graphics, and
indexes. There are easy metrics in those areas; offhand, I would suggest a
combination of Flesch readability index, percentage of passive-voice
sentences, headings per page, examples and graphics per page, and index hits
per page. Knowing engineers as I do, I'd be very surprised if they turned out
documents that measured up better in all those areas.

More abstractly, if you're creating end-user documents, an examination of
procedures might show a significant difference in approach. Do the procedures
support user tasks, or just product functions? Are tasks presented clearly
and consistently, with a reasonable number of steps appropriate to the target
audience?

Finally, a professional editor can cut through verbiage and improve clarity.
How red are your pages after editing? I'd bet the engineers' work would bleed
a lot more. (This probably isn't "objective" enough for you, though.)

-- Steve

Steven Jong, Documentation Team Manager ("Typo? What tpyo?")
Lightbridge, Inc., 67 South Bedford St., Burlington, MA 01803 USA
mailto:jong -at- lightbridge -dot- com -dot- nospam 781.359.4902 [voice]
Home Sweet Homepage: http://members.aol.com/SteveFJong




Previous by Author: Web help tools - review
Next by Author: RE: Wanted: Dumb Tech-Writer Stories
Previous by Thread: Re: FWD: Seeking assistance-contract rates
Next by Thread: Re: Philip and Alex's Guide to Web Publishing


What this post helpful? Share it with friends and colleagues:


Sponsored Ads