STC Competitions - evaluating usability

Subject: STC Competitions - evaluating usability
From: Laura Zupko <ZUPKO -at- STLVM27 -dot- VNET -dot- IBM -dot- COM>
Date: Tue, 1 Feb 1994 14:21:58 PST

In regards to Brian Daley's comment about usability testing of manuals
at any of the STC competitions:

In STC's 1993 Northern California Pubs Competition, the folks who judged
the Online Information categories conducted mini-usability tests
as part of their evaluations. The guidelines for conducting these
tests were part of the Online Judge's Handbook.

Entrants were asked to provide "tasks" that the judges would perform
with the product. The judge had to use the information provided in
the online help to do this (and not guess or play with the interface
until he figured it out). They were also asked to document the search
paths they used to find the information in the help, for example,
the search facility, online index, table of contents, etc.

While this approach worked out well for some judges and entries, we
did experience some problems when a judge received an entry that
was targeted for an industry in which the judge had little knowledge.
For example, one judge had to evaluate a drawing tool for chemists but
had very little background or knowledge of basic chemistry. This type
of situation could really affect a judge's opinion of the entry that
may not be fair - the judge really does not fit the user profile for
a particular product and may find the information unusable because
of his basic lack of background.

As one of the judges, I thought this was a great way of evaluating the
usability of the information. Obviously, we need to work out some of
the problems we ran into, but it was a good start.

Any comments on this anyone?

Previous by Author: Re: Want ads
Next by Author: Re: Pronouncing "gigabyte"; TAKE A STAND!
Previous by Thread: invocations
Next by Thread: Judging the judges (STC pubs/art/video competitions)

What this post helpful? Share it with friends and colleagues:

Sponsored Ads