Evaluate

HCD Activity 4: Evaluate the designs against requirements

User-centred evaluation is a required activity of the HCD standard and should be carried out throughout the project. It should involve [ISO 9241-210, p.17]:

“a) allocating resources both to obtain early feedback in order to improve the product, and at a later stage, to determine whether the requirements have been satisfied;

b) planning the user-centred evaluation so that it fits the project schedule;

c) carrying out sufficiently comprehensive testing to provide meaningful results for the system as a whole;

d) analysing the results, prioritising issues and proposing solutions;

e) communicating the solutions appropriately so that they can be used effectively by the design team.”

The level of formality with which this is conducted increases as the project proceeds to completion. The final iteration of HCD Activity 4 is a summative evaluation; it confirms that the design meets the minimum level of the previously specified usability requirements (or the project fails). Prior iterations are formative; the results are used to shape the design.

Evaluation may proceed through the following stages:

4.1 Develop evaluation plan

This involves determining how to evaluate the design against the usability requirements. There are two aspects to this:

  1. Specifying the protocol for carrying out the evaluation; observational evaluation, survey, etc.
  2. Arranging the logistic of evaluation activities; checking availability of users for test sessions, booking usability labs, etc.

4.2 Provide design feedback

This can be done in a number of ways:

  • Analytic evaluation of designs by experts
  • Involvement of the designers in observational evaluation sessions
  • Video recording of ‘critical incidents’ during evaluations sessions shown to designers
  • Written reports from a separate evaluation team

4.3 Assess whether objectives have been achieved

After a number of design/evaluate iterations the designers will believe that they have succeeded in meeting the requirements and their work is done. At this point a formal summative evaluation is carried out. The results will be presented to the project sponsor and agreement is sought that the usability objectives really have been met.

4.4 Field validate

This requires at least a working beta version of the software to be available. The actual usability of the product in use in its target context is examined.

4.5 Monitor long-term

A characteristic of software is that, with exposure to the working application users change their behaviour and the product’s usability may change. Also the context of use may change, and the task that the software supports is highly likely to change over time. Maintaining the usability of software is an important aspect of the maintenance cycle.

4.6 Report results

There is a standard form of documentation to report the outcome of a summative usability evaluation test: ‘BS ISO/IEC 25062:2006 Software engineering – Software product Quality Requirements and Evaluation (SQuaRE) – Common Industry Format (CIF) for usability test reports’. The format for the report has four major sections:

  1. Executive Summary: A high-level overview of the test and results for decision makers who may not read the remainder of the report.
  2. Introduction: Gives a description of the product under test and the test objectives.
  3. Method: Provides sufficient information to allow replication of the procedure and details of the data recorded.
  4. Results: Contains an analysis of the data obtained and a presentation of the results under the headings of performance and satisfaction.

Any additional material relevant to the report may be place in appendices.  The standard contains a template for formatting the summative evaluation report. Formative evaluation reports need be nothing like so formal, however the rational and background preparation of the kind documented in the standard report must in principle be available should the design team require it.

For some useful advice on report writing from another profession, see Report Writing.

Evaluation Tools

A useful listing of evaluation tools by is available at 30 Useful User Experience (UX) Tools and 8 Usability Testing Tools When On A Budget.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s