ERCIM News No.36 - January 1999
EINS-Web: User Interface Evaluation in Digital Libraries
by Silvana Mangiaracina and Pier Giorgio Marchetti
The construction of new user interface paradigms, or metaphors, in the interaction of humans with huge collections of information is currently a hot topic in both HCI and Digital Libraries (DL). We describe the EINS-Web user interface, designed and evaluated in the frame of the BRIDGE and CIME projects, co-financed by the European Union and coordinated by the European Space Agency and the EINS consortium (European Information Network Services). The Library of the Italian National Research Council (CNR) in Bologna was selected as the test site for the evaluation of EINS-Web.
The EINS-Web interface enables users to access distributed collections of bibliographic and textual databases, and provides a seamless interaction with the World Wide Web. The interface design has been guided by a heuristic evaluation, using a spiral design approach. This methodology was adopted as it is largely software-independent and proactive. It thus makes it relatively easy to incorporate suggested adaptations during the design and testing process.
The Design-Evaluation Spiral Process
The Library of the Italian National Research Council (CNR) in Bologna was selected as the test site for the evaluation of EINS-Web. The evaluation team consisted of a number of researchers from the CNR campus with their information problems in a varied set of disciplines, such as chemistry, material science, electronics, physics, geology, environment, etc., plus a mixed group of evaluators consisting of information specialists and user interface experts.
In the construction of the EINS-Web interface, we reused the design efforts employed in the development of the previous version of the interface (BRAQUE PC where BRAQUE = BRowse And QUEry), developed for the Windows environment.
A heuristic evaluation method then assessed user satisfaction, ease of learning, ease of use, error prevention and efficiency of the interface and was used as a feedback tool to drive a spiral design process. The evaluation began in April 1996, when BRAQUE 1.2 was released. Problems identified by our set of users were analysed against the above heuristic criteria. A number of issues were signalled, mainly of an aesthetic nature. Some were serious, heavily influencing the user interaction. Where possible, suitable solutions were proposed. As the results of the BRAQUE PC evaluation were greater than expected, it was decided to assess and improve the methodology in the design of the next generation Web interface. This new version was called EINS-Web, as the international EINS consortium had decided to adopt and test this interface.
All the information collected during the BRAQUE evaluation sessions, such as evaluators opinions and implementers replies, was used as input to drive the design process for the EINS-Web interface. Two different evaluation sessions were conducted: one involving a mixed group of experts (evaluators) and users; one with experts only. A new heuristic evaluation form was used by the evaluators: for each usability problem, a rating value was assigned - we agreed to assign values from 1 (the interface does not take this problem into consideration at all) to 5 (this problem has been completely solved).
The evaluators were requested to identify potential usability problems and to link each problem to the specific heuristic it violated. Multiple heuristics could be linked to any given violation. In the experts-only evaluation session, the same evaluation - covering the same information problems - was run by four different groups of evaluators.
Assessment of the evaluation methodology and lessons learned
The evaluation was useful in detecting certain design issues that had been neglected to some extent. In particular two problem areas were identified. The first related to feedback and visibility of system status. The second regarded user background knowledge and the user conceptual model. When the results of the four different groups of evaluators were compared, it could be seen that they had used all the heuristics present in the evaluation form, either in positive matches (ie a score >= 3), or in negative matches (ie one or more problems recognized were linked to the heuristic). This result shows that all the pre-selected heuristics were relevant. We noticed slight differences in the evaluation results, depending on the presence of real users or not. It appeared in fact that in a pure heuristic evaluation session (only interface experts, no real users) it was possible to detect problems relating more to the interactive behaviour of the interface, such as users behaviour problems, conceptual user model, aesthetic design. The evaluation with real users made it possible to examine concrete, real-world information seeking interaction problems. Matching between interface design and user expectation is difficult when information space is dispersed over very large collections: expert users want to increase interface functionality to achieve their goals, whilst non-expert users want to reduce interface functionality in favour of intuitive and simple features.
As a preliminary conclusion we feel that the interaction among evaluators, implementers and designers has contributed significantly to the success of the spiral design methodology, and is very necessary to cope with the requirements of designing interfaces targeted at the rapidly evolving Internet world.Please contact:
Silvana Mangiaracina - CNR, Bologna
Tel: +39 051 639 8026
E-mail: mangiaracina@area.bo.cnr.it