Thursday 21 October 2010

Primo Usability Testing - Details Post #1

To try to keep this detailed write-up of our usability testing more manageable we've broken it down into 3 posts. This is the first one.

Introduction / Background

The initial implementation of Primo took place in mid-July 2010, and the usability testing took place within 2-3 weeks of this implementation after staff had attended a two-day Ex Libris training session. The aim of the testing was to see how intuitive the Primo interface was, and to gauge how useful students thought some of the additional functionality would be e.g. Facets, merging of records and FRBRisation, Reviews & Tags.

In terms of recruiting for the testing we knew that it would be difficult to get students in the middle of the summer vacation, however we used the details that we had taken during the name survey earlier in the year. This did give us all of our student testers, and we recruited 2 members of academic staff by asking Subject Librarians to contact lecturers they knew were helpful. We conducted 10 tests in total, with 7 of the testers coming from the Business, Environment & Society faculty, and 3 coming from Engineering & Computing. We were unable to recruit students from the Health & Life Sciences faculty, possibly due to course structures and placements. All testers had used the library catalogue and/or the eLibrary (MetaLib) before, but had mixed library search skills.

• 2 x academic staff
• 5 x undergraduate students (two 2nd years and three 3rd years)
• 3 x postgraduate students

We conducted the usability tests in a usability lab within the university. The lab allowed us to record audio and video of the participants, along with a close-up screen capture which showed how the participants navigated around the screen. Within the analysis below, only the audio and screen-captures were used.

To become familiar with the usability lab environment, and to test the questions that we had written, we asked a member of library staff with some experience of enquiry work to run through the questions for us as a pilot study. This provided valuable feedback on the wording of some questions which we were able to amend before beginning the testing for real.

The tests themselves consisted of 12 scripted questions along with time for discussion at the end (including questioning about the location of facets and the participants' understanding of Reviews & Tags). Each test took around 45 minutes to complete, including discussion. We tried to use scenario-based questions so that students would feel that the tasks they were being asked to do would represent how they would approach the library search interface in 'real life'.

Analysis and Recommendations

Section 1 - Searching for books and e-books

Question 1 - You want to find out if the Library has the book “Operations management” by Nigel Slack from your reading list. Can you check if the Library holds this book?

All 10 testers were able to complete the task of recognising whether the library stocked a particular book, however 4 testers required prompting to ensure that they had seen that we held the specific book in question. Of these 4 testers, 2 picked the same titled book by a different author, 1 picked a differently titled book by the same author, and 1 became distracted by the ‘view 3 versions’ link of a different book. 5 of the 6 testers who completed the task without prompting were able to do so in less than 1 minute. It was noted that all testers recognised the search box immediately, and 1 tester commented about being in the Books etc tab already.

Question 2 - Can you tell us what floor it is on and what number it’s at in the Library?

In this question we wanted to see where the testers would find the location of the book from on the screen. 9 out of 10 testers were able to give verbal confirmation of the floor and number that the book was held at. The other tester provided confirmation of the floor, but not the classmark. The majority of testers found this information in the availability line in either the brief or full screen. 2 testers found this information in the locations tab. 2 testers needed prompting to give either the floor or the classmark, and the average time taken to find this information was around 19 seconds.

Question 3 - Can you tell us how many copies are out on loan?

Only 3 of the testers managed to complete the task of finding out how many copies were on loan without prompting. 4 testers were unable to complete the task without prompting – many clicked on all tabs except the Locations tab. Some testers commented that they would not look there as it would only tell you where the book was held and not how many we had. Many testers thought that these details would be in the Details or More tabs. There were 3 testers who were unable to complete the task. One of these was due to not recognising the ‘expand’ buttons in the locations tab when there was more than one sublibrary. 3 testers thought that the ‘availability line’ status was clickable and that this was all the information there was available.
Recommendations:
• Re-label the locations tab to something that is more intuitive e.g. one tester suggested ‘availability’ – the ‘locations’ tab has been re-named as ‘Availability’.
• Ensure that if there are multiple sub-libraries that they expand automatically so that all copies are displayed – Ex Libris responded that this is not possible at this time.
• For Ex Libris: possibility of making the availability line status as clickable, taking users into the ‘locations’ tab?

Question 4 - If all copies of this edition were out on loan, can you see if there are any other editions available to borrow and if so how many editions?

9 out of 10 testers completed the task by acknowledging the ‘View x versions’ link and providing confirmation of the number of editions available. Of these, 4 testers had already clicked on the ‘View x versions’ link as part of the previous task and had already noted that it took them to a list of editions of the book. This indicated that we may need to change the labelling of the link to ensure that students would know that it meant ‘editions’ of the book and not ‘copies’. The one tester who did not complete the task was unsure of what an ‘edition’ meant and did not notice the link in either full or brief details screens. During test 7, the ‘View x versions’ link disappeared from the full details screen and so the tester was prompted to return to the brief results. Other testers had seen this link when the system was working correctly.
Recommendations:
• Re-name ‘View x versions’ as ‘View x editions’ – this has been done.
• Ensure that the edition statement appears within the record – we have added this to the ‘Details’ tab.

Question 5 - You are at home and need to find some books and don’t have time to come into the Library. Can you find a list of e-books on the subject of human resource management?

5 of the 10 testers were able to find a list of e-books using the e-book facet on the results screen, 4 testers were able to complete the task with prompting and one tester did not complete the task, but then found the answer whilst completing task 6. Several testers used the ‘Show Online Resources’ link but did not notice that this included journals and e-books. Several testers tried to find an ‘e-books’ option before attempting their search, 3 testers used the advanced search and looked in the format option expecting there to be an ‘e-book’ choice. 3 testers thought that e-books would be in the e-shelf area and were confused by this. 7 of the testers saw the e-book facet – 4 without prompting.
Recommendations:
• Include brief instructions to users on the front page telling them that they can refine their search after retrieving a list of results.
• Re-name ‘e-shelf’ as this caused confusion as testers believed this was where they would find e-resources – this has been re-named as My Favourites.
• Explore further ways of distinguishing between journals and books in the ‘Show Online Resources’ results list

Question 6 - Please try to access one of these items

In this task we wanted to see whether users would be able to see how to open an e-book from the results list, and whether they would be happy with viewing an e-book in the ‘letterbox’ preview screen that is automatically opened when a user clicks on ‘View Online’. There were two potential ways of opening the e-book into a bigger screen – an icon on the right-hand side that would enlarge the e-book into a bigger Primo window (the same view as would happen if a user was in the full-details screen rather than the brief results list view). The second option was a link on the left-hand side that said ‘Open Source in a New Window’ (OSINW) which would open the e-book in a full non-Primo window. Firstly, all of the testers noticed the ‘View Online’ link when either in the brief or full details screens and understood this to mean opening the e-book. Almost all of the testers were unhappy about attempting to view the e-book in the small letterbox view, and many were also reluctant to view the e-book in the bigger Primo window. 5 testers from the brief view clicked on the icon to open the e-book in a bigger screen. Only one tester saw the OSINW link from the brief details screen. Only 4 testers in total saw the OSINW link in either the brief or full details screens. 4 testers attempted to use the internal options within the e-book itself to try and make the e-book bigger – several testers became quite frustrated at not being able to open the e-book in a non-Primo window.
Recommendations:
• Configure Primo to ensure that when clicking on ‘View Online’ the e-book would automatically open in a full-screen, non-Primo window (preferred option) – this is being looked in to by Ex Libris and a fix is due to be released in an upcoming Service Pack.
• If above not possible – swap the links from the OSINW and icon options so that the icon would open a non-Primo window and the OSINW link opened in a Primo window (or preferably both open in a non-Primo window).
• Possibly attempt to re-name OSINW text to something clearer

No comments:

Post a Comment

 
Creative Commons License
This work is licenced under a Creative Commons Licence.