A Look at Combined Search

May 12, 2011

The Usability Group and User Experience Department have partnered on a project to improve the display of library website search results. A search of the library's website using the default "MLibrary" tab currently retrieves:

  • databases
  • items listed in the Mirlyn catalog
  • online journals
  • research guides
  • webpages from the library's website
  • collections (both digital and other collections)
  • items from the library's database of government documents
  • items listed in the UM institutional repository, Deep Blue
  • relevant library contacts, such as specific subject specialists and services

In each category, a maximum of 4-10 matches are displayed, depending on the category and the number of matches found.

MLibrary search results page

MLibrary search box

This represents the library's combined search. Adjacent to the MLibrary search tab are separate tabs for ArticlesPlus and Mirlyn, which retrieve results from those systems only. Combined search, in contrast, is designed to aid discovery by displaying diverse kinds of search results.

The project undertaken by the User Experience Department and the Usability Group developed in part out of the library's implementation of ArticlesPlus in Fall 2010. For its initial implementation, ArticlesPlus was integrated into the library website as a separate search tab. The current project considers other options: should combined search include results from ArticlesPlus? If so, should those results be integrated into the existing design of search results pages or should we take a new approach?

To help us explore these questions, we examined how other websites -- from peer libraries to commercial sites -- organize multiple categories of results on a single search results page. We also compared search box designs, which inform the design of search results pages. The information we collected, below, represents the variety of design strategies employed to display complex and varied search results.

Combined Search Design & Feature Matrix

NCSU

Villanova

UCSF

Feature:

 

 

 

Number of search categories displayed

6: Articles, Journals, Books & Media, Databases, Library Website, More Search Options

Combined search includes “Books & More” and “Articles & More”

Combined search with results on 4 separate tabs

Number of columns

3

2

4 tabs, one column each

Collapsed results lists?

Yes, “More Search Options” (4): IR, NCSU theses & dissertations, Historical State, Google Scholar

None

None

Descriptive text with search results?

Yes, for Articles, Books & Media, and Library Website

Yes, similar to native Mirlyn search results pages

No

Lines between columns? Shading?

Lines – yes.
Shading – no.

Lines – yes.
Shading – yes, alternating between items.

N/A

Notes

Penn State

  • "Search for databases by name or Try These First" link under database search box provides straightforward directions for user.  UM equivalent of "try these first" could be ArticlesPlus.
  • The search box area is clean and well-organized.

NCSU

  • Search box under "ALL" search indicates exactly what is being searched: "search books, articles, journals and library website"
  • Search box is clean, prominent, and clearly organized.
  • Search result page has a lot of information on it but is surprisingly easy to look at.
  • Features that contribute to readability:
    • vertical lines between columns
    • contrasting colors to distinguish headings
    • collapsed results for "More search options" such as the institutional repository
    • few results returned for each type with more information provided for each
  • Best bets at top of results is a cool feature.
  • "Books & Media" is a good label to describe catalog results.
  • Databases are presented in two places and the distinction is confusing.
  • Website search results are visually busy.

Villanova

  • Drop down of fields in the search box visually delineates what will be searched.
  • Books and articles are very prominent
  • Where is other content, such as databases?

UCSF

  • Tabbed design keeps search results displays visually clean and clear.
  • Little – if any – descriptive information is provided for search results. More use could be made of the space that is gained through the tabbed design.

Syracuse

  • "Help me choose a search" link above the Search box on the homepage is useful.

Oxford

  • Tab to toggle between Oxford collections (catalog), and journal articles (article discovery tool)
  • "Show only" links to limit to online or physical holdings.
  • Institutional repository is subsumed in catalog (?), eliminating the need for a separate section of results from that source.

University of California

  • Advanced search first displays counts for each search term, which are clicked to display results. Progressive disclosure of functionality.

Other Interfaces

Isotope.metafizzy.com

  • Dynamic filtering

Authenticjobs.com

  • Dynamic results refreshed when check boxes are unchecked.

Search Box Designs

NCSU

NCSU Search Box

http://www.lib.ncsu.edu/

Villanova

Villanova Search Box

http://library.villanova.edu/

UCSF

UCSF Search Box

http://www.library.ucsf.edu/

Syracuse

Syracuse Search Box

http://library.syr.edu/

Oxford

Oxford Search Box

http://solo.bodleian.ox.ac.uk/

Penn State

Penn State Search Box

http://www.libraries.psu.edu/psul/home.html

BYU

BYU Search Box

http://lib.byu.edu/

Authentic Jobs

Authentic Jobs Search Box

http://www.authenticjobs.com/

Posted by Julie Piacentine at 10:48 AM. Permalink | Comments (0)

Library Gateway Usability Testing

July 22, 2010

The Usability Group & its Usability Task Force conducted a series of
evaluations of the Library Gateway (http://www.lib.umich.edu/) during the Fall 2009 and Winter 2010 semesters. We used a number of different methods, some new to us, to conduct our evaluations.

Participatory Design
This method was designed to gain a better understanding of which parts of the Gateway users find most and least useful, and to help inform our follow-up evaluations. (Discussed more fully in a later post.)

Card Sorting
This method was designed to help us re-categorize content currently grouped under Services, Departments and Libraries.

For the card sorting, we purchased a license to OptimalSort that would allow us to place a card sorting exercise in front of many individual users. We sent this exercise to all of our Library staff and received 104 responses to the exercise, an excellent rate of return. We also ran group card sorting sessions, a new method for us, with undergraduates and graduate students. Groups of up to 5 people sorted paper cards into categories through consensus.

Several similarities between categories surfaced across the various user groups performing the card sort, whether performing a paper sort or using the online tool.
* Physical Locations: libraries and/or services with a physical location and hours of operation.
* Publishing: MPublishing, SPO and University of Michigan Press.
* Services: a broad category used by all groups which ranged from getting help with library resources to internal services for library staff.
* Administration: background support for library staff or as one student said, “Stuff that students wouldn’t necessarily need.”

As a group, the Task Force also came up with "unified" categories that carried the general scope of the categories suggested by our participants. Our categories were based on the categories the participants created, as well as the comments they made during the card sort. Both the similar groupings and the "unified" categories were suggested as bases for further tests.

Guerrilla Tests
This method was designed a) to help determine the order of the headings on our search results and browse results pages, and b) to fine-tune the contents & labels for our Quick Links section.

We have used this method for many years. We call this "guerrilla testing" because we hope to get quick and short answers to quick and short questions. Five minutes is our goal!

For the search and browse results pages, we found that the section labels were confusing and inconsistent across the results templates, and that there was not enough metadata available for users to make informed choices. Participants in our guerrilla tests also wanted to see sections in a different order (e.g., Databases before Catalog). Our recommendations were to add more metadata to the catalog results (e.g., author, publication information, format) and to change the order on the results pages according to participant consensus.

For the Quick Links section, we found that our Library Outages link (when databases are inactive or not working correctly) was not understood or considered to be useful inside this section. More than half of users also requested the addition of a University-wide Webmail link. The Quick Links section was modified to take into account what we heard from participants.

You may access the full reports of the evaluations:
* Organization of Services, Departments and Libraries: http://www.lib.umich.edu/files/services/usability/libs-svces-depts-card-sort-report.pdf
* Search and Browse Results: http://www.lib.umich.edu/files/services/usability/Search_Browse.pdf
* Quick Links: http://www.lib.umich.edu/files/services/usability/QuickLinks.pdf

We were also fortunate enough to have a poster accepted at ALA Annual 2010 detailing our year's work: "Budget Usability without a Usability Budget".

Many thanks to the Task Force project managers-- Kat Hagedorn & Ken Varnum-- and the group members-- Gillian Mayman, Devon Persing, Val Waldron, Sue Wortman, and Karen Reiman-Sendi-- for all their hard work!

Posted by Kat Hagedorn at 09:48 AM. Permalink | Comments (1)

LibGuides Usability Report

December 16, 2009

The Usability Group has been working hard! Our last task force project was to evaluate the recently implemented LibGuides.

LibGuides is a commercial, web-based content management system used to present the library's various subject and technology-based guides. Librarians can quickly and easily create guides to resources in a simple, modular format. LibGuides provides "boxes" in different content format types, designed to better display certain types of information (for example: links of web resources, RSS feeds, or delicious tags).

First, we conducted a focus group to better understand the research habits of undergraduate students. Then we conducted a "guerilla" test (aka simple or discount usability) to build on the focus group findings which demonstrated that the language currently used to describe our LibGuides (e.g., "research guides") is confusing and misleading regarding the actual content found on LibGuides pages.

The two reports for this project can be found here.

Thanks to the task force project managers (Shevon Desai & Julie Piacentine) and the group members (Barbara Beaton, Jennifer Bonnet, Bill Dueber, and Karen Reiman-Sendi) for all their hard work!

Posted by Suzanne Chapman at 02:44 PM. Permalink | Comments (0)