This the text of a presentation I did last year at the Access Conference in Regina. Emma and I had plans to write this up as a paper, but life intervened and that didn’t happen. I wanted to keep some record beyond the video of the presentation, so here it is.
This morning I’m going to talk about a user research project I did with my colleague Emma Cross. We observed the user experience of students doing academic research online and then looked at that UX from the perspective of technical services staff. I’ll start with talking about the research we did with the students, the results from that research that seemed most relevant to Technical Services staff, and then I’ll talk a bit about the reaction that Technical Services staff at Carleton had to those results.
I’m sorry that Emma can’t be here. She was the Technical Services brain behind it all; I was the user research monkey.
So, why did we want to do this? Mixing Technical Services and User Experience is not done very often. A Technical Services supervisor at Carleton told us she was finding it difficult to prioritize work for her staff, and she wanted some insight into what was likely to have the most impact for our users. Fantastic.
Emma and I designed the research to be student-led; we didn’t have specific questions but we were interested in where students searched, how they searched, and what kinds of things they looked at in their results. In our sessions, we asked students to search for something they needed for a research assignment, and to try as much as possible to do what they would normally do, not what they thought they “should” do. We emphasized that even though we were from the library, they didn’t have to use library tools or resources if they normally wouldn’t for the kinds of searches they were doing.
I moderated the sessions, asking the students to think aloud throughout their searches, prompting them with questions if they were quiet. We let them search until they seemed to finish but let them know when we neared 30 minutes. The sessions lasted anywhere from 10-40 minutes, but most were 20-30 minutes.
Emma took notes and we also captured the sessions on video, so we were able to go back and fill in gaps when people worked too quickly for Emma to capture everything.
We did the research in March of 2017 and saw 10 undergraduate and 10 graduate students. Emma coded the results and found 4 themes that she thought were most relevant to technical services
Result #1: Overwhelming use of the single search box
Summon and/or Google Scholar were used by most of the students, and the catalogue not much at all. 7 people used various specialized databases and there was also regular Google, Wikipedia, Tumblr, but Summon and Google Scholar were really the most used.
There was little difference between grad and undergrad use of tools, except for catalogue use. The 2 people who used the catalogue were undergraduates. Kinda weird. But this is a good time to emphasize that this was a qualitative study, not a quantitative one; we’re not going to extrapolate that 20% of undergrads use the catalogue and 0 grad students do. The numbers don’t matter – it’s that when observing how students search and listening to how they approach looking for information, the library catalogue doesn’t often come up. It’s not part of their process.
Result #2: Popularity of the “get it” button
Emma’s second theme is the logical corollary to the overwhelming use of single search: the popularity of the “Get it” button and the link resolver in general.
I love the “Get it” link – it makes my life much easier. (Graduate student)
“Get it” is really useful (Graduate student)
“Get it” is helpful!” (Undergraduate)
HEY LOOK Carleton offers to “get this” in Google scholar – HEY THAT IS GREAT! (Undergraduate)
Even when students didn’t mention it explicitly, they used it seamlessly. Maybe that seems obvious, but I have seen user research results from other university libraries where students had a hard time understanding their Get It links. Our students got “get it.”
Result #3: Metadata looked at: title, date, abstract; Metadata searched: keyword, keyword, keyword
A pattern we saw repeated over and over in student research was:
Scanning search results list
- Quickly reviewing title for relevant keywords
- Check the date – majority of student not interested in old material
- If interested, click on record to read the abstract
- If title, date, abstract check out then download / print for further reading.
Students seem to be so used to this pattern and used to seeing abstracts or snippets of content that when they don’t see an abstract, usually when they are looking at a monograph record, they’re confused and then they move on.
And although students look at different metadata fields, they rarely search them. Aside from a couple of author searches and one really heartbreaking subject search, most of the searches we saw were keyword, keyword, keyword.
Result #4: Speed, impatience and ease of access
Students quickly skimmed results lists and rarely went beyond the first page of results (or with Summon’s infinite scroll, the first 10 or so). Undergrads tended to look at fewer results than grad students.
Many students had no qualms saying they were busy and they didn’t want to waste time. There was a general tendency to skip over materials that were harder to access – things on reserve, in storage, or borrowed, documents that take a long time to download. Even when they did pursue these harder to access items, they weren’t necessarily happy about it. This is probably Emma’s favourite quote:
This is useful IF I can find it. It is not online so I will have to search the Library itself. This makes me cry a little.
Generally, the students we saw were easily able to find other things that seemed just as good, so skipping over hard-to-access items didn’t seem to create much of a problem.
Reaction from Technical Services staff
So these were the findings we thought were most relevant to Technical Services staff. There are no big surprises here, but we wanted to know how our own Technical Services staff would react to what we’d found. What would they take from our results?
In July, we gave a presentation for Library technical services staff, followed by a discussion.
Here are some of the first comments from staff, to give you a flavor:
- “On the library website, we now have a Summon search box instead of a catalogue search box, and maybe that’s why catalogue use was low.”
- “Are users even aware of the catalogue?”
- “Students don’t seem to be aware of subject headings. They should be taught about the catalogue and how to do subject searches.”
- “Maybe all first years could be given a booklet about how to search properly.”
So that was sort of the tone at the beginning. Then our Head of Cataloguing said something like “I’m not buying into this discussion that keyword searching is a bad search. Remember that keyword searches subject. Indexing is the most important part of this.”
Then the Technical Services supervisor whose questions started the project said something like “I found the part about Summon and the link resolver very interesting. This validates where we need to spend time. We can call out vendors where there is a consistent problem. Now I can be pushy to get issues resolved. If that is what students are relying on, then we have to make sure what we have is right.”
Yes, having a Head and a Supervisor weigh in like this is bound to change the tone, but things did become much more positive and proactive from here on in, with comments and suggestions like this:
- “I’m wondering about loading e-book records. Sometimes we have good records but they don’t have subjects. Perhaps now I can load these records as they have summaries so they would get picked up in a keyword search.”
- “Cataloguers can change the way we work and include table of contents and summaries in monograph records when we find them. Perhaps we could make this an official policy and procedure.”
- “Perhaps we can take more time to see how Summon pulls information and where that information is pulled from.”
- A move to better understand how our discovery system handles our records
- A push to enrich print and ebook records to improve keyword searching
- A renewed focus on making sure the knowledge base is accurate so the link resolver works
I know these aren’t necessarily ground-breaking ideas but less than an hour earlier, this same group suggested giving first year students a booklet on how to search!
Hearing that students mostly do keyword searches in Summon and Google Scholar was understandably a little threatening to staff who have a very catalogue-centric view of the library (because that’s where they spend most of their time). But very quickly, they moved on and were suggesting new ways of doing things, and new ways of thinking about their work. It was wonderful.
Technical Services and User Experience don’t usually cross over, but we saw that it can be a really good fit. Our students do their research online. Technical Services staff make decisions that affect how library resources are found online. So they are perfectly positioned to improve the user experience of our students. I’ll give the last word to one of our staff members, who after seeing our results said what I think we all want:
“Now I can attack the right problems with purpose.”