All posts by Jayne Lees

Expert Search Early Adopters pilot – training evaluation

In mid-September 2020, HEE began a pilot to understand how best to help LKS in England move from using HDAS to using provider interfaces (EBSCOhost, Proquest and Ovid – see here for further detail:

We’re now heading towards the end of the project and are currently analysing all of the data we’ve collected so we can pull together a report with recommendations for moving forward. Today we’re sharing some of the results from our training evaluation survey – the valuable feedback we’ve collected from pilot participants will shape the way training is rolled out to the rest of LKS in preparation for migration from HDAS to provider interfaces.

Training was offered by all three interface providers (EBSCO, Proquest and Ovid). The team at University of Cambridge Medical Library also very kindly provided online training that was open to all pilot participants, regardless of geographical region. All of the training sessions were recorded and made available to those who couldn’t attend the live sessions. Of the 68 participants who completed the training evaluation survey, 93% had attended some training as part of the pilot.

When asked if the training provided meant that participants felt ready to switch from using HDAS to provider interfaces for literature searching, the responses were split pretty much equally, with 51% answering yes and 49% answering no. We asked for further detail to help clarify the issues around preparedness, as this will be key to helping staff feel confident about moving from HDAS. Those who had answered ‘yes’ said that a) the training sessions were a good starting point, and that they were ready to dive in and start practicing (38%), or b) they already had some familiarity with provider interfaces and so the training acted as a refresher (34%). The responses for those who answered ‘no’ were more difficult to categorise. There was a feeling that the training on its own was not enough, that people needed time to consolidate their learning and practice. There were also comments about the need for follow up sessions so that there was an opportunity to ask questions and share experiences after the initial sessions.

When asked what they would like to change about the training offered, 68% of participants said that the timing of sessions needed to be earlier. There were definitely slippages in the timing of the whole project, and we appreciate that the training schedule didn’t fit with the expectations of the pilot participants. We know from other comments that people needed time to process what they had learnt, to practice searching so they could increase their familiarity with interfaces and also to have a chance to come back to training if they need to. All of the sessions were demonstrations with time for questions, and 10% of participants would have liked hands on training, where there’s a chance to try a live search for yourself. However, there was a recognition that could be difficult in an online format.

There was a marked difference in the feedback for the training delivered by provider representatives and the sessions held by the University of Cambridge Medical Library. Although satisfaction levels were not drastically different, participants commented positively on the content and structure of the Cambridge sessions. Their training focussed on taking a search from start to finish in an interface, whereas participants felt that the interface providers were demonstrating functionality and features that weren’t necessarily relevant to the search process.

The next blog post for the pilot will be sharing some of the results from the data we’ve collected around ‘post-switch’ searches – those carried out in provider interfaces. Participants captured information about 296 searches, which is a fantastic resource for us to draw from.

For any questions about the project, please email

Emily Hurt, Lancashire Teaching Hospitals NHS Foundation Trust
Vicky Price, Vicky Price Consulting

To fine or not to fine? That is the question…

In the SWIMS Network we’ve recently put fining for overdue items under the spotlight.  We’ve been looking into whether SWIMS Network library services charge fines or not, and their reasons, and also trends in library fining generally. 

 Here’s why we did that, and what we found out. 

 In March 2020 we went live with replacement regional library management system. A new system needs a lot of configuration, and with limited capacity available we need to decide priorities for the work.  Most of the configuration work benefits all library staff and end users, and so this is our highest prioritybut some libraries request local configuration requirements.  One of these is management of fines, as a small number of our libraries charge fines for overdue items. 

 It became clear that configuring the system for fines management would require a fair amount of work, which prompted us to review library fining generally.  We carried out a survey of our 29 library services.  How many libraries charge fines?  What are their reasons?  Have any stopped recently, and if so have they noticed any consequences?  We also looked at literature reviews on the subject. 

 A literature review carried out for the BASE Patch1 in the West Midlands in 2019 didn’t find any particular trend in health libraries; however it did reveal a trend away from charging fines in both public and higher education libraries 

 An update to that review carried out by the Health Education England Knowledge Management team in 20202 confirmed this trend away from library finingThe review summarised a number of arguments in the literature for not charging fines: 

  • they are discriminatory from a socioeconomic angle 
  • they are unnecessary membership barriersand  
  • they are bad for reputational risk especially if seen as income generation (even if there is re-investment in resources).  

The review concluded that it is hard to see a case for them.  

 These arguments were also reflected in the results of a 2020 survey of libraries in the SWIMS Network.  Reasons not to fine or to stop fining also included: 

  • they can jeopardizcustomer relations and staff can feel uncomfortable imposing them  
  • they incur administrative burden and cost including of having a cash register 
  • they may deter people accessing resources they need  
  • they may be seen as some kind of fee to use the service so actually encourage users to keep items longer 
  • they may not be in line with trust guidelines 
  • with increasing use of cashless payments – especially in light of coronavirus – there is a barrier to collection for libraries without the necessary technology 

 Reasons to fine included: 

  • they encourage people to return their books in good time so that they circulate 
  • they generate some income 
  • historical reasons (unspecified! 

 Both libraries which fine and those which don’t mentioned the need to consider alignment with other services in the locality, including higher education libraries. However, a further consideration with regionally-shared NHS library managements system like ours – which enable users to easily move between library services as they move between employers – is the administrative complexity involved where users with outstanding fines move to libraries which don’t fine. 

 Neither the literature reviewnor the survey provided any concrete evidence for the benefits of fining, either on stock circulation or user relations.  In terms of impact on stopping fining, one comment from the SWIMS Network survey stands out: 

 “Along with letting people eat in the library, it’s one of the best things we’ve ever done 

 We would be interested to know if colleagues in different parts of the country have differing views on this question! 

 Jenny Toller 

Library and Knowledge Services Development Manager, South West and South East (Thames Valley and Wessex) 


Expert Search Early Adopters pilot – pre-switch search data

In mid-September HEE began a pilot to understand how best to help LKS in England move from using HDAS to using provider interfaces (EBSCOhost, Proquest and Ovid – see here for further detail:

Phase one of data collection ended in November 2020 and we’re able to share some of the results from our survey around pre-switch searches (carried out primarily on HDAS). This was undertaken to capture ‘normal’ search behaviour, so we could do some comparing and contrasting with searches carried out on provider interfaces after the pilot groups switched. The data collected is a great snapshot of search activity and is fascinating reading if you’re interested in search behaviour.

We had 68 searches recorded during this phase. We asked participants to briefly describe their search – purpose, level of complexity etc. As expected, topics were wide ranging and search requesters were from a multitude of staff groups.

The most frequently used resources were HDAS Medline (76% of searches), HDAS CINAHL (63% of searches) and HDAS EMBASE (50% of searches). Results were collated using reference management for 13% of searches, and Endnote Desktop was the most frequently used reference management tool.

Exactly 50% of searches were completed in a single session and the other 50% over multiple sessions. The time it took to complete a search varied wildly, with the shortest taking just 20 minutes and the longest 15 hours – this was a search to support a systematic review. There were two searches captured that fitted into the systematic review category, and as their times vastly skewed the average search time they were removed from calculations. With the remaining 66 searches the average time to complete was 2 hours 51 minutes, with most taking between 1-2 hours or 2-3 hours (44% and 29% respectively).

We asked participants to tell us what had gone well with their search, what didn’t go so well, and what changes could be made to improve their search experience. Things that worked well included being able to search multiple resources without switching interfaces, being able to collate results and search history into one document for the search requester, and searches where the topic was straightforward and therefor easy to find results for. There were common issues around glitches in HDAS, de-duplicating results and the search topic either proving difficult to search for, or being outside of the scope of the databases available to the searcher. Possible improvements included increased stability (fewer interface glitches), less scrolling and a cleaner interface, and having access to reference management software to de-duplicate and collate results.

Finally we asked people to rate their search experience a star rating, where 1 is poor and 5 is excellent. The average rating was 3.7, with 34% rating 3 stars, 42% rating 4 stars and 21% rating 5 stars.

Phase two of data collection is now well under way, and we are asking participants to fill out a similar survey for any searches they carry out on one specific day of their working week. We’ll be sharing the results from this phase over the next few months.

For any questions about the project, please email

Emily Hurt
Lancashire Teaching Hospitals NHS Foundation Trust

Vicky Price
Vicky Price Consulting