Category Archives: Metrics TaF

Metrics Update

The Metrics Task and Finish group is largely resting after the publication of their Report and the accompanying Quality Metrics Template.

I spoke about the work we had completed at HLG2016 Scarborough and it brought home to me the need to make the materials we had produced more accessible.  The strong attendance and active debate impressed on me that people were definitely interested but struggling to get to grips with how to proceed.

To address this I took advantage of the annual poster competition at #NHSHE2016 and submitted the below poster on Metrics.  I went with trying to hammer home the message about the four principles and what they mean in practice. Using MARC as an acrostic had the bonus of chucking in a feeble nerdy library pun.

The poster was well received (though only placed 6th in the competition) and I was particularly pleased to see a tweet afterwards sharing the poster with a group of other libraries after it had been raised at a network meeting.

I am hoping that people will share with me examples of how they have used the metrics work. Development is underway to provide a submission tool for sharing metrics similar to that to be used around impact case studies. In the meantime – do get in touch with thoughts, questions and comments.

Alan Fricker
Metrics TaF Chair

Building on good metrics

The Metrics Task and Finish Group is happy to able to share the results of their work looking at principles for good metrics.  You can read our report now on the TaF Reports page.

The goal of this report was to create a shared understanding of what makes a good metric.  Through examining practice within NHS libraries and elsewhere we agreed a set of four core requirements.

* Meaningful

* Actionable

* Reproducible

* Comparable

There is more detail about what each of these means in practice in the report so do give it a read.

To build on the report we now want to encourage the recognition, creation and sharing of good metrics.  To this end we have also prepared a Quality Metrics Template. This is a brief document that can help you structure your thinking as you consider a metric. It also provides the kind of information others would need to be able to see if your metric will also work in their setting.  Please do share your completed templates with us – you can drop them in an email to alan.fricker@kcl.ac.uk as we work out how best to present and share them.  Hopefully we should have some to share with you shortly.

 

LKDN Statistics – what can we learn from them?

I have been trying to turn statistical data from the national collection into information that tells me something about trends and/or the health of our libraries in the south. My thoughts from this exercise may help the Metrics Task and Finish group as one of our next tasks is to review the statistics collection.

First of all I had to decide which of the 139 lines of figures submitted would work well in comparison across the years. I couldn’t compare everything as there probably aren’t enough hours in the day to go through the whole lot, besides which I think a certain boredom factor might intrude on the thinking processes. I worked with Tricia Ellis to decide which lines to include. We wanted some analysis that would identify trends and patterns of progress, investment and activity. We tried to work out which statistics would show our successes and went for 1. Income and expenditure  2. Staffing levels  3. Library activities, i.e. loans from stock, user education sessions, literature searches undertaken, etc. and  4. Changes to library infrastructure, e.g. WI-FI access

These are the specific areas I looked at comparing three years of submissions for the South West: 2012-13, 2013-14 and 2014-15

What did I learn?
Income and Expenditure
 – Library income has not changed very much over this period, some libraries have lost income where staff have moved to other Trusts but this meant that another library in the area then gained that income. However, the non-recurring income has fluctuated wildly, only one library has non-recurring funds which remained fairly static.  For some libraries this may be significant as they rely on these for their service.  I felt that these funding figures didn’t really tell me a lot so I had the idea of looking at the number of users and maybe working out what each library spent per user.

Total number of users – is this an indicator of the busy-ness of the library? Of course this depends on data cleansing – if some libraries do not clear out expired users from their systems they may be over counting memberships.  In order to get a more meaningful comparison of libraries I split them into two groups – small and large libraries based on library staffing figures. I compared memberships with expenditure and was able to work out the average spend per user.   This made me ask a question  – to show value for money should we be increasing our library membership and decreasing our expenditure so that the average spend per user would actually get smaller?
Staffing levels  (wte and staffing mix)  – the data includes library qualifications, other qualifications, none.  I did not pick up on any trends in staffing levels, most remaining static, there are some discrepancies due to reported vacancies.  In the South West over 50% of staff have a professional qualification and are paid on Band 5 or over.  12% of staff have a paraprofessional qualification. I found out that we have had and have maintained a well-qualified set of staff.

Loans from stock  – includes renewals, but there are variants as each library has a different number of loans and renewals allowed. In some libraries book loans to own readers have gone down but overall figures look fairly static showing that our book stock is still of importance to library users. Loans to local networks have gone down slightly as would be expected with the increase in ebooks.  This figure could be an indicator of the importance of the collection to others and shared resources remain vital to the cost effectiveness of libraries and ease of access to our readers.

Copies supplied by other libraries – local networks, British Library, or others – I was looking for trends. Most libraries show a downward trend for document supply but there are exceptions with two or three libraries trending upward.  No conclusions to be drawn here.

Literature searches – total number of mediated searches – are these increasing or decreasing?  In the South West there has been a steady increase which is encouraging as this is one of the areas where some analysis of searches can show how the service impacts on management and clinical decision making.

User education and induction – numbers being made aware of our services. There appears to be a lack of consistency in the way librarians collect these and figures vary greatly – for example, one library has done nearly 5000 inductions in one year whereas all the other libraries have figures nearer to the figure of 700. Two libraries simply can’t supply these figures but don’t say why.  Without comparing membership figures it is hard to tell whether the smaller libraries are doing just as well as the larger ones in providing user education.

Current awareness  – bulletins, blogs, RSS feeds and social media – the number of blogs increased,  RSS feeds figures are static, Social Networking has increased but some Trusts don’t allow libraries to do this so it feels unfair to compare them. As we are simply counting yes or no in this area I can see the activity but this figure does not tell me about outreach or impact. Do library managers analyse the activity in this area to get a more meaningful result?

Journals – print titles, electronic only, print with electronic. Electronic only titles have increased.  Nothing really useful to be learned here so why record it?

Collaborative purchase scheme – figures went up and then back down. Although this collaborative purchases is key to getting good value for money the way we record it doesn’t tell us enough, how many resources were made available to how many libraries through these different schemes?

eBooks  and Databases – these figures don’t have story to tell – why count?

WiFi – most libraries have WiFi now can we now assume that this is the norm and stop counting it?

I would be interested to know if anyone else has tried to get a “story” from the annual statistics collection and if not, are there other statistics that you have used to plan services or make a decision?

Dorothy Curtis
Deputy Library Service Manager
Gloucestershire Hospitals NHS FT
Dorothycurtis@nhs.net