Introduction
In the challenging and constantly changing environment
of higher education, academic libraries are increasingly being tasked with
demonstrating the value they provide to the university and its wider
stakeholders. In the past, the assessment of library value was measured by inputs
and outputs, which are mostly concerned with internal library processes and
outcomes; however, to better align with institutional mission and goals,
academic libraries are being forced to look outward and must now articulate and
provide evidence of their value outside of the library. Therefore, it is no longer
reasonable for academic libraries to take their role for granted as “the heart
of the university” (Stemmer and Mahan, 359). This shift in the assessment of
library value has necessarily also lead to a shift in the research, which has
presented opportunity for impact studies to examine the relationship between
use of library resources and student outcomes.
Impact studies can be described as
‘analyses that [seek to] demonstrate an alignment of library activity with the
mission of the institution’ (Revill in Allison, 2015, p.31). The purpose of
this research is to provide an overview of the current state of the literature
on one aspect of library activity – that is, library use. In this way, this
paper seeks to describe the current state of research into the relationship
between use of library resources and student outcomes. This paper also sets out
the current research agenda for library assessment and provides an overview of opportunities
and future steps for professional development, so that librarians have the core
competencies to better document and communicate to stakeholders the impact the
library has on student outcomes.
As the main focus is on library use,
research that looks at factors such as library expenditure, collection
development and staffing is out of scope for the purpose of this paper. Furthermore,
studies that look at library impact on teaching effectiveness, the research
environment and overall institutional quality and assessment is also out of
scope. As this review is concerned with student outcomes at higher education
institutions, only research into use of resources at academic libraries has
been considered; however, it should be noted that academic librarians can also
learn from their counterparts in other types of libraries. A variety of
publication types were sourced including articles, conference papers, case
studies, reports and websites, which cover the experience of academic libraries
in international and Australian contexts. The material reviewed surveys the
various methodologies being employed in the research, such as surveys, focus
groups, data from library and institution systems, and other measures which compare
library usage with evaluations of student success.
The many-faceted
meanings of use
At the outset it’s important to understand
what is meant by use in the context
of use of library resources. Fleming-May (2011) has conducted interesting
research into understanding the various discursive meanings and construction of
the use concept in the professional
and scholarly journal literature. Whilst use is deployed in the literature as a
concept with a seemingly universal meaning, in practice there is no agreement
on what a use is and the concept of library use
as presented in the LIS literature has ‘several separate and appreciably
different facets, or meanings’ (Butkovich, 1996; Fleming-May, 2011. p.306). To
clarify and illuminate its polysemic meanings and construct a typology, Fleming-May
(2011, p. 301) applies Beth L. Rodgers’s Evolutionary
Concept Analysis (ECA), which is
‘an approach that considers the ways in which a concept is applied within a
given context in order to identify its attributes within that context.’ ECA has
similar epistemological foundations to Foucauldian discourse analysis, which is
a methodology commonly applied in the social sciences (Fleming-May, 2011).
In the analysis of the literature,
Fleming-May (2011) suggests that library use can be organised into four
categories:
use of the
library as an abstraction, or general idea; use of the library as an implement,
or tool; use of the library as a transaction or occurring within a discrete
instance; and use of the library as a complex process (p. 306).
For the purposes of this paper we will only
be concerned with use of the library as a transaction or instance. Use as an instance refers to the
‘transactional instances of the library or information that can be recorded and
quantified’; for example, circulation, interlibrary loan requests, database
usage, gate counts, reference questions answered, etc. Whilst this approach
provides quantitative data on the transactional use of the library’s resources
it doesn’t provide any qualitative examination of the user’s motivation in
choosing to access those resources (Fleming-May, 2011). When discussing the
transactional instances of electronic resources there are a number of
challenges, so any discussion of usage of electronic resources should consider
the differences in the way vendors generate reports as well as the
inconsistencies in defining transactional instances such as clicking on or
downloading material – both which are frequently referred to as “usage” or
“use” (Fleming-May, 2011).
Directly relevant to this review,
Fleming-May (2011) also notes how interest in better understanding library use
has intensified in recent times ‘due to changing opinion about appropriate
methods for measuring library effectiveness.’ There has been a shift away from
the traditional quantifiable measures of inputs and outputs (for example,
through counting transaction uses) to more qualitative approaches such as
outcomes-based assessment. This is being driven at the institutional level, and
as academic libraries are increasingly being tasked with demonstrating the
value they provide to the university and its wider stakeholders, ‘measuring
inputs and outputs has become an increasingly inadequate method of
demonstrating the ways in which libraries contribute to their communities’
(Fleming-May, 2011. p. 300).
The higher
education landscape
A review of the literature quickly shows
that academic libraries are increasingly being faced with the challenge of
proving their value to the institution and its stakeholders. Oakleaf (2010, p.7)
cites how higher education providers have had to adopt corporate values and
practices, which have caused an ‘internal paradox between assessment to improve
academic programs and assessment for external audiences designed to answer
calls for accountability from policy makers and the public.’ Driven by
increasing demands for accountability, colleges and universities are under
pressure from their stakeholders to prove value ‘in terms of student outcomes
such as persistence, graduation, and employment, as well as student learning
outcomes (Matthews, 2012; Saunders, 2015, p. 285).
This change in the higher education
landscape has also disrupted the symbolic and seemingly protected status of
academic libraries as the “heart of the university” (Oakleaf, 2010; Allison,
2015; Stemmer and Mahan, 2016). In the same way that higher education
institutions are required to demonstrate evidence they are achieving their goals,
academic libraries must also prove their value. Academic libraries can ‘no
longer rely on their stakeholders’ belief in their importance’ and must now ‘demonstrate
their value’. Therefore, ‘[l]ibrarians are increasingly called upon to document
and articulate the value of academic and research libraries and their
contribution to institutional mission and goals’ (Oakleaf, 2010, p.4). So,
academic libraries must now ask the ultimate question: “How does the library
advance the missions of the institution?” (Oakleaf, 2010, p.11).
The quest
for data and its challenges
In light of these changes in the landscape,
library leadership has had to find data that demonstrate the value of the
library in institutional terms (Stemmer and Mahan, 2016). At the 2010 Library
Assessment Conference, the keynote speaker suggested to the audience that:
In this digital
age you are in possession of a valuable resource, library transactions data for
your student, staff and faculty patrons. That data can be used to evaluate the
impact of library services and resources on outcomes of value to the university
(Shulenburger, 2010, p. 4).
And as Oakleaf (2010) notes,
until libraries
know that student #5 with major A has downloaded B number of articles from
database C, checked out D number of books, participated in E workshops and
online tutorials, and completed courses F, G, and H, libraries cannot correlate
any of those student information behaviours with attainment of other outcomes.
Until librarians do that, they will be blocked in many of their efforts to
demonstrate value (p. 96).
With this aim in mind, Matthews (2012,
p.257) suggests that it would be useful to build a data warehouse that could
pull together a large array of data across various systems and silos of the
institution, and from this central data repository, ‘the library would be able
to prepare a wide variety of data analysis and correlations to help determine
the value of library resources’. In some cases, rather than the library
developing its own data warehouse, a campus data repository may already exist,
so libraries should discover what other resources are available and work in
partnerships with other campus departments (Matthews, 2012). To get around the
issues of privacy, Oakleaf (2010) recommends that data systems should strip out
individual identifiers in information records to protect the privacy of
individuals.
Value and
impact of academic libraries
In 2010, to understand and meet these new
challenges, the Association of College and Research Libraries commissioned the
report, Value of Academic Libraries: A
Comprehensive Research Review and Report. The report provides a thorough
review into the current state of the literature on the value of academic
libraries within an institutional context and sets out a research agenda that
has sparked new research in impact studies, which is assisting academic
libraries better articulate their value in institutional terms to their
stakeholders (Oakleaf, 2010. p. 25; Stemmer and Mahan, 2016). Based on the
literature, the report also presents recommendations for how academic libraries
should demonstrate value, identifies potential surrogates for library value,
and suggests possible areas of correlation for the collection of library data
(Oakleaf, 2010). It should be noted that the report does not provide an
overview of methods for assessing library value within a library context.
However, this review does provide an overview of various methodologies in the
section below. The importance of the report can be seen in the frequency it is
cited in the literature, and by virtue of its reference in the leading
statement on “Value and Impact of University Libraries” on the website of the
Council of Australian University Librarians (CAUL).
Implications
for professional development
The report also suggests that ACRL create a professional development
program to build the profession’s capacity to ‘document, demonstrate, and
communicate library value in alignment with institutional goals’ (Brown and
Malenfant, 2012, p. 4). Based on this recommendation ACRL in partnership with
other professional organisations convened two summits under the auspices of the
“Building Capacity for Demonstrating the Value of Academic Libraries” project.
An outcome of the summits was a report titled, Connect, Collaborate, and Communicate: A Report from the Value of
Academic Libraries Summit (Brown and Malenfant, 2012). The report also
provided a number of important recommendations of which some are set out below.
Recommendation 1: Increase
librarians’ understanding of library value and impact in relation to various
dimensions of student learning and success.
The report recommends that assessment of student learning should take
into consideration a number of factors including demographics, learning styles,
educational goals, motivations and instructional format using a variety of
qualitative and quantitative methodologies, such as surveys, testing,
comparative data, interviews, etc. Based on this recommendation, the report
suggests a number of actions for the library profession, which include
developing a research agenda that considers the key questions raised in
Oakleaf’s (2010) report; investigating how the library can increase library
impact; and identifying common data sources available at the institution that
can be combined with library data to document student learning and success
(Brown and Malenfant, 2013, p. 12).
Recommendation 2:
Articulate and promote the importance of assessment competencies necessary for
documenting and communicating library impact on student learning and success.
As well as identifying a need for skills in incorporating outcomes into
library planning and evaluation, and leadership in being able to lead
conversation about assessment, the report highlights a need for data
competencies so that librarian can apply ‘knowledge of assessment data,
including the different roles of quantitative and qualitative data, sources of
data, and the analysis and interpretation of data’ (Brown and Malenfant, 2013,
p. 12). This is echoed by Matthews (2012, p. 257), who says that ‘being a good
“data jockey” will increasingly become a real marketable skill for librarians.’
Oakleaf (2010, p. 29) points out the positive opportunities for the
profession noting that ‘the current higher education environment offers
librarians an opportunity to accelerate change.’ Taking this as a great
opportunity to update their roles, ‘librarians can reconceptualise their
expertise, skills, and roles in the context of institutional mission, not
traditional library functions alone.’ Therefore, professional development of
current and future librarians is necessary so that librarians can better
articulate – in institutional terms – the impact and value the library has in
contributing to the institutional mission.
The research
agenda in practice
Having understood the factors driving the
need for academic libraries to demonstrate their value in institutional terms
to stakeholders, and the research agenda at large, we can now look at specific
examples of research into the use of library resources and student outcomes. The
following research has been reviewed in context to the preceding discussion. Whilst
the demand for research into the value
of academic libraries that is articulated in institutional terms is relatively
recent, studies of the impact of
academic libraries on student outcomes can be traced further back. The need for
research in this area was first articulated by Lane in 1966. In his seminal
paper, “Assessing the Undergraduates’ Use of the University Library, he
recognised that assessing the library in terms of physical facilities,
collections and its budget, was not a sufficient ‘measure of the library’s
effectiveness as an instrument of education’ (Lane, 1966. p. 277). He argued
that, ‘such measures can be obtained only by assessing the extent to which
students use the library and the extent to which use relates to academic
growth’ (Lane, 1966. p.277). Lane also acknowledged some of the difficulties in
conducting this type of research noting issues that continue to plague
researchers and the methodologies employed to the present day. In particular,
he notes how these types of assessment are time-consuming, expensive and
difficult to achieve with complete objectivity (Lane, 1966. p. 277). However,
he does note that these types of studies can produce worthwhile results for the
library’s stakeholders by providing ‘information useful to administrators,
students and faculty’ (Lane, 1966. p. 277).
A few years later, Kramer and Kramer (1968)
published a study that investigated the connection between library use,
retention and GPA scores among college freshmen at California State Polytechnic
College in the United States. Library use was measured by looking at library
loan records, which indicated how many books were checked out; and data that
showed factors such as name, sex, major, return or non-return to school in fall
1964, and grade point average (GPA) were obtained outside the library from the
registrar’s office (Kramer and Kramer, 1968). The study found that students who
borrowed no books during the period achieved a lower GPA than students who used
the library. Data also showed a strong indication that students who resided
on-campus had a higher correlation with persistence or retention. According to
Kramer and Kramer (1968, p. 312), their research appeared to show ‘a strong and
statistically significant correlation between library use and student
persistence.’ Based on the findings that a high proportion of students did not
borrow any library books during the research period, they suggest that counseling
and orientation could be productive in improving academic success and
persistence (Kramer and Kramer, 1968).
Almost two decades later, Hiscock (1986) directly
posed the question: “Does library usage affect academic performance?” At the
time Hiscock (1986, p.207) acknowledged the lack of interest in the library by
the institution, noting ‘a degree of ignorance of what really happens in
libraries and an absence of research to investigate the relationship between
usage of libraries and academic performance.’ The aim of Hiscock’s (1986)
research was to examine whether library use affected the academic outcomes of
the students surveyed in the study. Similar to Kramer and Kramer (1968),
Hiscock (1986) wanted to understand whether students who used libraries
performed better academically than students who did not.
Data was gathered through a survey of 196
students across selected first and second year undergraduate courses at the
Underdale campus of the South Australian College of Advanced Education in
Australia. The questionnaire asked questions about various types of library
usage including: usage of library staff; the catalogue; resources such as
encyclopaedias, indexes and abstracts; photocopying facilities; and use of the
library as a place for private study (Hiscock, 1986, p. 208).
Looking at various information-seeking
behaviour models the study sought to adopt an existing model ‘to aid in the
construction of hypotheses to evaluate the effect of library usage on academic
performance’ (Hiscock, 1986, p. 209). Hiscock (1986) arrived at nine
hypotheses, which she tested using statistical methods and reported on the
results for each hypothesis. Overall, the results were generally disappointing,
however, she did identify two areas that were associated with positive academic
performance: previous experience of using libraries and use of the library
catalogue (Hiscock, 1986. p. 213).
In 1992, the search for methods for better understanding
library impact on student outcomes continued with Powell’s study, “Impact
Assessment of University Libraries: A Consideration of Issues and Research
Methodologies.” Powell (1992, p. 249) notes that since the early 1970s, much of
the interest in measuring the effectiveness of libraries has focused on performance
and/or output measures, however, whilst valid measures, ‘librarians must
somehow document that the use of library services and resources actually has a
beneficial impact on the user.’ Powell (1992) identifies a number of problems
in determining what needs to be measured and suggests that the nature of the
use must first be determined. Similar to Fleming-May (2011), due to the wide
variance in the literature in describing the categories of use, and reasons or
purposes for library use, he identifies the difficulty of defining what is
meant by use (Powell, 1992). Across the
various methodologies reviewed, Powell (1992, p.253) suggests a number of
methodologies that are capable for measuring impact assessment and ‘permit
adequate testing of causal relationships without sacrificing too much external
validity.’ In applying better methodologies, libraries will be able ‘to know
how students’ use of libraries affect their academic performance’ (Powell,
1992, p.245).
Although the early studies are important
and provide a foundation for understanding the emerging need for research in
this area, Oakleaf’s (2010) report sets out the research agenda and provides a
call to action, and in doing so, marks the start of a fervent period of growth
in impact studies, which aim to assist academic libraries to better articulate
their value in institutional terms to appropriate stakeholders. In 2010, around
the time that Oakleaf was setting the research agenda, Haddow and Joseph
published findings of their research into library use and student retention at
Curtin University in Australia. The specific aims of the study were: ‘to
explore if an association between library use and student retention is evident,
and to investigate whether socio-economic status (SES) and age at entry are
influencing factors in library use and retention’ (Haddow and Joseph, 2010, p.
234).
To achieve these aims, Haddow and Joseph
(2010) analysed enrolment, demographic and library use data for students
enrolled in Semester 1, 2010 at the university. Enrolment and demographic information
was provided by the university’s student database and was used to identify
students that were retained or had withdrawn by the end of Semester 1. Two
spreadsheets were generated from the database and included data such as student
ID number, postcode, address and mature age. Students that were retained or had
withdrawn were identified using unique student ID numbers (Haddow and Joseph,
2010). The Library Management System provided library use data for commencing
students measured at three points in the semester. Use data collected included:
number of items borrowed (loans); number of logins to a library workstation (PC
logins); and number of logins to the catalogue, databases, metasearch tool, and
eReserve (other logins) (Haddow and Joseph, 2010). The researchers note that
ethics approval was required to conduct the study with particular consideration
‘to ensure individual students were not identified or identifiable and the
secure storage of data’ (Haddow and Joseph, 2010, p. 236).
Using SPSS, a statistical software
programme, quantitative analyses were applied to the data. Haddow and Joseph
(2010) found that regardless of whether students were retained or had
withdrawn, a large proportion (64.6%) had not borrowed items from the library
during the semester. In the case of library use, as indicated by PC logins or other
logins, these showed higher levels of use during the semester, with 74.6% and 83.7%,
respectively (Haddow and Joseph, 2010). When all three types of library use
were analysed against retention it was found that ‘retained students showed
higher levels of loans, PC logins, and other logins’ (Haddow and Joseph, 2010,
p. 238). In terms of demographics, the study found little differences in
library use in relation to loans and other logins. However, significant
differences were found for PC logins for students from low to medium SES
backgrounds; and surprisingly, students from the high SES group showed no or
low use of library workstations (Haddow and Joseph, 2010, p. 239). When it came
to mature age students, the results showed statistically significant
differences in the number of loans between mature age students and those under
21 years, with mature age students borrowing books at higher rates than younger
students (Haddow and Joseph, 2010, p. 239). Due to the apparent association
between library use and student retention, Haddow and Joseph (2010, p. 240)
suggest there are ‘implications for the planning of orientation and information
literacy activities.’
Around the same time research was being
conducted in Australia, Goodall and Pattern (2011) published a case study from
research that was in progress at Huddersfield University in the North of
England. Librarians at the university had identified ‘a historical correlation
between library usage and degree classification’, which on a priori assumptions
suggested that students who borrowed more books and accessed more electronic
resources achieved better grades (Goodall and Pattern, 2011. p. 160).
Preliminary research showed that some student groups were not using library
facilities and resources as much as was expected (Goodall and Pattern, 2011).
Three sets of data were collected on use of library resources: use of
electronic resources, book loans, and visits to the library. These variables
were then graphed, which showed ‘consistent amounts of no and low use at
campus, academic school, degree-type and course level’ (Goodall and Pattern,
2011. p. 159). When these results were combined with data showing academic
achievement it raised the question of whether there was a positive correlation
between library use and student attainment.
This research opened up a new area of
interest in impact studies, because at the time, whilst there had been previous
studies that looked at linking grades and retention to use of library resources
in investigating the impact of the library on student outcomes, engagement of
non-users was relatively unchartered territory (Goodall and Pattern, 2011, p.
162). In drawing attention to the importance of understanding non/low use of
the library, Goodall and Pattern (2011, p. 162) identified it as ‘a central
issue for individual students concerned about their grades, for academic staff
concerned about attainment, and for institutions concerned about retention.’ In
this way, the researchers hoped that if they could understand the reasons
behind non/low use then effective interventions could be developed and
trialled, and strategies could be implemented that would improve ‘the grades of
all students, from the bottom up, rather than just continuing to support those
which are already high flyers’ (Goodall and Pattern, 2011, p. 160).
The findings of this research were
presented at the 2010 UKSG Conference in Edinburgh, which attracted interest
from other universities who were interested in benchmarking. However, at the
time is was suggested that the data still had not been tested for statistical
significance; therefore, it was unknown whether the findings at Huddersfield
were due to the sample data used, rather than a true reflection that possibly
existed across a wider population (Stone and Ramsden, 2013). Based on the
initial research the project was expanded across eight universities in the
United Kingdom. The Library Impact Data Project (LIDP) was a six-month project
funded by Jisc to investigate the hypothesis that: “There is a statistically
significant correlation across a number of universities between library
activity data and student attainment” (Stone and Ramsden, 2013. p. 546). The
LIDP looked at usage data of 33,074 undergraduate students across the
participating universities with e-resources usage, borrowing statistics and
gate counts measured against final degree award. By supporting the hypothesis,
the LIDP aimed, ‘to give a greater understanding of the link between library
activity data and student attainment, which would show a tangible benefit to
the higher education (HE) community’ (Stone and Ramsden, 2011. p. 550).
In line with Oakleaf (2010), the project
was concerned with data protection issues, which were seen as a potential risk.
Due to the sensitive nature it was important that data was obtained in a way
that met legal and university regulations and students were informed that their
library use may be measured. The data was also fully anonymised and made
available to the project as part of an open data agreement and any courses that
only had a few students were excluded from the data to prevent identification
(Stone and Ramsden, 2013).
The researchers used both qualitative and
quantitative methodologies in the project. For example, qualitative data was
collected using focus groups and followed up with a brief questionnaire, which
helped to qualify issues that were identified. The transcripts were examined
for any apparent themes and statements were coded (Stone and Ramsden, 2013). Quantitative
data were analysed with statistical methods, which showed a positive
relationship between use of e-resources and degree result; book borrowing and
degree result; but not between gate counts and degree result. The data
suggested that the more an e-resource or book is used, ‘the more likely a
student is to have attained a higher-level degree result’ (Stone and Ramsden, 2013,
p. 554).
Whilst the project was regarded as
successful in demonstrating a statistically significant relationship between
use of library resources and final degree award and thereby substantiating the
initial hypothesis, researchers also identified a number of issues similar to
those discussed above. In terms of data reliability there are inherent issues
when it comes to use data. For example, data for use of e-resources and
borrowing of books does not reveal whether the item has actually been read,
understood and referenced, and in the case of e-resources, counting clicks and
downloads is problematic and variable across different databases, so heavy
usage does not necessarily relate to high information-seeking or academic
skills. The project also found that it underestimated the time taken to analyse
the data with collection and analysis taking four out of the six months of the
project (Stone and Ramsden, 2013).
In December 2011, the project secured
another tranche of funding to extend the LIDP into phase II, which as per
Oakleaf (2010) looked at demographic factors such as gender, age, ethnicity,
and country of origin to further enrich the quality of data to identify
additional causal links (Stone and Ramsden, 2013). Research in the U.K. has
continued and the project has since expanded into a partnership between Jisc,
Mimas (at the University of Manchester) and the University of Huddersfield, and
is now named the Library Analytics and Metrics Project (LAMP).
Another recent example of research
combining library transaction data with student performance data was undertaken
by the University of Wollongong Library (UWL) in Australia. Like many libraries
around the world, the library has used client satisfactions surveys to collect
feedback from its users with information gained used to drive continuous
improvement to the quality of services (Jantti and Cox, 2012). However, whilst
useful, the researchers argue that there are significant limitations to
surveys, including
they are
naturally biased towards library users; they are not run frequently enough to
support marketing; and they do not measure the impact of the library on
client’s success, only respondents’ subjective assessment of value and
performance (Jantti and Cox, 2012, p. 69).
With these limitations in mind, UWL undertook
a project in conjunction with the university’s Performance Indicator Unit to
develop a data warehouse and reporting function (a Cube) that combines library
usage of electronic resources with students’ demographic and academic
performance data (Jantti and Cox 2012; Pepper and Jantti, 2014). The aim of the
project was to help the library, ‘improve the impact of its resources and
teaching activities with respect to student academic performance, and student
engagement’ (Jantti and Cox, 2012, p. 69). And in line with much of the
discussion above, ‘unambiguously demonstrate the contribution [the library] is
making to institutional learning, teaching and research endeavours’ (Jantti and
Cox, 2012, p. 69).
UWL originally built a number of ‘Cubes’ to
aid in data collection and analysis. The Library Cube is a dataset that
combines usage of library resources with student demographic data and
performance using student numbers as a unique identifier. Due to the
university’s Privacy Policy, which allows for use of personal information, the
project was able to legally and ethically make use of student information. However,
in constructing the cube by only being able to view aggregated data, the project
did try to ensure that the library could not drill down to see a specific
student’s personal information except in the unlikely situation where there were
a small number of students in a variable within the cube (Jantti and Cox,
2011).
The Value Cube is a dataset that is
structured around academic teaching sessions used to assess the impact of
library resources on student outcomes measured by Weighted Average Marks (WAM).
Note that GPA scores are generally not used in Australia with most tertiary
institutions using WAM for academic grading. The Value Cube also allows the
library to review demographics by level of usage giving much more granular
analysis from the data set than has been achievable in previous studies (Jantti
and Cox, 2012). Data for the Library Cube was pulled from the Library
Management System, which included loans data and usage data for electronic
resources with ezproxy logs used to determine which databases, ebooks and
ereadings materials were being used by which student (Jantti and Cox, 2012).
As with the LIDP project (Stone and
Ramsden, 2013), the researchers observed the same limitation with usage, noting
that ‘just because someone borrowed a book does not mean they read, understood
or used the book’ (Jantti and Cox, 2012). Furthermore, the issue of correlation
and cause was raised as there could be many other factors that help contribute
to students’ academic performance (Jantti and Cox, 2012). However, the project
did find that there was very strong evidence that the library was positively
impacting on students’ academic success; for example, the researchers found
that students who used the library’s collection – through borrowing books and
using electronic resources – were more likely to achieve higher WAMs (Jantti
and Cox, 2012).
Having been able to see that the library
was indeed providing value to students who used library resources, the next
phase of the project sought to answer further questions and solve problems
around which students were using resources; assist with interventions in
conjunction with faculty to encourage non/low users to engage with resources;
and more ambitiously, to test if the library had been successful in influencing
behaviour by looking at post-intervention data (Pepper and Jantti, 2014). To
assist in these new endeavours, a Marketing Cube was built, which replicated
the demographic elements of the Value Cube and contained information on which
specific databases were being accessed on a weekly basis to provide a more
immediate view of resource use, which is allowing the library to understand the
context in which library resources are being used in relation to information
need (Pepper and Jantti, 2014).
Conclusion
This paper has provided an overview of the
state of research into the value of academic libraries and impact studies in
higher education. Due to the increased demands of accountability, academic
libraries are challenged with demonstrating the value they provide to the
university and its wider stakeholders. This shift in the higher education
landscape has a necessitated a shift in the need for impact studies to go
beyond impact, so that libraries can document and articulate the value they
contribute to the institutional mission and goals. This need has accelerated
the research agenda and produced a growing body of research that specifically
investigates the links between use of library resources and student outcomes.
All of the studies that were reviewed indicated at least some correlation
between library use and academic achievement, however, correlation does not
necessarily mean cause. The research is also challenged by the difficulties
around the meaning of use and the
inconsistencies in methodologies, as well as the inherent issue around use data
itself – the data doesn’t reveal that library resources such as books have
actually been read, understood or referenced – and when it comes to e-resources
there is variability across platforms and vendors in how usage is counted. To
better achieve the aims of future research in this important area of LIS studies,
librarians and researchers will need to take heed of Oakleaf’s recommendations
for professional development. Librarians who have the necessary data
competencies will be able to design better research that demonstrates the value
their library is making in institutional terms. Ultimately, it is these
librarians of the future who will be able to go beyond their traditional
library functions and take advantage of the challenges and changes in the
higher education landscape, and lead conversations by confidently responding to
the questions set forth by the research agenda.
References
Allison,
D. (2015). Measuring the academic impact of libraries. Portal: Libraries and
the Academy, 15(1), 29-40. Retrieved from https://www.press.jhu.edu/journals/portal_libraries_and_the_academy/)
Brown, K., & Malenfant, K. J. (2012) Connect, Collaborate, and
Communicate: A Report from the Value of Academic Libraries Summits. Association
of College and Research Libraries. Retrieved from: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_summit.pdf
Butkovich, Nancy J. (1996) Use Studies: A Selective Review. Library Resources and Technical Services,
40(4), 359–68.
CAUL. (n.d.). Value and Impact of University Libraries.
Retrieved May 8, 2016, from http://www.caul.edu.au/caul-programs/best-practice/cqaac-resources/library-value
Fleming-May, R. A. (2011). What is library use? Facets of concept
and a typology of its application in the literature of library and information
science. The Library Quarterly, 81(3), 297. Retrieved from: http://www.journals.uchicago.edu/toc/lq/current
Haddow, G., & Joseph, J. (2010). Loans, logins, and lasting the
course: Academic library use and student retention. Australian Academic and
Research Libraries, 41(4), 233-244.
Jantti, M. & Cox, B. (2011) Measuring the value of library
resources and student academic performance through relational datasets. Paper
presented at Library Assessment Conference: Building Effective, Sustainable,
Practical Assessment. Retrieved from: http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1120&context=asdpapers
Jantti, M. & Cox, B. (2012) Capturing business intelligence
required for targeted marketing, demonstrating value, and driving process improvement.
Paper presented
at 9th Northumbria International Conference on Performance Measurement in
Libraries and Information Services. Retrieved from: https://www.libqual.org/documents/LibQual/publications/2013/9th_Northumbria_Conference_Proceedings.pdf
JISC (n.d.). LAMP to be integrated into Jisc’s Learner and Business
Analytics R&D activities. Retrieved May 21, 2016 from http://jisclamp.mimas.ac.uk/
Kramer,
L. A., & Kramer, M. B. (1968). The college library and the drop-out. College
& Research Libraries, 29(4), 310-312. doi:10.5860/crl_29_04_310
Lane, G.
(1966). Assessing the undergraduates' use of the university library. College
& Research Libraries, 27(4), 277-282. doi:10.5860/crl_27_04_277
Matthews, J. R. (2012) Assessing library contributions to university outcomes: the need for
individual student level data. Paper presented at 9th Northumbria International
Conference on Performance Measurement in Libraries and Information Services.
Retrieved from: https://www.libqual.org/documents/LibQual/publications/2013/9th_Northumbria_Conference_Proceedings.pdf
Oakleaf, M. (2010) Value of
Academic Libraries: A Comprehensive Research Review and Report. Association
of College and Research Libraries. Retrieved from: http://www.acrl.ala.org/value/?page_id=21
Pepper,
A. & Jantti, M. (2014). The tipping point. Paper presented at ALIA
Information Online. Retrieved from:http://information-online.alia.org.au/sites/default/files/thetippingpoint.docx
Powell, R. (1992) Impact Assessment of University Libraries: A
Consideration of Issues and Research Methodologies. Library and Information Science Research, 14, 245-257.
Saunders,
L. (2015). Academic libraries' strategic plans: Top trends and under-recognized
areas. Journal of Academic Librarianship, 41(3), 285-291.
doi:10.1016/j.acalib.2015.03.011
Shulenburger, D. (2010) The relationship between university
assessment and library assessment. Paper presented at Library Assessment
Conference, Baltimore. Retrieved from: http://libraryassessment.org/bm~doc/shulenburger_david.pdf
Stemmer, J. K., & Mahan, D. M. (2016). Investigating the
relationship of library usage to student outcomes. College & Research
Libraries, 77(3), 359-375. doi:10.5860/crl.77.3.359
White, S. & Stone, G. (2010). Maximising Use of Library
Resources at the University of Huddersfield. Paper presented at UKSG 33rd
Annual Conference and Exhibition. Retrieved from: http://eprints.hud.ac.uk/7248/
No comments:
Post a Comment
Thank you for your comments. All comments are moderated. If you're comment is not advertising or spam then it will be approved as soon as possible.