Next Article in Journal
Engaging and Supporting a University Press Scholarly Community
Previous Article in Journal
Substandard Journal Management: Wastage of Authors’ Motivation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Library Assessment Research: A Content Comparison from Three American Library Journals

by
Ethan J. Allen
1,*,†,
Roberta K. Weber
2,† and
William Howerton
1,†
1
Florida Atlantic University Libraries, Florida Atlantic University, Jupiter, FL 33458, USA
2
Department of Curriculum, Culture, and Educational Inquiry, College of Education, Florida Atlantic University, Jupiter, FL 33458, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Publications 2018, 6(1), 12; https://doi.org/10.3390/publications6010012
Submission received: 21 October 2017 / Revised: 20 February 2018 / Accepted: 12 March 2018 / Published: 15 March 2018

Abstract

:
Improvement of academic library services as an outcome of continuous assessment is an aim of libraries of higher education institutions. Academic libraries are realizing the need to document evidence of their value to the institutions and the patrons they serve. Publications that include assessment research are reaching library decision makers, who seek to apply evidence to improve services or implement best practices that benefit all stakeholders. Following two previous studies that reported longitudinally on front-line library services, this paper investigates current five-year trending of three prestigious academic library journals in the publication of assessment studies. Data for this study were drawn through a content analysis process, in which the investigators selected studies for inclusion using a set of criteria developed in a pilot exercise. After individually examining 649 research articles, published between 2012 and 2016, 126 met the study’s selection criteria and were categorized according to the type of service they studied. Papers on information literacy instruction dominated, while reference services, technology, and general assessment studies saw less representation in the three journals. This finding reflects the priority placed upon information literacy instruction and describes how three American library journals are responding to current trends across academic libraries.

1. Introduction

Academic libraries within institutions of higher education can offer an impressive array of front-line services, inclusive of reference, instruction, and computer technology, but they are also moving toward futuristic, leading-edge services and offerings, such as big data curation, digitalization, makerspaces, and scholarly communications services, to name a few. Not all academic libraries, however, offer these services uniformly, and considering whether an institution grants degrees at the doctoral, bachelor’s, or associate’s level, the nature and quantity of services themselves may vary in response to differentiated needs. How library patrons use the services and resources of their home institution, and patrons’ satisfaction with them, have been a documented concern for library decision-makers. Service and resource assessments provide library decision-makers evidence as they strive to make improvements, and when assessment studies are published, the knowledge base for them is expanded. Hernon and Dugan describe the rationale for studying service quality and patron satisfaction:
Either service quality or satisfaction can be an end in itself; each is worthy of examination as a framework for evaluating library services from a customer’s or user’s perspective. By paying proper attention to assessment, service quality, and satisfaction, libraries are, in effect, promoting continuous quality improvement. Improvements lead to change, and library leaders must manage that change and ensure that the library’s assessment plan is realistic and realized. Both service quality and satisfaction should be part of any culture of assessment and evaluation.
[1] (p. 120)
The improvement of services, particularly in the front-line areas of information literacy instruction, reference, and technology, have become the focus of both informal self-study and published assessments. Studies of this type may be found in journals devoted to specialized services, but also in journals of high impact and wide readership. Examples of the latter are Journal of the Association for Information Science and Technology, Journal of Documentation, and New Review of Academic Librarianship. Examples of specialized journals not limited to an academic library readership that cover assessments are the Journal of Interlibrary Loan, Document Delivery, & Electronic Reserve and Cataloging & Classification Quarterly. There are many more examples of each type of journal dedicated to the promotion of best practices through assessments. Further, there are professional gatherings, such as the International Conference on Performance Measurement in Libraries (formerly called the Northumbria Conference), the Association of Research Libraries (ARL) Library Assessment Conference (LAC), and the Evidence Based Library and Information Practice Conference, which sponsor journals or publish proceedings. These developments are indicative of the desire among library professionals to grow and improve services offered to their patrons.
The relationship between scholars and custodians of texts with respect to service has changed significantly over time. As recently as the 20th century, service to scholars was very limited. In the current academic environment, however, library patrons are valued as “customers”, and consequently their satisfaction with library services takes high priority. The supporting pillar for library customer satisfaction is a culture of assessment, which takes into consideration how well library services are perceived by patrons. Further, assessments are a vehicle for communicating value to the parent institution’s administrators, particularly through the measurement of learning outcomes following library instruction and information literacy programming. The professional literature covers an array of articles on library services, but only within the last four decades has evidence-based research on front-line services been more abundant and available to decision-makers. The problem, however, is that assessment research on front-line academic library services in the last decade reflects a top-heavy presence of studies focusing upon information literacy and learning outcomes. While this content is important, and in fact vital to demonstrating value, studies on other service areas have relatively less representation.
The present study was undertaken to examine the availability of published assessments to college and university library decision-makers as they draw from a data-rich knowledge base. By examining a recent sample of the professional library literature, it aims to identify what may be needed to expand the literature, while providing a snapshot of what is trending within published assessments on library services. The study examines both the coverage of site-specific general library assessments and assessments of three front-line services that have been covered within three prominent Library and Information Science (LIS) journals over a recent five-year period.

2. Literature Review

2.1. Value

The value of academic libraries may be defined in different ways. In Value of Academic Libraries (VAL), Oakleaf summarizes relevant definitions in terms of internal and external focuses. The internal definitions center upon use, return-on-investment, and the production of a commodity, whereas the external focus extends to a more experiential value or impact upon the user—that is, on those activities that the “library enables them to do” [2] (p. 23). Lakos and Phipps, with Wilson, describe the assessment culture as “an organizational environment in which decisions are based on facts, research, analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders” [3] (p. 352).
Value is not only to be experienced by day-to-day student and faculty populations, but also to be communicated to its institutional leaders, as they perceive the library as part of a partnership to realize its goals, outcomes and missions. [4]. Coinciding with the need to advance the value of libraries through more complex assessment methodologies and communication skills, the number of library positions requiring assessment skills has increased [5], with new proficiency standards being published for assessment librarians [6]. With the growth of the assessment movement, a burgeoning of professional development opportunities has taken hold through international conferences, specialized training, and expansion within professional associations to develop this area of librarianship [7], the goals of which are service improvement and an outward demonstration of value.

2.2. Developing a Service Orientation

Assessment policies and procedures vary widely within higher education disciplines and programs. The library, working as an integral partner to fulfill the larger institutional mission, participates by supporting faculty, students, and its community, with numerous services specific to the needs of all its stakeholders. Libraries are also included in the parent institution’s accreditation evaluation process for program licensure and certification requirements. According to Town [8], measurement and evaluation of libraries as a means of demonstrating value to parent institutions and stakeholders has developed significantly over the last one hundred years. Citing Thompson [9], the historical development can be segmented into three phases: storehouse, service, and education. Storehouse refers to evaluation in terms of inputs. That is, quantified data describing what a library has, such as the size of the collection, number of books acquired, salaries paid, expenditures, and the like, such as the ARL Index, or its early predecessor, the Gerould Statistics. It may also refer to evaluation in terms of standards, such as 1928 College Library Standards, or the 1959 ARL Standards [10].
The momentum for the service phase might be traced to F.W. Lancaster’s works on quality improvement and his reliance upon Ranganathan’s Five Laws of Library Science [11]. Lancaster’s background in the area of systems analysis, combined with Duane Webster’s human resources experience at ARL, are described by Kyrillidou and Cook: “contributed to an increased awareness of libraries as symbolic entities manifesting elements of effect of service, information control, and library as place that generate perceptions and expectations as library users come into contact with these entities” [11] (p. 290). Within this same phase, and particularly in the 1980s and 1990s, Total Quality Management (TQM) gained acceptance among academic library managers as a strategic effort to effect improvements. TQM principles were developed for business management, and transferred to libraries, seeking to improve customer satisfaction and quality improvement [12]. Service quality and patron satisfaction, however, were not always an important concern for library managers. McElderry described, from an historical perspective, how library service in American higher education institutions developed through the 19th and early 20th centuries [13]. Essentially, academic libraries were operating on something of a self-service basis, in which practically no services were provided to scholars. It was only when there was a greater demand for higher education that the number of colleges and universities grew, and in order to support expanding curricula and growing subject specializations, larger library collections were needed, and with them, reference services to guide readers through library catalogues. At first, reference service models were slow to develop, but over time, philosophies of service emerged, ranging from the notion that librarians should facilitate self-sufficiency to models that operationalized full delivery on requests for reliable information. By the 1930s and 1940s, service as we understand it today was in its infancy, and “the products and services needed to satisfy reader requirements were not well understood” [13] (p. 418). Academic libraries of the 21st century have very rapidly undergone much change, but along with these changes, has come greater awareness of the need to improve services.

2.3. Measuring Quality

As in the application of TQM business principles to academic library assessments in the 1980s and 1990s, ServQUAL, a quality assessment tool developed for the retail industry, was also adapted to academic libraries. By design, the instrument identifies discrepancies between “the minimum, perceived, and desired levels of performance across five dimensions (tangibles, reliability, responsiveness, assurance, and empathy)” [14] (p. 243). By 1999, ARL’s New Measures Initiative (NMI) was developing a new “toolbox”, or suite of assessment products, that measured not only user satisfaction with library services, but also an evaluation of the efficiency of electronic resource investment. The NMI was renamed the StatsQUAL Gateway, and incorporated both quantitative and qualitative methodologies, through products such as LibQUAL+™, DigiQUAL, MINES for Libraries, and ClimateQUAL. LibQUAL+™ was piloted in 1999 [15], and has been used subsequently as an assessment tool in academic libraries worldwide.
With these tools in hand, many academic libraries have been developing and strengthening cultures of assessment. Studies examining what it takes to sustain a culture of assessment have concluded that library managers play an important role in supporting that culture [16,17], that a customer focus is essential [3,16], and that reliance upon external and evidence-based research should inform decisions [3,18]. Sources for evidence-based research, such as assessment studies, systematic reviews, and evidence summaries, are typically searched in the journal literature. It has also been suggested that a database for evidence summaries be created to assist decision makers and librarianship as a whole [19].

2.4. Assessment Trends

The educational phase, extending to learning metrics and the alignment of assessments with institutional goals, became a significant challenge to academic libraries in institutions of higher education. Interest in assessing information literacy instruction was heightened after the 2005 release of the U.S. Department of Education’s report “A Test of Leadership: Charting the Future of United States Higher Education” [20]. This document affected regional accreditation organizations’ standards, which in turn impacted the focus for assessment within academic and research libraries. The overall effect was to place a greater emphasis upon improvements in student learning in the post-2005 LIS literature. Hufford’s study, published in 2013, reviewed all types of assessment literature between 2005 and 2011. He noted an emphasis on assessments of information literacy instruction, stating that the increase was due to librarians’ belief that this is one of the most important services they provide, and further, predicted that this is a trend that will continue [20] (p. 20).
In 2010, the Association of College and Research Libraries (ACRL) published Value of Academic Libraries: A Comprehensive Research Review and Report (VAL), which covered existing research, research gaps, and the “most promising best practices and measures correlated to performance” for an academic librarian readership [21] (p. 1/1). Oakleaf et al. found, subsequent to the 2010 report, that “In general, the research linking libraries with student learning and success has pursued a correlation approach in which librarians use correlation methodologies to explore connections between library services and resources and the needs, goals, and outcomes of their institutions” [22] (p. 454). The alignment of an assessment research agenda to that of the parent institution’s, as outlined in VAL, encompasses the following areas: student enrollment, student retention and graduation, student success, student achievement, student learning, student experience, faculty research productivity, faculty grants, faculty teaching, and institutional reputation [2] (p. 17). This course of action is not without obstacles, and due to the siloed nature of units and departments in higher educational institutions, there may be challenges to obtaining needed analytic learning data, which would establish correlational evidence of a library’s impact to student success [22]. To date, a resolution to this problem is not widely available.
Earlier evidence of LIS publications carrying instructional research studies comes from the 2007 study by Crawford and Feldt. They discovered that between 1971 and 2002, there were four LIS journals that published a total of more than 50 research articles on library instruction during this time frame. These journals were Research Strategies (RS), Reference Services Review (RSR), Journal of Academic Librarianship (JAL), and College & Research Libraries (CRL). Among them, RS published the highest number of research studies on library instruction, and JAL published the highest number of essays on the same subject [23]. A content analysis by Luo and McKinney in 2015 also identified JAL as a leader in the publication of information literacy, finding that between 2004 and 2013, information literacy was its most popular topic [24].
The 2012 study by Mahraj reported a content analysis of a six-year span (2006–2011) from issues of Reference Services Review (RSR). Although the study did not distinguish between empirical research and other types of articles, it provided a topical breakdown. Information literacy and instruction articles accounted for 49% of the journal’s content. Articles on reference service were 28% of the data set, and emerging technologies saw coverage at 18%. Mahraj states: “While the data does not explain this pattern, the overall volume of content on information literacy is a potential signal of librarians’ shifting roles and priorities within academic institutions and may speak to trending in the profession at large” [25] (p. 185). Clark replicated the Mahraj study in 2015 with a three-year data set consisting of articles from the same journal (2012–2014). The study’s findings showed very similar representation of information literacy and instruction topics in two of the three years under study, but the number of emerging technology articles published between 2012 and 2014 accounted for 30% of articles, with the largest number having been published in 2012. Articles on reference topics dropped to 12% from 28% as compared to the former study. Clark credits the dynamism of the library field and notably changing technologies for shifts taking place in the publication of more diversified topics [26] (pp. 74–75).

3. Methodology

Historically, academic and research libraries have served varied patrons, ranging from university scholars to the incoming undergraduate student body. The perception of value to a library’s stakeholders in the current climate may be enhanced through the delivery of service quality to end-users, or through the demonstration of successful learning outcomes to the administrative leaders of its institution. A solid assessment culture is supported by administrative leadership, but also by the examination of empirical research, whether drawn from periodical literature or through original research. Assessment studies in the areas of overall library satisfaction, information literacy instruction, reference service, and technology within academic libraries are being published in a wide range of specialized and non-specialized LIS journals. However, while it has been noted that information literacy assessments are a means of demonstrating participation in institutional missions and communicating value to administrative stakeholders, published studies on other services to patrons may not share the same level of priority on library research agendas. Recent evidence of the growing movement to demonstrate value through assessment studies is found in the Association of College and Research Libraries’ (ACRL) Assessment in Action (AiA) program and its accompanying RoadShow of traveling workshops [27]. The AiA’s participants in North America and Australia have carried out assessment projects through action research using multiple methodologies. These newer and externally-focused assessment studies are designed to demonstrate value through an alignment with institutional goals and missions. The AiA report, co-authored by Brown and Malenfant, provides the context for the AiA initiative in its Executive Summary. Essentially, the document provides a framework for assessments linking what libraries have to offer (i.e., instruction, reference, collections, space and facilities) to student learning outcomes and overall academic success [28].
The following questions guided the present research:
  • To what extent have site-specific, front-line library service and general satisfaction assessments studies been published by three internationally recognized, high-impact academic library journals within a recent 5-year period?
  • How does the coverage of criteria-qualifying assessment content compare among the three journals?

3.1. Materials

This study was designed to provide an in-depth investigation into the practices adopted by academic libraries specific to published assessments. The framework for continuous improvement within academic libraries is a culture of assessment, focusing on the collection of evidence for the decisions that ultimately affect numerous stakeholders. Three well-recognized, peer-reviewed American journals covering topics of interest to academic library practitioners were selected for this descriptive content analysis study. The Journal of Academic Librarianship (JAL) has a five-year impact factor of 1.395, portal: Libraries and the Academy (portal) has an impact factor of 1.29 over a four-year period, and College & Research Libraries (CRL) had a 2016 Impact Factor of 1.515. CRL is the only one of the three that is published in open access. These journals represent major publications that may influence the direction of assessment and implementations of change within higher educational institutions. Furthermore, the articles published within these three journals were suitable for meeting the study’s selection criteria. They provide a foundation of policies and procedures that may close the gap between front-line library services and the needed evidence-based documentation to envision improvement of academic library services. Other peer-reviewed LIS journals with similar scopes were initially considered for inclusion in the study, but due to the granular nature of the data collection process and time limitations, they were excluded from the present study.
A literature search exercise, designed to lead staff to finding empirical evidence for reducing reported library barriers, was the starting point for the current study. The term “empirical” here is defined broadly as any type of data collected for the purpose of research. As searching progressed, the topic was narrowed to assessments that produced data that can be used to implement improvements in services to patrons. Initially, the search was conducted by scanning tables of the content of LIS journals of limited aims and scope, specifically those that covered management, interlibrary loan, cataloging and classification, reference, collections, and circulation. The sampling then progressed to LIS journals of wider appeal and reader interests. It was observed that research studies on a broader array of services were being published within the journals of wider scope. A decision was made to investigate the contents of three top peer-reviewed journals of the latter type. Studies considered for inclusion were narrowed to those focused on general patron satisfaction assessments and those that examined information literacy, reference, and usability of technology, to the extent that these services had a direct impact on library patrons. Excluded were studies that assessed other library assets and functions, such as collections, facilities, and operations and processes. Studies that involved multiple sites, reviews of the literature, meta-analyses, essays, or opinion articles were also excluded. The finalized criteria for examination were comprised of the following components: publication dates 2012–2016; empirical studies; satisfaction assessment studies of academic libraries; and studies specific to information literacy instruction, reference, and technologies used in academic libraries.
The three journals selected for investigation are read by a large and varied population. JAL is indexed by 16 abstracting and indexing (A&I) services, and has been in publication since 1974. JAL is an international and refereed journal, and publishes articles that focus on problems and issues germane to college and university libraries. JAL provides a forum for authors to present research findings and, where applicable, those findings’ practical applications and significance. Authors also analyze policies, practices, issues, and trends; speculate about the future of academic librarianship; and present analytical bibliographic essays and philosophical treatises [29].
The portal journal is indexed in over 50 A&I databases. From the journal’s website: “Focusing on important research about the role of academic libraries and librarianship, portal also features commentary on issues in technology and publishing. Written for all those interested in the role of libraries within the academy, portal includes peer-reviewed articles addressing subjects such as library administration, information technology, and information policy” [30]. Portal is published by Johns Hopkins University Press and has had a continuous run since 2001.
CRL is the official publication of the ACRL, and is published online-only on a bi-monthly schedule. It has had a longstanding publication run, having published its first issue in 1939. The journal is indexed by no less than eight indexers, and articles are freely accessible through internet searching. The scope of the journal extends to “all aspects of academic and research librarianship”, but its main focus is original research [31]. The authors believed these three publications were suitable for content analysis and comparison, due to their respective audiences, reputation, and similar yet distinct aims and scopes.
The methodology for this study originates in Krippendorff’s definition of content analysis, “a research technique for making inferences from texts (or other meaningful matter) to the contexts of their use” [32] (p. 16). Precedents for similar content analysis studies of LIS journals are found in Mahraj and Clark [25,26]. Content analysis is an unobtrusive data collection activity, undertaken in the absence of directly observable evidence [32] (p. 39). It is an indirect method of observation and as such, research questions are answered inferentially.

3.2. Procedures

The initial procedure for developing a codebook and the textual materials under investigation have been described above. The investigators worked together, rather than independently, over many sessions, examining and coding research articles published in JAL, portal, and CRL between 2012 and 2016. A total of 323 research articles published in JAL, 130 articles published in portal, and 196 in CRL were examined. Qualifying articles were documented in spreadsheet entries containing journal volume and issue numbers, year of publication, each article’s title, the relevant area of library service, and the country where the research was conducted. Authors’ names were omitted. Where ambiguities arose, differing interpretations of a given text were discussed in light of the inclusion criteria. Throughout the coding process, and particularly in these instances, texts were keyword-searched for critical terms, such as “assess”, “service”, and “improve”, in order to reach a reasoned determination. Differences of opinion were resolved in this manner, rather than by settling disagreements after independent scoring and calculations of inter-rater consensus estimates [32,33,34]. Due to the descriptive character of the study, no correlational statistical measures were undertaken.

4. Findings

This study was undertaken to identify general satisfaction and front-line library service assessments found within three distinguished, American peer-reviewed journals available to college and university library decision-makers, as they draw from an empirical knowledge base. Studies meeting the selection criteria are identified in Appendix A, Appendix B, and Appendix C, and these entries constitute the full dataset. All data entries were taken from articles published between 2012 and 2016. Each study listed in the appendices was judged to be empirical and fell within the study’s description of an assessment study. Topics covered within the data set were general satisfaction assessments (e.g., LibQUAL+), assessments of information literacy instruction, reference services, and technologies used in academic libraries. Of the 323 articles in JAL, 18% (n = 59) of research articles met the coding criteria. In portal, of 130 articles examined, 23 articles (18%) met the criteria, and of 196 articles in CRL, 44 (22%) qualified for inclusion within the data set. (During the data collection procedure, it was discovered that in portal there was a gap of between volume 14, number 1 and volume 15, number 2 in which there were no studies published covering the library services investigated in the present study. This is likely attributable to a transitional period between editors. In the last issue of volume 14 (2014), the editor announced the end of her tenure and named the immediate successor to the position. With volume 15, number 3 (2015), the authors were able to resume identification of criteria-qualifying studies. Had there not been a break in editorial continuity, it is assumed that the study would likely have provided a larger number of studies. The authors also speculate that empirical, service-study manuscripts not accepted by portal through the transitional gap were either held for future publication or published elsewhere.)
Assessment studies in information literacy dominated all three journals at rates of 61% (n = 36 in JAL), 48% (n = 11 in portal), and 52% (n = 23 in CRL). Six studies on reference services appeared in portal (26%), seven studies were identified in JAL (12%), and eight studies (18%) in CRL. Library technology studies had the greatest presence in CRL, with 11 studies (25%). In portal, five technology studies (22%) were identified, and JAL followed with nine studies (15%). General library satisfaction assessments constituted 10% of studies in JAL (n = 6), 8% (n = 5) in CRL, and less than 1% (n = 1) in portal through the same period (see Figure 1, Figure 2 and Figure 3). Five-year composite trending data from the three journal sources (n = 126) for general satisfaction assessments, information literacy instruction, reference, and library technologies are represented in Figure 4.
In comparison to one another, the three journals bore much similarity in the following ways. Each of the journals carried assessment studies in all topical areas of the current investigation: JAL published 59, portal 23, and CRL 44. The number of information literacy studies ranged from 36 in JAL, to 11 in portal, to 23 in CRL. General satisfaction assessments were minimally covered, with six in JAL, one in portal, and two in CRL. Reference service studies ranged between eight in JAL, six in portal, and eight in CRL. Technology studies saw slightly more coverage than reference service studies, with nine in JAL, five in portal, and 11 in CRL.
Some unique publishing characteristics are noted. JAL covered the largest number of qualifying studies, while also taking the lead in the number of information literacy studies. JAL also published studies from a wider international array of sites, extending to all continents except South America. An illustration of countries represented outside the United States published in JAL appears in Figure 5. The qualifying studies from CRL represented work at sites in the United States, but also in Canada, China, and Norway. The qualifying studies in portal were conducted only in the U.S. and Canada. When viewed in composite, excluding studies from U.S. sites, studies from Australia, Canada, and China were published with greater frequency than the other contributors. (See Figure 6).
Sites where the selected studies took place demonstrate an inclusion of international participation in these primarily American, English-language journals. JAL published the widest variety of international studies, originating in North America, Europe, Africa, Australasia, and Asia. Three information literacy assessments were accepted from Australia, and three reference service assessments came from sites in China. Canada and New Zealand each contributed two information literacy studies, with the remainder of non-U.S. sites each producing one qualifying article in general library satisfaction, information literacy, reference service, and technology (see Figure 5). The majority of studies in JAL naturally originated in the U.S., with four general assessments, 26 on information literacy, five on reference, and six technology studies.
By contrast, CRL and portal published notably fewer studies from sites around the globe. Qualifying studies in CRL came mainly from the United States, with much less representation from Canada (four), China (one), and Norway (one) (see Figure 6). Portal’s publication of studies beyond the U.S. was limited to Canada (two). These numbers are not necessarily a reflection of a publication bias, but rather are the findings based upon the criteria used to select the assessment studies.

5. Discussion

Studies on information literacy dominate the assessment literature in each of the three journals. Mahraj [25] reported parallel findings in her 2006–2011 content analysis of Reference Services Review (RSR) articles. Over the six-year publication spread, 49% of articles covered information literacy topics, 28% reference service topics, and 18% emerging technologies topics. In 2015, Clark [26] continued the retrospective work begun by Mahraj, tracking RSR topical trending between 2012 and 2014. An important difference between the Mahraj and Clark studies and the current research is that methodologically, the former categorized according to broader topical areas, whereas the current study categorized topically but investigated assessment studies exclusively. Relevant findings of Mahraj, Clark, and the current study are compared in Table 1.
Mahraj’s and Clark’s findings of a proportionately greater amount of topical article content about information literacy parallels the findings of the current study and corroborates the findings of Hufford [20], who noted an emphasis upon student learning library assessments in the post-2005 literature. The findings of the current study confirmed this trend. The emphasis upon information literacy and learning outcomes within the literature is easily understood in light of library units’ interest in aligning research agendas to the institutional mission.
The number of reference service assessments in the current study is small. JAL’s six studies, portal’s seven studies, and CRL’s eight studies within this time frame might indicate a change in the role of reference services within academic librarianship. That traditional reference service is in decline is also evidenced in non-research institutions of two- and four-year colleges, as highlighted in a study by Davies and Thiele. Their 2013 study examined article content from Community & Junior College Libraries and College & Undergraduate Libraries published between 2008 and 2010. The report found that among 97 published articles, there was “a dearth of articles published pertaining to reference questions”, and that this phenomenon may have been attributable to a change in “professional focus toward less reference oriented job tasks” [35] (p. 8).
VanScoy and Fontana [36] are of the opinion that the downward trend of published studies in this area may be due to a change in program evaluation needs. As such, new forms of reference, whether embedded within courses, moved to ubiquitous and virtual “spaces”, or tied to information literacy instruction, endure. Reference services, despite competition from popular search engines, continue to meet student needs. The innovations that accompany these new forms must remain relevant, and so are subject to continuous assessment. However, as the findings in the current study indicate, published reference service assessments remain under-represented in top LIS journals [25,37].
The challenges of keeping pace with the latest technological innovations have had a profound effect on libraries. In the present study, only 25 assessments on library technology were identified. The presence of technology studies in portal and CRL, at 27% and 25%, respectively, approach the 30% level discovered by Clark, which may suggest an ongoing increase in the number of technology assessments; however, more data are needed to corroborate this observation. Websites, new and existing software programs, devices, and equipment undergo continuous development, and directly support student and faculty work. The technological future of libraries envisioned by Noh in 2015 is one that follows the progression of web development. She posits that “Library 4.0 must include not only software-based approaches but also technological development such as makerspace, Google Glass, context aware technology, digitization of contents, big data, cloud computing, and augmented reality” [38] (p. 791). While assessment studies of emerging and existing library technologies are not plentiful in the three journals investigated here, they are found within the issues of more specialized library technology publications (e.g., Information Technology and Libraries).

Further Considerations

The current study distinguished 126 criteria-qualifying assessment studies out of 649 articles, drawn from three publications of wide aims and scopes through a labor intensive method. As the authors examined the selected studies, it was observed that, for the most part, findings tended to inform researchers of factual conditions that might eventually lead to improvements. Some studies affirmed that interventions were successful and generalizable; some stated that results were not transferable to other library situations. What was not observed were studies that presented the full assessment research cycle—that is, the implementation of data-driven decisions from initial findings that were then once again assessed. This observation may be the result of a cursory reading of the body of literature examined, but the overview points to a prominence of study findings that potentially lend themselves to implementation and further study.
One of the challenges to the use of empirical evidence in support of decision making is the research-to-practice gap [39]. However, this is being mitigated in publications such as Evidence Based Library and Information Practice (EBLIP), which provides evidence summaries, Health Information and Libraries Journal (HILJ), and journals published by the Emerald Group Publishers (e.g., Performance Measurements and Metrics), which provide structured abstracts, or in the case of HILJ, both a structured abstract and “key messages” for quick perusal. Systematic reviews for the purpose of improving customer service are another evidence resource for manager and practitioner decision making. There is a recently published systematic review for information literacy instruction outcomes [40], but there appeared to be a gap in the literature when the authors searched for current systematic reviews of reference service or holistic library technology assessments. The appendices below identify studies in each of the current study’s assessment study focuses. These may be useful as an index to librarians seeking lists of assessment studies from high-quality publications. The lists may also have the potential to help build the empirical database envisioned by Koufogiannakis [19].
A limitation of the current study is that it examines only three prestigious LIS journals, with fairly wide topical coverage. The limitation to three American LIS journals may have biased the findings of the study and, as a consequence, the findings may be less generalizable. Future research might employ more liberal criteria, such as multi-site studies, or extend data collection to additional journals published outside the United States. Additionally, the study could be expanded to include coverage of other service concerns—for instance, interlibrary loan and document delivery, collection development, access services, technical services, facilities, or non-traditional innovations.

6. Conclusions

Published assessments on general patron satisfaction, information literacy instruction outcomes, reference service, and library technologies at academic institutions were the chief focus areas of this study. The coverage of assessments in these areas by three well-recognized LIS journals points to an ongoing trend toward the continuous improvement goals for satisfied patrons and heightened perceptions of library value. The initial problem for this study was that published assessments of library services in LIS journals concentrated heavily on the coverage of information literacy instruction studies, leaving assessments of other key service areas sparsely covered. The study identified 59 studies in the Journal of Academic Librarianship, 23 studies in portal: Libraries and the Academy, and 44 studies in College & Research Libraries that satisfied the defined criteria. The findings also suggest a growing culture of assessment in many libraries by virtue of the abundance of assessments within the literature.
The review of the literature revealed a second problem: assessment research results are not presented in LIS publications such that they are easily consumable for those who need to make evidence-based decisions. Academic library practitioners seeking to improve services or customer satisfaction measures can turn to an external knowledge base, consisting in part of many library and information science journals. These sources contain empirical studies, which may lead to insights and potential solutions to problems. However, due to research agendas favoring information literacy assessments or studies on other front-line services, innovations in journals with wide scopes are published less frequently. A knowledge base is already large and growing, but it is not as accessible to busy library managers. As a partial solution to this problem, some LIS journals have been reporting evidence summaries that provide structured abstracts and key messages beneath article titles, but this publication practice is not currently as widespread as it might be.
Evidence-based assessments examining general satisfaction and specific front-line services within the three LIS journals investigated in this study demonstrate that librarians within cultures of assessment are actively contributing to the knowledge base. This is seen primarily in the area of information literacy instruction, but secondarily in reference and technology services, as well as general satisfaction studies. These findings represent a snapshot of the depth of data available to decision makers in order to advance their mission of providing valuable library services to all patrons, while also demonstrating value to stakeholders, extending to administrators of the parent institutions.
These findings suggest that editors who publish journals of wider aims and scope might consider the inclusion of a greater number of assessments on services beyond information literacy instruction, and develop innovative models for reporting evidence summaries. To library managers and administrators, the findings suggest that encouragement might be given to staff in all areas, furthering cultures of assessment in a manner that expands research and publication agendas.

Author Contributions

Ethan J. Allen and Roberta K. Weber conceived and wrote the paper. William Howerton was a full partner in the data collection procedure.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

JournalJournal of Academic LibrarianshipServiceSite
Article Title
v.38n.1 (2012)Still Digitally Divided? An Assessment of Historically Black College and University Library Web SitesTECHUSA
v.38n.2 (2012)NA
v.38n.3 (2012)An Investigation of Affect of Service Using a LibQUAL+™ Survey and an Experimental StudyILUSA
v.38n.4 (2012)Approaches to Learning Information Literacy: A Phenomenographic StudyILAustralia
v.38n.5 (2012)Beyond the Web Tutorial: Development and Implementation of an Online, Self-Directed Academic Integrity Course at Oakland UniversityILUSA
Assessing the Research Needs of Graduate Students at Georgetown UniversityGENUSA
v.39n.1 (2013)NA
v.39n.2 (2013)International Students’ Perception of Library Services and Information Resources in Chinese Academic LibrariesREFChina
Reference Reviewed and Re-Envisioned: Revamping Librarian and Desk-Centric Services with LibStARs and LibAnswersREFUSA
Now it’s Necessary: Virtual Reference Services at Washington State University, PullmanREFUSA
Measuring the Disparities between Biology Undergraduates’ Perceptions and Their Actual Knowledge of Scientific Literature with ClickersILUSA
Designing Authentic Learning Tasks for Online Library InstructionILUSA
v.39n.3 (2013)Web 2.0 and Information Literacy Instruction: Aligning Technology with ACRL StandardsTECHUSA
Guided and Team-Based Learning for Chemical Information LiteracyILUSA
v.39n.4 (2013)NA
v.39n.5 (2013)Wide Awake at 4 AM: A Study of Late Night User Behavior, Perceptions and Performance at an Academic LibraryGENUSA
v.39n.6 (2013)LibQUAL Revisited: Further Analysis of Qualitative and Quantitative Survey Results at the University of MississippiGENUSA
Developing Data Management Services at the Johns Hopkins UniversityTECHUSA
v.40n.1 (2014)NA
v.40n.2 (2014)Integrating Information Literacy into Academic Curricula: A Professional Development Programme for Librarians at the University of AucklandILNew Zealand
Tying Television Comedies to Information Literacy: A Mixed-Methods InvestigationILUSA
Higher Education and Emerging Technologies: Shifting Trends in Student UsageTECHUSA
v.40n.3-4 (2014)Four Pedagogical Approaches in Helping Students Learn Information Literacy SkillsILUSA
Student Engagement in One-Shot Library InstructionILUSA
Library Value in the Classroom: Assessing Student Learning Outcomes from Instruction and CollectionsILUSA
Distance Students’ Attitude Toward Library Help SeekingREFUSA
A Library and the Disciplines: A Collaborative Project Assessing the Impact of eBooks and Mobile Devices on Student LearningTECHUSA
v.40n.5 (2014)Copyright and You: Copyright Instruction for College Students in the Digital AgeILUSA
v.40n.6 (2014)Web-based Citation Management Tools: Comparing the Accuracy of Their Electronic Journal CitationsTECHUSA
v.41n.1 (2015)Library Instruction and Themed Composition Courses: An Investigation of Factors that Impact Student LearningILUSA
v.41n.2 (2015)Language in Context: A Model of Language Oriented Library InstructionILUSA
The Effectiveness of Online Versus In-person Library Instruction on Finding Empirical Communication ResearchILUSA
NICE Evidence Search: Student Peers’ Views on their Involvement as Trainers in Peer-based Information Literacy TrainingILUK
Library Instruction for Romanized HebrewILCanada
Integrating Information Literacy, the POGIL Method, and iPads into a Foundational Studies ProgramILUSA
v.41n.3 (2015)Exploring Chinese Students’ Perspective on Reference Services at Chinese Academic Libraries: A Case Study ApproachREFChina
A New Role of Chinese Academic Librarians—The Development of Embedded Patent Information Services at Nanjing Technology University Library, ChinaREFChina
v.41n.4 (2015)“It’s in the Syllabus”: Identifying Information Literacy and Data Information Literacy Opportunities Using a Grounded Theory ApproachILUSA
Student, Librarian, and Instructor Perceptions of Information Literacy Instruction and Skills in a First Year Experience Program: A Case StudyILUSA
Mapping the Roadmap: Using Action Research to Develop an Online Referencing ToolTECHAustralia
Beyond Embedded: Creating an Online-Learning Community Integrating Information Literacy and Composition CoursesILUSA
v.41n.5 (2015)The Effect of a Situated Learning Environment in a Distance Education Information Literacy CourseILUSA
v.41n.6 (2015)Comparison of Native Chinese-speaking and Native English-speaking Engineering Students’ Information Literacy ChallengesILCanada
Standing By to Help: Transforming Online Reference with a Proactive Chat SystemREFUSA
Faculty and Librarians’ Partnership: Designing a New Framework to Develop Information Fluent Future DoctorsILQatar
v.42n.1 (2016)Measuring the Effect of Virtual Librarian Intervention on Student Online SearchILUSA
Surveying Users’ Perception of Academic Library Services Quality: A Case Study in Universiti Malaysia Pahang (UMP) LibraryGENMalaysia
Research Consultation Assessment: Perceptions of Students and LibrariansREFUSA
Information Behavior and Expectations of Veterinary Researchers and their Requirements for Academic Library ServicesGENSouth Africa
v.42n.2 (2016)Impact of Assignment Prompt on Information Literacy Performance in First-year Student WritingILUSA
v.42n.3 (2016)Student Use of Keywords and Limiters in Web-scale Discovery SearchingILUSA
Flipped Instruction for Information Literacy: Five Instructional Cases of Academic LibrariesILUSA
Finding Sound and Score: A Music Library Skills Module for Undergraduate StudentsILAustralia
A Collaborative Approach to Integrating Information and Academic Literacy into the Curricula of Research Methods CoursesILNew Zealand
v.42n.4 (2016)Information Literacy Training Evaluation: The Case of First Year Psychology StudentsILSlovenia
Use It or Lose It? A Longitudinal Performance Assessment of Undergraduate Business Students’ Information LiteracyILUSA
v.42n.5 (2016)Assessing and Serving the Workshop Needs of Graduate StudentsGENUSA
Heuristic Usability Evaluation of University of Hong Kong Libraries’ Mobile WebsiteTECHChina
A Pragmatic and Flexible Approach to Information Literacy: Findings from a Three-Year Study of Faculty-Librarian CollaborationILUSA
v.42n.6 (2016)Assessing Graduate Level Information Literacy Instruction with Critical Incident QuestionnairesILUSA
Effects of Information Literacy Skills on Student Writing and Course PerformanceILUSA
Providing Enhanced Information Skills Support to Students from Disadvantages Backgrounds: Western Sydney University Library Outreach ProgramILAustralia
User Acceptance of Mobile Library Applications in Academic Libraries: An Application of the Technology Acceptance ModelTECHKorea

Appendix B

JournalPortal: Libraries and the AcademyServiceSite
Article Title
v.12n.1 (2012)Moving Beyond Assumptions: The Use of Virtual Reference Data in an Academic LibraryREFUSA
Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic PublishingTECHUSA
Guiding Design: Exposing Librarian and Student Mental Models of Research GuidesREFUSA
v.12n.2 (2012)McGill Library Makes E-books Portable: E-reader Loan Service in a Canadian Academic LibraryTECHCanada
v.12n.3 (2012)Performance-based Assessment in an Online Course: Comparing Different Types of Information Literacy InstructionILUSA
Implementing the Customer Contact Center: An Opportunity to Create a Valid Measurement System for Assessing and Improving a Library’s Telephone ServicesREFUSA
Incoming Graduate Students in the Social Sciences: How Much Do They Really Know About Library Research?ILUSA
v.12n.4 (2012)Rising Tides: Faculty Expectations of Library WebsitesTECHUSA
v.13n.1 (2013)The Apprentice Researcher: Using Undergraduate Researchers’ Personal Essays to Shape Instruction and ServicesILUSA
v.13n.2 (2013)NA
v.13n.3 (2013)Talking About Information Literacy: The Mediating Role of Discourse in a College Writing ClassroomILUSA
v.13n.4 (2013)Assessing Affective Learning Using a Student Response SystemILUSA
v.14n.1 (2014)NA
v.14n.2 (2014)NA
v.14n.3 (2014)NA
v.14n.4 (2014)The Measureable Effects of Closing a Branch Library: Circulation, Instruction, and Service PerceptionGENCanada
v.15n.1 (2015)NA
v.15n.2 (2015)The Student/Library Computer Science CollaborativeTECHUSA
v.15n.3 (2015)Serving the Needs of Performing Arts Students: A Case StudyREFUSA
“I Never Had to Use the Library in High School”: A Library Instruction Program for At-Risk StudentsILUSA
Impacting Information Literacy Learning in First-Year Seminars: A Rubric-Based EvaluationILUSA
Examining Mendeley: Designing Learning Opportunities for Digital ScholarshipTECHUSA
v.15n.4 (2015)Standing Alone No More: Linking Research to a Writing Course in a Learning CommunityILUSA
Learning by Doing: Developing a Baseline Information Literacy AssessmentILUSA
v.16n.1 (2016)“I Felt Like Such a Freshman”: First-Year Students Crossing the Library ThresholdILUSA
The Value of Chat Reference Services: A Pilot StudyREFUSA
v.16n.2 (2016)NA
v.16n.3 (2016)The Impact of Physically Embedded Librarianship on Academic DepartmentsREFUSA
Assessment for One-Shot Library Instruction: A Conceptual ApproachILUSA
v.16n.4 (2016)NA

Appendix C

JournalCollege & Research LibrariesServiceSite
Article Title
v.73n.1 (2012)NA
v.73n.2 (2012)NA
v.73n.3 (2012)Citation Analysis as a Tool to Measure the Impact of Individual Research ConsultationsREFUSA
v.73n.4 (2012)Why One-shot Information Literacy Sessions Are Not the Future of Instruction: A Case for Online Credit CoursesILUSA
v.73n.5 (2012)NA
v.73n.6 (2012)NA
v.74n.1 (2013)NA
v.74n.2 (2013)NA
v.74n.3 (2013)How Users Search the Library from a Single Search BoxTECHUSA
The Daily Image Information Needs and Seeking Behavior of Chinese Undergraduate StudentsTECHChina
Trends in Image Use by Historians and the Implications for Librarians and ArchivistsTECHUSA
v.74n.4 (2013)Revising the “One-Shot” through Lesson Study: Collaborating with Writing Faculty to Rebuild a Library Instruction SessionILUSA
The Research Process and the Library: First-Generation College Seniors vs. FreshmenILUSA
v.74n.5 (2013)Instructional Preferences of First-Year College Students with Below-Proficient Information Literacy Skills: A Focus Group StudyILUSA
Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library ResourcesILUSA
Where Do We Go from Here? Informing Academic Library Staffing through Reference Transaction AnalysisREFUSA
v.74n.6 (2013)Assessment in the One-Shot Session: Using Pre- and Post-tests to Measure Innovative Instructional Strategies among First-Year StudentsILUSA
Why Some Students Continue to Value Individual, Face-to-Face Research Consultations in a Technology-Rich WorldREFUSA
v.75n.1 (2014)Making a Case for Technology in AcademiaTECHUSA
v.75n.2 (2014)They CAN and They SHOULD: Undergraduates Providing Peer Reference and InstructionREFUSA
v.75n.3 (2014)
v.75n.4 (2014)Undergraduates’ Use of Social Media as Information SourcesILUSA
v.75n.5 (2014)Faculty Usage of Library Tools in a Learning Management SystemTECHUSA
Plagiarism Awareness among Students: Assessing Integration of Ethics Theory into Library InstructionILUSA
v.75n.6 (2014)The Whole Student: Cognition, Emotion, and Information LiteracyILUSA
v.76n.1 (2015)“Pretty Rad”: Explorations in User Satisfaction with a Discovery Layer at Ryerson UniversityTECHCanada
Maximizing Academic Library Collections: Measuring Changes in Use Patterns Owing to EBSCO Discovery ServiceTECHUSA
v.76n.2 (2015)An Information Literacy Snapshot: Authentic Assessment across the CurriculumILUSA
v.76n.3 (2015)Question-Negotiation and Information Seeking in LibrariesREFUSA
The Role of the Academic Library in Promoting Student Engagement in LearningILUSA
v.76n.4 (2015)Library Catalog Log Analysis in E-book Patron-Driven Acquisitions (PDA): A Case StudyTECHUSA
The Perceived Impact of E-books on Student Reading Practices: A Local StudyTECHUSA
Universal Design for Learning (UDL) in the Academic Library: A Methodology for Mapping Multiple Means of Representation in Library TutorialsTECHUSA
v.76n.5 (2015)Degrees of Impact: Analyzing the Effects of Progressive Librarian Course Collaborations on Student PerformanceILUSA
v.76n.6 (2015)Getting More Value from the LibQUAL+® Survey: The Merits of Qualitative Analysis and Importance-Satisfaction Matrices in Assessing Library Patron CommentsGENCanada
Integrating Library Instruction into the Course Management System for a First-Year Engineering Class: An Evidence-Based Study Measuring the Effectiveness of Blended Learning on Students’ Information Literacy LevelsILCanada
Changes in Reference Question Complexity Following the Implementation of a Proactive Chat System: Implications for PracticeREFUSA
v.77n.1 (2016)Metadata Effectiveness in Internet Discovery: An Analysis of Digital Collection Metadata Elements and Internet Search Engine KeywordsTECHUSA
Exploring Peer-to-Peer Library Content and Engagement on a Student-Run Facebook GroupREFUSA
v.77n.2 (2016)Examining the Relationship between Faculty-Librarian Collaboration and First-Year Students’ Information Literacy AbilitiesILUSA
Assessing the Value of Course-Embedded Information Literacy on Student Learning and AchievementILUSA
Personal Librarian for Aboriginal Students: A Programmatic AssessmentREFCanada
Mixed or Complementary Messages: Making the Most of Unexpected Assessment ResultsILUSA
Making Strategic Decisions: Conducting and Using Research on the Impact of Sequenced Library InstructionILUSA
Identifying and Articulating Library Connections to Student SuccessGENUSA
Beyond the Library: Using Multiple, Mixed Measures Simultaneously in a College-Wide Assessment of Information LiteracyILUSA
v.77n.3 (2016)The Librarian Leading the Machine: A Reassessment of Library Instruction MethodsILUSA
v.77n.4 (2016)Academic Librarians in Data Information Literacy Instruction: A Case Study in MeteorologyILNorway
v.77n.5 (2016)Undergraduates’ Use of Google vs. Library Resources: A Four-Year Cohort StudyILUSA
v.77n.6 (2016)A Novel Assessment Tool for Quantitative Evaluation of Science Literature Search Performance: Application to First-Year and Senior Undergraduate Biology MajorsILUSA
Assessing the Scope and Feasibility of First-Year Students’ Research Paper TopicsILUSA

References

  1. Hernon, P.; Dugan, R.E. An Action Plan for Outcomes Assessment in Your Library; American Library Association: Chicago, IL, USA; London, UK, 2002; ISBN 0-8389-0813-16. [Google Scholar]
  2. Oakleaf, M. The Value of Academic Libraries: A Comprehensive Research Review and Report. Available online: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf (accessed on 6 December 2017).
  3. Lakos, A.; Phipps, S.E. Creating a culture of assessment: A catalyst for organizational change. Portal Libr. Acad. 2004, 4, 345–361. [Google Scholar] [CrossRef]
  4. Albert, A.B. Communicating library value—The missing piece of the assessment puzzle. J. Acad. Librariansh. 2014, 40, 634–637. [Google Scholar] [CrossRef]
  5. Oakleaf, M. Building the assessment librarian guildhall: Criteria and skills for quality assessment. J. Acad. Librariansh. 2013, 39, 126–128. [Google Scholar] [CrossRef]
  6. The Association of College and Research Libraries (ACRL). Proficiencies for Assessment Librarians and Coordinators. Available online: http://www.ala.org/acrl/standards/assessment_proficiencies (accessed on 11 December 2017).
  7. Nitecki, D.A.; Wiggins, J.; Turner, N.B. Assessment is not enough for libraries to be valued. Perform. Meas. Metr. 2015, 16, 197–210. [Google Scholar] [CrossRef]
  8. Town, J.S. Value, impact, and the transcendent library: Progress and pressures in performance measurement and evaluation. Libr. Q. 2011, 81, 111–125. [Google Scholar] [CrossRef]
  9. Thompson, J. Redirection in Academic Library Management; Library Association: London, UK, 1991; ISBN 0851574688. [Google Scholar]
  10. Heath, F. Library assessment: The way we have grown. Libr. Q. 2011, 81, 7–25. [Google Scholar] [CrossRef]
  11. Kyrillidou, M.; Cook, C. The evolution of measurement and evaluation of libraries: A perspective from the Association of Research Libraries. Libr. Trends 2008, 56, 888–909. [Google Scholar] [CrossRef]
  12. Wang, H. From “user” to “customer”: TQM in academic libraries? Libr. Manag. 2006, 27, 606–620. [Google Scholar] [CrossRef]
  13. McElderry, S. Readers and resources: Public services in academic and research libraries, 1876–1976. Coll. Res. Libr. 1976, 37, 408–420. [Google Scholar] [CrossRef]
  14. Coleman, V.; Xiao, Y.; Blair, L.; Chollett, B. Toward a TQM paradigm: Using SERVQUAL to measure library service quality. Coll. Res. Libr. 1997, 58, 237–249. [Google Scholar] [CrossRef]
  15. Hiller, S. Another tool in the assessment toolbox: Integrating LIBQUAL+™ into the University of Washington Libraries assessment program. J. Libr. Adm. 2004, 40, 121–137. [Google Scholar] [CrossRef]
  16. Hiller, S.; Kyrillidou, M.; Self, J. When the evidence is not enough: Organizational factors that influence effective and successful library assessment. Perform. Meas. Metr. 2008, 9, 223–230. [Google Scholar] [CrossRef]
  17. Farkas, M.; Hinchcliffe, L.; Houk, A. Bridges and barriers: Factors influencing a culture of assessment in academic libraries. Coll. Res. Libr. 2015, 76, 150–169. [Google Scholar] [CrossRef]
  18. McClure, C.R.; Samuels, A.R. Factors affecting the use of information for academic library decision making. Coll. Res. Libr. 1985, 46, 483–498. [Google Scholar] [CrossRef]
  19. Koufogiannakis, D.; Kloda, L.; Pretty, H. A big step forward: It’s time for a database of evidence summaries in library and information practice. Evid. Based Libr. Inf. Pract. 2016, 11, 92–95. [Google Scholar] [CrossRef]
  20. Hufford, J.R. A review of the literature on assessment in academic and research libraries, 2005 to August 2011. Portal Libr. Acad. 2013, 13, 5–35. [Google Scholar] [CrossRef]
  21. American Library Association. Value of Academic Libraries Report. Available online: http://www.acrl.ala.org/value/?page_id=21 (accessed on 7 December 2017).
  22. Oakleaf, M.; Whyte, A.; Lynema, E.; Brown, M. Academic libraries & institutional learning analytics: One path to integration. J. Acad. Librariansh. 2017, 43, 454–461. [Google Scholar] [CrossRef]
  23. Crawford, G.A.; Feldt, J. An analysis of the literature on instruction in academic libraries. Ref. User Serv. Assoc. 2007, 46, 77–88. [Google Scholar] [CrossRef]
  24. Luo, L.; McKinney, M. JAL in the past decade: A comprehensive analysis of academic library research. J. Acad. Librariansh. 2015, 41, 123–129. [Google Scholar] [CrossRef]
  25. Mahraj, K. Reference Services Review: Content analysis, 2006–2011. Ref. Serv. Rev. 2012, 40, 182–198. [Google Scholar] [CrossRef]
  26. Clark, K.W. Reference Services Review: Content analysis, 2012–2014. Ref. Serv. Rev. 2016, 44, 61–75. [Google Scholar] [CrossRef]
  27. The Association of College and Research Libraries (ACRL). RoadShows. Available online: http://www.ala.org/acrl/conferences/roadshows (accessed on 4 January 2018).
  28. Documented Library Contributions to Student Learning and Success: Building Evidence with Team-Based Assessment in Action Campus Projects. Available online: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/contributions_y2.pdf (accessed on 11 December 2017).
  29. Guide for Authors—The Journal of Academic Librarianship. Available online: https://www.elsevier.com/journals/the-journal-of-academic-librarianship/0099-1333/guide-for-authors (accessed on 5 August 2017).
  30. Portal: Libraries and the Academy. Available online: https://www.press.jhu.edu/journals/portal-libraries-and-academy (accessed on 5 August 2017).
  31. Submissions. Available online: http://crl.acrl.org/index.php/crl/about/submissions#authorGuidelines (accessed on 3 January 2018).
  32. Krippendorff, K. Content Analysis: An Introduction to Its Methodology, 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 1980; ISBN 0-7619-1544-3. [Google Scholar]
  33. Stemler, S.E. A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Pract. Assess. Res. Eval. 2004, 9, 1–11. [Google Scholar]
  34. Oakleaf, M. Using rubrics to assess information literacy: An examination of methodology and interrater reliability. J. Assoc. Inf. Sci. Technol. 2009, 60, 969–983. [Google Scholar] [CrossRef]
  35. Davies, K.; Thiele, J. Library research: A domain comparison of two library journals. Community Jr. Coll. Libr. 2013, 19, 1–9. [Google Scholar] [CrossRef]
  36. VanScoy, A.; Fontana, C. How reference and information science is studied: Research approaches and methods. Libr. Inf. Sci. Res. 2016, 38, 94–100. [Google Scholar] [CrossRef]
  37. Koufogiannakis, D.; Slater, L.; Crumley, E. A content analysis of librarianship research. J. Inf. Sci. 2004, 30, 227–239. [Google Scholar] [CrossRef]
  38. Noh, Y. Imagining library 4.0: Creating a model for future libraries. J. Acad. Librariansh. 2015, 41, 786–797. [Google Scholar] [CrossRef]
  39. Booth, A. Is there a future for evidence based library and information practice? Evid. Based Libr. Inf. Pract. 2011, 6, 22–27. [Google Scholar] [CrossRef]
  40. Erlinger, A. Outcomes assessment in undergraduate information literacy instruction: A systematic review. Coll. Res. Libr. 2017. accepted. [Google Scholar]
Figure 1. Service and general assessments in Journal of Academic Librarianship (JAL).
Figure 1. Service and general assessments in Journal of Academic Librarianship (JAL).
Publications 06 00012 g001
Figure 2. Service and general assessments in portal: Libraries and the Academy (portal).
Figure 2. Service and general assessments in portal: Libraries and the Academy (portal).
Publications 06 00012 g002
Figure 3. Service and general assessments in College & Research Libraries (CRL).
Figure 3. Service and general assessments in College & Research Libraries (CRL).
Publications 06 00012 g003
Figure 4. Composite five-year trending of service and general assessments published in JAL, portal, and CRL.
Figure 4. Composite five-year trending of service and general assessments published in JAL, portal, and CRL.
Publications 06 00012 g004
Figure 5. JAL studies originating outside the U.S.
Figure 5. JAL studies originating outside the U.S.
Publications 06 00012 g005
Figure 6. Composite assessment contributions from outside the United States to JAL, portal, and CRL, 2012–2106.
Figure 6. Composite assessment contributions from outside the United States to JAL, portal, and CRL, 2012–2106.
Publications 06 00012 g006
Table 1. Comparison of Findings.
Table 1. Comparison of Findings.
Study & PublicationInformation LiteracyReferenceTechnology
Mahraj (2006–2011)
RSR49%28%18%
Qualifying articles1196843
Clark (2012–2014)
RSR47%12%30%
Qualifying articles541435
Current study (2012–2016)
JAL61%14%15%
Qualifying studies3689
Current study (2012–2016)
portal48%25%27%
Qualifying studies1165
Current study (2012–2016)
CRL52%18%25%
Qualifying studies23811

Share and Cite

MDPI and ACS Style

Allen, E.J.; Weber, R.K.; Howerton, W. Library Assessment Research: A Content Comparison from Three American Library Journals. Publications 2018, 6, 12. https://doi.org/10.3390/publications6010012

AMA Style

Allen EJ, Weber RK, Howerton W. Library Assessment Research: A Content Comparison from Three American Library Journals. Publications. 2018; 6(1):12. https://doi.org/10.3390/publications6010012

Chicago/Turabian Style

Allen, Ethan J., Roberta K. Weber, and William Howerton. 2018. "Library Assessment Research: A Content Comparison from Three American Library Journals" Publications 6, no. 1: 12. https://doi.org/10.3390/publications6010012

APA Style

Allen, E. J., Weber, R. K., & Howerton, W. (2018). Library Assessment Research: A Content Comparison from Three American Library Journals. Publications, 6(1), 12. https://doi.org/10.3390/publications6010012

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop