Next Article in Journal
The Biological “Invariant of Motion” vs. “Struggle for Life”? On the Possible Quantum Mechanical Origin and Evolution of Semiotic Controls in Biology
Previous Article in Journal
Passivity-Based Nonlinear Excitation Control of Power Systems with Structure Matrix Reassignment

Information 2013, 4(4), 351-366; doi:10.3390/info4040351

Suborganizations of Institutions in Library and Information Science Journals
Dalibor Fiala
Department of Computer Science and Engineering, University of West Bohemia, Univerzitní 8, Plzeň 30614, Czech Republic; Tel.: +420-377-63-2429; Fax: +420-377-63-2402
Received: 25 June 2013; in revised form: 19 August 2013 / Accepted: 26 September 2013 / Published: 9 October 2013


: In this paper, we analyze Web of Science data records of articles published from 1991 to 2010 in library and information science (LIS) journals. We focus on addresses of these articles’ authors and create citation and collaboration networks of departments which we define as the first suborganization of an institution. We present various rankings of departments (e.g., by citations, times cited, PageRank, publications, etc.) and highlight the most influential of them. The correlations between the individual departments are also shown. Furthermore, we visualize the most intense citation and collaboration relationships between “LIS” departments (many of which are not genuine LIS departments but merely affiliations of authors publishing in journals covered by the specific Web of Science category) and give examples of two basic research performance distributions across departments of the leading universities in the field.
departments; ranking; PageRank; citations; collaborations

1. Introduction and Related Work

Bibliometric studies can roughly be conducted at three levels—individual researchers (micro-level), institutions (meso-level), and countries (macro-level). Of course, these “basic” levels can have their own sublevels (e.g., regions of a country) or they can be grouped into supralevels (such as continents). There have been many bibliometric analyses at various levels, but we can feel that at the meso-level those analyses have mainly concentrated on institutions as such or that they have not really been large-scale, i.e., involving tens or hundreds of thousands of items to analyze. This study tries to bridge this gap in the field of library and information science (LIS) by analyzing several tens of thousands of bibliographic records at the meso-level and concentrating on the suborganizations of institutions. An institution (or the primary organization) usually has an organizational structure comprising some suborganizations (level 1) that themselves may consist of other suborganizations (level 2). The depth of this hierarchy may vary—some institutions have a relatively flat structure, while other hierarchies may include suborganizations of even higher levels. A typical academic institution (a university) may be divided into faculties, schools, departments, laboratories, and research groups, which are difficult to capture in scientometric studies due to the inconsistent way they are present (or absent) in authors’ addresses. As we will show later on, we will call level-1 suborganizations “departments” for the sake of simplicity. The main research questions of this study are the following: (a) Do Web of Science (WoS) data contain enough information to analyze the scientific performance and collaboration of the departments with which authors of journal articles in the LIS research area are affiliated? (hereafter called “LIS” departments); (b) What are the most intense citations and collaborations between “LIS” departments? and (c) Which “LIS” departments are the most highly ranked by various indicators based on publications from 1991–2010? Responses to these questions will be given in the next sections.

Bibliometric analysis of library and information science institutions has a long history in the United Kingdom. For instance, Bradley et al. [1] measured the publication patterns of the Department of Information Studies at the University of Sheffield, Holmes and Oppenheim [2] analyzed the citation impact of British LIS departments, and Oppenheim [3] ranked British LIS schools by citation impact. Seng and Willet [4] conducted a citation analysis of a small number of LIS departments in the UK and LIS departments in the UK were investigated by Webber [5]. British LIS departments were also analyzed webometrically—by Thomas and Willet [6] and by Arakaki and Willet [7]. As for other regions of the world, Aina and Mooko [8] analyzed a small set of top African LIS researchers and defined the centers of the African LIS research. Another tiny group of LIS publications was investigated by Herrero-Solana and Ríos-Gómez [9] to identify the most productive Latin American universities and departments. Meho and Spurgin [10] ranked American LIS schools by the visibility of their faculty in various databases and Yazit and Zainab [11] reported on the publication productivity in LIS of some Malaysian institutions. There have been two large-scale studies in which Yan and Sugimoto [12] explored citation patterns of various LIS institutions and He et al. [13] explored tens of thousands of LIS publications, but both of them remained at the institutional level. This study is the only large-scale one at the departmental level and the visualization tools used in this article are discussed by Shannon et al. [14].

2. Data and Methods

In November 2012 we manually queried the Web of Science web interface to obtain records of all articles published in the period 1991–2010 and indexed in the Social Sciences Citation Index in the research area “Information Science & Library Science” (ISLS). We were interested in the “article” document type only. In this way, we acquired plain text metadata on 46,800 journal articles. (Saving to plain text took about 50 min because a maximum of 500 records can be saved at once by anyone with a Web of Science subscription.) These metadata typically include an article’s title, journal name, volume, issue, pagination, and year as well as its authors’ names, addresses, times cited count and some other information. An example of a journal record is presented in Figure 1. As we can see, only some of the cited references (CR) can be identified unambiguously—in this case with a digital object identifier (DOI). The remaining references can be identified using the volume, issue, and pagination or cannot be identified at all. To create a citation network from the article records retrieved (a basic, root, or seed set of articles), we need one more tool.

Figure 1. A sample journal article record.
Figure 1. A sample journal article record.
Information 04 00351 g001 1024

Therefore, in the next step, we used the Web Services Lite application programming interface (API) to retrieve the records of articles citing the articles in the basic set. This API is available for free to anyone with a Web of Science subscription after registration. In total, we got 175,139 citing article records. The information contained in the citing article records is somewhat less abundant than in the plain text seed article records. In particular, any author address information is missing. On the other hand, citing article records are structured in a similar way as XML records. See Figure 2 for an example of a citing article record. In the example, an article with ID (UT) 000283981500004 is cited by an article with ID 000283981500001. These IDs can then be matched with “UT WOS” in seed article records (see bottom of Figure 1) and, as a result, a complete citation network of the articles in the root set can be constructed. This citation graph had 94,836 edges, i.e., slightly over 54% of all citations were citations within the seed set.

Figure 2. A sample citing article record.
Figure 2. A sample citing article record.
Information 04 00351 g002 1024

Since this paper is concerned with departments, the research depends on the extent to which affiliations and addresses of article authors are systematically present in the records we analyzed. There is no genuine affiliation information in the records, but there is often information on authors’ addresses denoted with C1 and RP like in Figure 1. RP means a “reprint address”, which is the address of the corresponding author (usually, but not always, the first author), and C1 is a field containing authors’ addresses. Reprint and “normal” addresses may sometimes be the same, for instance when there is one author only. In total, almost 88% of publications had some address information associated with them and 65% had both reprint and normal address. 85% of publications had a reprint address and 68% had one normal address at least, but the latter percentage was quite different in various years under study as can be seen from Figure 3. While the share of publications with some address information has been about 90% throughout the period, the number of publications with one normal address at least has only had a similar share since 1998. Before 1998 there was a high percentage of publications having a reprint but no normal address (from 45% to 70%), but this was almost negligible in later years and so was the number of articles having a normal address but no reprint address in the whole period 1991–2010.

Figure 3. Numbers of publications with different types of addresses.
Figure 3. Numbers of publications with different types of addresses.
Information 04 00351 g003 1024

As can be seen in Figure 1, addresses have a relatively clear structure starting with an institution followed with suborganizations (from bigger to smaller ones) and ending with a city and a country. Organizations (institutions) and suborganizations are written using standardized abbreviations and are delimited with commas as are cities and countries. In our experience, reprint addresses often include also other information such as street names and numbers or state or province names, etc. This additional information can distort the common address pattern “institution, suborganizaiton1, …, suborganizationN, city (+ZIP), country”, but based on our experiments with random address samples and a manual checking of the pattern correctness, the pattern is violated in a few percent of cases even if reprint addresses are included. As a result, we made an approximation and considered all addresses in all publications in the period 1991–2010 as having an institution as their first item, a city and a country as their last item, and suborganizations in between. The number of suborganizations can vary as shown in Table 1. In the data under study, an institution (main organization) can have up to seven suborganizations associated with it, but most affiliations consist of an institution and its suborganization. Thus, before all the experiments whose results will be reported in the next section, we retained suborganization 1 in each address and discarded the other suborganizations of higher levels. We will call the couple “institution; suborganization 1” a “department” because this is typically what is represented by that.

Table 1. Examples of various suborganizations of an institution.
Table 1. Examples of various suborganizations of an institution.
OrganizationSuborganization 1Suborganization 2Suborganization 3
Indiana Univ
Indiana UnivSch Lib & Informat Sci
Indiana UnivSch BusinessDecis & Informat Syst Dept
Indiana UnivSch MedDept MedDiv Gen Med & Geriatr

3. Results and Discussion

The citation graph of departments we obtained had 18,291 nodes and 154,744 edges. The graph is directed and the edges are weighted with an average weight of 2.62 per edge. The total sum of edge weights in the graph (404,755) is the total number of citations between departments. In Table 2 we can see the departments that received the most citations: “Indiana Univ; Sch Lib & Informat Sci”, “Leiden Univ; Ctr Sci & Technol Studies”, and “Univ Sheffield; Dept Informat Studies”. However, the numbers of publications by which the departments are represented (see the last column in Table 2) vary significantly so “Leiden Univ; Ctr Sci & Technol Studies” with 3722 citations and 84 publications is actually relatively more cited than “Indiana Univ; Sch Lib & Informat Sci” with 4334 citations and 243 publications (44 citations per publication compared to 18). But the measure of citations per publication is obviously biased towards departments with fewer publications. For instance, the relatively most cited department in Table 2 is “Lib Hungarian Acad Sci; Bibliometr Serv” (position 33) with 165 citations per publication.

As far as the citations between individual departments are concerned, we can see the most intense of them in Figure 4. The size of nodes is based on the “times cited” (see below for an explanation) of a department and the thickness of edges depends on the number of citations from one department to another. We can notice that there are two big components—one centred around “Wolverhampton Univ; Sch Comp & Informat Technol” and the other one around “Penn State Univ; Sch Informat Sci & Technol”. The most intense citations as such are those from “Wolverhampton Univ; Sch Comp & Informat Technol” to “Indiana Univ; Sch Lib & Informat Sci”, “Victoria Univ Wellington; Sch Commun & Informat Management”, and “Univ Western Ontario; Fac Informat & Media Studies”. There are also intra-institutional citations such as from “Wolverhampton Univ; Sch Comp & Informat Technol” to “Wolverhampton Univ; Sch Comp & Informat Sci” or from “Penn State Univ; Coll Informat Sci & Technol” to “Penn State Univ; Sch Informat Sci & Technol”, but these may sometimes be self-citations of departments that changed their names or whose names are used inconsistently. These errors are inherent in the Web of Science data and they could be removed only by means of a huge amount of manual effort. In total, we found that 4.3% of all citations were intra-institutional.

Table 2. Top 40 “library and information science (LIS)” departments by citations.
Table 2. Top 40 “library and information science (LIS)” departments by citations.
1Indiana Univ; Sch Lib & Informat Sci4334243
2Leiden Univ; Ctr Sci & Technol Studies372284
3Univ Sheffield; Dept Informat Studies3606195
4Rutgers State Univ; Sch Commun Informat & Lib Studies3413144
5Penn State Univ; Sch Informat Sci & Technol336156
6Univ Maryland; Robert H Smith Sch Business301352
7Univ Minnesota; Carlson Sch Management283571
8Univ Tennessee; Sch Informat Sci2661118
9Drexel Univ; Coll Informat Sci & Technol2288101
10Univ Tampere; Dept Informat Studies228596
11City Univ London; Dept Informat Sci2162192
12Univ Western Ontario; Fac Informat & Media Studies2125138
13Wolverhampton Univ; Sch Comp & Informat Technol2068109
14Univ British Columbia; Fac Commerce & Business Adm182126
15Univ Illinois; Grad Sch Lib & Informat Sci1710167
16Queens Univ; Sch Business165124
17Univ N Carolina; Sch Lib & Informat Sci1630102
18Harvard Univ; Sch Med1516143
19Univ Georgia; Terry Coll Business148438
20Florida State Univ; Coll Business144736
21Univ Virginia; Mcintire Sch Commerce141318
22Syracuse Univ; Sch Informat Studies1273162
23Georgia State Univ; Coll Business Adm126624
24Univ Calif Irvine; Grad Sch Management126125
25Univ Wisconsin; Sch Lib & Informat Sci119571
26Royal Sch Lib & Informat Sci; Dept Informat Studies115831
27Univ Pittsburgh; Sch Informat Sci115084
28Univ So Calif; Marshall Sch Business113928
29City Univ Hong Kong; Dept Informat Syst106464
30Univ N Texas; Sch Lib & Informat Sci105360
31Univ Calif Los Angeles; Grad Sch Educ & Informat Studies101542
32Univ S Florida; Coll Business Adm99217
33Lib Hungarian Acad Sci; Bibliometr Serv9916
34Katholieke Univ Leuven; Steunpunt O&o Stat98420
35Univ Arkansas; Sam M Walton Coll Business97311
36Florida State Univ; Sch Informat Studies97153
37Csic; Cindoc96632
38Georgia State Univ; Dept Comp Informat Syst96627
39Univ Wisconsin; Sch Lib & Informat Studies94674
40Univ N Carolina; Kenan Flagler Business Sch92613
Figure 4. Most intense citations between “LIS” departments.
Figure 4. Most intense citations between “LIS” departments.
Information 04 00351 g004 1024

The citations shown in Table 2 are based on the citation graph of departments, which was generated from the core 46,800 publication records retrieved. Citations from publications outside of this core are not counted in, but they are included in the “Times Cited” indicator which is present in each publication record retrieved (TC in Figure 1). The ranking of departments by times cited looks different than that in Table 2 and the top departments are presented in Table 3. The best three departments are “Univ Minnesota; Carlson Sch Management”, “Harvard Univ; Sch Med”, and “Univ Maryland; Robert H Smith Sch Business”. Again, departments with fewer publications often have higher times cited counts. An extreme case is “Univ So Calif; Knowledge Syst Lab” with one publication only and the largest times cited in Table 3. Note that the times cited count is not always greater than or equal to citations because both indicators are based on different citation graphs—the citation graph of articles and the citation graph of departments, respectively. Imagine a department affiliated with one article only that is merely cited once from an article with which three distinct departments are affiliated. In that case the cited department’s times cited count is 1 and its citations indicator is 3. Thus the ranks of individual departments in both rankings can differ significantly. For example, “Univ So Calif; Knowledge Syst Lab” is ranked 10th by times cited but 396th by citations or “Lib Hungarian Acad Sci; Bibliometr Serv” is 33th by citations but 155th by times cited. Anyway, the interpretation may be that “Univ So Calif; Knowledge Syst Lab” is relatively more cited by researchers from other scientific fields than from the community of library and information science whereas “Lib Hungarian Acad Sci; Bibliometr Serv” is relatively more cited from within the community than from outside of it. There is also one highly ranked “department” by times cited, namely “The Scientist; 3600 Market St”, which is wrongfully identified as such from frequent addresses associated with “The Scientist” journal articles in WoS data and which is ranked very low by citations. Nevertheless, the correlation between the department rankings by citations and by times cited is still rather high as will be shown later on. By the way, many of the present departments are not genuine LIS departments, but are affiliations of authors publishing in journals categorized as ISLS by WoS showing the multidisciplinarity of this field. On the other hand, some LIS research is also published in other WoS categories not covered by this study.

Table 3. Top 40 “LIS” departments by times cited.
Table 3. Top 40 “LIS” departments by times cited.
DepartmentTimes CitedPublications
1Univ Minnesota; Carlson Sch Management475671
2Harvard Univ; Sch Med4051143
3Univ Maryland; Robert H Smith Sch Business386052
4Indiana Univ; Sch Lib & Informat Sci3475243
5Queens Univ; Sch Business307024
6Rutgers State Univ; Sch Commun Informat & Lib Studies2950144
7Univ Virginia; McIntire Sch Commerce294218
8The Scientist; 3600 Market St2922569
9Univ Sheffield; Dept Informat Studies2761195
10Univ So Calif; Knowledge Syst Lab26961
11Leiden Univ; Ctr Sci & Technol Studies267384
12Univ Arkansas; Sam M Walton Coll Business216911
13Univ British Columbia; Fac Commerce & Business Adm216726
14Univ Georgia; Terry Coll Business202238
15Penn State Univ; Sch Informat Sci & Technol201756
16Florida State Univ; Coll Business200836
17Georgia State Univ; Coll Business Adm196724
18Harvard Univ; Sch Publ Hlth170738
19Univ Illinois; Grad Sch Lib & Informat Sci1669167
20Wolverhampton Univ; Sch Comp & Informat Technol1669109
21Univ Tampere; Dept Informat Studies162196
22City Univ London; Dept Informat Sci1580192
23Drexel Univ; Coll Informat Sci & Technol1535101
24Univ Tennessee; Sch Informat Sci1488118
25City Univ Hong Kong; Dept Informat Syst145964
26Univ So Calif; Marshall Sch Business144628
27Georgia State Univ; Robinson Coll Business142125
28Univ Calif Irvine; Grad Sch Management138525
29Univ Western Ontario; Fac Informat & Media Studies1332138
30Univ S Florida; Coll Business Adm123317
31Univ N Carolina; Sch Lib & Informat Sci1180102
32Syracuse Univ; Sch Informat Studies1178162
33Stanford Univ; Sch Med116276
34Univ Penn; Wharton Sch114149
35Georgia State Univ; Dept Comp Informat Syst107627
36Brigham & Womens Hosp; Div Gen Med & Primary Care107416
37Univ N Carolina; Kenan Flagler Business Sch106413
38McGill Univ; Fac Management106320
39Univ Western Ontario; Sch Business Adm10562
40Carnegie Mellon Univ; Grad Sch Ind Adm104615

We did not make an attempt to disambiguate and/or unify the names of institutions and suborganizations, but we used them as they were in WoS data. Instead, we tried to estimate the share of possible duplicate departments. The easiest way to do so was to calculate the similarities of all department names in three random samples of 500 departments using a well known algorithm and then manually check the department pairs whose similarity reached a certain threshold. The determined share of duplicate departments was always below 1%. Thus, we believe that the absence of name disambiguation and unification (which is a very time-consuming task) does not significantly affect the results of this study.

Figure 5. Most intense collaborations between “LIS” departments.
Figure 5. Most intense collaborations between “LIS” departments.
Information 04 00351 g005 1024

Apart from citations, we can also inspect collaboration patterns. The most intense collaborations between departments are depicted in Figure 5, where the node size depends on the publication count of a department and the edge thickness depends on the number of collaborations. The three most intense collaborations occur between “Univ Illinois; Coordinated Sci Lab” and “Univ Illinois; Grad Sch Lib & Informat Sci” (an intra-institutional collaboration), “Brigham & Womens Hosp; Div Gen Med & Primary Care” and “Harvard Univ; Sch Med”, and “Harvard Univ; Sch Med” and “Harvard Univ; Sch Publ Hlth” (also an intra-institutional collaboration). “Harvard Univ; Sch Med” is the “centre” of the biggest community in Figure 5 collaborating with four “Brigham & Womens Hosp” departments, with another “Harvard Univ” department, and with “Childrens Hosp; Div Emergency Med”. The share of intra-institutional interactions is substantially greater with collaborations than with citations—we found that almost 22% of all 22,569 collaborations were intra-institutional. As for the strength of the relationship between citations and collaborations, it does not seem meaningful to draw any conclusions from our data since only about 6% of collaborations occurred more than once and only about 1.5% of citations occurred more than ten times.

In addition to the rankings by citations or times cited, we created also other rankings of “LIS” departments based on other indicators: Publications (by the number of publications), Indegree (like citations but with all weights in the citation graph of departments set to 1), AvgTimesCited (average times cited per publication), HindexByTimesCited (h-index as defined by Hirsch [15] and based on times cited), HindexByEdges (based on citations within the graph), HITS [16], PageRank [17], and Weighted PageRank [18]. From these other eight rankings we only show the top 40 departments by PageRank and weighted PageRank in Table 4 and Spearman’s rank correlations between all the rankings in Table 5 (all significant at the 0.01 leveltwo-tailed).

The PageRank and weighted PageRank rankings are the most highly correlated rankings of all with a rank correlation coefficient of 0.996 and also the first difference in the rankings is at rank 5, where there is “Haifa Univ; Dept Geog” by PageRank and “Univ Minnesota; Carlson Sch Management” by the weighted PageRank. Otherwise, the rankings in Table 4 are quite similar to each other but less so to the ranking by citations (correlation about 0.83) and even less to the ranking by times cited (around 0.69). PageRank-like algorithms (and also HITS) are iterative recursive methods dependent on the structure of the citation graph of departments and, therefore, they are much more related to citations than to times cited. Although the top departments shown in Table 4 do not resemble those in Table 2 and Table 3, the overall rankings are still quite strongly correlated with all other rankings except Publications. The least correlation we found between Publications and AvgTimesCited—only about 0.2 Publications is also the most distant ranking from all others with an average correlation of 0.483.

Finally, to conclude the section on results, in Table 6 we present examples of the most influential departments (by times cited) of four leading universities having the greatest times cited counts in our LIS data set. These universities are “Univ Maryland”, “Indiana Univ”, “Georgia State Univ”, and “Univ Minnesota”. We can notice that there are basically two types of performance distribution at institutions—either there is one dominant department like “Carlson Sch Management” at “Univ Minnesota” or “Robert H Smith Sch Business” at “Univ Maryland” or, to a lesser extent, “Sch Lib & Informat Sci” at “Indiana Univ”, or there are several comparably well performing departments like “Coll Business Adm”, “Robinson Coll Business”, and “Dept Comp Informat Syst” at “Georgia State Univ”. Even if this example is small, we can assume that all influential institutions whose research influence is investigated at the level of departments can fit into one of these two basic performance distribution schemes.

Table 4. Top 40 “LIS” departments by PageRank and weighted PageRank.
Table 4. Top 40 “LIS” departments by PageRank and weighted PageRank.
PageRankWeighted PageRank
1Inst Studies Res & Higher Educ; Munthes Gt 29Inst Studies Res & Higher Educ; Munthes Gt 29
2Norwegian Radium Hosp; Inst Canc ResNorwegian Radium Hosp; Inst Canc Res
3Univ Missouri; Med Informat GrpUniv Missouri; Med Informat Grp
4Univ Missouri; Program Hlth Serv ManagementUniv Missouri; Program Hlth Serv Management
5Haifa Univ; Dept GeogUniv Minnesota; Carlson Sch Management
6Univ Maryland; Dept GeogIndiana Univ; Sch Lib & Informat Sci
7Enea; Cr CasacciaHaifa Univ; Dept Geog
8Univ Washington; Coll EducUniv Hull; Inst European Publ Law
9Washington State Univ; Edward R Murrow Sch CommunUniv Hull; Sch Law
10Cornell Univ; Coll Agr & Life SciRutgers State Univ; Sch Commun Informat & Lib Studies
11Cornell Univ; Coll Vet MedEnea; Cr Casaccia
12Univ Hull; Sch LawUniv Maryland; Dept Geog
13Univ Hull; Inst European Publ LawUniv Washington; Coll Educ
14Univ Minnesota; Carlson Sch ManagementUniv Sheffield; Dept Informat Studies
15Enea; Res Ctr CasacciaCornell Univ; Coll Vet Med
16Univ Hamburg; Inst EthnolQueens Univ; Sch Business
17Univ Calabria; Ctr Ingn Econ & SocialeLeiden Univ; Ctr Sci & Technol Studies
18Enea; Ente Nuove Tecnol Energia AmbienteCornell Univ; Coll Agr & Life Sci
19Indiana Univ; Sch Lib & Informat SciWashington State Univ; Edward R Murrow Sch Commun
20Rutgers State Univ; Sch Commun Informat & Lib StudiesUniv British Columbia; Fac Commerce & Business Adm
21Queens Univ; Sch BusinessPenn State Univ; Sch Informat Sci & Technol
22Univ Vermont; Sch Business AdmUniv Illinois; Grad Sch Lib & Informat Sci
23Univ Sheffield; Dept Informat StudiesUniv Maryland; Robert H Smith Sch Business
24Univ Virginia; Mcintire Sch CommerceHarvard Univ; Sch Med
25Leiden Univ; Ctr Sci & Technol StudiesEnea; Res Ctr Casaccia
26Univ Maryland; Hlth Sci LibUniv Tennessee; Sch Informat Sci
27Univ Illinois; Grad Sch Lib & Informat SciUniv Vermont; Sch Business Adm
28Univ Michigan; Alfred Taubman Med LibUniv Virginia; Mcintire Sch Commerce
29Univ Texas; Grad Sch BusinessUniv Penn; Wharton Sch
30Harvard Univ; Sch MedUniv Tampere; Dept Informat Studies
31Natl & Univ Lib Iceland; Interlib Loans Document Delivery DeptUniv Calif Irvine; Grad Sch Management
32Reykjavik Univ; European Documentat CtrUniv Maryland; Hlth Sci Lib
33Georgia State Univ; Coll Business AdmGeorgia State Univ; Coll Business Adm
34Univ Western Ontario; Sch Business AdmUniv Georgia; Terry Coll Business
35Univ Calif Irvine; Grad Sch ManagementCarnegie Mellon Univ; Grad Sch Ind Adm
36Univ British Columbia; Fac Commerce & Business AdmCity Univ London; Dept Informat Sci
37Syracuse Univ; Sch Informat StudiesUniv Michigan; Alfred Taubman Med Lib
38Univ Michigan; Head Hlth Sci LibUniv N Carolina; Sch Lib & Informat Sci
39Oregon State Univ; Dept JournalismDrexel Univ; Coll Informat Sci & Technol
40Carnegie Mellon Univ; Grad Sch Ind AdmSyracuse Univ; Sch Informat Studies
Table 5. Spearman’s rank correlation coefficients between various rankings.
Table 5. Spearman’s rank correlation coefficients between various rankings.
Avg Times CitedCitationsIndegreePublicationsTimes CitedHindex By EdgesHindex ByTims CitedHITSPRPR weighted
Avg TimesCited10.70090.70550.20450.95130.69440.70480.67850.63580.6340
Hindex ByEdges0.69440.78050.77900.49740.776510.78790.76480.68810.6887
Hindex ByTimesCited0.70480.63550.63120.61260.82240.787910.61530.59780.6011
PR weighted0.63400.83420.83850.39810.69230.68870.60110.79990.99581
Table 6. Top 20 “LIS” departments of four leading universities by times cited.
Table 6. Top 20 “LIS” departments of four leading universities by times cited.
Univ MarylandIndiana Univ
Robert H Smith Sch Business3860Sch Lib & Informat Sci3475
Rh Smith Sch Business755Kelley Sch Business1035
Coll Lib & Informat Serv597Sch Med709
Coll Informat Studies565Sch Business254
Asian Div480Dept Telecommun227
Coll Business & Management407Slis221
Dept Decis & Informat Technol387Grad Sch Business213
Dept Comp Sci221Ctr Social Informat142
Coll Lib & Informat Sci151Sch Publ & Environm Affairs141
Inst Adv Comp Studies145Kelly Sch Business121
Dept Informat Syst136Sch Informat89
Dept Geog123Sch Educ70
Sch Med100Regenstrief Inst Hlth Care62
Human Comp Interact Lab96Sch Journalism51
Amer Use Time Project72Dept Geog44
Joint Program Survey Methodol69Inst Commun Res38
Ctr Comp Sci65Dept Instruct Syst Technol35
College Pk62Dept Polit Sci26
Rh Smith Sch62Dept Amer Studies24
Hlth Sci Lib58Roudebush Va Med Ctr21
Georgia State UnivUniv Minnesota
Coll Business Adm1967Carlson Sch Management4756
Robinson Coll Business1421Curtis L Carlson Sch Management609
Dept Comp Informat Syst1076Dept Informat & Decis Sci100
J Mack Robinson Coll Business697Sch Journalism & Mass Commun91
Comp Informat Syst Dept675Mis Res Ctr66
Robinbson Coll Business220Dept Geog46
Dept Management210Sch Law40
Ctr Proc Innovat & Comp Informat Syst194Digital Technol Ctr38
Ctr Proc Innovat119Informat & Decis Sci Dept35
Coll Business77Biomed Lib32
J Mack Robinson Coll Business Adm45Dept Psychol30
Cis Dept40E Asian Lib24
Business Adm36Coll Educ & Human Dev23
Dept Comp Informat Ssyt36St Paul Campus Lib18
Dept Commun34Dept Comp Sci & Engn17
Policy Res Ctr24Sch Med17
Coll Educ12Sch Nursing14
Pullen Lib111445 Gortner Ave13
William Russell Pullen Lib11Sci & Engn Lib13
Dept Sociol8Walter Lib 10813

4. Conclusions and Future Work

Most large-scale scientometric research at the meso-level is concerned with primary research organizations (institutions), but only few studies analyze the scientific impact and collaboration of the suborganizations of these institutions. These suborganizations can be called schools, departments, divisions, laboratories, etc. and they themselves may be divided into further suborganizations of lower levels in the organizational hierarchy of an institution. Varying organizational structures along with ambiguities in the names of suborganizations may be the reason of the lack of large-scale scientometric analyses at the level of departments. This article tries to bridge this gap in the field of library and information science. The main contributions of this study are the following:

  • We analyzed the bibliographic records of 46,800 journal articles indexed in the Web of Science category “Information Science & Library Science” that were published between 1991 and 2010.

  • We created citation and collaboration networks of level-1 suborganizations that we call departments and we visualized the most intense citations and collaborations between departments.

  • We produced various rankings of “LIS” departments using ten well-known methods and computed the correlations between these rankings.

The main findings of our study confirm the sufficiency of WoS data and are as follows:

  • Almost 88% of publications had some address information associated with them, but prior to 1998 only few publications had other than reprint addresses included.

  • “Indiana Univ; Sch Lib & Informat Sci” is the best department in terms of citations and “Univ Minnesota; Carlson Sch Management” is ranked first by times cited.

  • Most cited of all departments is “Indiana Univ; Sch Lib & Informat Sci” by “Wolverhampton Univ; Sch Comp & Informat Technol” and the most intense departmental collaboration occurs between “Univ Illinois; Coordinated Sci Lab” and “Univ Illinois; Grad Sch Lib & Informat Sci”.

In our future work on the scientific performance and collaboration at the level of departments, we would like focus on other fields of science, other publication sources (e.g., conference proceedings), and other time periods.


This work was supported by the European Regional Development Fund (ERDF), project “NTIS—New Technologies for Information Society”, European Centre of Excellence, CZ.1.05/1.1.00/02.0090.

Conflicts of Interest

The author declares no conflict of interest.


  1. Bradley, S.J.; Willett, P.; Wood, F.E. Publication and citation analysis of the department of information studies, University of Sheffield, 1980–1990. J. Inf. Sci. 1992, 18, 225–232. [Google Scholar] [CrossRef]
  2. Holmes, A.; Oppenheim, C. Use of citation analysis to predict the outcome of the 2001 research assessment exercise for unit of assessment (UoA) 61: Library and information management. Inf. Res. 2001, 6. Available online: (accessed on 5 October 2013). [Google Scholar]
  3. Oppenheim, C. The correlation between citation counts and the 1992 research assessment exercise ratings for British library and information science university departments. J. Doc. 1995, 51, 18–27. [Google Scholar] [CrossRef]
  4. Seng, L.B.; Willett, P. The citedness of publications by United Kingdom library schools. J. Inf. Sci. 1995, 21, 68–71. [Google Scholar] [CrossRef]
  5. Webber, S. Information science in 2003: A critique. J. Inf. Sci. 2003, 29, 311–330. [Google Scholar] [CrossRef]
  6. Thomas, O.; Willett, P. Webometric analysis of departments of librarianship and information science. J. Inf. Sci. 2000, 26, 421–428. [Google Scholar] [CrossRef]
  7. Arakaki, M.; Willett, P. Webometric analysis of departments of librarianship and information science: A follow-up study. J. Inf. Sci. 2009, 35, 143–152. [Google Scholar] [CrossRef]
  8. Aina, L.O.; Mooko, N.P. Research and publication patterns in library and information science. Inf. Dev. 1999, 15, 114–119. [Google Scholar] [CrossRef]
  9. Herrero-Solana, V.; Ríos-Gómez, C. Producción latinoamericana en biblioteconomía y documentación en el social science citation index (SSCI) 1966–2003. Inf. Res. 2006, 11, 21–45. (in Spanish). [Google Scholar]
  10. Meho, L.I.; Spurgin, K.M. Ranking the research productivity of library and information science faculty and schools: An evaluation of data sources and research methods. J. Am. Soc. Inf. Sci. Technol. 2005, 56, 1314–1331. [Google Scholar] [CrossRef]
  11. Yazit, N.; Zainab, A.N. Publication productivity of Malaysian authors and institutions in LIS. Malays. J. Libr. Inf. Sci. 2007, 12, 35–55. [Google Scholar]
  12. Yan, E.; Sugimoto, C.R. Institutional interactions: Exploring social, cognitive, and geographic relationships between institutions as demonstrated through citation networks. J. Am. Soc. Inf. Sci. Technol. 2011, 62, 1498–1514. [Google Scholar] [CrossRef]
  13. He, B.; Ding, Y.; Yan, E. Mining patterns of author orders in scientific publications. J. Informetr. 2012, 6, 359–367. [Google Scholar] [CrossRef]
  14. Shannon, P.; Markiel, A.; Ozier, O.; Baliga, N.S.; Wang, J.T.; Ramage, D.; Amin, N.; Schwikowski, B.; Ideker, T. Cytoscape: A software environment for integrated models of biomolecular interaction networks. Genome Res. 2003, 13, 2498–2504. [Google Scholar] [CrossRef]
  15. Hirsch, J.E. An index to quantify an individual's scientific research output. Proc. Natl. Acad. Sci. USA 2005, 102, 16569–16572. [Google Scholar] [CrossRef]
  16. Kleinberg, J.M. Authoritative sources in a hyperlinked environment. J. ACM 1999, 46, 604–632. [Google Scholar] [CrossRef]
  17. Brin, S.; Page, L. The anatomy of a large-scale hypertextual Web search engine. Comput. Netw. ISDN Syst. 1998, 30, 107–117. [Google Scholar] [CrossRef]
  18. Fiala, D. Time-aware PageRank for bibliographic networks. J. Informetr. 2012, 6, 370–388. [Google Scholar] [CrossRef]
Information EISSN 2078-2489 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert