Next Article in Journal
Does the Middle School Model Make a Difference? Relating Measures of School Effectiveness to Recommended Best Practices
Previous Article in Journal
Critical Theoretical Frameworks in Engineering Education: An Anti-Deficit and Liberative Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Relationship between Administrative Intensity and Student Retention and Success: A Three-Year Study

1
Department of Learning Technologies, University of North Texas, Denton, TX 76207, USA
2
Department of Counseling and Higher Education, University of North Texas, Denton, TX 76203, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2018, 8(4), 159; https://doi.org/10.3390/educsci8040159
Submission received: 16 August 2018 / Revised: 12 September 2018 / Accepted: 19 September 2018 / Published: 22 September 2018

Abstract

:
Through the lens of complexity theory and by utilizing the methodological framework set forth in Gander’s 1999 article regarding internal and external organizational elements of administrative intensity, this secondary data analysis study linked the internal organizational elements of administrative intensity to institutional results as evidenced by higher education student retention and graduation rates. Representing institutional investments, efforts, and outcomes from 2004 to 2014, three years of data reporting were gathered from the Integrated Postsecondary Education Data Set (IPEDS) and were then cleaned per secondary data analysis techniques. Using canonical correlation analysis, the internal elements of administrative intensity were correlated with student retention and success. Findings indicate the relationships of internal elements of higher education institutions on student retention and success, which was measured by four-year, six-year, and eight-year graduation rates. The discussion includes education policy implications.

1. Introduction

Today’s highly competitive, scrutinized, and politicized higher education environment has shined an intense spotlight on the cost of higher education. In the last four decades, “state policymakers and college administrators have struggled to find a means to compensate for the discrepancy between college expenditures and college revenues. The situation has worsened over time because college expenditures have risen more rapidly than inflation…” ([1]; [2] (p. 720)). The media and academia alike have asserted that ‘administrative bloat’ has reached unprecedented levels and administrative costs are growing at an unsustainable rate [3]. Though not all agree with this, it is certainly common thinking that borders on conventional wisdom [3]. In this environment, institutions of higher education are constantly assessing the effectiveness of academic leadership and striving for higher levels of accountability within the organization [4].
There are perceptions that out-of-control rising administrative costs at institutions of higher education directly contribute to the rising cost of attendance for students in the form of higher tuition [3]. It is the intent of this article to explore why it is important that administrative costs are controlled and scarce resources are allocated efficiently. The purpose of this study is to determine if there is a predictable relationship of administrative intensity defined as the “empirical fact of using, over time, relatively more administrative resources compared to faculty” [5] (p. 309) to student retention and success, which is measured as four-year, six-year, and eight-year graduation rates. We are seeking to determine the influence of internal administrative intensity and its elements rather than external funding influences since we believe internal administrative intensity to be the broad, overarching construct that can affect student success and retention and the overall performance of the institution. However, several different terms will be introduced in the paper dealing with subcategories of administrative intensity such as administrative costs, administrative bloat, etc. Each factors comes with its own unique use, traits, and definition. It was hypothesized in this study that the internal elements of administrative intensity would be related to the levels of student retention and success in which retention and the three measures of success would be represented.
Furthermore, this research sought to answer the question: Does complexity theory explain the relationship between administrative intensity and student retention and success? There is an assumed Newton-esque equal and opposite reaction to administrative bloat [3]. Building from this logic, it is the assertion of this study that, when administrative costs increase beyond a legitimate level, there is inefficient division of resources and a corresponding negative effect to services that directly affect the outcome of the student’s learning. Utilizing the lens of complexity theory, it is proposed that an increase in administrative costs leads the institution to becoming more autocratic, top heavy, and bureaucratic, which leads to many open administrative practices and results in lower student retention and success. The independent variables (Set 1) of this study include administrative costs and organizational characteristics and the dependent variables (Set 2) are student retention and success. Student success is indicated by reported percentages for four-year, six-year, and eight-year graduation rates.
The research design is a secondary data analysis utilizing the Integrated Postsecondary Education Data Set (IPEDS), which is an industry-standard data set reported by all institutions of higher education in the United States. IPEDS reports on a multitude of measures for higher education including (but not limited to) admissions and enrollment, graduation, retention, financial measures, human resource measures, and test scores [6]. Using this data and the theoretical model introduced by Gander [5], a canonical correlation analysis was conducted on three years of data collected from IPEDS including fiscal years 2012 to 2014 with the fiscal year mimicking the traditional academic year of September to August.

2. Literature Review

The review of existing literature was performed within the EBSCOHost database over the last 10 years. Seminal articles from prior to this period were introduced as well. The search criteria centered on “higher education administration” and “complexity theory.” For purposes of this study, only peer-reviewed academic articles were included in the review of literature.

2.1. Higher Education Administration

If one were to do an Internet search of “higher education administration” (or something similar), it would result in a wide variety of media related to the cost of higher education. Hedrick et al. [3] produced a study that actually questioned the common narrative of out-of-control administrative expenses in higher education. The results were quite contrary compared to conventional wisdom. A more benign, flat trend in higher education administrative expenses was exposed and it was asserted that the entire media narrative regarding higher education administrative expenses may be overblown [3]. However, Hedrick et al. [3] is closer to the exception than the rule. Leslie and Rhoades [7] examined rising administrative costs from 1975 to 1986 and found faster growth in administrative categories than those related to instructional faculty. Shin and Milton [2] published an empirical review focused on how tuition increases were viewed by and affected students in six different academic majors. The results were split on how each major viewed and dealt with tuition increases. A recent article regarding “middle leadership” in higher education in New Zealand found that higher education management is a fluid and highly complex endeavor [8]. In two separate publications, a researcher out of London has studied the principles put forth by Wilhelm von Humboldt in 1810 that charted the path for the founding of the University of Berlin and today’s model for higher education. One article examined Humboldt’s original writings and applied them to today’s British universities [9]. The second study specifically applied Humboldt’s principles to the issues of assessment and university management through the lens of complexity theory [10]. Other researchers have studied higher education administration in China and their hierarchical administrative structure and proposed four different models for Chinese institutions to follow [11].
Another common sub-discussion that arises when reviewing higher education is that of funding. A study from Arizona State University attempted to explain the failure of performance-based funding for higher education due to the complexity of the higher education ecology [12]. Gander [5] considered funding as a method of control for the institution and established a link to administrative intensity. Klein [13] argued that higher education was actually at a disadvantage with lawmakers over increased funding due to the ability of institutions to simply raise tuition at the students’ expense. Several articles that presented the intricacies of higher education funding in Europe [14,15,16,17] and Africa [18] were included in the literature. A longitudinal analysis observed the ratio of appropriations to personal income over a twenty-year period and identified distinctive patterns of government support for higher education [19].
The link between administrative efficiency and student or institution effectiveness is elusive at best in the literature. The most prescient article found in the literature sought to establish a relationship between “administrative intensity” and institutional control using a model that partitioned administrative and faculty ratios from funding dependency [5]. Gander’s [5] model informed this study and helped to guide the focus on the internal and external influences in the relationship between administrative intensity and student success and retention. Researchers from Pace University examined the relationship between donations and the administrative efficiency of the University and postulated that, as the administrative expense ratios increased, giving decreased [20]. It was postulated within Gander’s [5] premise that the reason for “using relatively more administrative resources compared to faculty has to do with the increased regulations and reporting requirements imposed on higher education institutions as a result of government policy actions over the past 30 years,” [7,21] which is cited in Reference [5] (p. 310). As such, it is expected that increased administrative expenses related to management and decreased investment in faculty and student support and instruction has the potential to draw resources away from efforts directly related to student retention and success.

2.2. Complexity Theory

The theoretical foundation used for this study to explain the predictability of administrative intensity upon student success and retention is complexity theory. Tong and Arvey defined complexity theory broadly as “an evolving system driven simultaneously by forces of order and disorder” [22] (p. 653) and explained that a complex system undergoes interactions among multiple agents or individuals. This produces a random dynamic that veers toward chaos but eventually yields to emerging patterns that prevent self-destruction. Wilhelm von Humboldt’s memorandum that became the roadmap for the University of Berlin was the basis for a study by Elton [9] of the University of Manchester, which reinforced Humboldt’s elements as still applicable to higher education today. Elton [9] maintained that Humboldt knew the trials and issues facing higher education even in the early 19th century were caused by a complex intertwining of educational and research responsibilities of a university with the governance required of structured, mission-oriented organization. Within complexity theory, paradoxically divergent, yet parallel, interaction exists between educational instruction and research with university governance that tends to lead a university closer to the edge of chaos while ultimately claiming the vision of an institution of higher education [9,23]. The expectation is that the more complex the organization, the more chaotic the systems could be, which results in a number of independent agents that could retain their disorder as well as spontaneously order themselves into a coherent system that serves the mission and vision of the organization.

3. Materials and Methods

3.1. Functional and Natural Classification of the Expense–Nacubo Farm

One of the most influential reporting requirements to IPEDS is the reporting of expenses. IPEDS requires institutions to report expenses by functional and natural classification [6]. Natural classification includes expenses in categories such as salaries, wages, benefits, and operational expenses [6]. However, functional classification requires institutions to classify expenses by who or what the expenses’ intended use or benefit is rather than what the particular expense is [24]. Taken directly from the Financial and Accounting Reporting Manual for Higher Education [24], functional expense classification is delineated into twelve categories: Instruction, Research, Public Service, Academic Support, Student Services, Institutional Support, Scholarships and Fellowships, Auxiliary Enterprise, Hospitals, and Independent Operations. Institutions report these functional expense classifications in a variety of ways and to a number of different agencies and organizations.
The National Association of College and University Business Officers (NACUBO) defines each of these expense categories in the Financial and Accounting Reporting Manual [24,25], as follows within §700 Reporting Expenses. Instruction expenses include those for credit and non-credit courses such that academic, vocational, and/or technical instruction, remedial and tutorial instruction, and regular, special, and extension sessions should be included. The Research classification is for all expenses related to the specific organization and intention to produce research including the administration of research related grants. Public service activities are non-instructional activities established for the benefit of individuals and groups external to the institution. Academic Support falls into a couple of different subclasses: (a) the preservation, retention, or display of education and historical items such as a library, gallery, or museum and (b) academic administration personnel such as an academic dean and his/her administrative support personnel. Student Services are those offices of admissions, registrar, financial aid, student organizations, or any other service that exists, as its primary purpose, to “contribute to students’ emotional and physical well-being and intellectual, cultural, and social development outside the context of the formal instruction program” (Student Services, Para. 1). Institutional Support includes activities related to executive management of the institution, fiscal operations, general administration, and public relations and development. Scholarships and Fellowships are generally considered a reduction to revenue and, therefore, would not be classified as an expense and would not be reported as such. However, “if the applied aid exceeds charges to the student (tuition and fees, dormitory, and food service) and the excess is disbursed to the student. The excess disbursed is reported as an expense in the financial statements” (Scholarships and Fellowships, Para. 1). This would include grants-in-aid, trainee stipends, tuition and fee waivers, and prizes to undergraduate and/or graduate students. Auxiliary Enterprises are those activities that produce goods and services for the primary benefit of students, faculty, and staff of the institution for a charged fee. The intent of an auxiliary is to be financial self-supporting and includes services such as housing and dining, parking, printing services, and more. Hospital expenses “include all expenses associated with the patient care operations of a hospital including nursing and other professional services, general services, administrative services, and fiscal services” (Hospitals, Para. 1). Independent Operations include expenses of separately owned and controlled organizations that operate under the fiscal umbrella of the institution but whose assets are wholly owned by the agency not the institution.

3.2. IPEDS

For this study, data were collected from the Integrated Postsecondary Education Data Set (IPEDS), which is a survey collection of data from all institutions of higher education in the United States. As a product of the National Center for Education Statistics, which is a division of the U.S. Department of Education, IPEDS requires institutions of higher education to submit various measures of higher education including enrollment figures, institutional characteristics, financial and human resources related data, student success, and many more [6]. IPEDS data collection varies depending on the measure but includes three collection dates per year with different measures available at different times (see Figure 1 for 2016–2017 survey collection schedule). From these survey submissions, IPEDS is able to display various critical measures of higher education performance and general information to the general public [6].
Data collection from IPEDS is a dynamic query system. The user has the option to choose one institution at a time or to compare multiple institutions simultaneously. The user then must choose the variables to view and the years in which to view them. Once all information has been entered, IPEDS then outputs the data into a Microsoft (MS) Excel file. For this analysis, the data within the MS Excel files were categorized and organized within MS Access due to the sheer size and volume of the data. The variables (with few exceptions) that Gander [5] introduced were replicated within the database. To facilitate entry into IBM SPSS Statistics 24, database tables were exported to an MS Excel workbook. Lookup tables were used within MS Excel to append the variable data to a single MS Excel worksheet for each of the years of interest. Due to the reporting requirements and completion of the reporting cycles, the data used for this study were the 2012, 2013, and 2014 academic years. The initial data files were downloaded in Fall 2016 and additional data for these years were downloaded from IPEDS during the summer of 2017. Further checks for completion and finalization of the data for the 2012, 2013, and 2014 data were made in March 2018. The use of the finalized data sets ensured that the analysis included the most accurate data for each of the universities’ data included in the study. The variables based upon the prior administrative intensity work and their descriptive statistics are presented in Table 1.
As mentioned above, there were instances where the data gathered did not allow for a direct replication of Dr. Gander’s work and necessitated a change in calculation or definition. Among these are variables Q1 and Q2, which were presented as full time equivalent (“FTE”) students in Dr. Gander’s article and were essentially added to all student credit hours together and divided by a full-time, 30-hour per year course load. This analysis used pure headcount since it gives a much clearer picture of the complexity of the organization by including all students enrolled. It was postulated that the FTE calculation may “water down” the number of students. Headcount gives a direct number of students the institution must serve and, therefore, could increase the administrative and faculty resources required to serve these students. This increases the administrative intensity of the university. Another area where this analysis built on Gander’s definitions is in the addition of the FACCOST variable, which is a measure of salaries and benefits for instruction, research, and public service. This was done to make a more equitable comparison between variables for average compensation for faculty and administrators, PF and PA. In Gander [5], PF was a data point gathered from the original source. In this study, PA and PF were calculated as average administrative or faculty pay with benefits divided by the total full-time staff for each variable (administrative or instructional staff). Another additional variable is S, which denotes whether a school is selective or not in their admissions processes.

3.3. Respondents and Variables Used

Of the more than 7500 schools that report to IPEDS, the number of universities and colleges of interest to this study was 3146. The types of universities and colleges included within this study were 4-year, private and public non-profit schools offering bachelor’s and advanced degrees. The number of schools’ data included within each of the study’s variables are included in Table 2. These range from 1780 to 3146 schools. This difference in the number of schools is related to data entry value returns from IPEDS of zero (0), which was likely to have been caused by no entry and were a default entry in IPEDS. The difference was recoded as missing data for full time retention and the four-year, six-year, and eight-year graduation rates. For entries that contained only zeroes for the calculation for administrative intensity, the lnA/F resulted as a missing data entry in the school’s data row. The natural logarithm of zero is undefined.
The minimum and maximum values for each of the variables provide the endpoints to calculate the ranges. The mean and standard deviation offer additional information to estimate the variability of the reported measures by the schools. Each of the 50 states within the United States was represented by the schools’ data exported from IPEDS for this analysis. The type of institutional control (TOC) variable indicated that 76% of the schools included were privately funded. Between 5% and 6% of the schools reporting indicated that they had a medical school. Research expenditures were reported by between 15% and 16% of the schools each year.
Although the data was collected and considered, the values for TOC, enrollment selectivity, public source of funds per $ revenue (PUBSOR), and private sources of funds per $ revenue (PRISOR) were not used in the final analysis. The TOC variable was considered for inclusion. Preliminary exploratory analysis indicated that TOC would not be an adequate predictor for differences related to student retention and success. There were no differences in predicting student retention and success when comparing the public and private funded schools. The determination of selectivity is based upon the use of qualifying scores for selection. However, schools that were categorized as not selective included qualifying score values within IPEDS. As such, there was not a definitive method to extract data that would positively identify schools that were selective and correctly identify schools that were categorized as not selective. Therefore, the enrollment selectivity variable was removed from the statistical analysis. Due to the large numbers of missing data for PUBSOR and PRISOR, it was determined through regression analysis that these values would result in little explanation of the variation. This was consistent to Gander’s [5] findings.

3.4. Secondary Data Analysis Techniques

Secondary data analysis techniques include the recoding of variables into dummy codes or missing data in order to facilitate the accuracy of the statistical analysis processes and to organize the data in such a way to perform analysis to answer the research questions. A key element of secondary data analysis is that the research questions and analysis are different than the primary initial intention of the collection and analysis of the data. The primary purpose of IPEDS data collection is intended to have a repository of data to be used to connect research, policy, and practice [26]. This analysis is meant to examine the relationships between the internal factors related to administrative intensity, student retention, and success.
Recoding of the variables was necessary to mitigate analysis of entries that were incomplete and misleading. For example, a college reports 100% retention and does not report graduation rates. The data row would indicate that the school had 100% retention and then zero values for each of the graduation rates. This is not likely to have occurred. In such a situation, the recode of zero values for graduation rates to missing data allowed for the analysis with the retention rate when using pairwise deletion of the data. Any missing data would remove the entire row of data when using listwise deletion during an analysis procedure. Another similar unlikely event is a data row with positive graduation rates for the four-year, six-year, and eight-year entries and a zero entry for retention. The zero entry for retention was recoded as missing data.

3.5. Statistical Analyses

The reporting of data to IPEDS is encouraged by the National Center for Education Statistics to add to the repository of data. The completeness of the data must be assessed before completing an analysis. Values of zero could indicate a zero value was reported or could be a default entry. Care must be applied to zero values to determine if they were logically correct or required recoding to missing values. Due to the skewness of the data, the natural logarithmic values for the measures for administrative intensity (A/F), average faculty compensation (PF), average administration compensation (PA), and the number of students were calculated to transform the datasets to be more Gaussian in their distributions. The coding for bachelor’s degree institutions (DEG1), master’s degree institutions (DEG2), and doctoral degree institutions (DEG3) was dummy coded with DEG1 as the reference category.
The model for this study used logit analysis to assess the prediction of administrative intensity by internal and external factors related to administrative costs and procedures [5]. A first analysis to assess the relationship of administrative intensity, which was calculated by using the modified internal factors for this study, with student retention and success was a linear regression analysis. Even though the analyses resulted in statistically significant relationships (p < 0.001) for each of the 12 regression results (4 regression solutions for each of the 3 years of data) due to the low R2 values for each outcome (0.147 < R2 < 0.233), it was determined to complete a different research analysis protocol to answer the research question and the canonical correlation analysis. This new approach would allow for the assessment of the influence for each of the internal factors used in the calculation of administrative intensity and combine the four elements of student retention and success as a single set instead of individual dependent variables.

3.5.1. Distributional Normality

Multivariate normality can be assessed using either the data that created a scale or the total score. Within this analysis, the transformed variables and the percentages for the student retention and success were used within a graphical procedure to calculate Mahalanobis distances to ensure the variables met the assumptions of multivariate normality. The resulting Mahalanobis distances were compared to an expected chi-square value. Those that exceeded the value were deleted from the data set for analysis. The graphical procedure completed within ordinary least squares regression used chi-square random values generated within IBM SPSS Statistics 24 as the predicted values. The graphics generated were a histogram of regression standardized residuals overlaid with a normal curve, a normal P-P plot of regression standardized residuals observed cumulative probability to the expected cumulative probability, and a scatterplot of the regression standardized predicted values to the regression standardized residual values. Observation of the graphical displays confirmed the presence before deletion of the outliers and then the absence of outliers within the data after deletion.

3.5.2. Bivariate Correlation Analysis

The bivariate correlation analysis was conducted to determine the intercorrelations for the study’s variables. Intercorrelations for the internal variables used to calculate the administrative intensity were important to assess possible multicollinearity and an expectation of the potential influence on student retention and success. Due to the large number of responses in the data set, care must be given to look beyond statistical significance and observe practical significance. It is possible that meaningless levels of correlation such as those less than 0.5 could be statistically significant at the p < 0.01 level. Pairwise deletion allowed for the maximum number of comparisons for each calculation to test the study’s hypothesis.

3.5.3. Canonical Correlation Analysis

After completion of 12 ordinary least square regression analyses and the low levels of R2 obtained for each, other techniques that could provide evidence for or against the research hypothesis were considered. Canonical correlation analysis was determined to be the most appropriate statistical method with which to analyze the three years of data. The research sought to examine the predictive validity of the eight internal elements of administrative intensity, as noted in the hypothesis, for student retention and success. As a multivariate method of analysis, canonical correlation analysis subsumes other parametric methods such as multivariate analysis of variance and regression in the general linear model. The advantages of using canonical correlation analysis for this study are that the multivariate method allows for the interconnectedness of the elements within administrative intensity and the lack of independence of the student retention and success measures. There is an assumption that variables in the sets are on a continuum and may have multiple causes and multiple effects. Canonical correlation analysis permits the comparison between multiple predictors and multiple dependent variables and the shared correlations between the variables in each set. The hypothesis for this study is one in which a combination of the eight elements of administrative intensity might predict student retention and success.

4. Results

The means, standard deviations, minimum, and maximum values for the study’s variables are presented in Table 2. The intercorrelations are presented in Table 3. Although the measure of administrative intensity was a calculation including the eight internal elements, the levels of correlation vary in magnitude. This was evidence that supported the completion of the canonical correlation analysis with the elements of the internal measure as the predictor variable as well as the student retention and success to assess their multivariate, shared relationship. Standardized canonical correlation coefficients greater than one often indicate that there is likely a high degree of multicollinearity in the data. Observation of the bivariate correlation coefficients in Table 3 for four-year, six-year, and eight-year graduation rates indicates that there are high levels of correlation between the measurements. As such, it is suspected that these values consistently predict student success and are multicollinear. When completing the distributional normality graphical procedure to calculate the Mahalanobis distances to assess the data for outliers, it was noted that the tolerance values for six-year and eight-year graduation rates were less than 0.1 and the VIF values were greater than 10.0.
Each of the three years’ analysis resulted in four canonical functions. The values of the output are displayed in Table 4. Years 2012 and 2013 yielded three statistically significant functions and year 2014 resulted in four statistically significant functions. Although these results were statistically significant, only the first two functions for each year yielded interpretable squared canonical correlation (Rc2) effect sizes known as eigenvalues that were 0.396 < λ < 0.823. Note in Table 4 the relative size of the eigenvalues for the third and fourth functions for each year are quite small: 0.006 < λ < 0.027. The third and fourth functions for each year were omitted from the interpretation and the data are presented in Table 5.
The full model for all three years across all functions was statistically significant with p < 0.001 for each. See Table 4 for the statistical values. The Wilk’s Λ, 0.370 < Λ < 0.380 for function 1–4 for each year represents the variance unexplained in the model. Therefore, it can be used to calculate the effect size in an r2 metric, 1 – Λ, 0.620 < Λ< 0.630. This indicates that the full model can explain approximately 62% to 63% of the variance shared between the variable sets for each year, which includes 2012, 2013, and 2014.
Table 5 presents the canonical solutions. The data included are for functions 1 and 2 for each year of the study. The structure coefficients are reported in the canonical loadings output in IBM SPSS Statistics 24. Structure coefficients greater than an absolute value of 0.3 are regarded as noteworthy and should be considered as relevant criterion variables. Within the table, the squared structure coefficients and communalities (h2) are given for the two functions of each variable. Within the SPSS output, the values for the set 1 and set 2 cross loadings included values that had an absolute value less than the absolute value of set 1 and set 2 canonical loadings for each value among functions 1 and 2 for each year including 2012, 2013, and 2014.
In the first function in each year including 2012, 2013, and 2014, the variables lnPF, lnPA, lnQ, M, and Deg3 are significantly related to the overall meaning. Additionally, in the first function of each year, the variables FTRetention, 6yrgrad, and 8yrgrad are significantly related to the overall meaning. The second function for each year suggests that the ratio of Q2/Q1 measuring the academic organizational complexity index and the DVRES are relevant criterion variables. Entries for function 2 in 2012, lnPF, and in 2013, lnPA, have an absolute value of a structure coefficient greater than 0.3 indicating that they could be regarded as noteworthy. Some texts presenting the value for consideration suggest using 0.4. Applying this criterion to this study, these two values would not be considered in function 2 and all other predictor variables in function 1 and function 2 would retain their significant relationship to the overall meaning. This 0.4 level would remove 6yrgrad from function 1 for all three years. The values for FTRetention, 4yrgrad, 6yrgrad, and 8yrgrad are relevant criterion variables for function 2 for all three years.
Structure coefficients that have the same sign are positively related. Those that have the inverse sign are negatively related. In function 1 for all three years, the values for lnPF, lnPA, lnQ, M, and Deg3 are negative, as are the structure coefficients for FTRetention, 6yrgrad, and 8yrgrad, which indicates a positive relationship. In function 2, the sign for the structure coefficients for Q2/Q1 and DVRES are the same as those for FTRetention, 4yrgrad, 6yrgrad, and 8yrgrad. Therefore, these variables have a positive relationship.

5. Discussion

The purpose of this study was to examine the relationship between the internal elements of the administrative intensity and the student retention and success. The values for the dimensions were drawn from IPEDS to describe the level of the average faculty compensation, the average administration compensation, the number of students at undergraduate and graduate levels, academic complexity, the presence of a medical school, research expenditures, the student retention, and student success as measured by graduation rates. The combination of the internal elements was a measure of administrative intensity. The following discussion is organized by the internal element and its potential for influencing student retention and success. All are positively related, which means that, as any of the internal elements increases, so does the full time student retention and graduation rates.

5.1. Average Faculty Pay with Benefits (FACCOST/F)

It was hypothesized that the average pay with benefits for faculty would be related to student retention and success. The FACCOST variable included the salaries and benefits for academic support and instructional support. These data values do not separate the expenses into fulltime and adjunct faculty values. The canonical correlation relationship indicates a positive relationship between the costs associated with faculty pay and with student retention and success. The bivariate correlations for the lnPF and the measures of student retention and success are less than 0.4, which is a relationship that supports the canonical correlation findings. Educational policy decisions will need to weigh the potential gains associated with the types of faculty selected for instructional purposes. Earlier shifts from full-time faculty to adjunct faculty to realign institutional resources led to a decline in student retention and success [27]. Compared to adjunct faculty and student retention, full-time faculty were noted to have a more positive effect on student retention through encouragement and faculty/student interactions [28].

5.2. Average Administrative Staff Pay with Benefits (ADMCOST/A)

The ADMCOST variable included the salaries and benefits for academic support and institutional support. These values included the administration as well as services such as the library and student affairs. This study hypothesized that the average pay with benefits for administration would be related to student retention and success. A positive relationship was determined within this study. The delineation of the individual elements of academic support and institutional support that most influence student retention and success cannot be completed with the data from IPEDS due to the combination of the data into a combined total. Even though the categories were listed separately in the NACUBO Financial and Accounting Reporting Manual, the data elements in the database aggregate the values of these expense values. The positive relationship observed is related to the combined elements of academic support and institutional support. As such, prior research can help explain the relationships and possible influence on student retention and success. The levels of persistence and academic outcomes have not been found to be related to interactions with administrative offices such as the registrar, financial aid, or the bursar [29]. In another study, counseling and support programs that aimed to help first-generation students resulted in increased motivation and self-regulation at the conclusion of their freshmen year and graduated at the same rate as other admitted students to the same college and university [30]. In studies related to administrative bloat [3], unsustainable levels of administrative costs could result in higher tuition rates and reduced student retention and success. Financial issues were a reason that students identified as a cause for leaving higher education [31]. Education policy decision makers should consider the services provided by the administration and their return on investment. Combined together, the academic support and institutional support for students is necessary to provide the students with the services needed. If decision makers do want to consider specific aspects, they can use this prior research to make investment decisions. Within the administration support of a college or university, this is an area that has been found within prior research to make a difference in student retention and success through counseling and support services. Advising would be within this positive relationship found in this study.

5.3. Number of Undergraduate and Graduate Student Headcount

It was hypothesized that the total student headcount would be related to student retention and success. The outcomes of the canonical correlation analysis indicate a positive relationship between the number of undergraduate and graduate students with student retention and success. The bivariate correlation analysis indicated that there were no statistically significant intercorrelations between the number of students and the faculty compensation, the administration compensation, and master’s degree institutions for each of the three years in the study. This lack of a relationship with master’s degree institutions is understandable because master’s degree institutions have a wide range of enrollment counts and many master’s degrees are now available online. Unlike prior research [32], the retention of students at the master’s degree level were not found to have an increase in student retention with an increase in library expenditures. This is likely due to the change in the number of online degree programs at the master’s level for which students are not interacting with librarians [33], which is an element of the administration compensation calculation.

5.4. Academic Organizational Complexity Index

The academic complexity index was determined by calculating a ratio of graduate student headcount to undergraduate student headcount. This complexity is expected due to the additional resources required for master’s and doctoral level students compared to undergraduate level students. The canonical correlation analysis determined a positive relationship within function 2 between the academic organizational complexity index (Q2/Q1) as well as student retention and success. The growth of undergraduate research [34] has likely closed the gap in the resources required for undergraduate education. The bivariate correlation between Q2/Q1 and DVRES as well as research expenditure indicates that there are statistically significant intercorrelations between the two variables. However, review of the coefficients’ magnitudes indicates that this is not a practical significant relationship. Examination of the bivariate correlation of master’s institutions (DEG2) and DVRES indicates that these intercorrelations are not statistically significant. Many master’s degree programs are practitioner-based and do not require a research intensive experience that would include research expenditures.

5.5. Medical School Exists

Using a multiproduct Cobb-Douglas specification, the model of university cost inefficiency revealed that schools with medical education would have higher operating costs and those schools with tenure-track faculty operated more efficiently [35]. The presence of a medical school was hypothesized to be related to student retention and success. This was confirmed in the canonical correlation analysis to be a positive relationship. The statistically significant bivariate correlations between M with lnPA and lnPF reinforce the relationship between the increased costs for faculty and administration and the presence of a medical school.

5.6. Doctoral Institutions

Using data from the 2010 and 2011 Association of College and Research Libraries (ACRL) Metrics database, the strongest positive relationship between library expenditures and student retention was found for doctoral degree granting institutions [33]. The canonical correlation analysis confirmed the hypothesis that there is a relationship between doctoral institutions, DEG3, and student retention and success. The relationship is positive. The bivariate correlation analysis indicates coefficients greater than 0.5 for the relationship between DEG3 and the presence of a medical school. Thus, doctoral degree granting institutions often have a medical school within the institution.

5.7. Research Expenditures

The hypothesis for this study expected a relationship between research expenditures and student retention and success. The canonical correlation analysis confirmed this hypothesis and resulted in a positive relationship. Universities that support participation in undergraduate research opportunities have recorded increased student retention [34]. Engagement in student activities and social groups are known sources of support for students to continue their education [31]. Positive encounters during a student’s enrollment encourage him or her to invest in the college experience [36]. Therefore, the student could be retained and could experience success by continuing their education and graduating.

5.8. Complexity

Colleges and universities are evolving systems that are working to balance order and disorder. Just as Humboldt [23] had proposed about educational institutions, they are still a complex intertwining of education and research. This complexity influences the level of retention and success. Much is known about why students leave a college or university but more is needed to know about why students continue. It is a “complex web of events that shape student leaving and persistence” [37] (p. 1). The “leaders with high levels of authenticity, cultural intelligence, and trust drive impressive outcomes in higher education in the 21st century” [38] (p. 75). As such, balance between order and disorder and between chaos and patterns needs to be established in the education policies and decision-making by administration members and faculty. In hierarchical organization structures, the pace of decision-making is much slower than other organization types due to only a selected number of members with the authority and responsibility to make a decision. The individual employees have limits on their responsibility for the organization. The adaptation to changing business conditions is reduced [39]. To manage the complexity, the emergence of leaders at all levels including the administration, the faculty, and the students needs to be developed. The second research question asked if complexity theory explained the relationship between the administrative intensity and student retention and success. Although there was an expectation of finding a negative relationship between average administration compensation, the actual result was a positive relationship in the canonical correlation analysis between the internal elements of administrative intensity and student retention and success. These complex and chaotic systems of universities with higher levels of administrative intensity seemingly created order within their organizations and retained a positive relationship where a negative relationship was expected between academic intensity and student outcomes. The more complex universities retained the order, pattern, and structure necessary for student retention and success.

6. Conclusions

The purpose of this study was to determine if there is a predictable relationship of administrative intensity and student retention and success. The canonical correlation analysis confirmed that a positive relationship does exist between the internal elements of administrative intensity and student retention and success. An aspect for education policy decision-makers to consider is that the retention of students in a college or university directly affects revenue flows in areas such as housing, food, tuition, fees, and bookstore purchases. Successful students often return to their alma mater for sporting events and, possibly, donate funds for scholarships or other activities. A complex relationship exists between the student and the college or university. Education policy makers should consider how to maintain this balance.
One limitation of this study is with regard to how IPEDS data is captured in varying organizational structures. Institutions that belong to larger governance or corporate structures often pay for services to be provided by sister entities/institutions. Expenses that are in-sourced are recorded as transfers instead of actual operating expenses for the institution. For example, if University A belongs to a System and the System provides administrative services such as payroll, human resources, information technology, finance, etc., the cost of those services would be a transfer for University A and the expenses (salaries, wages, benefits, maintenance and operational costs, etc.) would fall under the System, which artificially deflates institutional support expenses for University A. This limitation could be mitigated in further study by comparing institutions at the top level of their governance structures (i.e., at the system/corporate level). Making this comparison would shed further light on whether these organizational models, which have become the norm in recent decades, have contributed to student outcomes or inhibited them.
Not all of the reasons why students leave their college or university are within the control of the school [33]. However, interventions such as programs that help “with life skills, academic strategies, and a sense of a belonging to help students succeed beyond their first semester” [30] (p. 321) have been found to be beneficial for student retention and success.

Author Contributions

Conceptualization, K.D.R. and R.B. Methodology K.D.R. and R.B. Data Curation, K.D.R., R.B., and K.A.R. Formal Analysis, K.D.R. and R.B. Writing-Original Draft Preparation, K.D.R., R.B., and K.A.R. Writing-Review & Editing, K.D.R. and R.B.

Funding

This research received no external funding.

Acknowledgments

The authors greatly acknowledge the helpful comments by James P. Gander during the initial conceptualization of the research design, and the feedback by the reviewers of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. John, E.P.S. Changes in pricing behavior during the 1980s: An analysis of selected case studies. J. High. Educ. 1992, 63, 165–187. [Google Scholar] [CrossRef]
  2. Shin, J.C.; Milton, S. Student response to tuition increase by academic majors: Empirical grounds for a cost-related tuition policy. High. Educ. 2008, 55, 719–734. [Google Scholar] [CrossRef]
  3. Hedrick, D.W.; Wassell, C.S.; Henson, S.E. Administrative costs in higher education: How fast are they really growing? Educ. Econ. 2009, 17, 123–137. [Google Scholar] [CrossRef]
  4. Cardno, C. Images of academic leadership in large New Zealand polytechnics. J. High. Educ. Policy Manag. 2013, 35, 124. [Google Scholar] [CrossRef]
  5. Gander, J.P. Administrative intensity and institutional control in higher education. Res. High. Educ. 1999, 40, 309–322. [Google Scholar] [CrossRef]
  6. U.S. Department of Education. Institute of Education Sciences, National Center for Education Statistics. Integrated Postsecondary Education Data System. Available online: http://nces.ed.gov/ipeds/surveys/2005/pdf/f1a_form.pdf (accessed on 15 November 2016).
  7. Leslie, L.L.; Rhoades, G. Rising Administrative Costs: Seeking Explanations. J. High. Educ. 1995, 66, 187–212. [Google Scholar] [CrossRef]
  8. Branson, C.; Franken, M.; Penney, D. Middle leadership in higher education: A relational analysis. Educ. Manag. Adm. Leadersh. 2016, 44, 128–145. [Google Scholar] [CrossRef]
  9. Elton, L. Collegiality and complexity: Humboldt’s relevance to British universities today. High. Educ. Q. 2008, 62, 224–236. [Google Scholar] [CrossRef]
  10. Elton, L. Complexity theory—an approach to assessment that can enhance learning and—more generally—could transform university management. Assess. Eval. High. Educ. 2010, 35, 637–646. [Google Scholar] [CrossRef]
  11. Berger, J.B.; Hudson, K.E.; Blanco Ramírez, G. How universities work: Understanding higher education organization in northwest China. Educ. Policy Anal. Arch. 2013, 21, 64. [Google Scholar] [CrossRef]
  12. Nisar, M.A. Higher education governance and performance based funding as an ecology of games. High. Educ. 2015, 69, 289–302. [Google Scholar] [CrossRef]
  13. Klein, M.W. Settling a U.S. Senatorial Debate: Understanding Declines in State Higher Education Funding. J. Educ. Financ. 2015, 41, 1–29. [Google Scholar]
  14. Barr, N. Higher education funding. Oxford Rev. Econ. Policy 2004, 20, 264–283. [Google Scholar] [CrossRef]
  15. Cretan, G.C.; Gherghina, R. Funding higher education in a few EU countries: Implications for competition and competitiveness in higher education. J. Knowl. Manag. Econ. Inf. Technol. 2015, 1, 1–12. [Google Scholar]
  16. Dragoescu, R.M.; Oancea, B. Trends in funding higher education in Romania and EU. Manager 2014, 19, 7–17. [Google Scholar]
  17. Johnston, R. England’s new scheme for funding higher education through student fees: ‘Fair and progressive’? Polit. Q. 2013, 84, 200–210. [Google Scholar] [CrossRef]
  18. Teferra, D. Funding higher education in Africa: State, trends and perspectives. J. High. Educ. Afr. 2013, 11, 19–51. [Google Scholar]
  19. McLendon, M.K.; Hearn, J.C.; Mokher, C.G. Partisans, professionals, and power: The role of political factors in state higher education funding. J. High. Educ. 2009, 80, 686–713. [Google Scholar] [CrossRef]
  20. Tinkelman, D.; Mankaney, K. When is administrative efficiency associated with charitable donations? Nonprofit Volunt. Sect. Q. 2007, 36, 41–64. [Google Scholar] [CrossRef]
  21. Volkwein, J.F.; Malik, S.M. State regulation and administrative flexibility at public universities. Res. High. Educ. 1997, 38, 17–42. [Google Scholar] [CrossRef]
  22. Tong, Y.K.; Arvey, R.D. Managing complexity via the Competing Values Framework. J. Manag. Dev. 2015, 34, 653–673. [Google Scholar] [CrossRef]
  23. Humboldt, W.V. On the spirit and organisational framework of intellectual institutions in Berlin. Minerva 1970, 8, 242–267. [Google Scholar]
  24. Financial and Accounting Reporting Manual for Higher Education. Available online: http://www.nacubo.org/Products/Financial_Accounting_and_Reporting_Manual_FARM.html (accessed on 15 November 2016).
  25. National Association of College and University Business Officers (NACUBO). Available online: http://www.nacubo.org (accessed on 15 November 2016).
  26. Institute for Education Sciences. Available online: https://ies.ed.gov/aboutus/ (accessed on 15 March 2018).
  27. Umbach, P.D. How effective are they? Exploring the impact of contingent faculty on undergraduate education. Rev. High. Educ. 2007, 30, 91–123. [Google Scholar] [CrossRef]
  28. Hutto, P.N. The Relationship Between Student Retention in Community College Courses and Faculty Employment Status. Community College. J. Res. Prac. 2017, 41, 4–17. [Google Scholar] [CrossRef]
  29. Southwell, K.H.; Whiteman, S.D.; MacDermid Wadsworth, S.M.; Barry, A.E. The Use of University Services and Student Retention: Differential Links for Student Service Members or Veterans and Civilian Students. J. Coll. Stud. Retent. Res. Theory Prac. 2016, 19, 394–412. [Google Scholar] [CrossRef]
  30. Wibrowski, C.R.; Matthews, W.K.; Kitsantas, A. The Role of a Skills Learning Support Program on First-Generation College Students’ Self-Regulation, Motivation, and Academic Achievement: A Longitudinal Study. J. Coll. Stud. Retent. Res. Theory Prac. 2016, 19, 317–332. [Google Scholar] [CrossRef]
  31. Martin, J.M. It Just Didn’t Work Out. J. Coll. Stud. Retent. Res. Theory Prac. 2015, 19, 176–198. [Google Scholar] [CrossRef]
  32. Mezick, E.M. Return on Investment: Libraries and Student Retention. J. Acad. Libririansh. 2007, 33, 561–566. [Google Scholar] [CrossRef]
  33. Eng, S.; Stadler, D. Linking Library to Student Retention: A Statistical Analysis. Evid. Based Libr. Info. Prac. 2015, 10, 50–63. [Google Scholar] [CrossRef] [Green Version]
  34. Gregerman, S.R.; Lerner, J.S.; Hippel, W.V.; Jonides, J.; Nagda, B.A. Undergraduate Student-Faculty Research Partnerships Affect Student Retention. Rev. High. Educ. 1998, 22, 55–72. [Google Scholar] [CrossRef]
  35. Sav, G.T. Stochastic Cost Inefficiency Estimates and Rankings of Public and Private Research and Doctoral Granting Universities. J. Knowl. Manag. Econ. Info. Technol. 2012, 2, 1–13. [Google Scholar]
  36. Pascarella, T.; Terenzin, P. How College Affects Students: A Third decade of Research, 2nd ed.; Jossey-Bass: San Francisco, CA, USA, 2005. [Google Scholar]
  37. Tinto, V. Research and Practice of Student Retention: What Next? J. Coll. Stud. Retent. Res. Theory Prac. 2006, 8, 1–19. [Google Scholar] [CrossRef]
  38. Lee, R.B. Comparative Case Study Analysis of Collaboration and Perceptions of Merging Academic Affairs and Student Affairs Divisions and the Impact on Student Success. Ph.D. Thesis, Delaware State University, Dover, Delaware, 2017. [Google Scholar]
  39. Petrou, P.; Demerouti, E.; Schaufeli, W.B. Crafting the Change: The Role of Employee Job Crafting Behaviors for Successful Organizational Change. J. Manag. 2018, 44, 1766–1792. [Google Scholar] [CrossRef]
Figure 1. The IPEDS 2016–2017 Data Reporting and Release dates. Note that the final files have an 18-month lag from reporting to release.
Figure 1. The IPEDS 2016–2017 Data Reporting and Release dates. Note that the final files have an 18-month lag from reporting to release.
Education 08 00159 g001
Table 1. Theoretical variable values by year: 2012, 2013, and 2014.
Table 1. Theoretical variable values by year: 2012, 2013, and 2014.
VariableDefinition/CalculationMeanMinMaxStd. Deviation
ADMCOSTSalaries and Benefits for Academic Support and Institutional Support ($)14,036,1230801,099,00041,269,634
14,717,8890851,853,00043,703,407
15,404,1250890,335,00045,429,712
FACCOSTSalaries and Benefits for Instruction, Research and Public Service ($)40,436,68101,650,127,000134,862,339
41,898,82301,759,393,52843,074,849
50,827,95402,632,559,00045,436,924
Q1Undergraduate Headcount Enrollment3877169,3806642
3804266,2986547
38471155,8727263
Q2Graduate Headcount Enrollment1440141,5132764
1406142,8112749
1417143,2282907
FTotal All Faculty Instructional, Research, and Public Service Full-time20915947430
21215971441
21716068452
AAdministrative and Professional Staff Full-time349021,2581169
381013,9811002
309013,896986
QAll Students (Q1 + Q2)4178077,7347928
4209077,3387912
43080195,0598777
TOCType of Institutional Control0.760 = “Public”1 = “Private”0.43
0.760.43
0.760.43
SEnrollment Selectivity 0 = “Not Selective”1 = “Selective”
MMedical School Exists0.050 = “No”1 = “Yes”0.22
0.060.23
0.050.22
A/FAdm/Faculty Intensity4.240119124.53
3.4701896.84
3.75029010.84
PFAverage Faculty Pay with Benefits (FACCOST/F) ($)129,191014,101,292329,733
126,22706,507,428191,464
176,85907,673,085246,935
PAAverage Administrative Staff Pay with Benefits (ADMCOST/A) ($)18,69303,570,78988,109
95,110027,240,083715,111
18,4190301,32818,722
DEG1Bachelor Institutions0.240 = “No”1 = “Yes”0.42
0.240.42
0.240.42
DEG2Master’s Institutions0.210 = “No”1 = “Yes”0.41
0.210.41
0.210.41
DEG3Doctoral Institutions0.120 = “No”1 = “Yes”0.32
0.120.32
0.120.32
Q2/Q1Academic Organizational Complexity Index (Graduate Headcount Enrollment/Undergraduate Headcount Enrollment)0.780395.6710.47
0.920510.0012.62
1.040797.0018.06
DVRESResearch Expenditures0.150 = “No Expenses”1 = “Yes Expenses”0.36
0.160.36
0.150.36
PUBSORPublic Sources of Funds per $ Revenue0.1101.350.22
0.1101.020.22
0.1100.990.22
PRISORPrivate Sources of Funds per $ Revenue0.11−0.351.000.22
0.11−0.081.000.22
0.110.001.000.22
Note: N = 3146 cases for each year. Full and incomplete reports of the data are included within the descriptive statistics.
Table 2. Descriptive statistics of study variables by year: 2012, 2013, and 2014.
Table 2. Descriptive statistics of study variables by year: 2012, 2013, and 2014.
VariableDefinition/CalculationMeanMinMaxStd. DeviationVariable
lnA/FNatural logarithmic transformation of the sum of lnPF, lnPA, lnQ, Q2/Q1, M, Deg2, Deg3, DVRes28272.8206.030.70
28993.000.836.300.72
28452.8306.030.70
lnPFNatural logarithmic transformation of PF29917.78016.465.63
29927.92015.695.59
29777.88016.465.60
lnPANatural logarithmic transformation of PA29913.90015.094.85
29927.69016.444.98
29773.93015.094.86
lnQNatural logarithmic transformation of Q29916.77011.222.36
29926.92011.262.13
29776.83011.222.27
Q2/Q1Academic Organizational Complexity Index29910.660395.679.88
29920.810510.0012.05
29770.670395.679.90
MMedical School Exists29910.050 = “No”1 = “Yes”0.22
29920.060.23
29770.060.23
DEG1Bachelor Institutions2991 Reference category
2992
2977
DEG2Master’s Institutions29910.210 = “No”1 = “Yes”0.41
29920.220.41
29770.210.41
DEG3Doctoral Institutions29910.120 = “No”1 = “Yes”0.32
29920.120.33
29770.120.33
DVRESResearch Expenditures29910.150 = “No Expenses”1 = “Yes Expenses”0.36
29920.160.37
29770.150.36
FTRetensionFull Time Retention (%)200770.908.0010016.41
202072.228.0010015.69
201273.179.0010015.64
4yrgrad4-year graduation rate (%)162536.651.0010021.78
167736.511.0010021.74
168536.901.0010021.75
6yrgrad6-year graduation rate (%)171550.401.0010020.34
177750.011.0010020.64
179949.941.0010020.59
8yrgrad8-year graduation rate (%)173452.342.0010020.15
179051.942.0010020.40
180551.541.0010020.41
Note: N values appear in the table with each variable due to the differences in the complete and missing data values. Outliers were deleted from each year’s data set as identified by using graphical procedure with Mahalanobis distances. Transformations to natural logarithms were conducted to manage the skewness of the data and transform the datasets to more Gaussian distributions.
Table 3. Intercorrelation matrix for study variables by year: 2012, 2013, and 2014.
Table 3. Intercorrelation matrix for study variables by year: 2012, 2013, and 2014.
Variable12345678910111213
1.lnA/F--
--
--
2.lnPF0.916 **--
0.939 **--
0.912 **--
3.lnPA0.696 **0.519 **--
0.929 **0.876 **--
0.691 **0.510 **--
4.lnQ0.614 **0.582 **0.400 **--
0.512 **0.561 **0.405 **--
0.617 **0.570 **0.390 **--
5.Q2/Q10.155 **0.004−0.0170.006--
0.160 **0.0220.0180.005--
0.154 **0.002−0.0180.004--
6.M0.217 **0.189 **0.152 **0.237 **0.023--
0.207 **0.188 **0.181 **0.255 **0.062 **--
0.203 **0.185 **0.151 **0.236 **0.021--
7.DEG20.339 **0.304 **0.247 **0.363 **−0.011−0.090 **--
0.295 **0.285 **0.240 **0.362 **−0.013−0.089 **--
0.334 **0.295 **0.240 **0.361 **−0.012−0.087 **--
8.DEG30.286 **0.239 **0.219 **0.345 **0.049 **0.571 **−0.191 **--
0.256 **0.237 **0.216 **0.360 **0.038*0.568 **−0.194 **--
0.285 **0.235 **0.214 **0.351 **0.049 **0.567 **−0.193 **--
9.DVRES0.271 **0.283 **0.142 **0.157 **0.048 **0.158 **0.0300.157 **--
0.285 **0.272 **0.293 **0.144 **0.035 *0.171 **0.0260.162 **--
0.268 **0.277 **0.139 **0.149 **0.047 **0.158 **0.0230.157 **--
10.FTRetension0.355 **0.366 **0.235 **0.277 **0.0430.203 **0.057*0.253 **0.337 **--
0.316 **0.287 **0.339 **0.251 **0.051*0.203 **0.0260.246 **0.335 **--
0.259 **0.259 **0.222 **0.228 **0.0.069 **0.201 **0.0340.238 **0.303 **--
11.4yrgrad0.128 **0.165 **0.026−0.0140.082 **0.116 **−0.150 **0.090 **0.485 **0.709 **--
0.182 **0.156 **0.210 **00.078 **0.119 **−0.144 **0.101 **0.486 **0.695 **--
0.135 **0.171 **0.034−0.0060.148 **0.131 **−0.145 **0.103 **0.475 **0.686 **--
12.6yrgrad0.328 **0.316 **0.183 **0.260 **0.097 **0.216 **−0.0350.242 **0.444 **0.783 **0.918 **--
0.347 **0.320 **0.345 **0.293 **0.086 **0.220 **−0.0360.253 **0.455 **0.782 **0.916 **--
0.363 **0.367 **0.211 **0.316 **0.076 **0.236 **−0.0180.258 **0.442 **0.745 **0.916 **--
13.8yrgrad0.330 **0.321 **0.186 **0.267 **0.105 **0.220 **−0.0260.253 **0.421 **0.775 **0.896 **0.992 **--
0.367 **0.332 **0.366 **0.313 **0.109 **0.224 **−0.0240.266 **0.440 **0.785 **0.890 **0.992 **--
0.387 **0.385 **0.235 **0.345 **0.074 **0.244 **−0.0090.277 **0.438 **0.754 **0.891 **0.993 **--
Note: * p < 0.05. ** p < 0.01. All other entries were not statistically significant. N values for relationships are presented with the variable name or combination. 2012: N = 2827 lnA/F, N = 2.007 FTRetention, N = 1625 4yrGradRate, N = 1715 6yrGradRate, N = 11734 8yrGradRate, N = 1574 4yrGradRateXFTRetention, N = 1651 6yrGradRateXFTRetentione, N = 1665 8yrGradRateXFTRetention, N = 1625 4yrGradRateX6yrGradRate, N = 1625 4yrGradRateX8yrGradRate, N = 1715 6yrGradRateX8yrGradRate, N = 2991 for all other listed pairs. 2013: N = 2899 lnA/F, N = 2020 FTRetention, N = 1677 4yrGradRate, N = 1776 6yrGradRate, N = 1789 8yrGradRate, N = 1602 4yrGradRateXFTRetention, N = 1693 6yrGradRateXFTRetentione, N = 1703 8yrGradRateXFTRetention, N = 1677 4yrGradRateX6yrGradRate, N = 1677 4yrGradRateX8yrGradRate, N = 1777 6yrGradRateX8yrGradRate, N = 2992 for all other listed pairs. 2014: N = 2845 lnA/F, N = 1974 FTRetention, N = 1684 4yrGradRate, N = 1788 6yrGradRate, N = 1794 8yrGradRate, N = 1616 4yrGradRateXFTRetention, N = 1713 6yrGradRateXFTRetentione, N = 1719 8yrGradRateXFTRetention, N = 1685 4yrGradRateX6yrGradRate, N = 1685 4yrGradRateX8yrGradRate, N = 1799 6yrGradRateX8yrGradRate, N = 2977 for all other listed pairs.
Table 4. Function canonical correlation coefficients, eigenvalues, and statistical significance.
Table 4. Function canonical correlation coefficients, eigenvalues, and statistical significance.
FunctionCorrelationEigenvalueWilks LFNum dfDenom dfSig.
2012(1)0.6700.8130.37055.74325761.970.000
2012(2)0.5580.4510.67131.92214488.640.000
2012(3)0.1420.0210.9733.58123128.000.000
2012(4)0.0830.0070.9932.1951565.000.053
2013(1)0.6640.7870.37555.91325865.230.000
2013(2)0.5550.4450.66932.64214569.040.000
2013(3)0.1620.0270.9674.43123184.000.000
2013(4)0.0800.0060.9942.0551593.000.069
2014(1)0.6720.8230.38055.54325916.860.000
2014(2)0.5320.3960.69230.01214609.240.000
2014(3)0.1440.0210.9664.67123212.000.000
2014(4)0.1160.0140.9864.4251607.000.001
Note: These values are presented in a table to improve readability. Only functions 1 and 2 for each year have sufficient eigenvalues for consideration within the canonical solution.
Table 5. Canonical solution for administrative intensity elements predicting student retention by year: 2012, 2013, 2014.
Table 5. Canonical solution for administrative intensity elements predicting student retention by year: 2012, 2013, 2014.
Function 1Function 2
Coefficientrs%rs2Coefficientrs%rs2% h2
lnPF−0.246−0.53728.84−0.273−0.35412.5341.37
0.006−0.54429590.481−0.2858.1237.72
−0.202−0.51426.42−0.205−0.2928.5334.95
lnPA−0.062−0.49624.600.014−0.0520.2724.87
−0.298−0.53628.73−0.723−0.36113.0341.76
−0.110−0.53628.73−0.021−0.0560.3129.04
lnQ−0.716−0.95390.820.0780.0200.0490.86
−0.730−0.93787.800.1130.0800.6488.44
−0.730−0.95491.010.1550.0450.2091.21
Q2/Q10.195−0.2214.88−0.275−0.35512.6017.49
0.213−0.1562.43−0.283−0.32210.3712.80
0.200−0.2064.24−0.246−0.34311.7616.01
M−0.042−0.42518.060.093−0.2004.0022.06
−0.045−0.43719.10−0.087−0.1853.4222.52
−0.037−0.43819.18−0.139−0.2355.5224.71
DEG2−0.157−0.2646.970.3560.2707.2914.26
−0.123−0.2194.800.3470.2948.6413.44
−0.108−0.2355.520.3150.2928.5314.05
DEG3−0.228−0.60836.970.184−0.1632.6639.62
−0.233−0.61938.320.171−0.1452.1040.42
−0.224−0.61738.070.121−0.1853.4241.49
DVRES0.011−0.0520.27−0.796−0.90581.9082.17
−0.017−0.1041.08−0.761−0.89780.4681.54
−0.020−0.0840.71−0.816−0.91383.3684.06
Rc2 63.00 32.90
62.5033.10
62.0030.80
FTRetension−0.488−0.57432.95−0.352−0.77760.3793.32
−0.462−0.62238.69−0.245−0.72552.5691.25
−0.360−0.56932.38−0.192−0.73053.2985.67
4yrgrad1.6830.0190.04−10.261−0.97895.6595.68
1.654−0.0270.07−10.276−0.98797.4297.49
1.4400.0070−10.170−0.99398.6098.61
6yrgrad1.323−0.34211.700.870−0.87776.9188.61
0.790−0.39315.440.622−0.87676.7492.18
2.546−0.35112.320.538−0.89880.6492.96
8yrgrad−2.822−0.40416.32−0.298−0.86073.9690.28
−2.332−0.45820.98−0.126−0.85372.7693.74
−3.946−0.42518.06−0.207−0.87576.5694.63
Note: Function 3 was statistically significant, p < 0.001 in 2012, p < 0.001 in 2013, and functions 3 and 4 were statistically significant p < 0.001 in 2014. However, the eigenvalues were λ = 0.021 in 2012, λ = 0.027 in 2013, and λ = 0.021 and λ = 0.014 in 2014. These explain a very small amount of the variance that were ruled to be impractical. Coefficient = standardized canonical function coefficient. rs = structure coefficient: canonical loadings > 0.3 are useful to the equation. rs2 = structure coefficient squared, variance explained.

Share and Cite

MDPI and ACS Style

Romine, K.D.; Baker, R.M.; Romine, K.A. The Relationship between Administrative Intensity and Student Retention and Success: A Three-Year Study. Educ. Sci. 2018, 8, 159. https://doi.org/10.3390/educsci8040159

AMA Style

Romine KD, Baker RM, Romine KA. The Relationship between Administrative Intensity and Student Retention and Success: A Three-Year Study. Education Sciences. 2018; 8(4):159. https://doi.org/10.3390/educsci8040159

Chicago/Turabian Style

Romine, Kerry D., Rose M. Baker, and Karla A. Romine. 2018. "The Relationship between Administrative Intensity and Student Retention and Success: A Three-Year Study" Education Sciences 8, no. 4: 159. https://doi.org/10.3390/educsci8040159

APA Style

Romine, K. D., Baker, R. M., & Romine, K. A. (2018). The Relationship between Administrative Intensity and Student Retention and Success: A Three-Year Study. Education Sciences, 8(4), 159. https://doi.org/10.3390/educsci8040159

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop