Next Article in Journal
Designing for Resilience: How Dutch Maternity Care Collaborations Anticipate, Adapt, and Thrive during a Pandemic
Next Article in Special Issue
Personality Traits and Business Environment for Entrepreneurial Motivation
Previous Article in Journal
Pandemic Imposed Remote Work Arrangements and Resultant Work-Life Integration, Future of Work and Role of Leaders—A Qualitative Study of Indian Millennial Workers
Previous Article in Special Issue
Economic Growth through the Lenses of Education, Entrepreneurship, and Innovation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Factors Influencing Public Higher Education Institutions’ Performance Reporting in the Romanian Context

by
Adriana Tiron-Tudor
*,
Cristina Silvia Nistor
,
Szilveszter Fekete
and
Andreea Alexandru
Faculty of Economics and Business Administration, Babes-Bolyai University, 500591 Cluj-Napoca, Romania
*
Author to whom correspondence should be addressed.
Adm. Sci. 2022, 12(4), 163; https://doi.org/10.3390/admsci12040163
Submission received: 21 September 2022 / Revised: 1 November 2022 / Accepted: 9 November 2022 / Published: 15 November 2022
(This article belongs to the Special Issue Exploring the Role of Universities in Entrepreneurship Education)

Abstract

:
Our study aims to facilitate a deeper understanding of the factors influencing performance reporting in the specific context of the hybrid higher education system in Romania, a former communist country in Eastern Europe with little experience in managing the notion of public sector performance. Performance reporting impacts higher education institutions’ development. The study’s approach offers opportunities to understand the main factors that influence and are influenced by mandatory elements stipulated in the specific norms in the public-university domain. Institutional and operant theories explain and sustain multilevel (institutional, organizational, and individual) performance-reporting analysis. In terms of research design, the theoretical exploration led us to formulate hypotheses while empirical data were collected from 23 Romanian public universities, ensuring the results’ reliability. The results indicate that the performance-reporting concept and practical demand in public universities depend on both exogenous causes (isomorphic pressures) and endogenous factors (different behaviors of organizations and individual performers). The performance reporting of Romanian public higher education institutions enriches the scientific literature and the practical sphere by offering comprehension of a European country’s evolution with roots in a communist system, having a lot of specific approaches, as a base for comparison with similar Eastern European entities or experienced countries.

1. Introduction

Public universities are evolving to exhibit business-like behavior to agree to the demands of the market as well as national and international competition. Defined in the article as ‘hybrid universities’, our approach is in accordance with the work of Grossi et al. (2020), which used the hybrid concept to represent the application of business-like mechanisms in university management in order to reform themselves. In this context, an overview of the overall ‘health’ of the universities can be managed through the performance concept.
Providing performance-related information concerning economy and efficiency is extremely important in public universities’ performance reporting, becoming one of the pillars of external accountability (Grossi et al. 2019). Performance funding and budgeting add institutional performance to the traditional considerations in state allocations to public higher education institutions (PHEIs) of current costs or student enrollments (Curaj et al. 2015) by allocating resources for achieved rather than promised results (Burke et al. 2002). Moreover, performance reports can be more comprehensive than performance budgeting and funding. The performance information is reported to the government and often disclosed to the media. Publicity is used rather than funding or budgeting to stimulate PHEIs to improve their performances (Burke et al. 2002). Thus, performance reporting may be considered as a method of demonstrating public accountability and encouraging improved PHEIs’ performance.
In response to the repeated calls for accountability, PHEIs are attempting to improve methods of measuring and reporting their performance (Alach 2017; Gordon and Fischer 2018). Financial and non-financial performance indicators were explored as early as the 1960s (Choong 2013); however, no consensus has been reached on which model better displays the efficacy of higher education institutions’ (HEIs’) teaching, research, and public service achievements. For PHEIs, measuring and reporting performance is even more critical in the current environment of significant financial constraints. Their popularity evolved—as they presented bases for better resource allocation, better methods to provide information to the external public, and allowed organizational benchmarking (Turner 2011)—mostly when they were aligned with the mission and strategy that created the performance context (Kauppila et al. 2015).
Many jurisdictions and standard-setting bodies as International Public Sector Accounting Standard Board (IPSASB) state that performance reporting should be included in public sector entities’ annual reports. Moreover, the IPSASB (2015) in ‘Recommended Practice Guideline 3: Reporting Service Performance Information’, notes that service performance information helps users assess how efficiently and effectively public sector entities are using resources to provide services and achieve their objectives, which is an important part of general-purpose financial reports (Rossi and Aversano 2015).
Performance has long been a concern in universities, especially in PHEIs. There are many initiatives concerning PHEIs’ performance reporting in the United States, Canada, and New Zealand (Gordon et al. 2002; Alach 2017). In Europe, there are examples from developed Western countries, such as Finland (Orr et al. 2014), Italy (Bonollo and Merli 2018), Spain and Greece (Brusca et al. 2019; Garde Sanchez et al. 2020), and a few in an Eastern European former communist context (Scott 2007; Dobija et al. 2018).
In particular, as performance-reporting systems are a part of New Public Management (NPM) reform, the development of performance-reporting systems is influenced by context and the successful introduction of NPM. Countries that have been pioneers in the introduction of NPM have more developed systems. The impact of NPM in any country depends on the country’s traditional administrative culture and any administrative regimes inherited from and ingrained in the past (Pollitt and Dan 2013). In recent decades, HEIs have been required to respond to multiple stakeholders’ interests and the intensely governmental nature of public sector organizations’ decision making. The institutional theory explains the actual integration and balance between different performance measures and reporting within organizations providing public services.
Based on these circumstances, the study aims to investigate the factors influencing the reporting of PHEI performance in the Romanian context. The novelty and relevance of the article’s major aim are based on a combination of concepts and approaches defined by scientific literature, transposed into a practical approach specific to the public universities’ sphere in an emerging former communist country. Thus, in the first stage of the research, the current state of knowledge regarding performance reporting in PHEIs was reviewed, followed by an introduction to the Romanian PHEIs’ performance-reporting challenges. The theoretical framework of NPM and institutional-theory isomorphism improved with the operant theory, allowing us to analyze the factors (coercive, mimetic, normative, and operant) influencing public performance reporting in the case of PHEIs. The significant role of human capital in this performance disclosure is analyzed on two levels–as a determinant of the level of funding (teaching and research process) and, as a consequence, through its possibility of employee stimulation (remuneration). In the second stage, through empirical research applied to Romanian universities, we validate the hypotheses developed on the theoretical framework and their relevance in the Romanian context. We determine the effective impacts of the influencing factors on performance reporting, subsequently generating a performance-reporting model.
The research results justify the Romanian PHEIs’ behavior and their capacity to adapt to the new external conditions, such as mandatory performance reporting, competition, and stakeholders’ increased requirements for information. This remark leads to the second significant result of the study: recognizing a real need for voluntary-reporting items in Romania based on HEIs’ autonomy and the dependence on mandatory reports based on state funding policies.
The following aspects ensure the originality of the study. Firstly, it focuses on an essential research topic regarding PHEIs’ performance reporting, considering the fact that the improvement and modernization of the public system (Grossi et al. 2020) have traditionally received international support (e.g., rules, funds, assistance) regarding entrepreneurial behavior (Capella-Peris et al. 2020). More than that, performance measurement has been a major path for preserving public trust and securing continued funding and resources (Lee 2021). Exploring the factors that influence the credibility of the performance measurement system is a serious step in understanding and improving the effectiveness of performance in the public sector (Ghosh and Wu 2012). In this context, we want to fill the gap in the literature regarding the entrepreneurial behavior of HEIs, in terms of performance and its measurement. The connection between financial, non-financial reporting/information role (Grossi et al. 2020), the influence of internal/external factors (Ghosh and Wu 2012), and the request for a realistic reporting system (Caputo et al. 2021) are justified by global reform theories. Secondly, in the empirical sphere, we create a unique performance-reporting model, grounded on an Eastern European ex-communist country’s particularities, which at the same time is a member of the European Union. The paper succeeded in enriching the scientific literature by taking a holistic approach to performance and its reporting in universities, including in comprehensive puzzle elements related to realities and perspectives of evolution, influencing factors, and reporting models. The hybrid-view achievement in public universities’ performance reporting may be more significant as a behavior and effect than in a country where the tradition and experience of a well-established reporting model has created stability and coherence.
Moreover, the study addresses a broad range of users. First, there are theoreticians to understand the evolution of public performance reporting. Second, practitioners familiarize themselves with the implications of the concept analyzed through the eyes of a specialist. Third, for professional bodies/legislators, the study offers a concrete, precise basis for future analysis to improve the mandatory items in national HEIs’ performance-reporting regulations.
The remainder of the paper is structured as follows. The next section presents the debate concerning the reporting performance of PHEIs and then reporting as viewed in the Romanian context, followed by this study’s theoretical framework. The methodological section describes the research design, research hypotheses’ development, sample selection and data collection, and variable description. The results and discussion section begins with the sample description and then discusses each of the analyzed dimensions: teaching, research, interaction with the external environment, and funding. Results for the entire university are presented, and the hypothesis validation is discussed. The conclusion section highlights the theoretical and practical contributions of the study, limitations, and further developments.

2. The Context of the Research

2.1. Debates in the Literature Concerning the Reporting Performance in PHEIs

Reporting the performance of organizations may be a critical element of overall performance. Performance reports involve information about how effectively the organization is fulfilling its mission, expressed in specified goals and objectives (Grossi et al. 2020). Hatry (2013) claims that reporting performance measures to external stakeholders allows citizens, elected officials, and interested parties to understand what public organizations are doing with their allocated resources. External reporting may encourage the organization to perform better on the measures it reports. Further, comparisons made of similar reported measures allow one organization to measure or benchmark success against another.
As has happened in many public service organizations, the last decade has seen unprecedented pressure to reform universities. The most relevant of these reforms is a shift from an elite to a mass higher education system; cuts in state funding and resulting difficulty financing the institutions exclusively with public funds; the emergence of new approaches, such as NPM; and greater competition between universities (Siegel and Wright 2015). Those changes to universities’ hybrid behaviors use private sector mechanisms and tools within the public sector (Grossi et al. 2020).
Growing demands to become more competitive, efficient, effective, and accountable have led to an increased interest in introducing control mechanisms to assess organizational performance. Consequently, performance management systems have been implemented in some universities, and the measurement and reporting of research and teaching performance have become increasingly common within universities. Wide varieties of performance indicators have been developed in different jurisdictions, mostly by government initiatives, to monitor the quantitative aspects of performance (Ter Bogt and Scapens 2012). Here, we mention some examples in the United States (Gordon et al. 2002), Finland (Orr et al. 2014), Italy (Bonollo and Merli 2018), and a few in Eastern European former communist contexts (Scott 2007; Dobija et al. 2018).
Concerning performance reporting at the international level, the IPSASB (2015) advances a principle-based recommended practice guideline (RPG3) regarding reporting the service performance information that may be considered a useful reference for a harmonized performance measurement and reporting system across EU member states (Aversano et al. 2018). RPG3 defines effectiveness, efficiency, inputs, output, outcome, performance indicators, and service performance objectives. Moreover, the implementation examples that accompany RPG3 illustrate the terms defined above. RPG3 mentions that the reporting of service performance information should be annual, and it should cover the same period of reference as the financial reporting covers. IPSASB encourages the disclosure of all additional information relevant to the users.
Performance has long been a concern in higher education, especially in PHEIs, as it is connected with accountability and quality assessment, and international rankings. In this context, PHEIs’ voluntary or mandatory established performance measurements are useful in assessing the progress towards established goals (Kyrillidou 2002). Moreover, there is an increased demand for PHEIs to disclose their contributions to society as part of the third mission regarding teaching and research (Maingot and Zeghal 2008).
In Europe, the United States, Canada, and Australia, central governments have been involved directly in developing ‘indicators’. Thus, the managers of public organizations may not have complete freedom to choose their performance measures. They may have to pay attention to the measures chosen by the government. Even when they must respond to outsiders’ measures, however, the managers of a public organizations are responsible for establishing an internal performance management system, including measurement and reporting performance information that will allow it to manage the organization (Behn 2003). In the case of universities, the performance reporting needs to be connected to the stated mission by revealing whether the goals were achieved (Kauppila et al. 2015). According to this, the performance reporting might include a section dedicated to teaching, a section dedicated to research, and another one dedicated to the relationship with the external environment (Bonollo and Merli 2018).
The performance-reporting initiatives in PHEIs represent a set of changes in the relationship between governments and PHEIs. The state entrusts universities to meet the needs of the national economy in a dynamic global marketplace. However, in the meantime, the state carefully monitors universities’ overall progress and performance following national needs and objectives using financial incentives and disincentives. On the other hand, universities are trying to attract other financial resources from the state, different funders, or private partners. Tensions are possible between the state funding based on the state’s pre-established performance indicators and the university in this context. To obtain state funds, the universities must adjust this behavior (Capella-Peris et al. 2020). In this utilitarian environment, governments will inevitably seek greater accountability and performance (Alexander 1998). States are not driven in this direction because of an authoritative desire to control and regulate (Levine 1997). States attempt to better monitor and assess PHEIs because they are responsible for acquiring more value for resources.
In the particular case of HEIs, some studies have demonstrated that voluntary disclosure results from multiple factors interacting with each other, including regulatory oversight, market forces, costs of disclosure, and organizational structure and governance (Hyndman and Eden 2001). An organization’s mandatory reporting and voluntary disclosure are influenced by the coercion of legislation, reporting and funding regulations, and other stakeholder requirements for information.
By adopting entrepreneurial behaviors and operating in a competitive market, universities have become aware that it is not enough to focus only on short-term economic and financial results to gain credibility and become competitive in the medium and long term. Non-financial information on governance, their social and environmental impact, and universities’ strategies could provide a better approach to addressing stakeholder concerns.
However, the role of financial information and its interdependence with non-financial information should not be overlooked. Favorable performance on non-financial measures seems to be irrelevant when performance on financial measures is unfavorable (Ghosh and Wu 2012). At the same time, financial results indicate the consequences of decisions made in the past and are sometimes unable to show the causal factors that could lead to the given outcomes. In this dispute, universities would no longer have to publish their activities through disconnected financial and non-financial reports but would produce a single integrated report (Caputo et al. 2021; Lee 2021). The harmonization of financial and non-financial standards and research on universities’ IR are steps toward sustainability reports.
Moreover, in public organizations, the state can exercise its coercive powers directly or indirectly. For PHEIs, the state can enact laws and regulations that guide performance disclosure (Bonollo and Merli 2018). The government can also indirectly exercise its coercive powers through institutions or establish specific governmental policies. Another significant fact refers to market forces. For universities, the rankings represent an essential indicator of their position in the national, regional, or worldwide educational market (Urdari et al. 2017). Moreover, performance reporting may be used as an inexpensive marketing tool because the managers may voluntarily disclose that information on performance that gives the organization advantages compared with other educational competitors in the market. Other factors influencing the HEIs’ performance reporting are the ones that are organizational in nature, such as structure, implemented performance measurements and reporting procedures, internal politics (Aversano et al. 2018), and governance in line with its declared mission.

2.2. Romanian PHEIs Main Characteristics

Since 1990, Romania has undergone a dramatic transformation, from a highly centralized totalitarian regime to democratic governance, following a radical change in its political and governing system. These changes affect the education sector as much as the economic sectors (Curaj et al. 2015). The first 15 years (1990–2006) of the new higher education system were characterized by the new elements and changes imposed by the new legislative framework, including the universities’ autonomy, public financing mechanisms and a performance approach (Education Law 84 1995). University autonomy offers HEIs the right to establish and implement their developmental strategies and policies. However, the autonomy was limited in certain aspects, such as the personnel and financial policies that remained under state control. In this period, the number of public HEIs and specialized programs increased. Moreover, Romania became an exciting destination for international students (Pricopie et al. 2009), but Romanian authorities were not prepared to accommodate them. The Bologna process alignment started in 2005, leading to an increase in the European comparability of the Romanian higher education system and the position of Romanian HEIs in the international higher education market (Pricopie et al. 2009).
A change in the national educational strategy intervened in 2007, focusing on promoting excellence and scientific production. All universities were classified according to their mission into three main groups: advanced research and education, education and science, and education-centered (Coste and Tiron-Tudor 2015). Another critical change refers to the ranking of study programs’ and universities’ classifications. This process aimed to provide information to potential beneficiaries regarding the quality of teaching, research, student services, community services, and internationalization, delineated into five categories (A > B > C > D > E, where > means ‘better results than’). Stakeholders argued against the process because the data processing methodology considered in the evaluation process was not made public.
Regarding the public funding of universities, the law introduced different types of financing, depending on their objectives, such as principal financing, complementary financing, institutional development financing, and so forth. The principal financing is allotted according to a per capita cost-differentiation formula, as the main part of the overall universities’ public funding. Complementary financing is based on a qualitative component (i.e., calculated by considering qualitative indicators, which were updated regularly).
The principal financing of universities, according to Education Law 2011, considers the results of the national classification exercise and the different ranking processes. Unfortunately, the implementation of this issue was unsuccessful, as the link between these instruments and the funding methodology was not well maintained. Moreover, even if the law stipulated other forms of disbursing higher education funding, such as institutional development financing that was not influenced by the classification, the subsequent methodologies were developed and implemented only after 2015.
To increase university autonomy and public responsibility, the law proposed that universities establish their mission, institutional development strategy, curricula design and implementation, quality-assurance mechanisms, and financial and human resources management protocols. All these elements must be operationalized using performance measurements and must be periodically reported.
According to the National Education Law No. 84 adopted in 1995, reporting Romanian universities’ performance is mandatory. The rector is responsible for preparing a report describing the institutional state. Over time, the law has undergone several changes, and Law No. 1/2011 repealed it. The new education law brought significant changes, one of which was the rectors’ obligation to publish the annual report on the university’s website (Education Law 2011, art. 130). The report presents the state of the university. The annual report must be publicly available to all interested parties. The minimum information to be provided in the report refers to the following issues: financial situation of the university by funding sources and types of expenses; study programs; staff structure and evolution; results of the research activity; quality assurance internal system; the degree of ethics in all university activities; and the situation of the professional insertion of the graduates from the previous promotions.
The information presented in the report on HEI status is based on relevant performance indicators for each category, demonstrating how financial and human resources have been used to fulfil the missions of teaching and learning, research and community-based services, or the impact on the economic and social environments (Coste and Tiron-Tudor 2015). In addition to demonstrating the performance obtained, the rector’s report is an element of public responsibility and a primary condition for public funding (Education Law 2011, art. 150, paragraph 3, p. 32).
The rector’s report represents the document in which the essential information that defines the HEI is presented. Law does not establish the specific performance indicators for each group of information; the rector can opt for specific indicators. Additionally, many universities provide additional voluntary information concerning significant elements for the previous year’s performance activity to the mandatory requirements. As an example, for 2019, the UBB Rector Report includes voluntary information concerning the following issues: non-traditional education; the University Publishing House (Cluj University Press); university extensions; relationships with the business environment, student and alumni practices; international cooperation; computerization and data communications; communication and public relations; relationships with students; administration and patrimony; and the UBB Centenary.

2.3. Theoretical Framework

The literature supports the implementation of performance reporting from several other theoretical frameworks, such as agency theory, public choice theory, and institutional theory (Grossi et al. 2019). For our purpose, we use the institutional theory, as it explains why organizational structures and practices become entrenched and how and why changes occur (Greve and Argote 2015). According to this theory, institutions impose norms or social coherence on human activity by producing and reproducing an environment for thinking and acting (Burns and Scapens 2000). Research focuses on extra-organizational (social, economic, and political) influences on organizational practices (Fligstein 1998). The theory describes the organization’s ability to change, following how institutionalized norms and values affect assumptions (Liguori and Steccolini 2012) and espouses that the process of change finally generates an isomorphic equilibrium (Dumitru et al. 2014).
The PHEI system is pressed for greater performance and quality. Efforts are made in order to adopt business-like attitudes that will keep them sustainable in a competitive economy and turn them into hybrid entities (Grossi et al. 2020). This global reform movement is sustained by new public management principles (NPMs). According to this theory, public sector organizations can be managed and evaluated in the same way as private organizations—namely, through demands for accountability, transparency, efficiency, and responsiveness (Gomes and Yasin 2017).
Performance indicators intended to measure progress towards established national/international goals (Kyrillidou 2002) are asked to describe their contribution to society, often related to the quality of the university’s teaching and research process (Maingot and Zeghal 2008). They are the cornerstone of adequate governance mechanisms in the universities, based on performance management measurement and reporting.
Romanian higher education reforms are the results of both global pressures and local demands. As a European Union member, we consider that the primary higher education system reforms’ main conditions are justified by global reform ideologies. As part of this, the factors that influence performance reporting are subject to global isomorphic pressures concerning transforming the process of governance models in higher education.
DiMaggio and Powell (1983) defined isomorphism as the factor that encourages the similitude by which institutions tend to adopt the same structure and practices, resulting in their homogeneity (Dobija et al. 2018). There are three processes of institutional isomorphism: coercive, mimetic, and normative. Coercive isomorphism refers to the influence of political and governmental regulations on organizations. In our case, the universities are likely to implement changes within their policies to adjust to the government’s requirements due to coercive pressures (Najeeb 2013). Mimetic isomorphism occurs when actors face uncertainty and try to emulate successful organizations as a solution. With an increasing level of competition and internationalization in the higher education context, universities have tried to model themselves on other prototypes in similar contexts through mimetic processes. Normative isomorphism (Paauwe and Boselie 2003) arises primarily from professionalization. Within the higher education context, professionalization involves two aspects: one is the homogenizing influence of established norms (regulatory bodies), and the other is the professional organizations (e.g., accreditation agencies).
Under Chen et al. (2010) and Dobija et al. (2018), the study identifies the following theoretical decision-making mechanisms related to the isomorphism of Romanian national policy (factors) regarding performance reporting: level of implementation of performance-reporting systems; the size of the institution/type of university (coercive and normative); mobile/immobile resource; financing (coercive); and personnel recruitment/staff remuneration (coercive and normative). In all these items, we can find mimetic learning (Cai 2010), as long as national higher education systems attempt to imitate prosperous nations when they face uncertainties or ambiguous development goals (e.g., Bologna process, European/international universities’ ranking).
However, some studies (e.g., Gonzales 2012) show that institutional analysis in higher education research focuses mainly on policy and management issues. Given that fund allocation (financing) for higher education institutions depends on the quality of human resources evaluated through the research (number/type/indexation of the article) and teaching outputs (Agyemang and Broadbent 2015), we also consider that employees need to have the potential to impact the performance that is measured (Bouckaert 1993). Performance evaluation is a mechanism by which individual goals and behaviors are aligned with organizational objectives, a process that helps employees understand and accept organizational norms (Ayers 2015).
In this sense, our research combines the institutional theory with the operant theory. The reporting systems for which the operant theory may be applied include performance evaluations, whether they be quality control reports, personnel evaluations, or variance reports (Lovata 1992). Many studies conclude that specific behavior results from its consequences (Ulrich et al. 1974). The theory is suitable for our context—the universities’ funding levels depend on the employers’ results. Additionally, the individual income level is influenced by this individual factor. Thus, rewards such as money are considered positive reinforcement if their presence increases the likelihood that the behavior will recur. At the same time, the behavior is most easily modified when it produces a negative consequence. If the expected reward is not satisfactory, the motivation to exercise an increased quality in the developed activity will upturn. If, on the other hand, the system ignores or criticizes the employee’s results (e.g., changing the framing articles, excluding journals/databases, diminishing the article/journal ranking), this consequence is likely to make them avoid working hard in the future. This mix between the institutional and operant theories is meant to justify and analyze the main factors that influence the form and evolution of performance reporting used for Romanian higher education.

2.4. Research Hypothesis Development

The hypotheses developed further refer to the factors likely to influence the reporting of information on the performance of PHEIs in Romania. The main arguments considered for the factors’ inclusion refer to scientific references from the literature and their relevance in the Romanian context.
Hypothesis 1:
Performance reporting is associated with the financial resources attracted: the more resources a PHEI attracts/obtains, the more information is reported.
The literature often mentions the level of financial resources as having a positive significance on reporting information in the public sector (Maingot and Zeghal 2008; Gallego et al. 2010; Gallego-Álvarez et al. 2011). The link between the financial resources attracted by a PHEI and the level of information presented is a topic that, at first glance, seems to have no significance in the public sector. However, the decrease in public resources’ capabilities to satisfy the level, quality, and extent of public needs of the population is an indisputable reality (Manolescu 2009). Thus, the fundamental nature of the traditional relationship between the government and higher education is in the process of undergoing significant changes to sustain more students in attempting to maximize economic returns (Alexander 2000; Garde Sanchez et al. 2020). This unequal ratio can be rebalanced by identifying solutions needed to supplement public funding. State reporting and funding mechanisms for HEIs are in the midst of a significant transformation from an input-based system to a more competitive outcomes-based approach (Aversano et al. 2018). Through mimetic isomorphism, PHIE borrows solutions from private entities’ behaviors (e.g., marketing and promotion policies, a public–private partnership between universities and community, tuition fee strategies, other forms of collaborations with the private environment, and use of bank loans granted to the university/student), which, through coercive and normative isomorphism, are implementing legal rules in universities.
In this context, PHEI might be concerned with finding the proper balance between institutional autonomy and performance-based assessments to become competitive in the market. With greater expectations being placed on it, higher education is obliged to examine itself or be examined by others (Alexander 1998). Accordingly, the growing societal necessity dictates that universities must become more responsive to economic needs and governmental demands for increased performance (Sangiorgi and Siboni 2017; Brusca et al. 2019; Garde Sanchez et al. 2020). In Romania, higher education is funded from the state budget according to the size and the university category, under granted basic and additional financing forms; for the second factor, the higher education institution’s performance is an essential criterion. For primary funding, the government typically determines the value of resources for various students in their fields of study at a centralized level.
Hypothesis 2:
Performance reporting is associated with salary costs.
Universities are ‘communities’, where individuals gather to invest in their human capital (Alexander 2000). Overall, higher education systems’ regulatory framework has become more complex and expensive to sustain, particularly regarding, on the one hand, employee salaries and rights, and on the other hand, employer obligations and contributions related to employees. Moreover, equity issues concerning disability, race, and gender are entered into force (Gordon and Whitchurch 2007).
The reality also shows an increasing diversification of academic tasks (teaching, scholarship, research, consultancy, community service, and administration). Kogan et al. (1994) note that the range of roles that an academic may be expected to undertake can include ‘teacher, scholar, practitioner, demonstrator, writer, model, discoverer, inventor, investigator, designer, architect, explorer, expert, learner, developer, collaborator, transformer, facilitator, enabler, evaluator, critic, assessor, setter, guide, colleague, supervisor, mentor, listener, adviser, coach, counselor, negotiator, mediator, juggler’. Therefore, the historical trilogy (teaching, research, and administration) of academic work (Garde Sanchez et al. 2020) would appear to have been enlarged. Since the public institution’s financial resources did not increase directly proportional to the work’s complexity, the effect was to increase the level of quality/involvement required to correspond to a different reward based on performance criteria.
Thus, performance disclosure appears as a mediating, bidirectional positioned factor. Through coercive isomorphism, the educational institution has conditioned the financing of human capital’s performance, which in turn, according to the operant theory, can be motivated/demotivated by the level of remuneration received. A favorable expectation/reward will determine a motivated behavior in the future, with a favorable effect on the increase in the financing sources, while a negative one will demoralize the human factor, with adverse effects on the future performance indicators. Through normative isomorphism, these human capital politics and rules are put in a particular view to sustain the institutional or personal interests, based on academic particularities.
In too many cases, the primary performance-reporting quantifiable item is considered to be the quality or quantity of research (Dunkin 2005; Siegel and Wright 2015). This approach can decrease the importance of other duties, roles, and functions, especially teaching, serving, and displaying good academic citizenship. As a general remark, Alexander (2000) notes that the tension between the numbers and quality dominates higher education debates in most advanced countries.
Each university is responsible for having the capacity and intelligence to stimulate human capital and build on this capacity, both academic and professional. As Dunkin (2005, p. 8) notes, ‘The capacity to develop business/earn one’s salary/manage ‘client’ relationships, once missing from academics, is now part of the skills repertoire of our next generation of academics’.
In the empirical study, we considered the ratio between costs with salaries and the number of students as a factor that can influence the degree of reporting of information on public service performance. The scientific literature uses this item to a lesser extent (Suryadi 2007), but in this research, we want to show that costs with salaries in the Romanian public sector positively influence the reporting of performance information.
Salary costs were included in the case study because, after studying the annual reports on universities’ states, we came across several documents claiming that the university’s performance was encouraged by providing financial benefits to teachers. Through mimetic behavior, also in some public entities, salary costs are an element that can influence the reporting of information on the performance of services because, in the private sector, the entity’s staff can obtain bonuses from the performance achieved; their payment correlates to the success achieved. Thus, the main arguments supporting the inclusion of variable costs with salaries in the empirical study are given by the university employees’ financial advantage and by the university’s performance objectives, specified in the annual report on the university’s state.
The universities’ management was often interested in the result. The pecuniary benefits granted to the teachers encouraged the performance, thus obtaining at least two advantages for the two parties involved: one for the institution and the other for the academic staff. The publication of scientific research in internationally renowned journals is encouraged by the management of higher education institutions, because, on the one hand, it receives international recognition for research activity, and on the other hand, several performance indicators are met, and more information on scientific research will be published. First, accessing research projects is a method of attracting financial resources, an essential activity for the higher education institution and those involved in developing the projects. Secondly, with the help of research projects, academic staff publish scientific research in international databases and participate in international conferences, thus receiving international recognition for their studies; they are highlighted by the number of citations. The advantage of universities, in this case, is to meet performance indicators.
Hypothesis 3:
Performance reporting is associated with the size of the higher education institution.
One of the most used factors in empirical research, identified as having a significant effect on the level of information presentation, is the institution’s size. The role of this factor in the private sector has a positive impact on information reporting (Glaum et al. 2013), and one of the reasons is the need to inform shareholders about the position and the obtained performance so that the investments are made in the best conditions (Gallego-Álvarez et al. 2011).
In private sector research, the size of a company has been determined by total assets, a form also found in empirical studies in the public sector, in which the positive link between the two has been demonstrated (Gordon et al. 2002; Gordon and Fischer 2018). Researching the university environment has led some studies to choose a specific factor for quantifying the size of a university, and in the literature (Maingot and Zeghal 2008; Gallego et al. 2010; Suhaiza and Nur 2011; Gallego-Álvarez et al. 2011), the number of students is an indicator of the size and has a positive influence on information reporting.
Regardless of how the size of public sector institutions, total assets, or the number of students was determined, the studies above proved a positive link between the institution’s size and a high degree of information reporting. The arguments presented above led us to set the first hypothesis for this study, the total assets being the size indicator of higher education institutions.
Hypothesis 4:
Performance reporting is associated with the quality category of the higher education institution.
Performance disclosure can be used as a tool for HIE to compete in the international university arena and attract students and researchers (Brusca et al. 2019). This target depends on the university’s profile: teaching or research. Thus, the research universities have a more substantial opportunity to obtain more research development funds through national or European competitions. For teaching universities, the theoretical and practical implications of the future profession—collaborations with employers—are just some of the specific information that increases this category’s market credibility. In this context, in line with Maingot and Zeghal (2008), we consider that these entities are stimulated to disclose performance information following their mission (type).
Based on specific legal rules and professionalization implications (coercive and normative isomorphism), university performance disclosure in two directions (research or teaching) becomes a critical factor in stakeholders’ visions. More specifically, Maingot and Zeghal (2008) state that students ‘outcomes are the results of universities’ developed educational offering and activities rather than the results of inputs (selection effects) or exogenous influences, such as economic conditions.

3. Methodology

3.1. Research Design

The investigation performed consists of the following steps. Firstly, in the previous section, based on the literature, we argued that PHEIs report information concerning their performance to demonstrate their accountability, responsibility, and transparency, and different factors influence the level of performance-reporting disclosure. Then, we formulated the hypotheses according to which factors, such as size, resources attracted per student, costs with salaries and category, influence the reporting of performance measurement information. Secondly, a performance-reporting disclosure index (DI) is proposed, based on existing national legislation that includes mandatory reporting elements and international rankings indicators and recommended practices issued by the International Public Sector Accounting Standards Board (IPSASB). The DI captured the general and specific elements required for the preparation of performance measurement reporting. The DI may be a useful tool for Romanian universities in terms of reporting performance information. Thirdly, the data were collected using a sample of PHEIs from economic sciences. Fourthly, we test the hypotheses, running a linear regression. We then correlate the results with the literature, and we discuss and contextualize the results.

3.2. Sample Selection and Data Collection

Of the total, 23 Romanian PHEIs are included in the sample, all of them with at least one specialization belonging to the field of economic sciences. In the case of each university included in our sample, we connected to their official website and identified the reporting section, where we found and downloaded the reports for the last five years (2015–2019). We conducted this analysis between September and November of 2020. We acknowledge that online information is continuously updated, and we marked the date of verification (within the formerly mentioned period of analysis). The cached versions of the websites in the specified interval provide the validity of our data. Afterward, we proceeded to analyze the content of the documents and identify the disclosure index items. Ultimately, we performed the corresponding calculations to attain the disclosure levels concerning each of the dimensions analyzed for each university and each year.
The universities’ distribution, according to the classification made by the Romanian Ministry of Education and Scientific Research (MENCS), is as follows:
  • Universities of advanced research and education: 4;
  • Universities of education and scientific research: 6;
  • Universities focused on education: 13.

3.3. Description of Variables

3.3.1. Dependent Variable

The first step to achieve the research objective was to determine the performance-related disclosure index (DI), acting as a dependent variable. DI might be an acceptable tool for measuring a series of elements in documents published by institutions (Banks and Nelson 1994; Gallego et al. 2010). The information index model used in this study measures the appearance of information in the report and is marked with ‘1’ and ‘0’ if the information is not presented, and the calculation formula of the index is as follows:
D I = i = 1 m d i i = 1 n d i
where:
  • DI = performance-reporting disclosure index;
  • d = the element of form i presented, performance indicator;
  • m = number of items submitted or disclosed;
  • n = number of items expected to be presented.
In order to refine our study, we divided this disclosure index into components:
DITotal = DITeach + DIResearch + DIExt + DIFin
where:
  • DITotal = performance-reporting disclosure index (DI);
  • DITeach = performance-reporting disclosure index regarding teaching activity;
  • DIResearch = performance-reporting disclosure index regarding research activity;
  • DIExt = performance-reporting disclosure index regarding external environment;
  • DIFin = performance-reporting disclosure index regarding financials;
In the construction of DI (Table A1), performance information was selected based on the following arguments:
In the literature, there are a series of studies using different DI constructs. For this study, among them, the ones that are relevant are those that: used the indicators requested by legislation in force (Gordon and Fischer 2018; Maingot and Zeghal 2008; Suhaiza and Nur 2011; Montesinos et al. 2013; Gomes and Yasin 2017) or are proposed by certain empirical studies (Gordon et al. 2002; Gallego et al. 2010; Gallego-Álvarez et al. 2011; Dobija et al. 2018). Based on these reasons set out above, the performance indicators included in the disclosure index include 46 elements (listed in Table A1) grouped into four categories as follows:
  • Teaching process: 15 performance indicators;
  • Research: 13 performance indicators;
  • External environment: 11 performance indicators;
  • Financial resources: 7 performance indicators.
The information on the performance included in the report on the state of the university has the central role of demonstrating how the material, human, financial, and informational resources were managed but also the degree of fulfillment of the objectives proposed by the rector in the management contract and the commitment to fulfilling them once taking office for four years. Thus, we analyzed to what extent the rector’s annual report covers the mandatory disclosure requirements stipulated by law (Education Law 2011, no. 1, art. 130, paragraph 2), and those included in addition to our index based on IPSAS recommended good practices concerning service performance reporting and international educational rankings.

3.3.2. Independent Variables

Based on the elements used in the studies in the literature, the hypotheses were defined. To determine the level of reporting information on the performance of services of higher education institutions in Romania, we used the following factor variables: financial resources attracted per student, personnel costs, and size and university category. An element of novelty consists of testing the influence of salary costs on reporting information on Romanian universities’ performance.
To determine the financial resources attracted, the personnel costs, and the size of the higher education institution, we collected the data from the financial statements, documents that are required to be published on each university’s website. In order to eliminate size effect of the financial information, we reported the incomes, costs with salaries, and total assets to the number of students, and data collected from the annual reports. The university’s categorization in advanced research and education, education and scientific research, and education centered was extracted from the national classification (Education Law 2011), and three control variables were considered. The first two control variables were introduced in regression to interpret the third data, which were not included. The choice of the first two variables is motivated by higher education institutions’ membership in the category that includes scientific research activity. We show the dependent and factorial variables in the statistical model to test the hypotheses (Table 1).

3.4. Development of the Linear Regression

In order to determine the influences on the DI in Romania and to test the hypotheses, we formulated a hierarchical linear model, through which the dependent variable (DI) is linked to its factors (explanatory variables) as discussed in the literature and the design of the hypotheses. Hierarchical linear regression is the statistical tool that reorganizes information and examines the consistency of the underlying theory (Lindenberger and Pötter 1998), similar to the least squares method (Osborne 2000). The analysis that can be performed—namely, modeling both within and between individual variations (Terracciano et al. 2005)—motivates the hierarchical linear model’s choice. We used the SPSS 2.0 program to develop the model and explore statistical data.
The variables included later were selected to improve the model and explain the model’s variation. The following factor variables were included in the first level: financial resources attracted per student (RES) and costs with salaries (SAL). These explanatory variables are indicators important for measuring the institution’s financial performance, showing the economic and financial aspects of the operational activity. In the second level, the variable size of the university (SIZE) was introduced. We chose to include the university category factor variable in the third level, used as the control variable. The advantage of using hierarchical linear regression is the gradual highlighting of the influence of factors on reporting information on service performance.
The hierarchical linear regression used in determining econometric models has the following form:
DI = ∝0 + ∝1 RES + ∝2 SAL + ∝3 SIZE + ∝4 CAT_1 + ∝5 CAT_2 + ∝5 CAT_3 + ε

3.5. Sources of Data

We studied the annual reports of HEIs for five years, and hand collected data for each year. Since this study does not follow the evolution of performance information reporting, but there is a causal relationship between the dependent and factor variables in terms of direction, level, and significance, we merged these years into one singe period because the conditions of analysis are identical throughout the years, without fundamental legislative changes and no organizational and accounting perspective changes. This way, our pooled sample increased in observations and provided opportunity for better data analysis.
Pooled data is a modified cross-sectional method to analyze fixed and random effects (Johnson 1995). The reason for choosing this technique is argued by increasing the statistical significance of the model or comparing institutions’ effects (Johnson 1995). The purpose of applying this analysis is to determine the random effects, which allows the estimations of the differences (Jesilow and Ohlander 2010) between higher education institutions. Thus, each university will have four records in the study. Fixed effects, resulting from the model’s application, exhibit the characteristic of not showing variations over time, regardless of the measured and unmeasured effects (Johnson 1995).
Based on the content analysis of 115 reports published by the 23 public universities in Romania, we obtained a statistically significant model over 5 years.

4. Results

The empirical research highlighted the extent to which the information on service performance is presented in the annual report prepared by the HEIs. Linear regressions, composed of factors whose influence has been demonstrated in various studies found in the literature, were tested, and from a statistical point of view, their significance on the presentation of information on service performance was reconfirmed.

4.1. Sample Description

The sample of PHEIs consisted of 23 public universities. For each university, the rector’s reports for the previous five years were analyzed in detail. The average level of reporting information on the Romanian PHEI environment’s performance is 54%, meaning that the PHEI presents only half of the relevant information that reflects the performance obtained. The lack of a standardized model for reporting information on performance leaves the assessment of the report’s content and format on the institution’s state to the management. Some of the reports studied were rich in content and described several procedures and events, but the information considered essential by international classifications, literature in the field, and national and international bodies was not presented. Reports belonging to the teaching process and the financial dimension, presented 60% and 71%, respectively, of the indicators followed in this study. Scientific-research activity is described in the annual reports, but the presentation level that reflects the performance is 49%.
Regarding the interaction with the external environment, the information is presented in a proportion of 42%, most often providing data for the number of partnerships with private environments and other higher education institutions. In contrast, the number of international students or the number of study programs in other international circulation languages was rarely presented. The variables’ descriptive statistics (Table 2) reflect the results obtained and described above.
Firstly, the hierarchical linear regression was tested by using as a dependent variable the PRDI on the teaching process (DITeach). Secondly, we used the PRDI of the research activity (DIResearch) and, thirdly, the PRDI for the relationship with the external environment (DIExt). Fourthly, we tested the financial dimension index (DIFin). Thus, the regression was run for the four dimensions in which the initially established performance indicators were included, motivating this choice by the homogeneity of the indicators in each dimension. Subsequently, the hierarchical linear regression was tested by using as a dependent variable the presentation index that contained the performance indicators set for each dimension (DITotal) to test the variation of the information presented according to the selected factor variables.

4.2. The Level of Disclosure Concerning the Entire University Performance

The university DI for the four dimensions and factor variables—resources attracted per student, salary costs, size of university and category of universities classified as advanced research and education institutions, according to the Pearson correlation coefficient—shows a strong correlation. The correlation matrix for the information presentation index is shown in Table 3.
The Enter method was used to enter the statistical model of all the factor variables to explain the connection between them and the dependent variable. Level 1 included factorial variables: (1) financial resources attracted per student because in empirical studies in the literature the importance of the variable was demonstrated, and (2) salary costs because, on the one hand, the variable was not found in public sector studies (the novelty of this study), and on the other hand, we wanted to demonstrate the implications of the variable on the performance reporting. In levels 2 and 3, new factorial variables were introduced: size, respectively, and the university category (Table 4). The factorial variables listed above have a positive impact on the level of reporting of information on higher education institutions’ performance. The collinearity between the factorial variables is accepted because the VIF value is smaller than 3.
The factor variables included in the statistical model explain the level of reporting of the information on the performance of higher education institutions in Romania in a proportion of 68%; the value R2 is 0.677. In this case, too, the model’s representativeness gradually improved, increasing significantly in model 3. The value recorded by F is 24,365, with a significance of 0.000. In model 2, the variation of DITotal is influenced by wage costs, as in model 1. In model 3, the significant increase in the index of information on service performance is influenced by category 1 and category 2 universities, compared with education universities (category 3). Compared with higher education institutions classified as universities of education, those in the category of advanced research and education and universities of education and scientific research present more performance information.
The robustness of the model was tested by two methods: the homoscedasticity test and the Shapiro–Wilk test. The regression model has robustness because all the hypotheses regarding the normal distribution of residues were accepted.

4.3. The Level of Disclosure Concerning Each of the Four Components of University Performance Disclosure

4.3.1. Teaching Dimension of PHEIs’ Performance Reporting

The correlation between the variables was tested using the Pearson coefficient (Table 5). The test results reveal a strong correlation between DITeach and the factor variables RES, SAL, and CAT_1. Thus, we can conclude that the resources attracted per student, salary costs, and the inclusion of universities in the advanced research category and education, positively impact the level of reporting of information on higher education institutions’ performance. The correlation between DITeach and university size (SIZE) is average. DITeach is correlated at level 0.32 with the financial resources attracted per student; in other words, the higher the financial resources per student, the higher the level of reporting the information on the teaching process’s performance, and vice versa. The collinearity between the factorial variables is an accepted one because the value of the variance inflation factor or VIF is below 3, with the collinearity being within the tolerated limits.
In model 1, the variation of the dependent variable DITeach is influenced by salary costs, so a high level of salary costs leads to more information on the teaching process’s performance. In model 2, the disclosure index is significantly influenced by wage costs. In model 3, the university category influences DITeach. In all three models, the size of the university does not influence the variation of DITeach. The model’s robustness was verified with the Shapiro–Wilk test and the homoscedasticity test, with all hypotheses regarding the normal distribution of residues being accepted. The tested model can be considered an average because the value recorded by the coefficient of determination is 0.43, meaning that the selected factorial variables influence the degree of reporting of the information on the teaching process’s performance in a proportion of 43%. This shows how much of the variation in the performance index is measured by the analyzed factors. The significance test has a limit of 0.05, and the result obtained falls within this limit (the value F being 8.804, with a significance of 0.002), thus obtaining a significant regression model. In this case, too, the model’s robustness was tested, accepting all the hypotheses regarding the normal distribution of residues.

4.3.2. Research Dimension of PHEIs’ Performance Reporting

Regarding the level of research activity performance reporting, salary costs and advanced research and education universities give positive influences on DIResearch. The classification of universities in education and scientific research negatively influences the level of reporting of information on research activity performance. The correlation of the variables is shown in Table 6. The value recorded by the VIF indicator is below 3, which allows us to accept the model because the collinearity falls within the tolerated limits.
The level of reporting of information on research activity performance is influenced by the costs with salaries, as demonstrated by all three models. The result is explained by teachers’ financial benefits for publishing scientific research in internationally renowned journals. Thus, the first benefit resulting from the publication of articles is related to teachers. The second benefit is for the university, which, in addition to receiving international recognition for its research activity, meets several performance indicators and will publish more information on scientific research. The research projects accessed attracted financial resources, and the people involved were remunerated for the activity carried out within the project. However, at the same time, the project’s performance indicators bring the university the advantage to excel in research activity by publishing articles and participating in various conferences, indicators demonstrating the performance of higher education institutions for the research mission. The higher the salary costs, the more performance the university will publish on the research activity In model 3, the variation of DIResearch is influenced by the salary costs, the financial resources attracted per student, and by the classification of the universities in the category of those of advanced research and education. R2 is 0.487, meaning that the selected factorial variables influence the variation of the degree of reporting the information on the performance of the research activity in a proportion of 49%. The value shows a significant jump from model 2 to model 3, thus demonstrating the gradual improvement of the model’s representativeness; however, the significance of coefficients is more important for our study since these indicate the direction and level of influence of factor variables.
The hierarchical linear regression model is robust because all assumptions regarding the normal distribution of residues have been accepted.

4.3.3. Interaction with External Environment Dimension of PHEIs’ Performance Reporting

The correlation between the variables and the index of presentation of information on university performance concerning the external environment is positively influenced by the factorial variables: resources attracted per student, salary costs, and university size and category (Table 7) The correlation between the factorial variables is strong or average, but they are accepted because the VIF indicator’s value is below 3%, with the collinearity falling within the tolerated limits.
Following the same procedure as in the previous cases, all three modes were tested. In model 3, the value of R2 is 0.354; the factor variables influence the degree of reporting of the information regarding the university performance for the interaction with the external environment in a proportion of 35%. The model’s representativeness gradually improved, and the value R2 increased significantly from model 2 to model 3. The value of F is 6.366, with a significance of 0.000. In model 1, the variation of DIExt was demonstrated, depending on the salary costs. Model 2 retains the same influence. In model 3, the category of universities classified as advanced research and education institutions and those of education and research represent factorial variables that influence the variation of DIExt.
The PRDI concerning financial dimension is positively correlated with the resources attracted per student, salary costs, the university’s size, and the category of universities assigned to advanced research and education institutions; the results are presented in Table 8. There is a negative correlation between the factorial variables university size and university category classified as educational and research institutions. Moreover, in this case, the VIF indicator’s value is below 3, so the variables included in the model are accepted because they fall within the tolerance limit.
Model 3 was formed for testing the variation of DIFin according to the factor variables, which are explained in a proportion of 51%; the representativeness of the regression gradually improved from one model to another. Costs with salaries are statistically high in all three models. The university’s category influences the presentation of financial results, and compared with universities focused on education, higher education institutions in the higher category will present more information on funding activity. This result is explained by universities’ abilities of advanced research and education to attract financial resources from various activities. In this case, too, the model’s robustness was tested, accepting all the hypotheses regarding the normal distribution of residues.

5. Validation of Hypotheses and Discussion

The resources attracted per student is a controversial variable for the public sector, being included in a small number of empirical studies, but its positive influence on the degree of information reporting has often been demonstrated (Maingot and Zeghal 2008; Gallego-Álvarez et al. 2011) using the Pearson coefficient. The reporting of information on service performance is positively associated with the level of financial resources attracted by public universities in Romania. The results of the Pearson coefficient from regression testing on research are shown. Otherwise, the financial resources attracted per student do not significantly influence the reporting of information on the research activity performance.
The novel factor variable ‘costs with salaries’ positively influences the level of reporting information on the performance of the teaching process, research activity, the relationship with the external environment, financial activity, and activities listed above as a whole.
The size of higher education institutions in Romania, determined by the total assets related to the number of students, is positively associated with reporting performance information. As in the other empirical studies found (Maingot and Zeghal 2008; Gordon and Fischer 2018; Gallego-Álvarez et al. 2011), we demonstrated that in Romania, too, the size of the university positively influences information reporting. This result is interpreted as follows: higher education institutions with several assets report performance information at a significantly higher level, and vice versa, with the effects and cause being determined by logical reasoning. The exception to the rule was encountered in reporting information on the research activity’s performance, where the university’s size does not influence the reporting process.
In most cases, the university category is a factor with a significant influence on the presentation of information on service performance, as has been demonstrated in the literature (Gallego-Álvarez et al. 2011). The variable was used with the classifications made by MENCS, and higher education institutions included in the category of advanced research and education universities present more performance information, which positively influences the reporting in all situations analyzed.
The reporting of information on the teaching process’s performance is positively influenced by the financial resources attracted per student, salary costs, and size and category of higher education institutions. The larger the educational institutions, and the more they are part of advanced research and education universities, the more information they present. This result indicates that universities that have gained a reputation over time are also those that have developed because they met users’ expectations and maintained their position in academia, bringing improvements and diversifying their educational offerings. Reporting information on performance in scientific research is influenced by salary costs and the category of the university. The stability of higher education institutions has attracted experience in scientific approaches and in proposing new research projects. The presentation of information attesting to scientific research performance would not be possible if several indicators were not met. The publication of research in renowned journals, indexed in databases, is encouraged by the university, and in some cases, financial benefits are offered. Firstly, the financial benefits obtained by the academic staff in cases of exceptional results were confirmed through the information presented in the annual report on the state of the university. Secondly, they were demonstrated by the empirical research conducted.
The resources attracted per student, salary costs, and size and category of the university outline the relationship with the external environment and the positive influence on information reporting. The results obtained in this case depend on the number of partnerships concluded with universities in different states. The number of international students who choose to study at a partner university in Romania increases, thus validating teaching and research performances and attracting financial resources. Partnerships with private entities and attracting resources from this environment also show the trust of private entities, building a strong and lasting relationship with the external environment while validating education and research conducted at the university.
In Romania, financing from the state budget is made according to the university’s size and category, being granted basic and additional financing, the latter based on the higher education institution’s performance, aspects supported by MENCS. From the empirical research, we can conclude that the presentation of information on financial performance is positively influenced by the resources attracted per student, salary costs, and size and category of higher education institutions, classified as universities of advanced research and education.
The results obtained by applying the model with all four dimensions, taken as a whole, demonstrate that the level of reporting information on the performance of the higher education institution is positively influenced by financial resources attracted per student, salary costs, and size and category of a university in the best category—namely, universities of advanced research and education (Table 9).
The variation in the level of reporting of information on higher education institutions’ performance has different characteristics, depending on each dimension considered in the study. The formulated hypotheses were partially validated and are briefly presented in Table 10.
Hypothesis 1 has not been validated in any model, so the resources attracted per student using our data sample is not associated with reporting of performance information. The results obtained are contrary to expectations, but we can say that in Romania, the financial resources attracted per student do not influence the reporting of information on service performance.
Most of the time, hypothesis 2 was validated, but, contrary to expectations, in the regression in which the variable ‘university category’ was introduced, it was shown that salary costs do not influence performance information reporting. Instead, the variation in the level of information presented on research and funding performance is influenced by wage costs.
The assertion presented by hypothesis 3 was validated in a single version of the models (DIResearch). Thus, the reporting of information on the performance of research is influenced by the university’s size. In the other models, the hypothesis was not confirmed. As we described before, in Romania, higher education is funded from the state budget according to the size and the university category. Therefore, especially for the research domain, the visibility of the results helps not only in the international recognition of the university (rankings and databases) but can also bring in counterpart national and international financial sources.
Hypothesis 4 was validated in all five cases, with the university category being thus an essential factor in determining the level of presentation of performance information and influencing the variation of reporting.
The regression efficiency test is an important element underlying the model and the conclusions drawn from the data analysis, and the homoscedasticity test was performed to validate the results. The homoscedasticity test involves scattering the dependent variables’ values without influencing the factorial variables’ values (Bai et al. 2016; Balkin et al. 2016). Based on the data obtained from the homoscedasticity verification, we can state that the tested model is an efficient one, and the conclusions formulated based on the obtained results are validated. The Shapiro–Wilk test was also performed to determine whether the residue had a normal distribution. This test involves validating the hypothesis that the residues have a normal distribution; otherwise, the hypothesis is rejected (Shapiro and Wilk 1965). The hypothesis is accepted if the value of Sig. > 0.005. We obtained a normal residue distribution for all models run, so the normal distribution hypothesis was accepted.
In the university context, the follower of entrepreneurial behavior, the (mainly) voluntary content, and involuntary information are adapted to the ‘stakeholders’ requirements to increase the university’s attractiveness and credibility on the market.

6. Conclusions

A university is a remarkable framework for the study. During the past few decades, universities’ behavioral contexts have changed from an exclusive public principle to a hybrid view. These actions define adjustments at the national level that have had several effects at the organizational and individual levels. Our study poses an essential argument regarding performance measurement reporting in HEIs, analyzed from a theoretical and empirical point of view through the factors that influence it. Customization on Romania’s HEI particularity increases the final value of research because it fulfils an essential gap in the literature regarding this subject approach in former communist countries of Eastern Europe.
Our results sustain that universities’ performance management systems must be designed to respond to the various external and internal stakeholders’ information requirements. In universities with hybrid behavior, the content of voluntary and mandatory information is adapted to the ‘stakeholders’ requirements to increase its attractiveness and credibility in the market.
Hence, it is directed towards improving universities’ quality (i.e., in teaching and scientific research). Under the pressure of rationality and efficiency of public financial resources, we noticed that establishing a performance-driven system for higher education is determined by linking universities’ governmental funding with an assessment of the results.
Our study reveals that financial resources attracted per student, salary costs, size, and category influence performance reporting in Romanian HEIs. To increase their accountability, responsibility, and transparency, public universities are more exposed to coercive isomorphism linked to legislation—primary funding, salary costs, and size established according to the Ministry’s rules. The category of universities’ or employees’ behavior (which affects the salary cost in the research component) also has a significant normative isomorphism—starting, for example, from the influence of international rankings or accreditations. This is why we also insist on examining employees’ attitudes through the operant theory. The link between the institutional and operant theories contributes to a better understanding of dependence amongst HEIs and their employees. In general, HEIs are less likely affected by mimetic isomorphism, although we can find them assuming behaviors specific to the private sphere, given their evolution towards a hybrid one. In this approach, based on mimetic isomorphism, we found that size is significant in explaining the total extent of disclosure, in accordance with most previous research that has found that public university size (such as a corporate one) positively influences the amount of information disclosed on websites and webpage navigability. Therefore, institutions may experience all three types of isomorphism to legitimize themselves, although in different degrees.
Based on previous approaches, Romanian higher education is subject to global isomorphic processes in the context of internationalization. The isomorphism pressure is felt on Romanian HEIs, mainly from the moment of accession in the European Union or several international organizations in the field. The coercive forces lead to the homogenization of behaviors and internal structures of organizations, as they impose by law a single set of criteria, standards, and indicators. The reference to European models and the adoption of their characteristics (mimetic) creates similar behaviors at the international level (e.g., Bologna system, articles ranking, subject categories, or teaching plans). Human resources, through employees’ attitudes (normative isomorphism), is requested to sustain the institutional or personal interests, based on academic particularities.
Based on these factors, by empirical research, our study proposes a performance-reporting model based on Romanian HEIs’ analysis, with applicability to all public HEIs. Our performance reporting aims to fix responsibility for performance results by suggesting a limited list of common indicators for use in the institutional reports on performance. They are in accordance with IPSASB principles; thus, they also have international approval.
The paper succeeds to add value to scientific literature from several points of view. Firstly, it contributes to the development of knowledge by covering a gap in the literature concerning Eastern Europe PHEIs’ performance-reporting literature. From a theoretical viewpoint, our paper manages in an original way to combine institutional theory with operant theory to highlight and explain the factors that influence the HEIs’ performance-reporting system. In HEIs, the particularity of the involvement of the human factor in quantifying performance level generates effects for both the behavior of the institution and the academic staff. This is why we combine the two theories we consider useful for the elaborated research. Secondly, from an empirical viewpoint, the study proposes a performance-reporting model based on mandatory performance-reporting issues, international rankings indicators, and international recommended practices. The reporting model is developed on four universities’ dimensions: teaching, research, interaction with the external environment, and financial resources.
The practical implications of the study are more important as performance is connected with factors that influence it and these factors become key elements in the development and assessment objectives of any public institution. HEIs are no exception, linking to a competitive process not only at national level but especially international. The conception of a performance reporting model based on the particularities of the Romanian system and the insertion of Romanian universities in international systems (e.g., Bologna) or rankings increases its applicability to other HEIs.
Nevertheless, this research has certain limitations. Despite its advantages (uniqueness and low visibility), an Eastern European country’s analysis may contain some specific particularities due to history or political, economic, or social influences that can affect comparability/generalization with HEIs with long traditions of democratic regimes. From an empirical point of view, the restriction related to the number of public universities and the ability to collect other material can be considered a limitation. For future research, we consider the obvious potential for cross-country and longitudinal studies on performance-reporting systems.

Author Contributions

Conceptualization, A.T.-T.; methodology, S.F.; validation, S.F.; formal analysis, S.F.; investigation, C.S.N.; resources, A.A.; data curation, A.A.; writing—original draft, C.S.N.; writing—review & editing, A.A.; supervision, A.T.-T., project administration, C.S.N.; funding acquisition, A.T.-T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Performance-reporting disclosure index.
Table A1. Performance-reporting disclosure index.
Teaching dimension Performance InformationARWUTHEQSU-MultirankAuthors
5.Number of students at bachelor level X Lukman et al. (2010); Ter Bogt and Scapens (2012); Asif and Searcy (2013)
2.Number of students at master level X
3.Number of students at doctoral level X
4.Number of graduates at bachelor level X XLukman et al. (2010); Shin (2010); Ter Bogt and Scapens (2012); Asif and Searcy (2013)
5.Number of graduates at master level X X
6.Number of graduates at doctoral level X
7.Number of graduates at bachelor and master level me X
8.Number of graduates with prestigious prizesX
9.Number of PhD obtained abroad or in joint supervision XAl-Ashaab et al. (2011); Ramos-Vielba et al. (2010); Albats et al. (2018)
10.Number of PhD with prestigious prizes X
11.Insertion in the labor market XXAsif and Searcy (2013); Lukman et al. (2010)
12.Teachers with prestigious prizesX
13.Number of programs of studies Suryadi (2007)
14.Didactic staff periodical assessment X Suryadi (2007); Asif and Searcy (2013); Ter Bogt and Scapens (2012); Lukman et al. (2010)
15.University’s reputation in teaching X
Research dimension16.University’s reputation in research X
17.Number of publications XSuryadi (2007); Guthrie and Neumann (2007); Lukman et al. (2010); Al-Ashaab et al. (2011); Ter Bogt and Scapens (2012); Asif and Searcy (2013)
Kauppila et al. 2015; Albats et al. 2018; Bonollo and Merli (2018)
18.Number of publications with coauthors from abroad X
19.Ratio between publications number and teaching staff X
20.Number of publications with teachers form the region X
21.Number of publications in Nature and ScienceX
22.Number of interdisciplinary publications X
23.Number of publications indexed in Science Citation Index and Social Science Citation Index X
24.Number of citations XXXSuryadi (2007); Guthrie and Neumann (2007); Lukman et al. (2010); Al-Ashaab et al. (2011); Perkmann et al. 2011; Rossi and Rosli 2015; Asif and Searcy (2013)
25.Number of research projects X
26.Number of patent cited publications X
27.Number of inventions certificates and patents X
28.Number of post-doctoral places XSeppo and Lilles (2012)
External environment dimension29.Number of international students X Lukman et al. (2010); Hegarty (2014)
30.Ratio between number of students with Romanian residence and international students X
31.Number of students from mobilities XLukman et al. (2010); Hegarty (2014)
32.Number of international professors XXBonollo and Merli (2018)
33.Ratio between the didactic staff with Romanian residence and from abroad X
34.Number of study programs la bachelor and master level in foreign languages XAsif and Searcy (2013)
35.Number of partnerships with other universities XAlbats et al. (2018)
36.Number of partnerships with companies and not for profit entities XSeppo and Lilles 2012; Asif and Searcy (2013); Rossi and Rosli 2015
37.Number of students in internships XLukman et al. (2010); Al-Ashaab et al. 2011; Bonollo and Merli (2018)
38.Number of spin-offs XSeppo and Lilles 2012; Rossi and Rosli 2015; Asif and Searcy (2013); Bonollo and Merli (2018)
39.Number of start-ups X
Financial dimension40.Financial resources obtained from the state budget Seppo and Lilles (2012); Francesconi and Guarini (2018)
41.Financial resources obtained from research activities X XGuthrie and Neumann (2007); Shin (2010)
42.Ratio between financial and research resources X
43.Financial resources obtained from private sector XSeppo and Lilles 2012; Asif and Searcy (2013); Rossi and Rosli 2015
44.Financial resources obtained from lifelong learning trainings and professional programs XGuthrie and Neumann (2007); Hegarty (2014)
45.Financial resources obtained from local partners XAsif and Searcy (2013)
46.Ration between financial resources and didactic staff X Guthrie and Neumann (2007); Hegarty (2014)

References

  1. Agyemang, Gloria, and Jane Broadbent. 2015. Management control systems and research management in universities: An empirical and conceptual exploration. Accounting, Auditing and Accountability Journal 29: 1018–46. [Google Scholar] [CrossRef] [Green Version]
  2. Alach, Zhivan. 2017. The use of performance measurement in universities. International Journal of Public Sector Management 30: 102–17. [Google Scholar] [CrossRef]
  3. Al-Ashaab, Ahmed, Myrna Flores, Athanasia Doultsinou, and Andrea Magyar. 2011. A balanced scorecard for measuring the impact of industry–university collaboration. Production Planning and Control 22: 554–70. [Google Scholar] [CrossRef] [Green Version]
  4. Albats, Ekaterina, Irina Fiegenbaum, and James A. Cunningham. 2018. A micro level study of university industry collaborative lifecycle key performance indicators. Journal Technological Transformation 43: 389–431. [Google Scholar] [CrossRef]
  5. Alexander, F. King. 1998. The Endless Pursuit of Efficiency: The International Movement to Increase Accountability and Performance in Higher Education. Paper presented at the ASHE Annual Meeting Paper, Miami, FL, USA, November 5–8. [Google Scholar]
  6. Alexander, F. King. 2000. The Changing Face of Accountability: Monitoring and Assessing Institutional Performance in Higher Education. The Journal of Higher Education 71: 411–31. [Google Scholar] [CrossRef]
  7. Asif, Muhammad, and Cory Searcy. 2013. Determining the key capabilities required for performance excellence in higher education. Total Quality Management and Business Excellence 25: 22–35. [Google Scholar] [CrossRef]
  8. Aversano, Natalia, Francesca Manes-Rossi, and Paolo Tartaglia-Polcini. 2018. Performance Measurement Systems in Universities: A Critical Review of the Italian System. In Outcome-Based Performance Management in the Public Sector. Edited by Borgonovi Elio, Anessi-Pessina Eugenio and Bianchi Carmine. Berlin: Springer, pp. 269–88. [Google Scholar]
  9. Ayers, Rebecca S. 2015. Aligning Individual and Organizational Performance: Goal Alignment in Federal Government Agency Performance Appraisal Programs. Public Personnel Management 44: 169–91. [Google Scholar] [CrossRef]
  10. Bai, Zhidong, Guangming Pan, and Yanqing Yin. 2016. Homoscedasticity tests for both low and high-dimensional fixed design regressions. arXiv arXiv:1603.03830. [Google Scholar]
  11. Balkin, Richard S., Katelyn M. Richey Gosnell, Andrew Holmgren, and Jason W. Osborne. 2016. Nonlinear Analysis in Counseling Research. Measurement and Evaluation in Counseling and Development 50: 109–15. [Google Scholar] [CrossRef]
  12. Banks, William, and Morton Nelson. 1994. Financial disclosures by Ontario universities: 1988–1993. Journal of International Accounting Auditing and Taxation 3: 287–305. [Google Scholar] [CrossRef]
  13. Behn, Robert D. 2003. Why measure performance? Different purposes require different measures. Public Administration Review 63: 586–606. [Google Scholar] [CrossRef]
  14. Bonollo, Elisa, and Mara Zuccardi Merli. 2018. Performance Reporting in Italian Public Universities: Activities in Support of Research, Teaching and the “Third Mission”. In Outcome-Based Performance Management in the Public Sector. Edited by Borgonovi Elio, Anessi-Pessina Eugenio and Bianchi Carmine. Berlin: Springer, pp. 307–30. [Google Scholar]
  15. Bouckaert, Geert. 1993. Measurement and meaningful management. Public Productivity and Management Review 17: 31–43. [Google Scholar] [CrossRef]
  16. Brusca, Isabel, Sandra Cohen, Francesca Manes-Rossi, and Giuseppe Nicolò. 2019. Intellectual capital disclosure and academic rankings in European universities: Do they go hand in hand? Meditari Accountancy Research 28: 51–71. [Google Scholar] [CrossRef]
  17. Burke, Joseph C., Henrik Minassians, and Po Yang. 2002. State Performance Reporting Indicators: What Do They Indicate? Planning for Higher Education 31: 15–30. [Google Scholar]
  18. Burns, John, and Robert W. Scapens. 2000. Conceptualizing management accounting change: An institutional framework. Management Accounting Research 11: 3–25. [Google Scholar] [CrossRef]
  19. Cai, Yuzhuo. 2010. Global isomorphism and governance reform in Chinese higher education. Tertiary Education and Management 16: 229–41. [Google Scholar] [CrossRef]
  20. Capella-Peris, Carlos, Jesús Gil-Gómez, Manuel Martí-Puig, and Paola Ruíz-Bernardo. 2020. Development and validation of a scale to assess social entrepreneurship competency in higher education. Journal of Social Entrepreneurship 11: 23–39. [Google Scholar] [CrossRef]
  21. Caputo, Fabio, Lorenzo Ligorio, and Simone Pizzi. 2021. The contribution of higher education institutions to the SDGs—An evaluation of sustainability reporting practices. Administrative Sciences 11: 97. [Google Scholar] [CrossRef]
  22. Chan, Vivian. 2015. Implications of key performance indicator issues in Ontario universities explored. Journal of Higher Education Policy and Management 37: 41–51. [Google Scholar] [CrossRef]
  23. Chen, Huifa, Qingliang Tang, Yihong Jiang, and Zhijun Lin. 2010. The role of international financial reporting standards in accounting quality: Evidence from the European Union. Journal of International Financial Management and Accounting 21: 220–78. [Google Scholar] [CrossRef]
  24. Choong, Kwee Keong. 2013. Understanding the features of performance measurement system: A literature review. Measuring Business Excellence 17: 102–21. [Google Scholar] [CrossRef]
  25. Coste, Andreea Ioana, and Adriana Tiron-Tudor. 2015. Performance indicators in Romanian Higher Education. SEA: Practical Application of Science 3: 177–80. [Google Scholar]
  26. Curaj, Adrian, Liviu Matei, Remus Pricopie, Jamil Salmi, and Peter Scott. 2015. The European Higher Education Area: Between Critical Reflections and Future Policies. Berlin: Springer Nature, p. 898. [Google Scholar]
  27. DiMaggio, Paul J., and Walter W. Powell. 1983. The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. American Sociological Review 48: 147–60. [Google Scholar] [CrossRef]
  28. Dobija, Dorota, Anna Maria Górska, and Anna Pikos. 2018. The impact of accreditation agencies and other powerful stakeholders on the performance measurement in Polish universities. Baltic Journal of Management 14: 84–102. [Google Scholar] [CrossRef]
  29. Dumitru, Valentin Florentin, Andrei Stanciu, Madalina Dumitru, and Liliana Feleaga. 2014. Pressure and isomorphism in business education. Amfiteatru Economic Journal 16: 784–99. [Google Scholar]
  30. Dunkin, Ruth. 2005. The HR challenge: Some more thoughts. Paper presented at the OECD Conference on Trends in the Management of Human Resources, Paris, France, August 25–26. [Google Scholar]
  31. Education Law. 2011. Education Law No 1 of 2011, Romanian Official Gazette, XXIII, No 18, 10 January 2011. Available online: https://www.edu.ro/sites/default/files/legea-educatiei_actualizata%20august%202018.pdf (accessed on 8 November 2022).
  32. Fligstein, Neil. 1998. The politics of quantification. Accounting Organizations and Society 23: 325–33. [Google Scholar] [CrossRef]
  33. Francesconi, Andrea, and Enrico Guarini. 2018. Performance-Based Funding and Internal Resource Allocation: The Case of Italian Universities. In Outcome-Based Performance Management in the Public Sector. Edited by Borgonovi Elio, Anessi-Pessina Eugenio and Bianchi Carmine. Berlin: Springer, pp. 289–306. [Google Scholar]
  34. Gallego, Isabel, Isabel-María García, and Luis Rodríguez. 2010. Universities’ websites: Disclosure practices and the revelation of financial information. The International Journal of Digital Accounting Research 9: 153–92. [Google Scholar] [CrossRef]
  35. Gallego-Álvarez, Isabel, José Manuel Prado-Lorenzo, and Isabel-María García-Sánchez. 2011. Corporate social responsibility and innovation: A resource-based theory. Management Decission 49: 1709–27. [Google Scholar] [CrossRef]
  36. Garde Sanchez, Raquel, Jesús Mauricio Flórez-Parra, María Victoria López-Pérez, and Antonio Manuel López-Hernández. 2020. Corporate governance and disclosure of information on corporate social responsibility: An analysis of the top 200 universities in the shanghai ranking. Sustainability 12: 1549. [Google Scholar] [CrossRef] [Green Version]
  37. Ghosh, Dipankar, and Anne Wu. 2012. The Effect of Positive and Negative Financial and Nonfinancial Performance Measures on Analysts’ Recommendations. Behavioral Research in Accounting 24: 47–64. [Google Scholar] [CrossRef]
  38. Glaum, Martin, Peter Schmidt, Donna L. Street, and Silvia Vogel. 2013. Compliance with IFRS 3-and IAS 36-required disclosures across 17 European countries: Company-and country-level determinants. Accounting and Business Research 43: 163–204. [Google Scholar] [CrossRef]
  39. Gomes, Carlos F., and Mahmoud M. Yasin. 2017. Toward promoting effective strategic performance: The relevance of the alignment of performance measurement and competitive strategic choices. International Journal of Business Excellence 12: 329–50. [Google Scholar] [CrossRef]
  40. Gonzales, Leslie D. 2012. Responding to mission creep: Faculty members as cosmopolitan agents. Higher Education 64: 337–53. [Google Scholar] [CrossRef]
  41. Gordon, Gus, and Mary Fischer. 2018. The need for performance management systems in public higher education. Journal of Higher Education Theory and Practice 18: 79. [Google Scholar]
  42. Gordon, George, and Celia Whitchurch. 2007. Managing human resources in higher education: The implications of a diversifying workforce. Higher Education Management and Policy 19: 1–21. [Google Scholar] [CrossRef]
  43. Gordon, Teresa, Mary Fischer, David Malone, and Greg Tower. 2002. A Comparative Empirical Examination of Extent of Disclosure by Private versus Public Colleges and Universities in the United States. Journal of Accounting and Public Policy 21: 235–75. [Google Scholar] [CrossRef]
  44. Greve, Henrich R., and Linda Argote. 2015. Behavioral theories of organization. In International Encyclopedia of the Social and Behavioral Sciences, 2nd ed. Amsterdam: Elsevier, pp. 481–86. [Google Scholar]
  45. Grossi, Giuseppe, Dorota Dobija, and Wojciech Strzelczyk. 2020. The Impact of Competing Institutional Pressures and Logics on the Use of Performance Measurement in Hybrid Universities. Public Performance and Management Review 43: 818–44. [Google Scholar] [CrossRef] [Green Version]
  46. Grossi, Giuseppe, Kirsi-Mari Kallio, Massimo Sargiacomo, and Matti Skoog. 2019. Accounting, performance management systems, and accountability changes in knowledge-intensive public organizations: A literature review and research agenda. Accounting, Auditing and Accountability Journal 33: 256–80. [Google Scholar] [CrossRef]
  47. Guthrie, James, and Ruth Neumann. 2007. Economic and non-financial performance indicators in universities: The establishment of a performance-driven system for Australian higher education. Public Management Review 9: 231–52. [Google Scholar] [CrossRef]
  48. Hatry, Harry P. 2013. Sorting the relationships among performance measurement, program evaluation, and performance management. New Directions for Evaluation 137: 19–32. [Google Scholar] [CrossRef]
  49. Hegarty, Niall. 2014. Where We Are Now-The Presence and Importance of International Students to Universities in the United States. Journal of International Students 4: 223–35. [Google Scholar] [CrossRef]
  50. Hyndman, Noel, and Ron Eden. 2001. Rational management, performance targets and executive agencies: Views from agency chief executives in Northern Ireland. Public Administration 79: 579–98. [Google Scholar] [CrossRef]
  51. IPSASB. 2015. Recommended Practice Guideline (RPG3) on Reporting Service Performance Information. IFAC. Available online: http://www.ifac.org/system/files/publications/files/IPSASBRPG-3-Reporting-Service-Performance-Information.pdf (accessed on 1 November 2022).
  52. Jesilow, Paul, and Julianne Ohlander. 2010. The impact of the National Practitioner Data Bank on licensing actions by state medical licensing boards. Journal of Health and Human Services Administration 33: 94–126. [Google Scholar] [PubMed]
  53. Johnson, David R. 1995. Alternative methods for the quantitative analysis of panel data in family research: Pooled time-series models. Journal of Marriage and the Family 77: 1065–77. [Google Scholar] [CrossRef]
  54. Kauppila, Osmo, Anu Mursula, Janne Harkonen, and Jaakko Kujala. 2015. Evaluating university–industry collaboration: The European foundation of quality management excellence model-based evaluation of university– industry collaboration. Tertiary Education and Management 21: 229–44. [Google Scholar] [CrossRef]
  55. Kogan, M., I. Moses, and E. El-Khawas. 1994. The changing framework for the academic profession. In Staffing Higher Education Meeting New Challenges. Report on the IMHE Project on Policies for Academic Staffing in Higher Education. Bristol: Jessica Kingsley Publishers Ltd., pp. 12–32. [Google Scholar]
  56. Kyrillidou, M. 2002. Current Context for Performance Indicators in Higher Education. Washington, DC: ARL Statistics and Measurement Program Home, Association of Research Libraries. [Google Scholar]
  57. Lee, Chongmyoung. 2021. Factors influencing the credibility of performance measurement in nonprofits. International Review of Public Administration 26: 156–74. [Google Scholar] [CrossRef]
  58. Levine, Arthur. 1997. Higher Education’s New Status as a Mature Industry. Washington, DC: Chronicle of Higher Education. [Google Scholar]
  59. Liguori, Mariannunziata, and Ileana Steccolini. 2012. Accounting change: Explaining the outcomes, interpreting the process. Accounting, Auditing and Accountability Journal 25: 27–70. [Google Scholar] [CrossRef]
  60. Lindenberger, Ulman, and Ulrich Pötter. 1998. The complex nature of unique and shared effects in hierarchical linear regression: Implications for developmental psychology. Psychological Methods 3: 218. [Google Scholar] [CrossRef]
  61. Lovata, Linda M. 1992. The use of operant theory in the design of performance reporting systems. Management Accounting Research 3: 273–89. [Google Scholar] [CrossRef]
  62. Lukman, Rebeka, Damjan Krajnc, and Peter Glavič. 2010. University ranking using research, educational and environmental indicators. Journal of Cleaner Production 18: 619–28. [Google Scholar] [CrossRef]
  63. Maingot, Michael, and Daniel Zeghal. 2008. An Analysis of Voluntary Disclosure of Performance Indicators by Canadian Universities. Tertiary Education and Management 14: 269–83. [Google Scholar] [CrossRef] [Green Version]
  64. Manolescu, G. 2009. Uncertainty in Financing Higher Education. Working Papers. Bucharest: Spiru Haret University, Faculty of Finance and Banking, Center for Advanced Economic and Financial Research, vol. 3. [Google Scholar]
  65. Montesinos, Vicente, Isabel Brusca, Francesca Manes Rossi, and Natalia Aversano. 2013. The usefulness of performance reporting in local government: Comparing Italy and Spain. Public Money and Management 33: 171–76. [Google Scholar] [CrossRef]
  66. Najeeb, Ali. 2013. Institutional perspectives in HRM and MNC research: A review of key concepts. International Journal of Employment Studies 21: 79–100. [Google Scholar]
  67. Orr, Dominic, Alex Usher, and Johannes Wespel. 2014. Do Changes in Cost-Sharing Have an Impact on the Behaviour of Students and Higher Education Institutions? Evidence from Nine Case Studies, Volume II: National Reports. Luxembourg: Publications Office of the European Union. [Google Scholar]
  68. Osborne, Jason W. 2000. Advantages of hierarchical linear modeling. Practical Assessment, Research and Evaluation 7: 1–3. [Google Scholar]
  69. Paauwe, Jaap, and Paul Boselie. 2003. Challenging ‘strategic HRM’ and the relevance of the institutional setting. Human Resource Management Journal 13: 56–70. [Google Scholar] [CrossRef]
  70. Perkmann, Markus, Andy Neely, and Kathryn Walsh. 2011. How should firms evaluate success in university–industry alliances? A performance measurement system. R&D Management 41: 202–16. [Google Scholar]
  71. Pollitt, Christopher, and Sorin Dan. 2013. Searching for impacts in performance-oriented management reform. Public Performance and Management Review 37: 7–32. [Google Scholar] [CrossRef] [Green Version]
  72. Pricopie, Remus, Luminiţa Nicolescu, Zeno Reinhardt, and Oana Almăŝan. 2009. Dynamics in the Internationalization of Higher Education at the Global Level and Specific Trends in Romania. Revista Romana de Comunicare Si Relatii Publice 11: 103–9. [Google Scholar]
  73. Ramos-Vielba, Irene, Manuel Fernández-Esquinas, and Elena Espinosa-de-los-Monteros. 2010. Measuring university–industry collaboration in a regional innovation system. Scientometrics 84: 649–67. [Google Scholar] [CrossRef]
  74. Rossi, Federica, and Ainurul Rosli. 2015. Indicators of university–industry knowledge transfer performance and their implications for universities: Evidence from the United Kingdom. Studies in Higher Education 40: 1970–91. [Google Scholar] [CrossRef] [Green Version]
  75. Rossi, Francesca Manes, and Natalia Aversano. 2015. Advancing performance measurement Italian local government vis-à-vis the IPSASB project. The International Journal of Productivity and Performance Management 64: 76–93. [Google Scholar] [CrossRef]
  76. Sangiorgi, Daniela, and Benedetta Siboni. 2017. The disclosure of intellectual capital in Italian universities: What has been done and what should be done. Journal of Intellectual Capital 18: 354–72. [Google Scholar] [CrossRef]
  77. Scott, Peter. 2007. Higher education in Central and Eastern Europe. In International Handbook of Higher Education. Edited by James J. F. Forest and Philip G. Altbach. Berlin: Springer, pp. 423–42. [Google Scholar]
  78. Seppo, Marge, and Alo Lilles. 2012. Indicators measuring university-industry cooperation. Discussions on Estonian Economic Policy: Theory and Practice of Economic Policy 20: 204–25. [Google Scholar]
  79. Shapiro, Samuel Sanford, and Martin B. Wilk. 1965. An analysis of variance test for normality (complete samples). Biometrika 52: 591–611. [Google Scholar] [CrossRef]
  80. Shin, Jung Cheol. 2010. Impacts of performance-based accountability on institutional performance in the US. Higher Education 60: 47–68. [Google Scholar] [CrossRef]
  81. Siegel, Donald S., and Mike Wright. 2015. Academic entrepreneurship: Time for a rethink? British Journal of Management 26: 582–95. [Google Scholar] [CrossRef]
  82. Suhaiza, Ismail, and Barizah Abu Bakar Nur. 2011. Reporting practices of Malaysian public universities: The extent of accountability disclosure. African Journal of Business Management 5: 6366. [Google Scholar]
  83. Suryadi, Kadarsah. 2007. Framework of measuring key performance indicators for decision support in higher education institution. Journal of Applied Sciences Research 3: 1689–95. [Google Scholar]
  84. Ter Bogt, Henk J., and Robert W. Scapens. 2012. Performance management in universities: Effects of the transition to more quantitative measurement systems. European Accounting Review 21: 451–97. [Google Scholar] [CrossRef]
  85. Terracciano, Antonio, Robert R. McCrae, Larry J. Brant, and Paul T. Costa Jr. 2005. Hierarchical linear modeling analyses of the NEO-PI-R scales in the Baltimore Longitudinal Study of Aging. Psychology and Aging 20: 493. [Google Scholar] [CrossRef] [Green Version]
  86. Turner, D. 2011. Quality in Higher Education. Rotterdam: Sense. [Google Scholar]
  87. Ulrich, R., T. Stachnik, and J. Mabry, eds. 1974. Control of Human Behavior: Behavior Modification in Education. Glenview: Scott, Foresman, vol. 3. [Google Scholar]
  88. Urdari, Claudia, Teodora Viorica Farcas, and Adriana Tiron-Tudor. 2017. Assessing the legitimacy of HEIs’ contributions to society: The perspective of international rankings. Sustainability Accounting, Management and Policy Journal 8: 191–215. [Google Scholar] [CrossRef]
Table 1. Description of the variables used in the statistical model.
Table 1. Description of the variables used in the statistical model.
SymbolVariableComputationHypothesisAuthors
DIDisclosure indexDisclosure index for information related to performanceDependent
variable
Gallego-Álvarez et al. 2011;
Suhaiza and Nur 2011
RESFinancial resources
attracted/per student
Ratio between total revenue and total number of studentsH1Guthrie and Neumann 2007; Chan 2015
SALCosts with salariesRatio between total costs with salaries and total number of studentsH2Gordon and Whitchurch 2007
SIZEUniversity’s sizeRatio between total assets and total number of studentsH3Ter Bogt and Scapens 2012; Chan 2015
University’s category H4
CAT_1 1. Control variable with value ‘1’ for universities from the category advanced research and education; value ‘0’ for other categories;ControlLaw 1/2011
CAT_2 2. Control variable with value ‘1’ for universities from the category teaching and scientific research; value ‘0’ for other categories; Law 1/2011
CAT_3 3. Control variable with value ‘1’ for universities from the category education centered; value ‘0’ for other categories. Law 1/2011
Table 2. Variables’ descriptive statistics.
Table 2. Variables’ descriptive statistics.
VariableAverageThe Mean Square Deviation
DI or DITotal0.540.14
DITeach0.600.21
DIResearch0.490.14
DIExt0.420.24
DIFin0.710.15
RES8201.383237.99
SAL4429.14923.99
SIZE23,645.617904.41
CAT_10.250.44
CAT_20.280.45
CAT_30.470.50
Table 3. Correlation matrix for RPDI.
Table 3. Correlation matrix for RPDI.
DITotalRESSALSIZECAT_1CAT_2
DITotal1
RES0.349 ***1
SAL0.541 ***0.583 ***1
SIZE0.236 **0.302 ***0.255 **1
CAT_10.727 ***0.246 **0.429 ***0.276 **1
CAT_20.0720.0990.209 **−0.271 **−0.361 ***1
Significance test: ** significant at level 0.05; *** significant at level 0.01.
Table 4. Performance-reporting model characteristics.
Table 4. Performance-reporting model characteristics.
ModelModel 1Model 2Model 3
Variable CoefficienttCoefficientTCoefficientt
RES0.22 × 10−50.3910.12 × 10−50.210.21 × 10−50.539
SAL0.75 × 10−43.3858 ***0.73 × 10−43.741 ***0.12 × 10−40.724
SIZE--0.17 × 10−50.8800.14 × 10−50.952
CAT_1----0.2468.203 ***
CAT_2----0.1073.846 ***
R20.2940.3030.677
F12.711 ***8.701 ***24.365 ***
Significance test: *** significant at level 0.01.
Table 5. Correlation matrix for teaching dimension.
Table 5. Correlation matrix for teaching dimension.
DITeachRESSALSIZECAT_1CAT_2
DITeach1
RES0.32 ***1
SAL0.38 ***0.583 ***1
SIZE0.26 **0.302 ***0.255 **1
CAT_10.54 ***0.246 **0.429 ***0.276 **1
CAT_20.100.0990.209 **−0.271 **−0.361 ***1
Significance test: ** significant at 0.05; *** significant at 0.01.
Table 6. Correlation matrix for research dimension.
Table 6. Correlation matrix for research dimension.
DIResearchRESSALSIZECAT_1CAT_2
DIResearch1
RES0.1441
SAL0.413 ***0.583 ***1
SIZE−0.0150.302 ***0.255 **1
CAT_10.623 ***0.246 **0.429 ***0.429 ***1
CAT_2−0.215 **0.0990.209 **−0.271 **−0.361 ***1
Significance test: ** significant at level 0.05; *** significant at level 0.01.
Table 7. Correlation matrix for external environment.
Table 7. Correlation matrix for external environment.
DIExtRESSALSIZECAT_1CAT_2
DIExt1
RES0.252 **1
SAL0.342 ***0.583 ***1
SIZE0.184 *0.302 ***0.255 **1
CAT_10.407 ***0.246 **0.429 ***0.276 ***1
CAT_20.228 **0.0990.209 **−0.271 **−0.361 ***1
Significance test: * significant at level 0.1; ** significant at level 0.05; *** significant at level 0.01.
Table 8. Correlation matrix for financial dimension.
Table 8. Correlation matrix for financial dimension.
DIFinRESSALSIZECAT_1CAT_2
DIFin1
RES0.261 **1
SAL0.549 ***0.583 ***1
SIZE0.187 *0.302 ***0.255 **1
CAT_10.640 ***0.246 **0.429 ***2.76 **1
CAT_2−0.340.090.209 **−0.271 **−0.361 ***1
Significance test: * significant at level 0.1; ** significant at level 0.05; *** significant at level 0.01.
Table 9. Variables’ influence over the level of performance-reporting disclosure index.
Table 9. Variables’ influence over the level of performance-reporting disclosure index.
VariableDITeachDIResearchDIExtDIFinDITotal
RES+0+++
SAL+++++
SIZE+0+++
Cat_1+++++
Cat_20+00
Table 10. Hypotheses validation.
Table 10. Hypotheses validation.
HypothesesDITeachDIResearchDIExtDIFinDITotal
Hypothesis 1: Performance reporting is associated with the financial resources attracted.RejectRejectRejectRejectReject
Hypothesis 2: Performance reporting is associated with salary costs.Partially acceptAcceptPartially acceptAcceptPartially accept
Hypothesis 3: Performance reporting is associated with the size of the higher education institution.RejectPartially acceptRejectRejectReject
Hypothesis 4: Performance reporting is associated with the quality category of the higher education institution.AcceptAcceptAcceptAcceptAccept
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tiron-Tudor, A.; Nistor, C.S.; Fekete, S.; Alexandru, A. Factors Influencing Public Higher Education Institutions’ Performance Reporting in the Romanian Context. Adm. Sci. 2022, 12, 163. https://doi.org/10.3390/admsci12040163

AMA Style

Tiron-Tudor A, Nistor CS, Fekete S, Alexandru A. Factors Influencing Public Higher Education Institutions’ Performance Reporting in the Romanian Context. Administrative Sciences. 2022; 12(4):163. https://doi.org/10.3390/admsci12040163

Chicago/Turabian Style

Tiron-Tudor, Adriana, Cristina Silvia Nistor, Szilveszter Fekete, and Andreea Alexandru. 2022. "Factors Influencing Public Higher Education Institutions’ Performance Reporting in the Romanian Context" Administrative Sciences 12, no. 4: 163. https://doi.org/10.3390/admsci12040163

APA Style

Tiron-Tudor, A., Nistor, C. S., Fekete, S., & Alexandru, A. (2022). Factors Influencing Public Higher Education Institutions’ Performance Reporting in the Romanian Context. Administrative Sciences, 12(4), 163. https://doi.org/10.3390/admsci12040163

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop