Next Article in Journal
Analysis of the Sustainable Cooperation between a Multi-Piped Impeller and a Concentric Casing Using Experimental Planning
Next Article in Special Issue
Education for Sustainable Development: Challenges and Opportunities of Transformative Learning and Teaching
Previous Article in Journal
AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management
Previous Article in Special Issue
Future of Undergraduate Education for Sustainable Development Goals: Impact of Perceived Flexibility and Attitudes on Self-Regulated Online Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Performance and Qualitative Evaluation of Scientific Work at Research Universities: A Focus on the Types of University and Research

by
Dmitry A. Radushinsky
1,*,
Egor O. Zamyatin
2,
Alexandra I. Radushinskaya
3,
Ivan I. Sytko
1 and
Ekaterina E. Smirnova
1
1
Department of Metrology, Instrumentation and Quality Management, Empress Catherine II Saint Petersburg Mining University (Saint Petersburg Mining University), 199106 St. Petersburg, Russia
2
Department of Scientific Research, Empress Catherine II Saint Petersburg Mining University (Saint Petersburg Mining University), 199106 St. Petersburg, Russia
3
Department of Russian Policy, Saint Petersburg State University (SPbGU), 191124 St. Petersburg, Russia
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(18), 8180; https://doi.org/10.3390/su16188180
Submission received: 12 August 2024 / Revised: 11 September 2024 / Accepted: 14 September 2024 / Published: 19 September 2024

Abstract

:
The successful implementation of scientific research is one of the key factors for sustainable development, including the development of tertiary education. A leading or “world-class university”, today, transfers knowledge to innovation, bearing the concept of “academic excellence”, and features of “research” and “entrepreneurial” universities highly match the SDGs. This article presents an analysis for scientific quality-assessing indicators at research universities. It also studies university science models in different countries, the features of university types—engineering (technical or technological) and comprehensive (multidisciplinary)—and the arising problems with their possible solutions for university science on the whole. The authors suggested a composition of indicators for assessing the quality of the fundamental scientific research and engineering project results and the performances of universities’ specialized scientific units. The respondents of the survey weighed the significance of the indicators. The research used methods of analysis, brainstorming, Ishikawa diagram building, and a survey of specialists. The results obtained can be useful for the improvement of quality management systems (QMSs) at research universities. Some implications of this study could be extended to improve QMS and management processes of specialized scientific organizations that belong to the state, business, and non-profit sectors of science.

1. Introduction

Successfully developing universities are centers not only of higher education but also of science and technology, and the most advanced of them are also centers of technological implementation and innovation. According to G. Etzkowitz’s well-known concept of the “triple helix of innovation”, universities can actively participate in the technological and innovative developments of territories and industries, using their scientific and social potentials [1,2]. That complex highly matches sustainable development goals (SDGs).
The public policy for economically leading countries stimulates the commercialization of innovations through the cooperation of educational, scientific, and industrial organizations. Tertiary or higher-education units become a valuable part of applied research. Also, fundamental science, in many countries, receives support through so-called research universities of an engineering nature (technical or technological universities) as well as through multidisciplinary (comprehensive and general education) universities. University science, in this system, contributes to the local sustainable development, including the development of industries and tertiary education itself, for national and international sustainability.
Research universities are associated with better opportunities for young specialists and with prospects for interaction with developers of “scientific products” of medium and high levels and both fundamental and applied natures [3,4,5].
The levels of funding, as percentages of the gross domestic expenditure, on research and development (GERD) via the higher-education sector are displayed for certain national university science models (Table 1).
According to the data in Table 1, we can conclude that there are several fundamental levels of GERD funding through universities: The highest levels are demonstrated by countries such as Serbia and Turkey (18–35%); the middle levels include Russia and China (8–10%), and possibly Japan (5.3%); the EU, on the whole, and the USA demonstrate the lowest levels (from 1 to 3.4%). However, one should keep in mind that sufficient government and business enterprise research contracts that receive funding through universities in EU countries and the USA are not considered as higher-education-sector stakes but as business and government stakes [3,4].
Also, it is notable that in several European countries, e.g., Serbia, North Macedonia, Turkey, Croatia, Cyprus, Spain, Portugal, and Iceland (Table 1, https://ec.europa.eu/eurostat/databrowser/view/rd_e_fundgerd/default/table?lang=en (accessed on 10 September 2024)), the higher-education sector contributed to GERD much more than the average level in the EU in 2000–2023, which was counted by Eurostat at 1–3% (Figure 1).
The greater part of the GERD in the EU is funded by business enterprises, the government sector, and foreign funds, which come from the “outer world” (Figure 1).
In comparison, in Russia (the Russian Federation), the share of the higher-education sector in the GERD had reached 5–11% in 2000–2023, gradually increasing (Figure 2). Business enterprises covered 57–70% of the R&D costs; the government, 25–33%; and the non-profit organizations’ stake was the lowest (0.2–0.8%). These figures highly coincide with EU data except for foreign funds, which input has been negligible in Russia in the period of this study.
According to the diagram (Figure 2), the share of the higher-education sector has doubled in Russia since 2000, reaching 9–11% in 2015–2023. This increase could be attributed, in particular, to investments connected with the national project “Science and Universities” (by a decree of the president of the Russian Federation on 21 July 2020, No. 474). The goal of that national project is to ensure the presence of the Russian Federation among the ten leading countries in the world in science.
By transferring financial resources for conducting research and purchasing equipment to universities, the Russian Federation demonstrates one of the models for financing university science. In a number of Western countries that are leaders in R&D, universities receive huge research contracts with private, state, and municipal organizations. In fact, both models create conditions under which scientists at universities can conduct independent research using modern equipment, involving undergraduate and graduate students in their work. The possibility for transferring knowledge and innovation ensures the sustainable developments of industries and territories and of higher (tertiary) education itself.
There is a lot of evidence that for different areas of research, the scientific base that is created at universities can be of great importance for the transfer of innovations [8,9]; in particular, it contributes to sustainable development in mineral complexes [10,11,12].
The purpose of this article is to identify the significance of indicators for assessing the quality of university scientific units’ (SUs’) performances from the point of view of their employees—professional researchers. The conducted study took into account the differences in fundamental research and engineering projects and the features of universities of two major types—engineering (technical or technological) and comprehensive (multidisciplinary) universities.
The objectives of this study include the development of a list of indicators, provision of an indicator assessment for fundamental research and engineering projects, analysis of typical problems in university research on the whole, and analysis of the differences in university science models and processes at engineering and comprehensive universities.

2. Materials and Methods

The quality and volume of R&D activities significantly influence the socioeconomic role of the university in its region and its position in university rankings, in particular, as developed by the Shanghai Ranking Consultancy (SRC), Quacquarelli Symonds (QS), Times Higher Education (THE), and the Center for World University Rankings (CWUR) [13,14,15].
Not all universities implement the scientific function and have the status of “research university”; many of them do not meet this title based on a number of criteria. A distinctive feature of “research universities” is the significant weight of scientific departments, both in the structure of the university’s staff and in the sources of its income. If a university has been positioned as a research university for a long time, another important feature is the distribution of the premises. At research universities, the total area of the laboratories, offices, and training and other spaces intended for scientific work can be as much as the total area of the classrooms [16,17,18].
In Russia, there has been developed a plan for the creation of 25–30 modern university campuses, with a total area of 2.3 million square meters, by 2030. This plan (a federal project of the Ministry of Science and Higher Education of Russia “Advanced Engineering Schools”, planned for implementation in 2022–2030, https://engineers2030.ru/ (accessed on 10 September 2024) was designed to strengthen the scientific activities of universities while providing them with the necessary premises and equipment for research.
Besides premises, specialized scientific units (SUs) or scientific and technical units (STUs) at universities are created in the form of scientific R&D centers, laboratories, bases, etc. These organizational structures perform research on a permanent basis. The model of university science in such countries as China and Russia presumes that government and business contracts and financial resources provide better research opportunities for both researchers and lecturers employed at these SUs and STUs, which belong to universities–national science leaders. Universities–national science leaders are incorporated into special projects, for example, “Project 985”, “Project 211”, and “Double First-Class” in China [19,20,21] and projects “5–100” and “Priority 2030” in Russia [22,23,24].
For the purposes of this article, we consider it as appropriate to distinguish specialized SUs/STUs, which conduct research on a permanent basis, and education units (EUs), which could conduct research as supplements to their major teaching tasks (Figure 3).
As indicated in Figure 3, the categories reflect the following concepts:
(a) fundamental research that does not provide immediate commercial effect but may have significant innovation and implementation potential in the future; SUs/STUs, where they are primarily carried out, have the status of “centers of fundamental research”;
(b) engineering projects (connected with applied works and consulting projects) for which there would likely arise demand from the business sector, and commercial contracts can be signed; SUs/STUs, where they are primarily carried out, have the status of “engineering divisions” and usually act as “profit centers” for university research as a whole.
The role of education units (EUs) is considered as supplemental to specialized SUs/STUs, which conduct the main body of research. However, the research performance of EUs could be evaluated in the same way as that of SUs.
The authors suggested 6 groups and a total of 39 indicators for assessing the performance of SUs during the reporting period as well as the quality of the fundamental scientific research and engineering project results. A single project is formed by a research contract, grant, etc. The significance of the indicators differs because of the nature of the SUs and project status (contract)—fundamental or engineering (Table 2).
The brainstorming method was used to identify the indicators. The Ishikawa diagram construction method was used to obtain the indicators’ structure. Five associate professors with research and/or quality management experience of 12–20 years took part in developing the list of indicators.
The authors consider it as optimal to use Table 2 indicators as an indicator list in order to evaluate the performance quality of SUs over a period from 1 to 5 years (reporting period) for a retrospective comparison. However, it is recommended to apply the list for assessing the results of separate research projects as well.
It was figured out from interviews with 12 SU leaders that the optimal reporting period for specialized SUs/STUs should count from 3 to 5 years, while a 1- or 2-year period of evaluation could be subject to incomplete information. The retrospective performance comparison of the same SU performance is more usable. Because of the high degree of specialization, even within one university, often, it is not adequate to compare the performances of different SUs—at least for fundamental and engineering specializations.
This study was conducted from December 2023 to April 2024. This research was carried out based on the results of a survey, which took place among the researcher staff of 5 universities in St. Petersburg (Table 3). To participate, the researchers were requested to complete an electronic-table- or paper-based questionnaire (Appendix A). The total number of respondents was 246. In addition, there were 12 interviews held with SU leaders.
The universities’ research statistical figures were also analyzed.
This study analyzed the opinions of researchers at a number of leading universities in St. Petersburg (The Empress Catherine II St. Petersburg Mining University (or Mining University), St. Petersburg State University (or SPbGU), St. Petersburg Polytechnical University, ITMO University, and LETI University) to figure out the significance (weight with a total sum of 100%) of the indicators presented in Table 2. The 5 universities were selected as research universities of technical (engineering) and general education (comprehensive) profiles. The total number of students studying at the selected universities exceeds 100 thousand, which is approximately 37.5% of the number of St. Petersburg students. The 5 mentioned universities carry out more than 60% of the research in the higher-education sector in St. Petersburg. All of them attract students as well as postgraduate students to carry out some types of research.
The indicators in Table 1 are presented in the following 6 groups:
1. Main scientific performance indicators;
2. Student cooperation indicators;
3. Quantitative economic indicators;
4. Quantitative scientometric indicators;
5. International cooperation indicators;
6. Qualitative assessment (comprehensive multifactorial assessment) indicators.
The research hypotheses are as follows:
Hypothesis 1 (H1). 
Assessing the quality for engineering projects (EPs) is less complex than assessing the results of fundamental research (FR). EPs apply to close and reliable bases and, thus, can operate based on a brief significant indicator list—economic, main performance, and probably scientometric indicators (depending on the contract type);
Hypothesis 2 (H2). 
Economic indicators will be of greater importance for assessing the quality of EPs rather than the results of FR;
Hypothesis 3 (H3). 
For fundamental research, the full list of indicator groups is significant;
Hypothesis 4 (H4). 
The qualitative assessment indicators and student cooperation groups of indicators play a greater role in fundamental research than in engineering projects;
Hypothesis 5 (H5). 
Research universities of technical profiles, over the same reporting period, conduct more engineering profile works (EP), along contracts with business entities, than comprehensive profile universities. The latter are likely to conduct more FR along contracts with state entities over the same reporting period than universities of technical profiles;
Hypothesis 6 (H6). 
Research universities of technical profiles have a greater part of research conducted by specialized scientific units (SUs) rather than by education units (EUs). Comprehensive profile universities use their educational staff (EU) for research more often than research universities of technical profiles.

3. Literature Review

Scientific activities of universities are associated with the concepts of “research university” [25,26,27], “academic excellence” [5,28,29], and “world-class university” [30,31,32]. The processes of research implementation and innovation transfer, which are considered as the basis of sustainable development in the knowledge economy, also constitute the key content of the “entrepreneurial university” concept [33,34,35].
The financial support of national research universities in Russia and the realization of the “academic excellence” concept in “5–100” and “Priority 2030” projects stimulated university research in the country in the 2010s [36,37,38]. Some actual features of research quality assessments were studied within mineral resource complex production processes [39,40,41]. The latter works state the need for the consideration of environmental and socially significant factors and their indictors while establishing quality management systems (QMSs) and undertaking research at mineral resource enterprises.
Problems in university research and their possible solutions have been analyzed substantially by many authors. We have grouped them in a table (Table 4).
In Table 4, the specified problems are presented in the descending order of importance at the opinion of the authors of this article.
The qualitative evaluation of university research could use some available government statistical indicators. To evaluate the efficiency of university science, the Russian Ministry of Science and Higher Education and government statistical bureau, Rosstat, use, today, the following indicators (in particular):
-
the internal costs for research and development in priority areas of technology development, science, and technology;
-
the number of small innovative enterprises created with the participation of an organization (university);
-
the shares of the development and production (in the region) using critical technologies (including universities);
-
the share of patents, which scientific results were put into practice.
These indicators were invented in statistical practice in the last 10–15 years and reflect the widespread global approach for stimulating practice-oriented research [90,91,92].
However, an official statistical approach is not sufficient for internal evaluations, which should help university administrations to support and improve quality management systems (QMSs) of university research. Literally, the quality of engineering projects usually takes less effort to assess. The primary assessment is made by the head of the specialized department (SU) and then by qualified staff at the customer’s company. The assessment of the quality of the fundamental research is characterized by greater uncertainty. A large number of risks and the complexity reflect the following issues and problems [93,94,95]:
-
possibilities for integration with the results of previous and related studies;
-
maintaining existing achievements, the general culture, and expanding the activities of the scientific school;
-
the possibility for testing/partial implementation of results in practice in different industries—“knowledge transfer”—on a test or stream basis;
-
the possibility for publishing results, with inclusion in regional/sectoral research and national information systems;
-
other direct and indirect impacts of the results obtained in the long term on various areas of scientific and practical activities.
Thus, in order to assess the quality of the fundamental research, universities and scientific organizations use temporary or permanent advisory bodies of “encyclopedic profile” specialists, capable for complex expert judgement/evaluation [96,97,98]. Preferred experts for qualitative assessments should have actual experience in interdisciplinary research and a high degree of awareness in various scientific fields. We should note that such interdisciplinary assessments have been practiced, for example, in the Research Excellence Framework (REF) program implemented in the UK since 2014 [99,100,101]. The REF results determine the degree of financial support for universities’ science in this country.
Though bibliometric or scientometric indicators have sufficient limitations (we pointed that out earlier in Table 4), they are the most popular and vastly used as fundamental research evaluation criteria. Many researchers have repeatedly established the fact that scientometric indicators increase noticeably when a university receives government funding for scientific works [102,103,104]. In this case, researchers (SU) and lecturers (EU) have the opportunity to pay for expensive publications in leading scientific journals; also, they may have salary bonuses for publishing.
In many countries, the state system for financing university science is based on recording high-quartile publications and their citations. Thus, the following dependence works: The state funds fundamental science at universities, using as the most important reporting indicator, the number of high-quartile publications and their further citations. In turn, scientometric indicators sharply decline with a decrease in or cessation of government funding [102,103,104].
The importance of scientometric indicators is greater for fundamental research quality assessment rather than for engineering projects because the latter could not presume open publications of research results.

4. Results

4.1. Description of the Research Object and University Research Data Analysis

We analyzed available data and research statistics from five selected leading universities in St. Petersburg. Table 5 presents some indicators for evaluating their scientific work. The composition of the indictors was determined by authors using the method of brainstorming.
Based on the data presented in Table 5, the following conclusions can be drawn:
-
comparatively numerous scientific units belong to ITMO University and to the Mining University (line 4);
-
the same universities were mostly focused on non-state contracts (line 6);
-
the share of lecturers who published at least one scientific article for the period 2021–2023 in journals of the Scopus/WoS 1–2 quartile ranged from 14 to 39%, which is less than the corresponding share of researchers 39–64% (lines 7 and 8);
-
from 62 to 85% of the patent authorships belong to university researchers (line 10);
-
a higher volume of scientific work per researcher was performed at the Mining University (line 11).
Comparing the universities, one should take into account their specialization. This sample deals with two major types: St. Petersburg State University is a type of comprehensive (multidisciplinary) university, while Mining and ITMO (specializing in precision mechanics and optics) universities are samples of engineering (technical or technological) ones. The other two universities, SPb Polytechnical University and LETI University, are much closer to the engineering type while containing some characteristics of comprehensive universities because of a sufficient share of non-technical specialties, students, and postgraduates (Table 6).
We could see that engineering universities have a greater share of researchers and perform more scientific work with the business sector. For example, in the research-funding structure of Mining University in the last 6 years, Russian legal entities (business sector) have prevailed, while the average annual share of government funding was only about 12% (Figure 4).
This is atypical for other studied universities. For comparison, the share of the government funding of research at St. Petersburg State University in 2021–2023 has been significantly higher, at 69.7%; for LETI—78.9% (Table 5, line 6).
Other differences and dependences we figured out by analyzing the universities’ scientific (SUs/STUs) and education units’ (EUs) performance data (Table 7).
Based on the data presented in Table 7, the following conclusions can be drawn:
-
the share of the fundamental research is greater than the share of the engineering projects at comprehensive-type St. Petersburg State University and at the mixed-type LETI; also, according to Table 5 (line 6) these two universities had the largest share of the state-ordered research—69.7% and 78.9%. (This proves research hypothesis H5, which was formulated in the Materials and Methods section: a larger share of government funding characterizes comprehensive universities);
-
the share of engineering projects exceeded the average level of 68.2% at engineering-type universities, including the “mixed-type” SPb Polytechnical University;
-
specialized scientific units (SUs), on average, performed a larger part of the research work, nearly 84%. However, at engineering-type universities—Mining and ITMO—that indicator exceeded 90%. (Thus, we proved research hypothesis H6, which stated that comprehensive-profile universities use their educational staff for research more often).

4.2. Survey Result Analysis

As a result of the conducted survey, the respondents put weights to form “indicators for assessing the quality of the fundamental scientific research and engineering project results…” into two columns (Table 2, Appendix A). Some indicators were not marked by respondents, but the average respondent chose 33.8 indicators in both columns. This result indicates the substantial interest of the respondents to participate in the questionnaire and proves the quality of the obtained results. The quality of the survey is confirmed by the absence of questionnaires in which no indicators were marked or only one indicator was marked as 100%.
At the first step, it was crucial to determine the difference in how the respondents estimated the weights for groups of indicators for fundamental and engineering research. We calculated the averages in percentages (Table 8, Figure 5).
We should note the following:
-
basic scientific performance indicators have almost an equal weight of about 11%. We will show further, however, that the values of the sub-indicators differ for fundamental research and for engineering projects;
-
student cooperation appears to be more valuable for engineering projects rather than for fundamental research—13% and 8%, respectively. We suppose the reason is that the SU teams, which perform most of the engineering projects, more actively seek new members among students and young specialists;
-
quantitative economic indicators determine nearly the 2/3 (65%) of the quality value for engineering projects, while they determine close to only 30% of the quality value for fundamental research evaluation. The economic indicators’ group is the most numerous of all the groups. For engineering work, which is carried out under contracts with business, economic indicators reflect the best way of the effectiveness of an individual project and of an SU’s performance as a whole for the reporting period;
-
in contrast, scientometrics are much more valuable for fundamental research rather than for engineering projects (32% against 4%);
-
international cooperation indicators were appreciated at a low level in both cases, though it means that for fundamental research, they exceed 3%, which is quite sufficient;
-
the qualitative assessment group for fundamental research has three times more value than that for engineering projects (17% against 5%).
According to the authors’ personal opinions, the qualitative assessment group of indicators could have a larger weight in the future. For example, initiative fundamental research (dissertations and scientific articles) is usually assessed by experts in the field. They belong to national academic councils, awarding scientific degrees, editorial boards, and so on, which have a permanent characteristic. In the case of the expansion of the practice for using temporary expert councils and commissions with the involvement of external experts, the meaning of the qualitative assessment for fundamental research should rise.
To address the qualitative assessment group of indicators, the questionnaire, containing the assessments of problems, risks, prospects, and opportunities of the scientific unit (scientific and technical units) of the university, was previously developed by the authors [105].
Within group 1 (“basic scientific performance indicators”), the significant difference between fundamental research and engineering project evaluations is reflected in the number of registered patent (1.1) and the number of defended dissertation indicators (1.3; 1.4) (Figure 6).
The quality of the fundamental research is better reflected by the number of defended dissertations, while the number of registered patents is more important for engineering projects. A single indicator “number of patents” with 7.8% of the total significance, according to survey results, appears to be the most important for engineering projects (Table 9). However, one should take into account that five other top indicators of the top six belong to group 3, with a total significance of 30.8%. Thus, economic factors, on the whole, have greater importance for engineering projects.
In analyzing Table 9, we also note that for fundamental research, four of the six single valuable indicators belong to group 4 “scientometrics”. Their weight totals 24.9%, which points to the highest significance degree of this group for the respondents.
Further, we have chosen three indicator groups, with the most significant difference between the fundamental research and engineering project assessment.
Group 3, “quantitative economic indicators”, has 14 indicators (Table 2), with a total importance of 29.8% for fundamental research and 65.4% for engineering projects, according to the survey results. The major differences (more than three times) are as follows (the first figures are for fundamental research; the second—for engineering projects):
3.1. total number of researchers involved in the project—0.9% and 3.0%;
3.6. costs for maintaining laboratory and office spaces—0.9% and 2.7%;
3.11. other costs—1.5% and 5.0%;
3.12. net profit or pure income (proceeds minus all the costs and taxes)—1.9% and 6.9%.
This group of economic indicators suggests a reliable tool for assessing effectiveness, but its importance for engineering projects is overwhelming. As for fundamental research, its quality does not depend on cost optimization to the same degree. The higher costs may lead to better results, which is reflected by the lower estimate of the economic indicators for fundamental research. However, the majority of the indicators in this group have a substantial significance.
Group 4, “quantitative scientometric indicators”, has eight indicators (Figure 7). As we could see, this indicator group was not recognized as valuable by the respondents for engineering projects. Scientometrics may not be carried out at all in the case of typical and repetitive engineering and consulting contracts. In the absence of customer interest in scientometric indicators, this can lead to the fact that engineering work is assessed only from the point of view of its technical significance and economic efficiency.
If none of the scientometric indicators is provided by contractual relationships and publications are not encouraged (for example, a trade secret regime is established), its meaning for engineering projects could be zero.
However, in the case of fundamental research, the total weight of the quantitative scientometric indicators is about 1/3 (31.7%), on the whole, which makes this set the most valuable among the indicator groups. This fact reflects the modern practice for evaluating scientific work, which was described in the “literature review” section.
Group 6, “qualitative assessment (comprehensive multifactorial assessment)”, indicators play a much greater role for fundamental research (Figure 8). The chosen indicators were compiled by the authors according to the literature analysis. Together, possibilities for integration with the results of previous and related studies and of result implementation in practice in industry consist of 9.1% of the total significance, which points to the importance of long-term studies and scientific collaboration (Table 4, item 5) for fundamental research.
Student cooperation indicators and international cooperation indicators (groups #2 and #5) do not have a radical difference for fundamental scientific research and engineering projects. However, student cooperation was preferred by the respondents of the survey as everyday practice. International cooperation is seen as a valuable tool for selected specialists and managers who possess experience and contacts with specialists from abroad. That was also confirmed by the interviews with the SU leaders.
We should note that the student cooperation indicator group, which values for fundamental research and engineering projects were estimated at 7.6% and 13.2%, respectively (Table 8), is quite important for the transformative learning issue. Involving students in research teams highly contributes to the formation of sustainable education models and furthers innovation transfer, which provides for SDG achievements.
The international cooperation indicators have their own peculiarity concerned with the disruption in specialists and knowledge exchange flows caused by the war in Ukraine and Western sanctions. In the 2000–2010s, international cooperation implied partnerships with universities, companies, editorial offices, and authors from a vast number of Western countries. However, at present, the priority of such cooperation has been to become partners with the BRICS and other Russia-friendly countries. For example, until 2022, certain SUs of Mining University carried out work for enterprises from Europe, and students and teachers underwent regular internships and defended their scientific degrees at EU universities. Today, there exist sufficient obstacles, which have made such academic exchanges almost impossible. Restoring developed ties and scientific cooperation should arise as a mutually beneficial prospect for university science and business. In the event of such a restoration, the importance of international cooperation factors would be likely to increase.
University research data analysis (Section 4.1) helped to prove hypotheses H5 and H6 in this study. Beneath, we logically analyze the justifications of H1–H4. The threshold of the statistical significance for an indicator group was considered as α = 5.0%.
H1: Assessing the quality for engineering project results is less complex as they apply to a reliable basis and, thus, can operate based on a brief significant indicator list—economic, main performance, and probably scientometric indicators.
H1: We consider it to be proved partially (70%) except for scientometric indicators, which value for engineering projects is less (4.4%) compared to other indicator groups: student and young specialists’ cooperation—13.2%; basic scientific performance—11.0%; qualitative assessment—4.7%.
H2: Economic indicators will be of greater importance for engineering projects.
H2: We consider it to be proved. The value of economic indicators for engineering projects is 65.4%. For typical engineering projects, the effectiveness of the result is measured by obtaining it quickly and cheaply using cost optimization on the basis of a reproducible technology (algorithm).
H3: For fundamental research, the full list of indicator groups is significant.
As we could see from Table 8 and Figure 5, the significance of the indicator groups spread for fundamental research is more gradual. Five of the six groups are statistically significant (>5%), except international cooperation (3.2%). In contrast, engineering projects have only three groups of the six that exceed the threshold of the statistical significance. We consider H3 to be proved partially (90%), as one of the six indicator groups appeared to be not significant.
H4: The qualitative assessment indicators and student cooperation groups of indicators play a greater role for fundamental research than for engineering projects.
The first part of this hypothesis is proved (16.8 > 4.7%), while the second part is not proved (7.6% < 13.2%).
Table 10 contains the results of the hypotheses’ justifications.
Thus, the six hypotheses of this study have been proved or partially proved.
It is worth mentioning the conclusions and results obtained by the authors of works with similar research topics. M. A. Kiseleva, who studied the scientific activities of technical universities in Moscow and Novgorod (Russia), focused on describing the mechanism for attracting promising specialists to research teams (groups) within universities and improving the organizational and economic aspects of the management mechanism [55]. The author identified six groups of factors influencing efficiency in accordance with the well-known PESTEL model but did not note differences in the types of research and types of universities.
The study by Yukun Shi and co-authors, which concerned the 32 “double first-class” universities directly under the Ministry of Education in China, showed a significant difference in the efficiency of the scientific research conducted at leading (7) and other universities (25) [106]. The authors examined five groups of first-level indicators for efficiency assessment: personnel, funding, projects, scientific and technological achievements, and achievement awards. A general observation is that basic research is less efficient than applied research if they are assessed using criteria with equal weights. Following that, the authors suggested that universities should change the traditional way of scientific research evaluation, distinguishing basic research from applied research. This suggestion is consistent with the approach of our article, which proposes a similar set of factors for assessment but different weights.
The study by Zhe Cheng and co-authors developed an analytical framework based on the idea of Education for Sustainable Development (ESD) and scientific research activity [107]. The 42 world-class universities in China were selected as research objects. The authors of [107] state in Section 4.3.2 (from [107]) that they evaluated research performance by the quantity and quality of research papers related to the topic of sustainable development. The quality was measured using a field-weighted citation index (FWCI), which was examined for most papers and exceed 1. We note, in that section, the authors articulate only one indicator group (scientometric indicators) of the six groups that are suggested in this paper. However, further in Section 4.3.3 (from [107]), the authors of [107] state that the sustainability of the scientific research includes the contributions of Ph.D. graduates, research papers, and the amount of science and technology transformation—a set that is much closer to that is suggested herein. Notoriously, Z. Cheng found out that the average number of papers on sustainable development, published by universities in China, is greater than that published by the best-performing universities in the USA (Purdue University) and the UK (The University of Nottingham).
A study by Lei Hou and co-authors offers novel implications for not only the theoretical understanding of the research collaboration but also the strategic management of research universities [108]. Conclusions concern whether research topic specializations between two universities make their collaboration closer and the impact of such collaborations. Ten research topics were identified, but the authors did not note differences in the types of research (fundamental and engineering) and types of universities they studied. In contrast, W. Chen and Y. Yan, who studied the internal collaboration networks of scientific teams, stated that team specialization on engineering projects, for example, matters a lot [93], while the university types may differ.
Roba Elbawab, who studied university rankings, used QS-methodology-ranking indicators (international student ratio, international faculty ratio, faculty–student ratio, citations per faculty, academic reputation, and employer reputation) [109]. The academic reputation score and employer reputation score metrics were assessed using surveys; hence, these metrics were based on the perceptions of the academics and employers. The other four metrics were based on numerical data. The author found out that one of the four derived by him, university clusters, excels in research through citations, while other clusters excel in other scores concerned with students. Thus, only the scientometric-type citation indicator is responsible for research evaluation following this wide-spread methodology.
The work by Claudiu Kifor and co-authors proposes a KPI framework for sustainable universities, which assessment and research are placed at the second and third positions, while education takes the fourth place of the ten [110]. Thus, the authors underline the meaning of the research for sustainable-type universities, which are described and analyzed in this paper.
Guironnet and Peypoch base indicators of research productivity on numbers of academic publications and successful Ph.D. students [111]. Some authors included the faculty’s role, Ph.D. degrees, and some other decisive indicators in the research effectiveness in appreciation for preferring the DEA approach [112,113,114]. 525 total indicators of quality performance for higher education institutions were classified and distributed into 94 intermediate groups by the authors of [115]. We consider that, on the part of academic research their conclusions coincide with the results of our research.

5. Discussion

We assume that the suggested composition of indicators for assessing the quality of the fundamental scientific research and engineering project results and the performances of universities’ specialized scientific units would be useful for the improvement of quality management systems (QMSs) at research universities. Some implications of this study could be extended to improve the QMS and management processes of specialized scientific organizations (scientific institutes, centers, laboratories, experimental bases, etc.) that belong to the state, business, and non-profit sectors of science.
The choice of whether to evaluate the full set or just a part of the indicator groups and the evaluation itself for each indicator depend on the project agreement (contract, grant, etc.). The indicators of the performances of SUs in the long term should become a part of universities’ science policies and be publicly available as management documentation.
The retrospective analysis of the values of the indicator groups for previous reporting periods allows for the establishment of effective individual values for the reporting period and for some future periods (for example, years) for certain SUs. Establishing individual values makes it possible to take into account the features of projects, contracts, and grants implemented by certain SUs and select the most appropriate combination and weights for indicators.
The international cooperation indicator for university research in Russia reflects the current situation of the exchange flow disruption caused by the war in Ukraine and Western sanctions. The priority of the cooperation is biased from Western academic institutions and businesses to partners from the BRICS and other Russia-friendly countries. The possibility for restoring ties and scientific cooperation is a mutually beneficial prospect for university science development. The importance of international cooperation indicators for university research’s qualitative evaluation will rise with such a restoration.

6. Conclusions

The main goal of this study is to fill the gap for the lack of knowledge on the assessment of the university research quality, with a focus on the university type and research type. This study assumed that the scientific effectiveness of higher-educational institutions is determined by the performance of their specialized scientific units. To support this, the internal research processes of several universities were evaluated from the perspective of professional researchers—the employees of specialized scientific units. The schematic diagram of the university research was also developed (Figure 3, Appendix B).
As a result of this research, the differences in the respondents’ estimates for suggested quality indicator groups for fundamental research and engineering projects were identified. The most significant indicators were determined, as well as the indicators that researchers do not consider as particularly important.
The university research data analysis helped to identify the differences in and features of scientific work conducted at comprehensive-profile universities and engineering-type universities.
The six hypotheses of this study were proved or partially proved.
The results obtained would be useful to provide ongoing quality control of university research and to form and improve quality management systems for scientific work at research universities.
However, one should realize that these research results reflect the opinions of specialists and may not represent a complete judgement of university research. For obtaining more precise results, other studies should be carried out by analyzing the performances of universities’ specialized scientific units and education units’ research performances. These studies should focus on the key indicators that were revealed in the current research.
Problems in university research analysis and the results of the survey study help to identify the ways that the university science system could use to improve its effectiveness. In particular, the survey confirmed the high importance for involving students in the research process, which is a valuable instrument for transformative learning. On the whole, the sustainability of tertiary education contributes to the local, national, and international SDG achievements via the “triple helix of innovation” and transfer processes in the knowledge economy.

Author Contributions

Conceptualization, writing—original draft preparation, writing—review and editing, and project administration, D.A.R.; conceptualization, validation, methodology, and supervision, E.O.Z.; investigation, resources, formal analysis, visualization, and data curation, A.I.R., I.I.S. and E.E.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available because of ethical restrictions.

Acknowledgments

The authors thank Vice Rector for Scientific and Innovation Activities Ivanov M.V. for useful discussions and valuable comments that served to improve this article.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Survey Form for Indicators for Assessing the Quality of the Scientific Research

1. The names of the university and laboratory (research center)
2. Your position (line employee, middle manager, or leading specialist) and the head of the laboratory (research center)
3. Age and gender
4. Academic degree
5. Please indicate the value (weight) in the two columns of the table below for each of the highlighted indicators (It is assumed that there is a difference in the weights of the given indicators for fundamental research, on the one hand, and engineering projects, on the other*.) according to one of the following options:
(A) in percentages so that the sum of all the factors is 100% (use an Excel table), or
(B) a value for each indicator on a scale from 0 to 10 (paper form), or
(C) a value for each indicator on a scale from 0 to 100 (paper form).
Note: If you specify options (B) and (C), all the values will be summed up when processing the questionnaire (sum—100%) and converted to percentages.
Please refer to Table 2 presented in the Methodology section.

Appendix B. University Research Processes

Figure A1. Schematic of university research processes performed by scientific units (SUs) and education units (EUs). Note: Figure was created by the authors.
Figure A1. Schematic of university research processes performed by scientific units (SUs) and education units (EUs). Note: Figure was created by the authors.
Sustainability 16 08180 g0a1

References

  1. Etzkowitz, H. The Triple Helix University—Industry—Government Innovation in Action; Routledge: New York, NY, USA, 2008; p. 225. ISBN 978-0415964500. [Google Scholar]
  2. Tang, H.H. The strategic role of world-class universities in regional innovation system: China’s Greater Bay Area and Hong Kong’s academic profession. Asian Educ. Dev. Stud. 2020, 11, 7–22. [Google Scholar] [CrossRef]
  3. National Research Council, USA. Research Universities and the Future of America: Ten Breakthrough Actions Vital to Our Nation’s Prosperity and Security (Report); National Research Council: Washington, DC, USA, 2012; p. 225. ISBN 978-0-309-25639-1.
  4. Powell, J.J.W.; Dusdal, J. The European Center of Science Productivity: Research Universities and Institutes in France, Germany, and the United Kingdom. Century Sci. Int. Perspect. Educ. Soc. 2017, 33, 55–83. [Google Scholar] [CrossRef]
  5. Intarakumnerd, P.; Goto, A. Role of public research institutes in national innovation systems in industrialized countries: The cases of Fraunhofer, NIST, CSIRO, AIST, and ITRI. Res. Policy 2018, 47, 1309–1320. [Google Scholar] [CrossRef]
  6. Vlasova, V.V.; Gokhberg, L.M.; Ditkovsky, K.A.; Kotsemir, M.N.; Kuznetsova, I.A.; Martynova, S.V.; Nesterenko, A.V.; Pakhomov, S.I.; Polyakova, V.V.; Ratay, T.V.; et al. Science Indicators: 2023: Statistical Collection; National Research University Higher School of Economics: Moscow, Russia, 2023; p. 416. ISBN 978-5-7598-2765-8. (In Russian) [Google Scholar]
  7. Textor, C. Breakdown of R&D spending in China 2017–2022, by Entity. Available online: https://www.statista.com/statistics/1465556/research-and-development-expenditure-in-china-distribution-by-entity/ (accessed on 12 August 2024).
  8. Chen, K.; Zhang, C.; Feng, Z.; Zhang, Y.; Ning, L. Technology transfer systems and modes of national research institutes: Evidence from the Chinese Academy of Sciences. Res. Policy 2022, 51, 104471. [Google Scholar] [CrossRef]
  9. Vasilev, Y.; Vasileva, P.; Batova, O.; Tsvetkova, A. Assessment of Factors Influencing Educational Effectiveness in Higher Educational Institutions. Sustainability 2024, 16, 4886. [Google Scholar] [CrossRef]
  10. Ilyushin, Y.V.; Pervukhin, D.A.; Afanaseva, O.V. Application of the theory of systems with distributed parameters for mineral complex facilities management. ARPN J. Eng. Appl. Sci. 2019, 14, 3852–3864. [Google Scholar]
  11. Raupov, I.; Burkhanov, R.; Lutfullin, A.; Maksyutin, A.; Lebedev, A.; Safiullina, E. Experience in the Application of Hydrocarbon Optical Studies in Oil Field Development. Energies 2022, 15, 3626. [Google Scholar] [CrossRef]
  12. Rudnik, S.N.; Afanasev, V.G.; Samylovskaya, E.A. 250 years in the service of the Fatherland: Empress Catherine II Saint Petersburg Mining university in facts and figures. J. Min. Inst. 2023, 263, 810–830. [Google Scholar]
  13. Olcay, G.F.; Bulu, M. Is measuring the knowledge creation of universities possible? A review of university rankings. Technol. Forecast. Soc. Chang. 2017, 123, 153–160. [Google Scholar] [CrossRef]
  14. Lapinskas, A.; Makhova, L.; Zhidikov, V. Responsible resource wealth management in ensuring inclusive growth [Odpowiedzialne zarzdzanie zasobami w zapewnieniu wzrostu wczajcego]. Pol. J. Manag. Stud. 2021, 23, 288–304. [Google Scholar] [CrossRef]
  15. Peris-Ortiz, M.; García-Hurtado, D.; Román, A.P. Measuring knowledge exploration and exploitation in universities and the relationship with global ranking indicators. Eur. Res. Manag. Bus. Econ. 2023, 29, 100212. [Google Scholar] [CrossRef]
  16. Ponomariov, B.L.; Boardman, P.C. Influencing scientists’ collaboration and productivity patterns through new institutions: University research centers and scientific and technical human capital. Res. Policy 2010, 39, 613–624. [Google Scholar] [CrossRef]
  17. Isaeva, N.V.; Borisova, L.V. Comparative analysis of national policies for developing research universities’ campuses. Univ. Manag. Pract. Anal. 2013, 6, 74–87. (In Russian) [Google Scholar]
  18. Ponomarenko, T.V.; Marinina, O.A.; Nevskaya, M.A. Innovative learning methods in technical universities: The possibility of forming interdisciplinary competencies. Espacios 2019, 40, 1–10. Available online: http://www.revistaespacios.com/a19v40n41/19404116.html (accessed on 10 September 2024).
  19. Zhang, H.; Patton, D.; Kenney, M. Building global-class universities: Assessing the impact of the 985 Project. Res. Policy 2013, 42, 765–775. [Google Scholar] [CrossRef]
  20. Shang, J.; Zeng, M.; Zhang, G. Investigating the mentorship effect on the academic success of young scientists: An empirical study of the 985 project universities of China. J. Informetr. 2022, 16, 101285. [Google Scholar] [CrossRef]
  21. Chen, Y.; Pan, Y.; Liu, H.; Wu, X.; Deng, G. Efficiency analysis of Chinese universities with shared inputs: An aggregated two-stage network DEA approach. Socio-Econ. Plan. Sci. 2023, 90, 101728. [Google Scholar] [CrossRef]
  22. Mindeli, L.E. Financial Support for the Development of the Scientific and Technological Sphere; Mindeli, L.E., Chernykh, S.I., Frolova, N.D., Todosiychuk, A.V., Fetisov, V.P., Eds.; Institute for Problems of Science Development of the Russian Academy of Sciences (IPRAN): Moscow, Russia, 2018; p. 215. ISBN 978-5-91294-125-2. (In Russian) [Google Scholar]
  23. Guba, K.S.; Slovogorodsky, N.A. “Publish or Perish” in Russian social sciences: Patterns of co-authorship in “predatory” and “pure” journals. Issues Educ. Educ. Stud. Mosc. 2022, 4, 80–106. (In Russian) [Google Scholar] [CrossRef]
  24. Slepykh, V.; Lovakov, A.; Yudkevich, M. Academic career after defending a PhD thesis on the example of four branches of Russian science. Issues Educ. Educ. Stud. Mosc. 2022, 4, 260–297. (In Russian) [Google Scholar] [CrossRef]
  25. Mohrman, K.; Ma, W.; Baker, D. The Research University in Transition: The Emerging Global Model. High. Educ. Policy 2008, 21, 5–27. [Google Scholar] [CrossRef]
  26. Altbach, P.G. The Road to Academic Excellence: The Making of World-Class Research Universities; Altbach, P.G., Salmi, J., Eds.; World Bank: Washington, DC, USA, 2011; p. 381. ISBN 978-0-8213-9485-4. [Google Scholar] [CrossRef]
  27. Powell, J.J.W.; Fernandez, F.; Crist, J.T.; Dusdal, J.; Zhang, L.; Baker, D.P. Introduction: The Worldwide Triumph of the Research University and Globalizing Science. Century Sci. Int. Perspect. Educ. Soc. 2017, 33, 1–36. [Google Scholar] [CrossRef]
  28. Colina-Ysea, F.; Pantigoso-Leython, N.; Abad-Lezama, I.; Calla-Vásquez, K.; Chávez-Campó, S.; Sanabria-Boudri, F.M.; Soto-Rivera, C. Implementation of Hybrid Education in Peruvian Public Universities: The Challenges. Educ. Sci. 2024, 14, 419. [Google Scholar] [CrossRef]
  29. Fernandez, F.; Baker, D.P. Science Production in the United States: An Unexpected Synergy between Mass Higher Education and the Super Research University. Century Sci. Int. Perspect. Educ. Soc. 2021, 33, 85–111. [Google Scholar] [CrossRef]
  30. Jamil, S. The Challenge of Establishing World-Class Universities: Directions in Development; World Bank: Washington, DC, USA, 2009; p. 115. ISBN 082-1378-767, 978-0821-3787-62. [Google Scholar] [CrossRef]
  31. Mudzakkir, M.; Sukoco, B.; Suwignjo, P. World-class Universities: Past and Future. Int. J. Educ. Manag. 2022, 36, 277–295. [Google Scholar] [CrossRef]
  32. Tian, L. Rethinking the global orientation of world-class universities from a comparative functional perspective. Int. J. Educ. Dev. 2023, 96, 102700. [Google Scholar] [CrossRef]
  33. Clark, B.R. Creating Entrepreneurial Universities: Organizational Pathways of Transformation; Emerald Group Publishing Limited: London, UK, 1998; p. 200. ISBN 978-0 0804-3342-4. [Google Scholar]
  34. Etzkowitz, H. The Entrepreneurial University: Vision and Metrics. Ind. High. Educ. 2016, 3, 83–97. [Google Scholar] [CrossRef]
  35. Kazin, F.A.; Kondratiev, A.V. Development of the concept of an entrepreneurial university in Russian universities: New assessment tools. Univ. Upr. Prakt. Anal. Univ. Manag. Pract. Anal. 2022, 26, 18–41. (In Russian) [Google Scholar] [CrossRef]
  36. Berestov, A.V.; Guseva, A.I.; Kalashnik, V.M.; Kaminsky, V.I.; Kireev, S.V.; Sadchikov, S.M. The “national research university” project is a driver of Russian higher education. Vyss. Obraz. V Ross. High. Educ. Russ. 2020, 29, 22–34. (In Russian) [Google Scholar] [CrossRef]
  37. Matveeva, N.; Ferligoj, A. Scientific Collaboration in Russian Universities before and after the Excellence Initiative Project “5–100”. Scientometrics 2020, 124, 2383–2407. [Google Scholar] [CrossRef]
  38. Matveeva, N.; Sterligov, I.; Yudkevich, M. The Effect of Russian University Excellence Initiative on Publications and Collaboration Patterns. J. Informetr. 2021, 15, 101110. [Google Scholar] [CrossRef]
  39. Semenov, V.P.; Mikhailov Yu, I. Challenges and trends of quality management in the context of industrial and mineral resources economy. J. Min. Inst. 2017, 226, 497. [Google Scholar] [CrossRef]
  40. Litvinenko, V.S.; Bowbrick, I.; Naumov, I.A.; Zaitseva, Z. Global guidelines and requirements for professional competencies of natural resource extraction engineers: Implications for ESG principles and sustainable development goals. J. Clean. Prod. 2022, 338, 130530. [Google Scholar] [CrossRef]
  41. Semenova, T.; Martínez Santoyo, J.Y. Economic Strategy for Developing the Oil Industry in Mexico by Incorporating Environmental Factors. Sustainability 2024, 16, 36. [Google Scholar] [CrossRef]
  42. Rozhdestvensky, I.V.; Filimonov, A.V.; Khvorostyanaya, A.S. Methodology for assessing the readiness of higher educational institutions and scientific organizations for technology transfer. Innov. Innov. 2020, 9, 11–16. (In Russian) [Google Scholar] [CrossRef]
  43. Bakthavatchaalam, V.; Miles, M.; de Lourdes, M.-T.; Jose, S. Research productivity and academic dishonesty in a changing higher education landscape. On the example of technical universities in India (translated from English). Educ. Issues Educ. Stud. Mosc. 2021, 2, 126–151. (In Russian) [Google Scholar] [CrossRef]
  44. Aven, T. Perspectives on the nexus between good risk communication and high scientific risk analysis quality. Reliab. Eng. Syst. Saf. 2018, 178, 290–296. [Google Scholar] [CrossRef]
  45. Marinina, O.A.; Kirsanova, N.Y.; Nevskaya, M.A. Circular economy models in industry: Developing a conceptual framework. Energies 2022, 15, 9376–9386. [Google Scholar] [CrossRef]
  46. Siegel, D.; Bogers, M.; Jennings, D.; Xue, L. Technology transfer from national/federal labs and public research institutes: Managerial and policy implications. Res. Policy 2022, 52, 104646. [Google Scholar] [CrossRef]
  47. Snegirev, S.D.; Savelyev, V.Y. Quality management for scientific activities. Stand. I Kachestvo Stand. Qual. 2014, 3, 54–57. (In Russian) [Google Scholar]
  48. Leontyuk, S.M.; Vinogradova, A.A.; Silivanov, M.O. Fundamentals of ISO 9001:2015. J. Phys. Conf. Ser. 2019, 1384, 012068. [Google Scholar] [CrossRef]
  49. Wieczorek, O.; Muench, R. Academic capitalism and market thinking in higher education. In International Encyclopedia of Education, 4th ed.; Elsevier: Amsterdam, The Netherlands, 2023; pp. 37–47. [Google Scholar] [CrossRef]
  50. Overchenko, M.N.; Marinin, M.A.; Mozer, S.P. Quality improvement of mining specialists training on the basis of cooperation between Saint-Petersburg mining university and Orica company. J. Min. Inst. 2017, 228, 681. [Google Scholar] [CrossRef]
  51. Hernandez-Diaz, P.M.; Polanco, J.-A.; Escobar-Sierra, M. Building a measurement system of higher education performance: Evidence from a Latin-American country. Int. J. Qual. Reliab. Manag. 2021, 38, 1278–1300. [Google Scholar] [CrossRef]
  52. Zharova, A.; Karl, W.; Lessmann, H. Data-driven support for policy and decision-making in university research management: A case study from Germany. Eur. J. Oper. Res. 2023, 308, 353–368. [Google Scholar] [CrossRef]
  53. Lubango, L.M.; Pouris, A. Is patenting activity impeding the academic performance of South African University researchers? Technol. Soc. 2009, 31, 315–324. [Google Scholar] [CrossRef]
  54. de Jesus, C.S.; Cardoso, D.d.O.; de Souza, C.G. Motivational factors for patenting: A study of the Brazilian researchers profile. World Pat. Inf. 2023, 75, 102241. [Google Scholar] [CrossRef]
  55. Kiseleva, M.A. Development of an organizational and economic mechanism for managing the research activities of national research universities. Vestn. YURGTU NPI Bull. SRSTU NPI 2021, 3, 182–190. (In Russian) [Google Scholar] [CrossRef]
  56. Alakaleek, W.; Harb, Y.; Harb, A.A.; Shishany, A. The impact of entrepreneurship education: A study of entrepreneurial outcomes. Int. J. Manag. Educ. 2023, 21, 100800. [Google Scholar] [CrossRef]
  57. Rudko, V.A.; Gabdulkhakov, R.R.; Pyagay, I.N. Scientific and technical substantiation of the possibility for the organization of needle coke production in Russia. J. Min. Inst. 2023, 263, 795–809. Available online: https://pmi.spmi.ru/index.php/pmi/article/view/16246?setLocale=en_US (accessed on 10 September 2024).
  58. Gromyka, D.S.; Gogolinskii, K.V. Introduction of evaluation procedure of excavator bucket teeth into maintenance and repair: Prompts. MIAB. Mining Inf. Anal. Bull. 2023, 8, 94–111. (In Russian) [Google Scholar] [CrossRef]
  59. Michaud, J.; Turri, J. Values and credibility in science communication. Logos Epistem. 2018, 9, 199–214. [Google Scholar] [CrossRef]
  60. Oblova, I.S.; Gerasimova, I.G.; Goman, I.V. The scientific career through a gender lens: A contrastive analysis of the EU and Russia. Glob. J. Eng. Educ. 2022, 24, 21–27. [Google Scholar]
  61. Chiware, E.R.T.; Becker, D.A. Research trends and collaborations by applied science researchers in South African universities of technology: 2007–2017. J. Acad. Librariansh. 2018, 44, 468–476. [Google Scholar] [CrossRef]
  62. Palavesm, K.; Joorel, S. IRINS: Implementing a Research Information Management System in Indian Higher Education Institutions. Procedia Comput. Sci. 2022, 211, 238–245. [Google Scholar] [CrossRef]
  63. Litvinenko, V.S.; Petrov, E.I.; Vasilevskaya, D.V.; Yakovenko, A.V.; Naumov, I.A.; Ratnikov, M.A. Assessment of the role of the state in the management of mineral resources. J. Min. Inst. 2023, 259, 95–111. [Google Scholar] [CrossRef]
  64. Carillo, M.R.; Papagni, E.; Sapio, A. Do collaborations enhance the high-quality output of scientific institutions? Evidence from the Italian Research Assessment Exercise. J. Socio-Econ. 2013, 47, 25–36. [Google Scholar] [CrossRef]
  65. Chen, S.; Ren, S.; Cao, X. A comparison study of educational scientific collaboration in China and the USA. Phys. A Stat. Mech. Its Appl. 2021, 585, 126330. [Google Scholar] [CrossRef]
  66. Arpin, I.; Likhacheva, K.; Bretagnolle, V. Organising inter- and transdisciplinary research in practice. The case of the meta-organisation French LTSER platforms. Environ. Sci. Policy 2023, 144, 43–52. [Google Scholar] [CrossRef]
  67. Liew, M.S.; Tengku Shahdan, T.N.; Lim, E.S. Enablers in Enhancing the Relevancy of University-industry Collaboration. Procedia—Soc. Behav. Sci. 2013, 93, 1889–1896. [Google Scholar] [CrossRef]
  68. Tunca, F.; Kanat, Ö.N. Harmonization and Simplification Roles of Technology Transfer Offices for Effective University—Industry Collaboration Models. Procedia Comput. Sci. 2019, 158, 361–365. [Google Scholar] [CrossRef]
  69. Sciabolazza, V.L.; Vacca, R.; McCarty, C. Connecting the dots: Implementing and evaluating a network intervention to foster scientific collaboration and productivity. Soc. Netw. 2020, 61, 181–195. [Google Scholar] [CrossRef]
  70. Ovchinnikova, E.N.; Krotova, S.Y. Training mining engineers in the context of sustainable development: A moral and ethical aspect. Eur. J. Contemp. Educ. 2022, 11, 1192–1200. [Google Scholar] [CrossRef]
  71. Duryagin, V.; Nguyen Van, T.; Onegov, N.; Shamsutdinova, G. Investigation of the Selectivity of the Water Shutoff Technology. Energies 2023, 16, 366. [Google Scholar] [CrossRef]
  72. Mohamed, M.; Altinay, F.; Altinay, Z.; Dagli, G.; Altinay, M.; Soykurt, M. Validation of Instruments for the Improvement of Interprofessional Education through Educational Management: An Internet of Things (IoT)-Based Machine Learning Approach. Sustainability 2023, 15, 16577. [Google Scholar] [CrossRef]
  73. Tuan, N.A.; Hue, T.T.; Lien, L.T.; Van, L.H.; Nhung, H.T.T.; Dat, L.Q. Management factors influencing lecturers’ research productivity in Vietnam National University, Hanoi, Vietnam: A structural equation modeling analysis. Heliyon 2022, 8, e10510. [Google Scholar] [CrossRef]
  74. Akcay, B.; Benek, İ. Problem-Based Learning in Türkiye: A Systematic Literature Review of Research in Science Education. Educ. Sci. 2024, 14, 330. [Google Scholar] [CrossRef]
  75. Cherepovitsyn, A.E.; Tretyakov, N.A. Development of New System for Assessing the Applicability of Digital Projects in the Oil and Gas Sector. J. Min. Inst. 2023, 262, 628–642. Available online: https://pmi.spmi.ru/pmi/article/view/15795?setLocale=en_US (accessed on 10 September 2024).
  76. Murzo, Y.; Sveshnikova, S.; Chuvileva, N. Method of text content development in creation of professionally oriented online courses for oil and gas specialists. Int. J. Emerg. Technol. Learn. 2019, 14, 143–152. [Google Scholar] [CrossRef]
  77. Sveshnikova, S.A.; Skornyakova, E.R.; Troitskaya, M.A.; Rogova, I.S. Development of engineering students’ motivation and independent learning skills. Eur. J. Contemp. Educ. 2022, 11, 555–569. [Google Scholar] [CrossRef]
  78. Rijcke, S.D.; Wouters, P.F.; Rushforth, A.D.; Franssen, T.P.; Hammarfelt, B. Evaluation Practices and Effects of Indicator Use—A Literature Review. Res. Eval. 2016, 25, rvv038. [Google Scholar] [CrossRef]
  79. Cappelletti-Montano, B.; Columbu, S.; Montaldo, S.; Musio, M. New perspectives in bibliometric indicators: Moving from citations to citing authors. J. Informetr. 2021, 15, 101164. [Google Scholar] [CrossRef]
  80. García-Villar, C.; García-Santos, J.M. Bibliometric indicators to evaluate scientific activity. Radiología Engl. Ed. 2021, 63, 228–235. [Google Scholar] [CrossRef]
  81. Guskov, A.E.; Kosyakov, D.V. National factional account and assessment of scientific performance of organizations. Nauchnyye Tekhnicheskiye Bibl. Sci. Tech. Libr. 2020, 1, 15–42. (In Russian) [Google Scholar] [CrossRef]
  82. Khuram, S.; Rehman, C.A.; Nasir, N.; Elahi, N.S. A bibliometric analysis of quality assurance in higher education institutions: Implications for assessing university’s societal impact. Eval. Program Plan. 2023, 99, 102319. [Google Scholar] [CrossRef] [PubMed]
  83. Buehling, K. Changing research topic trends as an effect of publication rankings—The case of German economists and the Handelsblatt Ranking. J. Informetr. 2021, 15, 101199. [Google Scholar] [CrossRef]
  84. Kremcheev, E.A.; Kremcheeva, D.A. The content of the operation quality concept of the scientific and technical organization. Opcion 2019, 35, 3052–3066. [Google Scholar]
  85. Nyondo, D.W.; Langa, P.W. The development of research universities in Africa: Divergent views on relevance and experience. Issues Educ. Educ. Stud. Mosc. 2021, 1, 237–256. (In Russian) [Google Scholar] [CrossRef]
  86. Marozau, R.; Guerrero, M. Impacts of Universities in Different Stages of Economic Development. J. Knowl. Econ. 2016, 12, 1–21. [Google Scholar] [CrossRef]
  87. Fayomi, O.O.; Okokpujie, I.P.; Fayomi, O.S.I.; Udoye, N.E. An Overview of a Prolific University from Sustainable and Policy Perspective. Procedia Manuf. 2019, 35, 343–348. [Google Scholar] [CrossRef]
  88. Do, T.H.; Krott, M.; Böcher, M. Multiple traps of scientific knowledge transfer: Comparative case studies based on the RIU model from Vietnam, Germany, Indonesia, Japan, and Sweden. For. Policy Econ. 2020, 114, 102134. [Google Scholar] [CrossRef]
  89. See, K.F.; Ma, Z.; Tian, Y. Examining the efficiency of regional university technology transfer in China: A mixed-integer generalized data envelopment analysis framework. Technol. Forecast. Soc. Chang. 2023, 197, 122802. [Google Scholar] [CrossRef]
  90. Dusdal, J.; Zapp, M.; Marques, M.; Powell, J.J.W. Higher Education Organizations as Strategic Actors in Networks: Institutional and Relational Perspectives Meet Social Network Analysis. In Theory and Method in Higher Education Research; Huisman, J., Tight, M., Eds.; Emerald Publishing Limited: Bingley, UK, 2021; Volume 7, pp. 55–73. [Google Scholar] [CrossRef]
  91. Silva, M.D.C.; de Mello, J.C.C.B.S.; Gomes, C.F.S.; Carlos, I.C. Efficiency analysis of scientific laboratories. Meta Aval. 2020, 2, 625–645. [Google Scholar] [CrossRef]
  92. Vinogradova, A.; Gogolinskii, K.; Umanskii, A.; Alekhnovich, V.; Tarasova, A.; Melnikova, A. Method of the Mechanical Properties Evaluation of Polyethylene Gas Pipelines with Portable Hardness Testers. Inventions 2022, 7, 125. [Google Scholar] [CrossRef]
  93. Chen, W.; Yan, Y. New components and combinations: The perspective of the internal collaboration networks of scientific teams. J. Informetr. 2023, 17, 101407. [Google Scholar] [CrossRef]
  94. Corcoran, A.W.; Hohwy, J.; Friston, K.J. Accelerating scientific progress through Bayesian adversarial collaboration. Neuron 2023, 111, 3505–3516. [Google Scholar] [CrossRef] [PubMed]
  95. Wu, L.; Yi, F.; Huang, Y. Toward scientific collaboration: A cost-benefit perspective. Res. Policy 2024, 53, 104943. [Google Scholar] [CrossRef]
  96. Ilyushin, Y.; Afanaseva, O. Spatial Distributed Control System Of Temperature Field: Synthesis And Modeling. ARPN J. Eng. Appl. Sci. 2021, 16, 1491–1506. [Google Scholar]
  97. Cossani, G.; Codoceo, L.; Caceres, H.; Tabilo, J. Technical efficiency in Chile’s higher education system: A comparison of rankings and accreditation. Eval. Program Plan. 2022, 92, 102058. [Google Scholar] [CrossRef]
  98. Marinin, M.A.; Marinina, O.A.; Rakhmanov, R.A. Methodological approach to assessing influence of blasted rock fragmentation on mining costs. Gorn. Zhurnal 2023, 9, 28–34. [Google Scholar] [CrossRef]
  99. Sutton, E. The increasing significance of impact within the Research Excellence Framework (REF). Radiography 2020, 26 (Suppl. S2), S17–S19. [Google Scholar] [CrossRef]
  100. Basso, A.; di Tollo, G. Prediction of UK Research excellence framework assessment by the departmental h-index. Eur. J. Oper. Res. 2022, 296, 1036–1049. [Google Scholar] [CrossRef]
  101. Groen-Xu, M.; Bös, G.; Teixeira, P.A.; Voigt, T.; Knapp, B. Short-term incentives of research evaluations: Evidence from the UK Research Excellence Framework. Res. Policy 2023, 52, 104729. [Google Scholar] [CrossRef]
  102. Reddy, K.S.; Xie, E.; Tang, Q. Higher education, high-impact research, and world university rankings: A case of India and comparison with China. Pac. Sci. Rev. B Humanit. Soc. Sci. 2016, 2, 1–21. [Google Scholar] [CrossRef]
  103. Shima, K. Changing Science Production in Japan: The Expansion of Competitive Funds, Reduction of Block Grants, and Unsung Heroe. Century Sci. Int. Perspect. Educ. Soc. 2017, 33, 113–140. [Google Scholar] [CrossRef]
  104. Li, D. There is more than what meets the eye: University preparation for the socio-economic impact requirement in research assessment exercise 2020 in Hong Kong. Asian Educ. Dev. Stud. 2021, 11, 702–713. [Google Scholar] [CrossRef]
  105. Radushinsky, D.A.; Kremcheeva, D.A.; Smirnova, E.E. Problems of service quality management in the field of higher education of the economy of the Russian Federation and directions for their solution. Relacoes Int. No Mundo Atual 2023, 6, 33–54. [Google Scholar]
  106. Shi, Y.; Wang, D.; Zhang, Z. Categorical Evaluation of Scientific Research Efficiency in Chinese Universities: Basic and Applied Research. Sustainability 2022, 14, 4402. [Google Scholar] [CrossRef]
  107. Cheng, Z.; Xiao, T.; Chen, C.; Xiong, X. Evaluation of Scientific Research in Universities Based on the Idea of Education for Sustainable Development. Sustainability 2022, 14, 2474. [Google Scholar] [CrossRef]
  108. Hou, L.; Luo, J.; Pan, X. Research Topic Specialization of Universities in Information Science and Library Science and Its Impact on Inter-University Collaboration. Sustainability 2022, 14, 9000. [Google Scholar] [CrossRef]
  109. Elbawab, R. University Rankings and Goals: A Cluster Analysis. Economies 2022, 10, 209. [Google Scholar] [CrossRef]
  110. Kifor, C.V.; Olteanu, A.; Zerbes, M. Key Performance Indicators for Smart Energy Systems in Sustainable Universities. Energies 2023, 16, 1246. [Google Scholar] [CrossRef]
  111. Guironnet, J.P.; Peypoch, N. The geographical efficiency of education and research: The ranking of U.S. universities. Socio-Econ. Plan. Sci. 2018, 62, 44–55. [Google Scholar] [CrossRef]
  112. Ma, Z.; See, K.F.; Yu, M.M.; Zhao, C. Research efficiency analysis of China’s university faculty members: A modified meta-frontier DEA approach. Socio-Econ. Plan. Sci. 2021, 76, 100944. [Google Scholar] [CrossRef]
  113. Tavares, R.S.; Angulo-Meza, L.; Sant’Anna, A.P. A proposed multistage evaluation approach for Higher Education Institutions based on network Data envelopment analysis: A Brazilian experience. Eval. Program Plan. 2021, 89, 101984. [Google Scholar] [CrossRef] [PubMed]
  114. Le, M.H.; Afsharian, M.; Ahn, H. Inverse Frontier-based Benchmarking for Investigating the Efficiency and Achieving the Targets in the Vietnamese Education System. Omega 2021, 103, 102427. [Google Scholar] [CrossRef]
  115. Adot, E.; Akhmedova, A.; Alvelos, H.; Barbosa-Pereira, S.; Berbegal-Mirabent, J.; Cardoso, S.; Domingues, P.; Franceschini, F.; Gil-Doménech, D.; Machado, R.; et al. SMART-QUAL: A dashboard for quality measurement in higher education institutions. Qual. Reliab. Manag. 2023, 40, 1518–1539. [Google Scholar] [CrossRef]
Figure 1. Gross domestic expenditure on R&D (GERD) by sources of funds, %, EU, 2000–2023. Note: Figure was created by the authors for the data from https://ec.europa.eu/eurostat/databrowser/view/rd_e_fundgerd/default/table?lang=en (accessed on 10 September 2024) and https://ec.europa.eu/eurostat/statistics-explained/index.php?title=R%26D_expenditure#R.26D_expenditure_by_source_of_funds (accessed on 10 September 2024). The figures for 2022–2023 were forecasted.
Figure 1. Gross domestic expenditure on R&D (GERD) by sources of funds, %, EU, 2000–2023. Note: Figure was created by the authors for the data from https://ec.europa.eu/eurostat/databrowser/view/rd_e_fundgerd/default/table?lang=en (accessed on 10 September 2024) and https://ec.europa.eu/eurostat/statistics-explained/index.php?title=R%26D_expenditure#R.26D_expenditure_by_source_of_funds (accessed on 10 September 2024). The figures for 2022–2023 were forecasted.
Sustainability 16 08180 g001
Figure 2. GERD by sources of funds, %, Russia, 2000–2023. Note: Figure was created by the authors for data from 2000–2023 ([6]; Figures 5.1.8 and 5.4.23), taking into account the approximate trend for 2022–2023. The sector of non-profit organizations was not highlighted because for most of the observation period, its share did not exceed 0.2%.
Figure 2. GERD by sources of funds, %, Russia, 2000–2023. Note: Figure was created by the authors for data from 2000–2023 ([6]; Figures 5.1.8 and 5.4.23), taking into account the approximate trend for 2022–2023. The sector of non-profit organizations was not highlighted because for most of the observation period, its share did not exceed 0.2%.
Sustainability 16 08180 g002
Figure 3. Schematic diagram of the university science basic model (see also Appendix A). Note: Figure was created by the authors.
Figure 3. Schematic diagram of the university science basic model (see also Appendix A). Note: Figure was created by the authors.
Sustainability 16 08180 g003
Figure 4. Diagram of the average annual share of research funding at Mining University in 2018–2023. Note: Figure was compiled by the authors for the data from research activity reports.
Figure 4. Diagram of the average annual share of research funding at Mining University in 2018–2023. Note: Figure was compiled by the authors for the data from research activity reports.
Sustainability 16 08180 g004
Figure 5. Diagram of group indicator analysis. Source: created by the authors.
Figure 5. Diagram of group indicator analysis. Source: created by the authors.
Sustainability 16 08180 g005
Figure 6. Diagram of indicator group 1 analysis. Source: Created by the authors.
Figure 6. Diagram of indicator group 1 analysis. Source: Created by the authors.
Sustainability 16 08180 g006
Figure 7. Diagram of indicator group 4 analysis. Source: Created by the authors.
Figure 7. Diagram of indicator group 4 analysis. Source: Created by the authors.
Sustainability 16 08180 g007
Figure 8. Diagram of indicator group 6 analysis. Source: Created by the authors.
Figure 8. Diagram of indicator group 6 analysis. Source: Created by the authors.
Sustainability 16 08180 g008
Table 1. Higher-education sector’s stakes in GERD in certain countries, 2011–2023, %.
Table 1. Higher-education sector’s stakes in GERD in certain countries, 2011–2023, %.
CountryShare of the Total Volume, %Average
20112015201820192020202120222023
12345678910
USA3.4%3.4%3.4%3.2%3.2%3.0%3.0%3.0%3.2%
China7.1%8.0%8.0%8.2%8.5%8.5%8.5%8.5%8.2%
Japan5.9%5.4%5.1%5.2%5.2%5.2%5.2%5.2%5.3%
Russia8.0%9.0%9.5%10.7%10.1%10.2%11.0%11.0%9.9%
Turkey20.4%19.2%18.9%18.4%18.6%16.4%15.7%15.5%17.9%
Serbia25.1%24.0%25.3%25.4%44.7%45.9%41.9%43.0%34.4%
Spain4.1%4.3%4.4%4.2%3.9%4.0%4.0%4.0%4.1%
France1.0%2.8%3.1%2.9%3.0%3.0%3.0%3.0%2.7%
EU0.8%0.8%1.2%1.2%1.2%1.2%1.2%1.2%1.1%
Note: Table was compiled by the authors for the data from https://ec.europa.eu/eurostat/databrowser/view/rd_e_fundgerd/default/table?lang=en (accessed on 10 September 2024) and Figure 5 at https://ec.europa.eu/eurostat/statistics-explained/index.php?title=R%26D_expenditure#R.26D_expenditure_by_source_of_funds (accessed on 10 September 2024) for Russia [6] and for China [7]. The figures for 2022–2023 were forecasted.
Table 2. Indicators for assessing the quality of the fundamental scientific research and engineering project results and the performances of universities’ specialized scientific units (SUs) in retrospect.
Table 2. Indicators for assessing the quality of the fundamental scientific research and engineering project results and the performances of universities’ specialized scientific units (SUs) in retrospect.
Indicators for Assessing the Quality of Project Results and the Performances
of Specialized SUs
Significance of Indicators, %
FundamentalEngineering
123
1. Basic scientific performance indicators:
1.1. the number of patents registered
1.2. the number of original computer programs registered
1.3. the number of defended dissertations (master/science candidates) by employees of SUs
1.4. the number of defended dissertations (Ph.D./doctoral) by employees of SUs
2. Student cooperation indicators:
(the statistics of the students attracted to the project teams/the work of the SUs during the reporting period—the number of persons and percentages of staff and of the total working time)
2.1. students
2.2. postgraduate students
2.3. young specialists (25–35 years)
2.4. foreign students and postgraduates
3. Quantitative economic indicators:
3.1. total number of researchers involved in the project
3.2. working time of researchers, hours
3.3. working time of researchers, costs (if available)
3.4. constantly used spaces of laboratories, m2
3.5. constantly used office spaces, m2
3.6. costs for maintaining laboratory and office spaces
3.7. residual value of the laboratory equipment used, which belongs to SUs/STUs
3.8. cost of specially purchased equipment for the project
3.9. laboratory equipment use of other departments (SUs) and organizations (costs and hours)
3.10. costs of materials used for laboratory experiments
3.11. other costs
3.12. net profit or pure income (proceeds minus all the costs and taxes)
3.13. proceeds per researcher on a project or in a reporting period
3.14. net profit per researcher on a project or in a reporting period
4. Quantitative scientometric indicators:
4.1. the quantity of scientific publications indexed by Scopus/WoS 1–2 quartile
4.2. the quantity of scientific publications indexed by Scopus/WoS 3–4 quartile
4.3. the quantity of scientific publications indexed by Scopus/WoS, without quartile
4.4. the quantity of scientific publications indexed by national citation databases (for example, the Russian Science Citation Index, RSCI)
4.5. the quantity of citations in Scopus/WoS databases *
4.6. the quantity of citations in the national citation databases *
4.7. the quantity of reviews for Scopus/WoS performed
4.8. the quantity of reviews performed for publications, indexed in national citation databases
5. International cooperation indicators:
5.1. foreign researchers attracted to the project teams/the work of SUs during the reporting period (the number of persons and percentages of staff and of working hours)
5.2. researchers of SUs attracted to work with foreign partners during the reporting period (the number of persons and percentages of staff and of working hours)
6. Qualitative assessment (comprehensive multifactorial assessment)
6.1. possibilities for integration with the results of previous and related studies
6.2. maintaining existing achievements, general culture, and expanding the activities of the scientific school
6.3. the possibility for testing/the partial implementation of the results in practice in different industries—“knowledge transfer”—on a test or stream basis
6.4. the possibility for publishing results with inclusion in regional/national or sectoral research information systems
6.5. invitations to SU researchers to become constant members of national and international scientific associations
6.6. invitations to SU researchers to participate in national academic councils which are awarding the scientific degrees
6.7. other direct and indirect positive impacts in various areas
TOTAL100.0%100.0%
* Citations of works published over the past 5 years. Note: Table was created by the authors.
Table 3. Respondents’ characteristics.
Table 3. Respondents’ characteristics.
CharacteristicMining UniversitySt. Petersburg State UniversitySPb Polytechnical UniversityITMO UniversityLETI University
123456
Total number of researchers (employees of SUs/STUs)18023025020050
Total number of researchers who took part in the survey (246)
Of them
10359292718
SU leaders54111
Middle managers and specialists7940191510
Post-graduate students127553
Students78464
Aged
20–2520199117
25–353112633
35–5540181193
>551210345
Note: Table was created by the authors.
Table 4. Problems in university research and their possible solutions.
Table 4. Problems in university research and their possible solutions.
Problem Possible Solution
1. The insufficient involvement of students, postgraduates, and young specialists in research, which complicates the transfer of innovations in the long term and is a threat to the sustainability of the developments of both the university and its macroenvironment region, industry, and country [27,28,34]. The creation of conditions for the development of university science by the state: the construction of laboratory premises, acquisition of equipment, and engineering school support [36,37,39]. Attracting students to research via the entrepreneurial activities of the university [34,35].
2. The risk of unjustified investment in university research: “the system for identifying promising developments at universities is retroactive, which leads to a low potential for their commercialization... and to unjustified investments.” [42]; “Falsification of research at technical universities can not only deprive the university of the trust of sponsoring companies but also leads to emergency situations when trying to implement it” [43]; publication of results in “predatory” journals is a research management risk [44,45]. The correct defining of a task, drawing up detailed technical specifications, and bearing responsibility for the results of research [46]; implementing the terms from international quality standards of the ISO 9000 series and their analogs for science products in research contracts and technical specifications: “product”—“scientific result” and “requirement”—“scientific criteria” and “quality”—“the degree of scientific validity of a research result” [47,48].
3. The separation of the functions of research contracting and contract execution: “the creation of scientific products and their successful sale as products or services on the market are different types of activities that require separate management and organizational efforts and structures” [47,49]. Attracting managers from international companies in university science contract and sales divisions [5,50,51] and the implementation of support schemes and promotional programs for key specialists, who can present, sell, and execute research as incentives [52].
4. The incomplete reflection of the specialists’ competencies: shortcomings in realizing the potential of temporary and constant scientific teams (SUs, engineering centers, etc.) in patents and grant activities [53,54,55]. Involving researchers, lecturers, and students in the work of “entrepreneurial university” small enterprises and encouraging them to register patents and IT-industry products and to apply for grants [56,57,58].
5. Low levels of scientific collaborations and communications between researchers within and between universities and production companies: insufficient levels of trust and cooperation for joint scientific research between university units [59,60]; the absence or shortcomings of academic research communication and management systems (RCMSs), like European “EuroCRIS”, complicates the exchange of experience within and between universities and production companies and research result implementation [61,62,63]. Stimulating scientific collaboration within and between universities and production companies by organizing inter- and trans-disciplinary research [64,65,66]; organizing internships for employees of universities and production companies [67,68,69]; the creation of personalized algorithms and systems of research communication and management with high-tech partner companies of universities [70,71]; introducing an internet-of-things (IoT)-based machine-learning approach [72].
6. Involving lecturers in scientific activities: “lecturers (teachers) are, for the most part, interested in educational activities, and conducting scientific research is perceived as something forced” [73]; current real-world problems or scenarios are not invented enough in educational practice [74]. Shifting the focus to the formation of “interdisciplinary competencies” and problem-solving skills of lecturers, which allows for them to carry out desk research on their own, as well as to involve talented students in scientific work [75,76,77].
7. Limitations of scientometric (bibliometric) indicators: quantitative methods of the integer counting of publications for assessing the effectiveness of academic research are not sufficiently objective, and they need additional qualitative diversification [78,79,80]. The use of the “fractional counting” of scientific publications to increase the objectivity of scientific result evaluation [81], taking into account the societal impact, research topic, and other qualitative factors while ranking the publication [82,83,84].
8. Problems of small (regional) universities in attracting qualified scientific personnel capable to “make a significant contribution to … the production of knowledge and its transfer” [85,86]. Regional universities should stress the most-relevant area of research for the territory, with the partial involvement of qualified specialists from local production leaders as consultants [87,88,89].
Note: Table was compiled by the authors.
Table 5. Main indicators of university research, average figures yearly for the period 2021–2023 for chosen universities.
Table 5. Main indicators of university research, average figures yearly for the period 2021–2023 for chosen universities.
CharacteristicMining UniversitySt. Petersburg State UniversitySPb Polytechnical UniversityITMO UniversityLETI University
123456
1. Number of undergraduate and graduate students, thousands of people16.732.13414.59.1
2. Number of lecturers (employees of education units, teaching staff, and support staff), thousands of people2.53.32.51.31.1
3. Number of researchers (employees of scientific units), thousands of people0.180.230.250.20.05
4. Ratio of the number of researchers to the number of lecturers, %12%7%8%15%5%
5. Annual volume of scientific work performed, millions of rubles1500–1950580–650710–790650–780130–170
6. Share of government and organizations with state participation that order research, percentage of the total volume of the contracts20.7%69.7%59.5%48.5%78.9%
7. Lecturers who published research in journals in the Scopus/WoS level 1–2 quartile36%14%29%39%17%
8. Share of researchers who regularly publish the results of their research in journals in the Scopus/WoS level 1–2 quartile53%44%57%64%39%
9. Number of patents registered to the university187–29855–112312–628215–36589–178
10. Share of patent authorship attributable to researchers/lecturers65/35%85/15%78/22%62/38%82/18%
11. Annual volume of scientific work per employee of the SU, thousands of rubles (average estimate)94442652300036253000
Note: Table was developed and compiled by the authors for available reporting data and estimates received from the university management.
Table 6. University types.
Table 6. University types.
CharacteristicMining UniversitySPb State UniversitySPb Polytechnical UniversityITMO UniversityLETI University
123456
The share of students and postgraduates who study technical specialties93%44%68%94%78%
University type (EE—engineering; C—comprehensive; E—mixed, closer to engineering)EECEEEE
Note: Table was developed and compiled by the authors.
Table 7. Scientific and education units’ performances in 2021–2023 for chosen universities, the share of the fundamental research and engineering projects by costs.
Table 7. Scientific and education units’ performances in 2021–2023 for chosen universities, the share of the fundamental research and engineering projects by costs.
Performed by UnitsShare of the Total Volume, %
Mining UniversitySt. Petersburg State UniversitySPb Polytechnical UniversityITMO UniversityLETI UniversityWeighted Average *
1234567
1. Scientific units (SUs/STUs), total90.3%62.8%79.8%93.6%65.3%83.7%
Including
(a) fundamental research19.8%16.8%14.6%12.6%27.8%17.3%
(b) engineering projects70.5%46.0%65.2%81.0%37.5%66.4%
2. Education units (EUs)9.7%37.2%20.2%6.4%34.7%16.3%
Including
(a) fundamental research9.1%35.0%15.3%6.0%28.0%14.4%
(b) engineering projects0.6%2.2%4.9%0.4%6.7%1.9%
TOTAL100%100%100%100%100%100.0%
Including
(a) fundamental research28.9%51.8%29.9%18.6%55.8%31.8%
(b) engineering projects71.1%48.2%70.1%81.4%44.2%68.2%
* It was weighted according to the annual volume of the scientific work performed (Table 5, line 5). Note: Table was developed and compiled by the authors.
Table 8. Results of group indicator analysis.
Table 8. Results of group indicator analysis.
Groups of IndicatorsSignificance of Indicators, %
FundamentalEngineering
1. Basic scientific performance indicators10.9%11.0%
2. Student cooperation indicators7.6%13.2%
3. Quantitative economic indicators29.8%65.4%
4. Quantitative scientometric indicators31.7%4.4%
5. International cooperation indicators3.2%1.3%
6. Qualitative assessment (comprehensive multifactorial assessment)16.8%4.7%
TOTAL100.0%100.0%
Note: Table was created by the authors.
Table 9. Top-six most-important single indicators from different groups for fundamental research and for engineering projects.
Table 9. Top-six most-important single indicators from different groups for fundamental research and for engineering projects.
Indicators for Fundamental Research%Indicators for Engineering Projects%
1234
4.1. the quantity of scientific publications indexed by Scopus/WoS 1–2 quartile8.8%1.1. the number of registered patents 7.8%
4.5. the quantity of citations in Scopus/WoS databases 7.8%3.12. net profit or pure income (proceeds minus all the costs and taxes) 6.9%
6.1. possibilities for integration with the results of previous and related studies5.6%3.4. constantly used spaces of laboratories, m26.4%
1.3. the number of defended dissertations (Ph.D.; science candidate) by employees of SUs5.3%3.2. working time of researchers, hours6.1%
4.7. the quantity of reviews for Scopus/WoS performed4.5%3.3. working time of researchers, costs (if available)5.7%
4.8. the quantity of reviews performed for publications, indexed in national citation databases3.7%3.8. cost of specially purchased equipment for the project 5.7%
Subtotal 35.7%Subtotal 38.6%
Note: Table was developed and compiled by the authors.
Table 10. Results of the hypotheses’ justifications.
Table 10. Results of the hypotheses’ justifications.
HypothesisConclusion
H1Partially proved hypothesis (70%)
H2Proved hypothesis
H3Partially proved hypothesis (90%)
H4Partially proved hypothesis (50%)
H5Proved hypothesis
H6Proved hypothesis
Source: Created by the authors.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Radushinsky, D.A.; Zamyatin, E.O.; Radushinskaya, A.I.; Sytko, I.I.; Smirnova, E.E. The Performance and Qualitative Evaluation of Scientific Work at Research Universities: A Focus on the Types of University and Research. Sustainability 2024, 16, 8180. https://doi.org/10.3390/su16188180

AMA Style

Radushinsky DA, Zamyatin EO, Radushinskaya AI, Sytko II, Smirnova EE. The Performance and Qualitative Evaluation of Scientific Work at Research Universities: A Focus on the Types of University and Research. Sustainability. 2024; 16(18):8180. https://doi.org/10.3390/su16188180

Chicago/Turabian Style

Radushinsky, Dmitry A., Egor O. Zamyatin, Alexandra I. Radushinskaya, Ivan I. Sytko, and Ekaterina E. Smirnova. 2024. "The Performance and Qualitative Evaluation of Scientific Work at Research Universities: A Focus on the Types of University and Research" Sustainability 16, no. 18: 8180. https://doi.org/10.3390/su16188180

APA Style

Radushinsky, D. A., Zamyatin, E. O., Radushinskaya, A. I., Sytko, I. I., & Smirnova, E. E. (2024). The Performance and Qualitative Evaluation of Scientific Work at Research Universities: A Focus on the Types of University and Research. Sustainability, 16(18), 8180. https://doi.org/10.3390/su16188180

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop