Russian University Teachers’ Perceptions of Scientometrics

This article is devoted to the attitude of Russian university teachers toward scientometrics and its indicators, which have been imposed on them by university administrations and the state since 2012. In addition to substantiating the problem’s urgency, the article contains a brief critical outline of the main scientometric parameters and their application in practice in Russia. To evaluate this, 283 people from leading universities in Russia (included in Program 5-100) were questioned. As the study showed, faculties of Russian universities understand the specifics of scientometrics, relate to it relatively positively, and over the past years have been able to adapt to the new requirements of the administration regarding implementing scientometric tasks and standards. The higher the position and scholarly qualification of a respondent, the more complete the knowledge about scientometrics. Scholars in the humanities know and relate comparatively better to scientometrics than representatives of technical and general scientific specialties.


Introduction
Digitalization affects increasing areas of human life; quantitative indicators are becoming dominant in areas that previously seemed not intended for this [1]. Scientometrics is a special discipline that researches scholarship using mathematical methods, data collecting, and statistical processing of bibliographic information (the number of published scientific papers, citations, etc.). Despite using mathematical methods, equations, and mathematical analysis, scientometrics can hardly be classified as a full-fledged science since its main indicators rarely give a complete and objective picture of the scientific achievements of specific researchers (which will be discussed below). Nevertheless, scientometric data are now widely used in various countries, and in Russia, since 2012, there has been a real fetishization of scientometric parameters, at least in the country's leading universities.
The origin of bibliometrics dates back to the end of the 19th century, the scientometrics taking shape in the middle of the 1950s in the United States, founded by the American linguist Eugene Garfield [2]. In 1960, he organized the Institute of Scientific Information (ISI), which since 1963 has regularly published bibliographic indexes of scholarly citation (Science Citation Index). Such bibliographic information was much needed by universities, rating agencies, and other organizations directly or indirectly connected with education and scholarship and, therefore, willing to pay for it [3]. It is not surprising that three decades later, scientometrics in the United States has become a thriving and profitable discipline, especially after the takeover of ISI by the Thomson Reuters Corporation in 1992 and forming the world-famous bibliographic database (BDB) Web of Science (WoS), which Clarivate Analytics took over in October 2016.
In addition to scientometric indicators, an expert assessment is also used in Russia and other countries, but usually in reviews in journals (or reviews on monographs). Of course, this kind of expert analysis indirectly supplements and corrects scientometric statistics. In university practice, expert analysis is usually not applied, but primitive quantitative indicators are used, for example, the number of publications indexed in the Scopus or WoS databases for the previous three calendar years; Hirsch index according to BDB Scopus or RSCI, etc.
Let us start with the total number of publications-the most common indicator of the effectiveness of the scientific activity of a scholar. Not all of the scholar's works are recorded in bibliographic databases, sometimes reducing the final figure and significantly. The leading international databases Web of Science and Scopus introduced artificial restrictions on the registration of scholarly papers: articles published in peer-reviewed journals, mainly in English, have absolute priority, which results in discrimination against representatives of non-English-speaking countries [5,6].
In addition to the frequent underestimation of a scholar's total number of publications, another important disadvantage of the main scientometric indicator is the ignoring of complexity, volume, and quality of scholarly work. After all, a simple mechanical calculation of publications erases the difference between a monograph, journal article, review, etc. Therefore, such a simple indicator as the total number of publications must, on one hand, be detailed, indicating the nature of the publications, and, on the other hand, must be supplemented with other scientometric indicators.
Here it is also necessary to touch upon the issue of co-authorship. This problem is especially relevant for the natural, medical, and technical sciences since humanities specialists usually write their work individually or in small teams. A classic example is an article published after research at the Large Hadron Collider and the discovery of the Higgs boson: almost 3000 physicists are formally considered the authors of this article! [7] (p. 7). It is impossible to imagine that all these people wrote one article. Although fractional counting methods have already been developed in scientometrics [8], they are rarely used in university practice. The main bibliometric databases also give equal co-authorship to all contributors (real and more often imaginary) of the indexed publication.
The problem of co-authorship often, especially in Russia, also lies in the possibility of various abuses, for example, when co-authors forcibly (or voluntarily-to assist in the publication of an article) include the leadership of a department or university in the absence of a real contribution to scientific research [9] (p. 276). In addition, false co-authorship distorts scientometric results, and sometimes very significantly, leading to citation fraud, contributing to corruption, and so on.
Even this brief outline of the most universally used scientometric indicator demonstrates that the number of author's works appearing in one or another bibliographic database rarely reflects the real number of his/her publications (underestimating or, conversely, overestimating through the mechanism of co-authorship). In parallel, scientometrics offers another indicator that is designed to solve the research impact. This is the citation, which means considering the reference to the author's work by another author or group of authors. It is believed that the more citations, the higher the demand for the scholarly work, and thus its quality. Currently, the citation index (CI) is a generally recognized indicator of the significance of a scholar's work in the scholarly world. Although it has no fewer disadvantages than the previous main scientometric indicator, it appears in all the databases because not all of the author's works are recorded in the leading databases Web of Science and Scopus. Accordingly, the number of citations considered will be artificially reduced. Again, there is the problem of co-authorship: the citation index does not take into account the personal contribution of the author (when calculating the CI, that is, it does not matter if fifty people wrote the article or there was only one author) [10] (pp. 135-141).
In scientometrics, it has long been noted that citation depends on the branch of scientific knowledge and the culture of citation: most often, doctors and biologists are cited, and least often, historians and mathematicians. In addition, the number of citations depends very significantly on the scholarly topic and specialization, even within a single science. For example, it is obvious that in ethnography, a larger topic or study devoted to a larger ethnic group will gather a larger "crop" of citations: links to an article about the material culture of the Chinese will be significantly more than to a publication about totemism among the Haida-Kaigani Indians. Among other things, the number of citations received can be influenced by the "scientific fashion" and personal relationships between scholars (the role of the subjective factor is especially great in Russia with its tradition of informal scholarly relations and corruption).
There are several other obvious disadvantages to the citation index. In particular, the CI registers references even to those works that are subjected to fair criticism and thus reaches the point of complete absurdity: a negative reference, which, in theory (after verification by experts) should be credited with a minus sign, on the contrary, brings the criticized author an additional citation. However, the "father of scientometrics", Eugene Garfield did not consider this a significant problem since, in his opinion, scholars are not inclined to be distracted by refuting frivolous works [11] (p. 45). He also believed that there is no need to be afraid of self-citation, which from his point of view is to a certain extent justified and considered by all bibliographic databases without exception. Indeed, references to your own work can be useful (within reasonable limits) by referring the reader to more detailed information. However, it is clearly not necessary to take them into account in scientometric calculations since, on this basis, an artificial increase in the citation is possible. Thus, the introduction in Italy in 2010 of the rule to take into account citations when holding the position of professor has led to a sharp increase in self-citation, especially among sociologists [12]. According to a special study, the more people quote themselves, the more often they are quoted by other scholars [13] (p. 433). Wherein men are about one and a half times more likely to refer to their own work than women [14].
A very significant disadvantage of the citation index is, in our opinion, that a single reference to specific work is usually considered in the list of literature used in the article. However, there may be several references to this publication in the text. An equally serious disadvantage of the CI is that it can be artificially increased by various manipulations, for example, when colleagues agree to quote each other's results. Such unethical methods are not uncommon in Russia and sometimes abroad.
Despite all these disadvantages, the citation of scholarly papers formed the basis of another scientometric index-the impact factor (IF) of journals. The impact factor is a formal numerical indicator of the importance of a scientific journal, which shows how many times, on average, each article published in it is cited over the next two years after publication. The introduction of the impact factor contributed to a better selection of scientific journals by the WoS database, where they are divided into four categories-quartiles-from Q1 (highest) to Q4 (lowest). Currently, the SCImago Journal Rank (SJR) is a convenient and visual system that demonstrates quartiles of journals. It is a publicly accessible portal developed by the University of Granada (Spain) with ratings of journals and countries and is associated with the Scopus database [15].
In addition to the impact factor, the citation of scholarly papers forms the basis of the last of the main scientometric indicators, the Hirsch Index-the h-index. This was developed in 2005 by Jorge Hirsch from the University of San Diego (CA, USA) to assess a researcher's scientific productivity that can be given by such simple characteristics as the total number of publications and citations. According to the h-index, a scientist has index h if h of his or her Np papers have at least h citations each and the other (Np-h) papers have ≤h citations each [16] (p. 16569). Although the h-index has undeniable positive qualities, such as ease of calculation and a relatively adequate assessment of the scholarly productivity of the researcher, it is not without many disadvantages. Thus, the h-index number cannot exceed the total number of the author's works; it does not take into account information on the most important highly cited works; two people with the same h-index value may have a total (summary) citation rate (IC), which differs tenfold, etc. [10]. As we can see, none of the three main scientometric parameters is perfect, and their thoughtless use can lead to various misunderstandings and abuses (see in detail [17]). However, the warnings of scientometrics experts had no effect in Russia, where a real fetishization of scientometric indicators began in 2012, which could not but affect the position of teachers in the country's leading universities. The pressure of the state bureaucracy was especially intensified under President Vladimir Putin, including strengthening the powers of the Ministry of Higher Education and university administration. To prevent a further decline in the country's prestige in the field of international publication activity, decree no. 599, signed by President Putin, "On Measures to Implement State Policy in the Field of Education and Science," was issued on 7 May 2012. It set a goal to increase Russian researchers' work in the total number of publications in world scholarly journals indexed in the Web of Science to 2.44% by 2015.
After the publication of the 2012 presidential decree, all subsequent state policies in the field of scholarship began to adapt to it, including the forced introduction of basic scientometric indicators, which began receiving much more attention. Another consequence of the decree was the inclusion of Russian universities in the rating race both at home and abroad. In 2013, the ambitious government program "5-100-2020" was launched, according to which by 2020, the five best Russian universities should be included in the top 100 universities globally. Considerable funds were allocated to 21 elite Russian universities for this project, including to increase the publication activity and citation of their employees and scholars. In general, the experience was quite successful, and according to WoS data for 2017, Russia's share was 2.56% of the world's scholarly publications, which corresponded to 13th place globally [18,19] (p. 828). Now, in 2021, according to the SJR rating, Russia is in 12th place globally in terms of the number of scientific publications, slightly ahead of South Korea and behind Spain.
In this situation, Russian scholars and teachers are forced to adapt to the state and university bureaucracy regulations. Similar processes are observed in other countries. For example, the rush toward rating indicators in Pakistan, where a similar program to include five of the best universities in the top 300 higher educational establishments of the world had been adopted, led not only to the monetary stimulation of publication activity but also to various abuses and the deterioration of the quality of scientific publications [20] (pp. 442-447). Likewise, in China, young scientists are forced to publish in journals indexed by prestigious international databases [21]. Even in Italy, some authors, albeit insignificant, either unknowingly or quite deliberately give their works to so-called "predatory" journals for faster publication and indexing of articles for getting a citation [22] (pp. [14][15]26).
The same is observed in Russia. To meet targets, not always honest methods are used that will allow one to achieve, and sometimes even exceed, the formal requirements of superiors. In recent years, the number of articles with multiple authors has increased significantly (especially among scholars in the humanities). The usual method has been to divide a large article into several smaller ones to increase the total number of publications; publication of the same work, but with different titles and minimal changes in content, has the same effect. The unrestrained mutual citation has also been used [23] (pp. 64-66).

Methods
In preparing this article, such standard theoretical methods of scholarship were used as induction and deduction, analysis and synthesis, a systematic approach, and the comparative-typological and comparative-analytical methods.
In addition to theoretical methods, practical methods, such as working with documents, analysis of printed and electronic sources of information, and especially computerassisted web interviewing, were widely used in writing this article. The use of the latter helped to gather the main blocks of information on the research topic. In addition, statistical and mathematical methods were also used when processing questionnaires and respondent's answers.
For a more detailed study of the attitude of Russian university teachers to scientometrics and its indicators, the authors developed a questionnaire of 22 questions for a survey, the results of which are given in the Appendix A (see Table A1). The survey, conducted in the first three months of 2020, involved 283 respondents. In the Russian Federation in 2020, 227 thousand teachers worked in all universities; this information was published in the statistical collection "Education in Figures: 2020". This short statistical compilation is the main source of information on the entire system of Russian education. The collection uses data from the Ministry of Education of the Russian Federation, the Ministry of Education and Science of the Russian Federation, and the Federal Treasury, as well as its own developments at the Institute for Statistical Research and Economics of Knowledge of the National Research University Higher School of Economics. Thus, observing a confidence level of 90 and a confidence interval of 5%, we have the minimum required sample size of 272 respondents (283 university professors were interviewed in the study). In addition, forming the sample being influenced by the number of teachers with a university degree should have been at the average for the Russian Federation (74.1%). . These are mainly educational institutions included in the 5-100 Program, and therefore, their teachers were generally better oriented in the problems of scientometrics than representatives of ordinary Russian universities (the Herzen State Pedagogical University of Russia was chosen as a control University not included in the 5-100 Program, which provided 10% of respondents). Statistical materials were collected both through direct questionnaires and through Internet surveys. At the same time, the number of respondents from provincial universities-150 people-slightly exceeded the number of respondents from St. Petersburg (133). Among the respondents, just over half were women-50.5%, men-45.6%, and only a very few refused to answer the question about their gender (3.5%) or indicated another gender-0.4%. The age distribution gave the following figures: young people under 34 years of age made up 36.7%, middle-aged (35-49 years)-39.9%, and 50 years and older-23.4%, which roughly reflects the gender and age structure of teachers at Russian universities. As for ethnicity, the vast majority of respondents identified themselves as Russian-96.5%; several people identified themselves as Jews, Kazakhs, Tatars, or Ukrainians. As for the respondents' professional and official structure, professors made up 10.6%, associate professors-44.2%, senior teachers-19.8%, and assistants-25.4%, which approximately corresponds to the standard number of each category in a normal Russian university. Half of the respondents identified themselves as scholars in the humanities-49.8%, and 50.2% identified themselves as natural and technical sciences representatives. Of the respondents, 46.3% preferred not to reveal their specialty, indicating only which sciences could be attributed. Information on those who indicate their specialty is presented in the Appendix A with information on the universities (see Appendix A, Table A2).

Analysis of the Perception of Scientometric Indicators in Leading Russian Universities
Interest in the problem of scholars' attitudes to scientometric indicators arose initially in the West in the 1990s [24]. One of the most extensive studies of this phenomenon was done in 2012 when an Internet survey of 1704 researchers representing all branches of scholarship from 86 countries was conducted. However, this survey concerned only one scientometric indicator-the impact factor. The results showed that the positive attitude of scholars to it only slightly exceeded the negative one, but for 88% of the respondents, the IF is important or very important for evaluating scholarly performance in their country [25] (p. 286-289). Research in this direction is continuing, although not very intensively. For example, a comparative analysis of the surveys of 420 humanities scholars in Australia and Sweden concerning bibliometric (scientometric) indicators was recently conducted. The survey found, in particular, that a third (32%) of respondents used scientometric parameters for evaluation or self-promotion in attachments and resumes [26] (p. 927).
In Russia, almost the only work devoted to the topic of interest was an article published in 2016 by Igor Filippov, "How Scholars of Humanities Profile Evaluate Scientometrics," in the journal Siberian Historical Research. Forty people from among the humanities folk took part in the interview he conducted, i.e., the representativeness of the research is clearly insufficient, which the author of the article admitted. At the same time, all of Filippov's respondents were united in the opinion that the proposed new ways of evaluating scientific activity using scientometric indicators are unsatisfactory since they differ in a high degree of formalism, do not allow evaluation of the merits of the work of scholars, and therefore, are unfair. At the same time, the interviews revealed that many respondents were not well aware of scientometrics, its goals, heuristic capabilities, limits, and experience of application: many, even experts, were unable to report their data in the Russian science citation index or in foreign bibliometric databases (specific figures are not given in Filippov's article) [27] (p. 14).
After a preliminary review of the main parameters of respondents, we will go directly to the results of their survey. To the question, "Do you know what 'scientometrics' is?" the most popular answer was: "Yes, I know very well"-54.8% of responses; another 35% of respondents noted "vaguely imagine," and 10.2%-"do not know." Thus, a little more than half of the Russian university teachers are very familiar with scientometrics, and only 10% do not know about it. At the same time, the share of those who do not have a clear knowledge of this discipline is significant-just over one-third. At the same time, Most respondents have a neutral attitude to scientometrics (56.9%); 26.9% have a positive attitude to this discipline, and 16.2% have a negative attitude. These figures show that, in general, the teaching community has already adapted to the administration's requirements and is relatively favorable to scientometrics (definitely-about a quarter), and opponents are outnumbered. It is also likely that the desire of some teachers, especially from provincial universities, to indirectly demonstrate their loyalty to the university authorities (whose primary concern is to formally increase the scientometric indicators for reporting to the ministry) has affected this situation.
Of the respondents, 63.3% could boast a good knowledge of all bibliometric databases (RSCI, Web of Science, and Scopus). "Something familiar, but there are no clear ideas" was the answer of 17% of respondents. Only 14.1% of respondents know well the Russian science citation index, 3.2% do not know any of the bibliometric databases at all, and 2.4% of respondents gave their own version of the answer, which in terms of semantic content is closer to a good knowledge of all three databases. Thus, only a very small percentage of Russian teachers do not know anything about bibliometric databases.
Next in our questionnaire were clarifying questions related to scientometric indicators. To the question, "Do you know what the 'h-index' is?" "yes" was answered by 77.4%, "very approximately"-17%, "no"-5.6%. It is interesting to note that respondents know the h-index much better than scientometrics itself. Perhaps the reason lies in the exoticsounding term that periodically pops up in the scientific press, administration orders, and private conversations of teachers. As for knowing the value of their own h-index, Russian university teachers show much less awareness: in the RSCI database, about a third-32.5%-do not know the value of their own h-index. The situation is even worse regarding knowledge of their indicators in international databases: more than half of Russian respondents have no idea about their h-index figure in the Web of Science (61.5%) and in Scopus (56.2%). This is not surprising, given that not all Russian teachers, and especially young ones, have registered citations in international databases and, accordingly, a nonzero h-index.
To the question, "Do you know what the journal's impact factor and its quartile are?" a little more than half of the respondents answered: "I know both terms very well and understand their meaning" (55.1%). Almost 30% of respondents have a vague idea about them ("I have encountered them somewhere"-29.7%), and 11.3% first learned about the existence of these terms during the survey. The last figure agrees well with the negative answer about knowledge of the term "scientometrics" (10.2%) in one of the previous questions.
The next question was: "Do you keep track of your scientometric indicators in the main bibliographic databases?" It turned out that a little more than half of the respondents monitor their scientometric indicators from time to time (55.1%), 20.9% monitor constantly, and never-almost a quarter (24%). The last figure is very significant and surprising. An additional review of the questionnaire showed that of the 67 people, who ignore their data in WoS, Scopus, and RSCI, most are people, who occupy the lower levels of the official hierarchy (assistants and senior teachers-88%), and most are young men, who, due to their age, do not yet have decent publication indicators that it would make sense to monitor.
To the question, "How does managing your university treat scientometric indicators?" 44.2% of respondents answered: "The university management pays some attention to scientometric indicators," 41% believe that the administration regularly monitors the scientometric indicators of teachers, and according to 14.8% of respondents, it ignores them. Thus, most teachers (85.2%) are aware of administrative control and are probably trying to adapt to it, including correcting their publication strategy. At the same time, the authors should note a relatively high percentage of responses (almost 15%) that deny monitoring of scientometric parameters by the university administration. It is obvious that the respondents who answered this way are at the lowest levels of the official hierarchy and simply do not know the administrative policy of university management. It is also striking that most such respondents (56%) are concentrated at the Herzen State Pedagogical University of Russia, where control over the scientometric results of teachers is of secondary importance since the university is not included in the 5-100 Program.
According to our survey, most universities have incentives for high scientometric achievements. Almost half of the respondents (46.7%) reported that their superiors sometimes reward subordinates for good scientometric reporting, and 31.4% said that such incentives are regularly based on the school year's results and during recertification. At the same time, 21.9% of respondents stated that there is no reward for high scientometric indicators. Most likely, we are again faced with the results of the responses of young teachers and specialists, who still have very few publications, their scientometric indicators being minimal. Therefore, they cannot count on any awards from their superiors, and thus they have the illusion that the university does not have any incentive system for scholarly achievements.
To the question, "What specific sanctions are applied to stimulate the increase of scientometric indicators in your university?" the following information was received: the administration of more than a quarter of universities pays bonuses for publications indexed in the WoS and/or Scopus database (28.6%), and almost the same number of universities do not use any sanctions at all (28.3%). One-fifth of respondents (19.8%) pointed to the link between scientometric indicators and wages, while another 3.2% found it difficult to answer. The remaining 20.1% of the surveyed teachers offered their own options, including dismissal/non-promotion to pass the competition due to the lack of publications indexed in the WoS and Scopus databases and various sanctions for different academic branches or departments of the university.
At the end of the general survey, respondents were asked to evaluate the attitude toward using scientometric indicators in universities on a scale from negative to positive in the range from 1 to 5. The results are as follows: 1: 13.5%, 2: 19.2%, 3: 33.9%, 4: 21.7%, 5: 11.7%. Summing up the negative and positive responses, we get 32.7% and 33.4%, which indicates that the number of respondents who are sympathetic to using scientometric indicators is only 0.7% higher than the number of opponents of this practice. This information is presented in Figure 1.
To the question, "What specific sanctions are applied to stimulate the increase of scientometric indicators in your university?" the following information was received: the administration of more than a quarter of universities pays bonuses for publications indexed in the WoS and/or Scopus database (28.6%), and almost the same number of universities do not use any sanctions at all (28.3%). One-fifth of respondents (19.8%) pointed to the link between scientometric indicators and wages, while another 3.2% found it difficult to answer. The remaining 20.1% of the surveyed teachers offered their own options, including dismissal/non-promotion to pass the competition due to the lack of publications indexed in the WoS and Scopus databases and various sanctions for different academic branches or departments of the university.
At the end of the general survey, respondents were asked to evaluate the attitude toward using scientometric indicators in universities on a scale from negative to positive in the range from 1 to 5. The results are as follows: 1: 13.5%, 2: 19.2%, 3: 33.9%, 4: 21.7%, 5: 11.7%. Summing up the negative and positive responses, we get 32.7% and 33.4%, which indicates that the number of respondents who are sympathetic to using scientometric indicators is only 0.7% higher than the number of opponents of this practice. This information is presented in Figure 1. After reviewing the overall indicators and figures, let us get a little more detailed picture and analyze the answers to the main questions in the questionnaire in terms of gender, age, department, and job affiliation. The data obtained during the respondents' responses show that women are better acquainted with scientometrics than men. For example, almost 63% of women and only 45% of men know this discipline very well; 30.1% and 41% have vague ideas, respectively, and 7% of women and 14% of men are not familiar with scientometrics at all. This ratio in favor of the fair sex can be explained by the greater responsibility inherent in women (this quality is developed by generations of women, who bear the main concern for the welfare of their offspring). At the same time, the greater responsibility and awareness of women affected the answers to other questions in our questionnaire. Although in general, a neutral attitude to scientometrics prevails in almost equal proportions between women (55.9%) and men (58.1%), simultaneously, a significant share of women (32.2%) perceive this discipline positively, which is not true for men (only 21.7%), and also, the number of men, who have a negative attitude to scientometrics (20.2%) is almost double that of women (11.9%). In addition, women are more familiar with bibliometric databases and with the h-index, and they know the value of this index better in their author profile in the RSCI, but slightly worse than men in the  After reviewing the overall indicators and figures, let us get a little more detailed picture and analyze the answers to the main questions in the questionnaire in terms of gender, age, department, and job affiliation. The data obtained during the respondents' responses show that women are better acquainted with scientometrics than men. For example, almost 63% of women and only 45% of men know this discipline very well; 30.1% and 41% have vague ideas, respectively, and 7% of women and 14% of men are not familiar with scientometrics at all. This ratio in favor of the fair sex can be explained by the greater responsibility inherent in women (this quality is developed by generations of women, who bear the main concern for the welfare of their offspring). At the same time, the greater responsibility and awareness of women affected the answers to other questions in our questionnaire. Although in general, a neutral attitude to scientometrics prevails in almost equal proportions between women (55.9%) and men (58.1%), simultaneously, a significant share of women (32.2%) perceive this discipline positively, which is not true for men (only 21.7%), and also, the number of men, who have a negative attitude to scientometrics (20.2%) is almost double that of women (11.9%). In addition, women are more familiar with bibliometric databases and with the h-index, and they know the value of this index better in their author profile in the RSCI, but slightly worse than men in the WoS and Scopus database. Women are also more attentive to the dynamics of their scientometric indicators in bibliometric databases: 27.3% carry out constant monitoring (men-only 14.7%). Moreover, if the opinion of both sexes mostly coincides in assessing the control of the university administration over scientometric indicators, there are clear discrepancies in the responses to the question about incentives for high scientific achievements (for example, 41.3% of women and only 21.75% of men noted the presence of incentives at the end of the academic year). Some, but not critical, variation between the sexes is observed in responses to the question about sanctions used to stimulate scientometric indicators. Finally, as expected, the proportion of women positively evaluating using scientometric indicators in universities (36.5%) exceeds the share of those who negatively perceive such practices (26.4%); among men, the proportion is quite different: 40.6% of them think negatively about using scientometric indicators, and only 29.4% approve of their implementation.
Age also has a certain impact on the perception of scientometrics and its main parameters. For example, people of middle age (34-50 years) know best what scientometrics is-66.3%, and only 1.8% of them do not have any idea about it; older people (over 50 years old) have slightly worse knowledge, while the weakest indicators are among young people: only 39.4% of them are well versed in scientometrics, and 18.3% know almost nothing about it. Of respondents, middle-aged people have a positive attitude to scientometrics-34.6% of respondents (only 8.8% have a negative attitude, while the figure for young people is twice as high-16.3%). Again, primarily middle-aged people demonstrate knowledge of bibliometric databases (RSCI, WoS, and Scopus)-77.9% of respondents; awareness is slightly worse among the elderly-62.1%, and young people are noticeably weaker in orientation in the databases-48.1%. In answer to the question: "Do you know what the 'h-index' is?" the leaders are again middle-aged people with 88.5% positive responses; but the ignorance of this index was shown primarily by the elderly-15% of them do not have any idea about it; among the youth, only 4.8% have no idea. At the same time, all three age categories show poor knowledge of the h-index value in their author's profile in various bibliometric databases. However, even here, representatives of the middle generation are ahead-only 22.8% do not know the value of their h-index in the RSCI, while the share of young people who do not have an idea about it reaches 40% and the elderly-35.8%. Even worse is the situation with the personal Hirsch index in foreign databases: among young people, 71.4% do not know their figure in WoS, among the middle-aged-about half (50.9%), and among the elderly-61.2%. In light of what has already been said, the answers to the question, "Do you keep track of your scientometric indicators in the main bibliographic databases?" were fairly predictable. Among young people, only 17.3% of respondents constantly carry out such monitoring, and 37.5% are never interested in it; among middle-aged people, figures were 26.6% and 10.6%, respectively, and among the elderly-16.7% and 25.8%. Here it can repeat what was already mentioned above: young people, due to natural causes, usually have nothing or almost nothing to track in bibliometric databases. However, the number of young people who have a negative attitude toward using scientometric parameters in universities is only slightly higher than the number of supporters (a ratio of 27.1% and 25.5%), but among the elderly, the number of opponents of using scientometric indicators is almost 2 times higher-52.4%. Obviously, the more conservative older generation is skeptical of scientometric innovations, and therefore, there are half as many supporters of scientometric standards among them-25.3%. In contrast, among the middle age group, the number of adherents of scientometric parameters is 43%, while the number of opponents is only 21%, i.e., a mirror ratio compared to the older generation.
There was a definite surprise with the questionnaire analysis regarding the criterion of humanities folk/technicians. Thus, it turned out that scholars in the humanities are relatively better at knowing what scientometrics is than representatives of technical and general scientific specialties: 59.6% against 50%, while on the other hand, 14.8% of technicians and only 5.6% of humanities folk do not know anything about this discipline. Moreover, if the number of both who have a negative attitude toward scientometrics is approximately the same (15.5% and 17%), then for scholars in the humanities, who have a positive attitude toward scientometrics (32.6%) significantly exceeds the share of techies (21.1%). The weaker interest of representatives of natural and technical disciplines in scientometrics may be since they are used to collective scientific work, which brings equal scientometric bonuses to all participants of the project and, therefore, does not stimulate interest in evaluating personal scientific contributions. Some techies, for example, at ITMO University, work in other firms (often with higher earnings) in parallel with their work at the university, and therefore, the data of scientometrics is not critical for their career and material prosperity. However, when answering other questions in the questionnaire, the differences between humanities folk and technicians are generally insignificant. As our study shows, scholars in the humanities are twice as aware of their data in the RSCI and more often monitor their indicators in the main BDB. Again, on the issue of using scientometric indicators in universities, there is a discrepancy in the views of humanities folk and techies: if only 27.5% of scholars in the humanities perceive introducing metrics negatively, then 36.5% of techies do, and, conversely, 40.1% of humanities folk and 29% of techies welcome introducing scientometric indicators. Now let us look at the main questions of our questionnaire through the prism of answers from people who are at different levels of the professional and official hierarchy. As a result of analyzing the answers to the question about scientometrics and familiarity with the name of the discipline, a linear pattern emerges: the higher the position and scholarly qualification, the more complete the knowledge. Hence, if only 16.7% of assistants know what scientometrics is, and 30.6% do not have the slightest idea about it, while among professors, the corresponding figures are 86.7% and 0%. In this regard, the attitude toward scientometrics is also very revealing: among university employees of lower categories (assistants, senior teachers), the negative perception of the discipline is clearly predominant-22.2% against 13.9%, and 14.3% of the responses in which scientometrics was evaluated from a positive point of view. Associate professors, by contrast, have a positive perception of scientometrics (36%), and only 8.8% of them have a negative attitude toward it. It is even better perceived by professors (43.3%), though there are many people who have a negative attitude to scientometrics-20% (a figure close to the indicators of assistants and senior teachers). The opposition to scientometrics on the part of lower categories of university employees has explained above-these usually are young people who do not have a significant number of publications and citations and, therefore, are not worthy of scientometric attention. On the other hand, professors and associate professors usually have sufficient symbolic "capital" in the form of publications and citations. This thesis is confirmed by the answers about the respondents' knowledge of bibliometric databases: 29.2% of assistants, 48% of senior teachers, 82.4% of associate professors and 93.4% of professors know all the databases. A similar linear progression is built when answering the question about h-index knowledge (52.8% of assistants and 96.7% of professors are familiar with it). A similar result is observed when answering other questions. Only sometimes do associate professors begin to challenge the palm of superiority (from professors), in particular, when answering the question, "Do you keep track of your scientometric indicators in the main bibliographic databases?" 32% of associate professors admitted to constant monitoring of their scientometric data, while professors gave only 23.3% such answers. At the same time, according to the opinion of a significant share of assistants (20.8%) and senior teachers (28.6%), the university administration ignores scientometric indicators, which confirms the above hypothesis about the relationship of age/position (youth/low status) with the denial of the remuneration system for high scientometric indicators. It is characteristic that only 4.8% of associate professors deny such a system in their universities, and as for professors, there was not a single one who would say that the university administration ignores scientometric data. It is not surprising that 37.3% of assistant professors and 42% of senior teachers do not approve of using scientometric indicators in universities (24.2% and 22%, respectively, have the opposite view), while associate professors are clear supporters of using scientometric data: 51.2% (against only 24%). However, most professors negatively affect metrics (46.7%; only 36.6% have a positive attitude). It may be due to a certain conservatism inherent in older people or a deeper understanding of the shortcomings and formalism of scientometrics criteria. More detailed information on these issues is presented in Figure 2.
cators in universities (24.2% and 22%, respectively, have the opposite view), while associate professors are clear supporters of using scientometric data: 51.2% (against only 24%). However, most professors negatively affect metrics (46.7%; only 36.6% have a positive attitude). It may be due to a certain conservatism inherent in older people or a deeper understanding of the shortcomings and formalism of scientometrics criteria. More detailed information on these issues is presented in Figure 2. Finally, we need to consider the attitude toward scientometrics and its indicators in terms of the respondents' capital/regional affiliation. It is hardly appropriate to analyze in detail the main answers of teachers of St. Petersburg and provincial universities since the latter is superior in all parameters. Thus, among representatives of universities in the northern capital, only 44.4% know what scientometrics is, while among their provincial colleagues, this figure reaches 64%; all three bibliometric databases are well-known by 55.6% of St. Petersburg respondents and 70% of teachers at provincial universities. At the same time, residents of St. Petersburg evaluate scientometrics mostly negatively-39.2% of negative reviews against 29.3% of positive ones, while in the provinces, the opposite picture is observed (27.4% and 36.8%). These figures can be explained by the fact that teachers in provincial universities want to make a better impression and protect themselves from the displeasure of their superiors. On the other hand, the most depressing indicators among St. Petersburg teachers were given by employees of the Herzen State Pedagogical University, which was selected as a control institution not included in the 5-100 Program. For example, out of 30 respondents from this university, only 3 people know well what scientometrics is, and the rest either have a vague idea of what it is (16) or do not know at all (9); again, only three respondents know all three bibliometric databases, etc. This indicates that scientometrics and its data have not yet been adequately applied in ordinary Russian universities, in contrast to a limited number of leading universities in the country.

Discussion
The research conducted has shown that among the teachers of the leading Russian universities, the vast majority have more or less clear ideas about scientometrics. In general, respondents have a relatively positive perception of this discipline and do not object  Finally, we need to consider the attitude toward scientometrics and its indicators in terms of the respondents' capital/regional affiliation. It is hardly appropriate to analyze in detail the main answers of teachers of St. Petersburg and provincial universities since the latter is superior in all parameters. Thus, among representatives of universities in the northern capital, only 44.4% know what scientometrics is, while among their provincial colleagues, this figure reaches 64%; all three bibliometric databases are well-known by 55.6% of St. Petersburg respondents and 70% of teachers at provincial universities. At the same time, residents of St. Petersburg evaluate scientometrics mostly negatively-39.2% of negative reviews against 29.3% of positive ones, while in the provinces, the opposite picture is observed (27.4% and 36.8%). These figures can be explained by the fact that teachers in provincial universities want to make a better impression and protect themselves from the displeasure of their superiors. On the other hand, the most depressing indicators among St. Petersburg teachers were given by employees of the Herzen State Pedagogical University, which was selected as a control institution not included in the 5-100 Program. For example, out of 30 respondents from this university, only 3 people know well what scientometrics is, and the rest either have a vague idea of what it is (16) or do not know at all (9); again, only three respondents know all three bibliometric databases, etc. This indicates that scientometrics and its data have not yet been adequately applied in ordinary Russian universities, in contrast to a limited number of leading universities in the country.

Discussion
The research conducted has shown that among the teachers of the leading Russian universities, the vast majority have more or less clear ideas about scientometrics. In general, respondents have a relatively positive perception of this discipline and do not object to using its indicators. This shows that in just a few years, the representatives of the teaching corps have managed to adapt to the requirements of the university and ministerial administrations. The survey revealed two statistical patterns: the better teachers know scientometrics, the better they feel about it, and vice versa; the younger the respondent and lower their position, the more negatively they feel about scientometrics and the less they know about its parameters. At the same time, the fact of a better attitude toward scientometrics and knowledge of scientometric standards and their monitoring by representatives of the humanities, in contrast to natural and technical disciplines, turned out to be surprising. Similarly, but even more clearly, the teachers of provincial universities are superior in  Table A2. Data on the main universities studied.

University, Full and Abbreviated Name
Year of Foundation

Number of Students
The