Next Article in Journal
The Relationship of Ambivalence towards Lecturers with University Students’ Psychological Distress and Mental Health
Previous Article in Journal
Blood Donation during Times of Crises: The Mediating Role of Meaning in Life for Undergraduate Medical Students
 
 
Article
Peer-Review Record

On the Periphery of the European Social Sciences—A Scientometric Analysis of Publication Performance, Excellence, and Internal Bias in Social Sciences in the Visegrad Countries

Soc. Sci. 2024, 13(10), 537; https://doi.org/10.3390/socsci13100537
by Péter Sasvári 1,2,* and Gergely Ferenc Lendvai 3
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Soc. Sci. 2024, 13(10), 537; https://doi.org/10.3390/socsci13100537
Submission received: 19 July 2024 / Revised: 20 September 2024 / Accepted: 4 October 2024 / Published: 11 October 2024

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

This article describes an analysis of the scientific production in social science of the four Visegrad countries.

I think the article can show some interesting insight that can potentially have an impact on actionable policies. However, I also reckon the paper could be improved in a number of ways to make it more effective.

Down here I report some observations:

- I am troubled with the usage of Scopus, SciVal, and Scimago as this prevents sharing the data used in the analysis and replication of the results. This is against Open Science practices and should be tackled somehow. Perhaps open sources for scientometrics could help, such as OpenAIRE, OpenAlex or Dimensions, or at least include a section where the authors can explain why this is not feasible.

- At page 2/3, the list of the 24 disciplines could be removed. I'd say it is not very useful; a table with the # of journal breakdowns by discipline would suffice, IMHO. Also, please indicate the number of domestic journals out of the total and for each discipline.

- The analysis is multifaceted, and the authors wisely outline RQs. However, the results are presented in a somewhat confusing way, as the reader has to bounce back and forth between the relevant RQs. Perhaps having them addressed in an orderly fashion would help the reader in parsing the information, which is very dense in the paper.

- There is a jargon overload, as I presume the same concept is used with different wordings. E.g. are (domestic | regional | visegrad) journals used as synonyms? I found several occurrences of "national journals", is this another way to address domestic journals? The concept of being "domestic" also translates to researchers and publications, which as far as I see, are never defined openly and left to the reader to sort out. From what I got "domestic researchers" are the ones affiliated in organisations in the Visegrad countries (what about multiple affiliations? What about authors changing affiliations during the observed period?), while for "domestic publications" I assume they are the ones published in domestic journals by domestic authors (to what extent a mix of domestic/non-domestic authors is accounted for?).

- I have a concern with how the concept of "domestic journals" is operationalised, especially because, at some point, the authors kind of imply that these are of lower quality and serve as a stepping stone for academics to establish themselves and then move on to more international venues. However, as the authors state, this classifies as domestic prestigious journals too, such as Scientometrics. Perhaps the authors could describe the quality and impact of the domestic journals by proxying SJR indicators and show better to what extent this operationalization supports their claims later in the paper.

- When the authors imply the three stages at page 5, it is not clear if they derived them from analysing publishing patterns of individuals analysing timewise. Isn't it a stretch to claim that otherwise?

- At page 15, the text seems repeating itself in a couple of parts.

- When the authors assert, "This leadership is underpinned by well-established academic infrastructures and supportive research policies that foster a conducive environment for scholarly output." the reader is left with little support to validate the claim and dig deeper into the details.

- "low, medium, and high rates of internal bias" how?

Author Response

Comments 1:

“I am troubled with the usage of Scopus, SciVal, and Scimago as this prevents sharing the data used in the analysis and replication of the results. This is against Open Science practices and should be tackled somehow. Perhaps open sources for scientometrics could help, such as OpenAIRE, OpenAlex, or Dimensions, or at least include a section where the authors can explain why this is not feasible.”

Response 1:

Thank you for raising this important point. We have revised the manuscript to provide a more detailed explanation for the use of Scopus, SciVal, and Scimago in our research. Specifically, we have added a section in the methodology to justify the selection of these tools based on their widespread usage, data quality, and comprehensiveness in the field of scientometrics.

In the revised manuscript, we explain that:

  • Scopus is a globally recognized and trusted database for high-quality scientometric data (Baas et al., 2020). The extensive coverage and rigorous indexing criteria make it a reliable source for this type of research.
  • SciVal provides robust tools for analyzing research performance and collaboration patterns, facilitating comprehensive trend evaluations (Hao, 2021). It allows for deeper insight into research productivity, which is essential for our comparative analysis.
  • Scimago is freely accessible and complements the analysis by offering additional metrical tools and data on journals indexed in Scopus (Andalia and Contreras, 2010).

However, we recognize the limitations of using proprietary tools in the context of Open Science, as they limit replicability. In response to this concern, we have added a section in the Limitation section addressing Open Science practices and replicability. While the raw data from Scopus, SciVal, and Scimago cannot be directly shared, we have ensured full transparency of our methodology. We provide detailed instructions for replicating the steps of our analysis for any user who has access to these platforms. Additionally, we suggest how similar research can be conducted using open-access datasets such as OpenAIRE, OpenAlex, or Dimensions, although these platforms may not provide identical results due to differences in data scope and indexing.

We will continue to explore ways to balance the use of proprietary datasets with Open Science principles in future work, particularly by incorporating open-access alternatives where possible.

Changes made to the manuscript:

  1. We added a section to the methodology (Section [X]) explaining the rationale for selecting Scopus, SciVal, and Scimago.
  2. A new paragraph was included in the Limitations section where we explicitly address the challenges posed by proprietary databases for Open Science and replicability and propose alternative approaches using open-access sources.

 

 

 

Comments 2:

“At page 2/3, the list of the 24 disciplines could be removed. I'd say it is not very useful; a table with the # of journal breakdowns by discipline would suffice, IMHO. Also, please indicate the number of domestic journals out of the total and for each discipline.”

Response 2:

We have removed the list and included a comprehensive list with all necessary data in page 5.

 

Comments 3:

„The analysis is multifaceted, and the authors wisely outline RQs. However, the results are presented in a somewhat confusing way, as the reader has to bounce back and forth between the relevant RQs. Perhaps having them addressed in an orderly fashion would help the reader in parsing the information, which is very dense in the paper.”

Response 3:

Thank you for pointing out this structural issue, we have restructured the discussion to better reflect on the RQs. We hope that the new structure where we included RQ2 in a new, separate segment serves as a better guidance to the readers.

 

Comments 4:

“There is a jargon overload, as I presume the same concept is used with different wordings. E.g. are (domestic | regional | visegrad) journals used as synonyms? I found several occurrences of "national journals", is this another way to address domestic journals? The concept of being "domestic" also translates to researchers and publications, which as far as I see, are never defined openly and left to the reader to sort out. From what I got "domestic researchers" are the ones affiliated in organisations in the Visegrad countries (what about multiple affiliations? What about authors changing affiliations during the observed period?), while for "domestic publications" I assume they are the ones published in domestic journals by domestic authors (to what extent a mix of domestic/non-domestic authors is accounted for?).”

Response 4:

Thank you very much for this comment, we have completely revised and reviewed this problem. To make matters more convenient for the readers we have also included a small summarizing table under Section 2 which explains our definitions. We opted for the use of “domestic” and deleted/rephrased every instance where national/regional/visegrad were used interchangeably.

 

Comments 5:

- I have a concern with how the concept of "domestic journals" is operationalised, especially because, at some point, the authors kind of imply that these are of lower quality and serve as a stepping stone for academics to establish themselves and then move on to more international venues. However, as the authors state, this classifies as domestic prestigious journals too, such as Scientometrics. Perhaps the authors could describe the quality and impact of the domestic journals by proxying SJR indicators and show better to what extent this operationalization supports their claims later in the paper.

Response 5:

We appreciate the reviewer’s insightful feedback regarding the operationalization of domestic journals. We understand the concern about potentially implying that these journals are inherently of lower quality or serve only as stepping stones for researchers. In response, we have revised our manuscript to clarify that the classification of journals as “domestic” does not equate to a judgment of quality.

We have strengthened the section discussing the quality and impact of domestic journals by incorporating more detailed analysis using SJR indicators. Specifically, we now emphasize that while some domestic journals may have lower SJRs, this is not universally the case. We used SJR as a proxy for journal quality to demonstrate that domestic journals represent a spectrum of impact, from those serving regional and emerging scholars to internationally recognized outlets. We specifically highlighted higher SJR-journals too and made a figure (Figure 2) for better interpretation.

 

 

Comments 6:

“When the authors imply the three stages at page 5, it is not clear if they derived them from analysing publishing patterns of individuals analysing timewise. Isn't it a stretch to claim that otherwise?”

Response 6:

We have added a segment explaining our statement to better reflect that the “three stages” concept is indeed derived them from analysing publishing patterns of individuals in the given period.

 

Comments 7:

“At page 15, the text seems repeating itself in a couple of parts.”

Response 7:

We resolved this issue, thank you for pointing it out!

 

Comments 8:

“When the authors assert, "This leadership is underpinned by well-established academic infrastructures and supportive research policies that foster a conducive environment for scholarly output." the reader is left with little support to validate the claim and dig deeper into the details.”

Response 8:

We have deleted the statement.

 

Comments 9:

"low, medium, and high rates of internal bias" how?

Response 9:

We have added a rate disctinction segment in the Methodology section.

 

Reviewer 2 Report

Comments and Suggestions for Authors

This is an excellent submission and as such I have just a few remarks.

 The authors discuss the Scopus classification system (p.2). Are these classes overlapping (are there journals that belong to two or more classes)? If so, does this influence the results?

Minor points

p.1 line 32 Do the authors mean quantitative?

Same line: it is  de Rijcke, not Rijcke,

p. 3 The SJR is an indicator based on the citation network, not just on the number (or percentages) of received citations.

 p. 12 Please give a reference for the Norwegian List

p. 18 line 558. Typo: Garfield, not Garfiels

Author Response

Comments 1:

“The authors discuss the Scopus classification system (p.2). Are these classes overlapping (are there journals that belong to two or more classes)? If so, does this influence the results?”

Response 1:

Per other comments we have completely revised this part. We have included a comprehensive table outlining the structure.

 

Comments 2:

“p.1 line 32 Do the authors mean quantitative?

Same line: it is  de Rijcke, not Rijcke,”

Response 2:

Corrected, thank you!

 

Comments 3:

“p. 3 The SJR is an indicator based on the citation network, not just on the number (or percentages) of received citations.”

Response 3:

Corrected and detailed SJR definition has been added.

 

Comments 4:

“p. 12 Please give a reference for the Norwegian List”

Response 4:

Added Aarstad’s (2010) work. Thank you for the suggestion!

 

Comments 5:

“p. 18 line 558. Typo: Garfield, not Garfiels”

Response 5:

Corrected, thank you!

Back to TopTop