Next Issue
Volume 2, September
Previous Issue
Volume 2, March
 
 

Metrics, Volume 2, Issue 2 (June 2025) – 5 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
30 pages, 479 KiB  
Review
Comprehensive Review of Metrics and Measurements of Quantum Systems
by Hassan Soubra, Hatem Elsayed, Yousef Elbrolosy, Youssef Adel and Zeyad Attia
Metrics 2025, 2(2), 9; https://doi.org/10.3390/metrics2020009 - 19 Jun 2025
Viewed by 289
Abstract
Quantum computing promises to offer significant computational advantages over classical computing, leveraging principles such as superposition and entanglement. This necessitates effective metrics and measurement techniques for evaluating quantum systems, aiding in their development and performance optimization. However, due to fundamental differences in computing [...] Read more.
Quantum computing promises to offer significant computational advantages over classical computing, leveraging principles such as superposition and entanglement. This necessitates effective metrics and measurement techniques for evaluating quantum systems, aiding in their development and performance optimization. However, due to fundamental differences in computing paradigms and current immaturity of quantum software abstractions, classical software and hardware metrics may not directly apply to quantum computing, where the distinction between software and hardware can still be somewhat indiscernible compared to classical computing. This paper provides a comprehensive review of existing quantum software and hardware metrics in the scientific literature, highlighting key challenges in the field. Additionally, it investigates the application of Functional Size Measurement (FSM), based on the COSMIC ISO 19761 FSM Method, to measure quantum software. Three FSM approaches are analyzed by applying them to Shor’s and Grover’s algorithms, with measurement results compared to assess their effectiveness. A comparative analysis highlights the strengths and limitations of each approach, emphasizing the need for further refinement. The insights from this study contribute to the advancement of quantum metrics, especially software metrics and measurement, paving the way for the development of a unified and standardized approach to quantum software measurement and assessment. Full article
Show Figures

Figure 1

12 pages, 351 KiB  
Article
HOTGAME: A Corpus of Early House and Techno Music from Germany and America
by Tim Ziemer
Metrics 2025, 2(2), 8; https://doi.org/10.3390/metrics2020008 - 29 May 2025
Viewed by 259
Abstract
Many publications on early house and techno music have the character of documentation and include (auto-)biographical statements from contemporaries of the scene. This literature has led to many statements, hypotheses, and conclusions. The weaknesses of such sources are their selective and subjective nature, [...] Read more.
Many publications on early house and techno music have the character of documentation and include (auto-)biographical statements from contemporaries of the scene. This literature has led to many statements, hypotheses, and conclusions. The weaknesses of such sources are their selective and subjective nature, and the danger of unclear memories, romanticization, and constructive memory. Consequently, a validation through content-based, quantitative music analyses is desirable. For this purpose, the HOuse and Techno music from Germany and AMErica (HOTGAME) corpus was built. Metrics from the field of data quality control show that the corpus is representative and explanatory for house and techno music from Germany and the United States of America between 1984 and 1994. HOTGAME can serve as a reliable source for the analysis of early house and techno music using big data methods, like inferential statistics and machine learning. Full article
Show Figures

Figure 1

30 pages, 315 KiB  
Article
Pre-Service Secondary Science Teachers and the Contemporary Epistemological and Philosophical Conceptions of the Nature of Science: Scientific Knowledge Construction Through History
by Abdeljalil Métioui
Metrics 2025, 2(2), 7; https://doi.org/10.3390/metrics2020007 - 22 May 2025
Viewed by 641
Abstract
In this research, we aim to synthesize the complex issue of students’ and science teachers’ conceptions of the nature of science (NOS). We identified the conceptions of ninety-five pre-service science teachers (PSTcs) enrolled in the Qualifying Master’s Program in Teaching Science at the [...] Read more.
In this research, we aim to synthesize the complex issue of students’ and science teachers’ conceptions of the nature of science (NOS). We identified the conceptions of ninety-five pre-service science teachers (PSTcs) enrolled in the Qualifying Master’s Program in Teaching Science at the secondary level in Quebec (Canada) about the NOS, particularly relative to the development of science through history and approaches to constructing scientific knowledge, especially regarding the relationship between observation, hypothesis, experiment, measure and theory. To this end, we constructed a multiple-choice questionnaire (MCQ) comprising 11 statements to characterize their conceptions. The qualitative data analysis underscores the intricate nature of scientific knowledge construction. The PSTcs identified are as follows: 1. Scientific theories today correspond to improving ancient theories; 2. Science progresses by accumulation; 3. Science advancement results from improving current theories thanks to experimentation; 4. The observation is a pure observation that is preconceived; 5. We must experiment with scientific equipment in a laboratory to disprove a theory; and 6. Experiments precede scientific theory. These conceptions are crucial not only for developing training programs that help pre-service science teachers (PSTs) to study the concepts of science prescribed in the curriculum within the history and epistemology of science, but also to underscore the urgency and importance of addressing these conceptions. Full article
17 pages, 1815 KiB  
Article
Region Partitioning Framework (RCF) for Scatterplot Analysis: A Structured Approach to Absolute and Normalized Data Interpretation
by Eungi Kim
Metrics 2025, 2(2), 6; https://doi.org/10.3390/metrics2020006 - 8 Apr 2025
Viewed by 532
Abstract
Scatterplots can reveal important data relationships, but their visual complexity can make pattern identification challenging. Systematic analytical approaches help structure interpretation by dividing scatterplots into meaningful regions. This paper introduces the region partitioning framework (RCF), a systematic method for dividing scatterplots into interpretable [...] Read more.
Scatterplots can reveal important data relationships, but their visual complexity can make pattern identification challenging. Systematic analytical approaches help structure interpretation by dividing scatterplots into meaningful regions. This paper introduces the region partitioning framework (RCF), a systematic method for dividing scatterplots into interpretable regions using k × k grids, in order to enhance visual data analysis and quantify structural changes through transformation metrics. RCF partitions the x and y dimensions into k × k grids (e.g., 4 × 4 or 16 regions), balancing granularity and readability. Each partition is labeled using an R(p, q) notation, where p and q indicate the position along each axis. Two perspectives are supported: the absolute mode, based on raw values (e.g., “very short, narrow”), and the relative mode, based on min–max normalization (e.g., “short relative to population”). I propose a set of transformation metrics—density, net flow, relative change ratio, and redistribution index—to quantify how data structures change between modes. The framework is demonstrated using both the Iris dataset and a subset of the airquality dataset, showing how RCF captures clustering behavior, reveals outlier effects, and exposes normalization-induced redistributions. Full article
Show Figures

Figure 1

76 pages, 16124 KiB  
Article
Mapping Data-Driven Research Impact Science: The Role of Machine Learning and Artificial Intelligence
by Mudassar Hassan Arsalan, Omar Mubin, Abdullah Al Mahmud, Imran Ahmed Khan and Ali Jan Hassan
Metrics 2025, 2(2), 5; https://doi.org/10.3390/metrics2020005 - 2 Apr 2025
Cited by 1 | Viewed by 1467
Abstract
In an era of evolving scholarly ecosystems, machine learning (ML) and artificial intelligence (AI) have become pivotal in advancing research impact analysis. Despite their transformative potential, the fragmented body of literature in this domain necessitates consolidation to provide a comprehensive understanding of their [...] Read more.
In an era of evolving scholarly ecosystems, machine learning (ML) and artificial intelligence (AI) have become pivotal in advancing research impact analysis. Despite their transformative potential, the fragmented body of literature in this domain necessitates consolidation to provide a comprehensive understanding of their applications in multidimensional impact assessment. This study bridges this gap by employing bibliometric methodologies, including co-authorship analysis, citation burst detection, and advanced topic modelling using BERTopic, to analyse a curated corpus of 1608 scholarly articles. Guided by three core research questions, this study investigates how ML and AI enhance research impact evaluation, identifies dominant methodologies, and outlines future research directions. The findings underscore the transformative potential of ML and AI to augment traditional bibliometric indicators by uncovering latent patterns in collaboration networks, institutional influence, and knowledge dissemination. In particular, the scalability and semantic depth of BERTopic in thematic extraction, combined with the visualisation capabilities of tools such as CiteSpace and VOSviewer, provide novel insights into the dynamic interplay of scholarly contributions across dimensions. Theoretically, this research extends the scientometric discourse by integrating advanced computational techniques and reconfiguring established paradigms for assessing research contributions. Practically, it provides actionable insights for researchers, institutions, and policymakers, enabling enhanced strategic decision-making and visibility of impactful research. By proposing a robust, data-driven framework, this study lays the groundwork for holistic and equitable research impact evaluation, addressing its academic, societal, and economic dimensions. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop