1. Introduction
In elite athletes, the balance between specific training-induced adaptations and recovery between sessions is crucial to minimize injury risks and enhance performance. Many tools are employed to monitor the effects of internal and/or external load applied to athletes. Notably, to monitor internal load, the rate of perceived exertion, heart rate, lactate concentration, and certain biomarkers are well documented in sports science [
1]. Proteins, metabolites, electrolytes, and other molecules are also increasingly utilized by sports team scientists as biomarkers.
To assess skeletal muscle status (i.e., muscle damage), the most common biomarkers remain creatine kinase and urea due to their cost-effectiveness and daily applicability [
2,
3]. However, molecules related to endocrine regulation of muscle repair, muscle excitability, and metabolic homeostasis have also gained attention [
4]. Biomarker monitoring should occur at multiple time points throughout training, the off-season, and competition cycles. For chronic changes across a season, athletes may be tested every 4–6 weeks or at key training transition points [
1,
5].
Recently, sports scientists have increasingly used point-of-care (POC) methods to identify and analyze reliable biomarkers that aid in making rapid, critical decisions regarding training load management. Among these biomarkers, a variety of novel molecules (e.g., CD163, heat shock proteins [HSPs], cell-free DNA [cfDNA], blood cell ratios) are suggested to add value in this context. However, the cost and effort involved in measuring these parameters regularly remain high, limiting their convenience for monitoring purposes [
6,
7]. In contrast, evidence of emerging biomarkers related to the immune system, such as leukocytes and pro- and anti-inflammatory cytokines, is improving [
8,
9]. According to Haller et al. [
7], the potential use of these proteins as biomarkers in exercise settings is of particular interest, as immunological markers indicate differential disturbances in physiological homeostasis or tissue integrity. Additionally, markers of oxidative stress (i.e., malondialdehyde, protein carbonyls, and antioxidant enzymes) have also been studied due to their close connection to the immune system. High-intensity exercise is known to lead to the excessive production of reactive oxygen species (ROS) through mitochondrial electron leakage or intracellular enzymes, such as NADPH oxidases, xanthine oxidase, and phospholipase A2 [
10]. ROS production induces changes in the normal physiological environment of skeletal muscle fibers and vascular endothelial function, which play a central role in the inflammatory process [
11]. Nevertheless, there are few studies investigating biomarkers in Paralympic athletes. Of the few studies presented in the literature, all of them are associated with salivary assessment (cortisol, testosterone, cortisol: testosterone ratio, or secretory immunoglobulin A) [
12,
13,
14].
In a study of four Paralympic swimmers, Sinnot-O’Connor et al. [
13] found significant increases in salivary markers associated with two weeks of intensified training load (38.3%), with a subsequent decrease after a 49.5% decrease in training load. These data suggest a higher risk of upper respiratory tract infections in these athletes when submitted to intensive and prolonged training. In addition to upper respiratory tract infections, the training intensification can also induce mitochondrial impairments. Flockhart et al. [
15] progressively increased high-intensity-interval-training (HIIT) loads over four weeks in healthy subjects, where, during the first three weeks, the frequency of exercise sessions increased, and during the fourth week, the training load was reduced to allow for recovery. As a result, the authors observed reduced ROS emission, closely linked to a decrease in mitochondrial respiration after the fourth training week. This could be a compensatory mechanism to counteract increases in non-mitochondrial ROS production as a protective strategy against oxidative stress. Similarly, Cardinale et al. [
16] found that endurance athletes exhibited an increase in several mitochondrial autophagy and density markers, cytosolic proteins, and a decrease in mitochondrial respiration (20%) and aconitase activity, indicating reduced mitochondrial quality following four weeks of intensified training. This may also explain the increase in injury incidence associated with HIIT programs in the 2000s after its popularization. Rynecki et al. [
17] found a 144% increase in all injuries, including a 137% increase in lower extremity injuries, which may be due to an unbalance between training load and recovery time.
Aconitase, commonly used to assess mitochondrial adaptation, is an enzyme that catalyzes the reversible isomerization of citrate and isocitrate and is part of the citric acid cycle (TCA) and is present in both mitochondria and cytosol. The TCA is a series of reactions in a closed loop that is closely connected to carbon dioxide production and the bicarbonate ion in skeletal muscle and red blood cells [
18,
19,
20]. This suggests a possible connection between mitochondrial function and acid-base status. The assessment of acid-base status is commonly used in inpatient settings to diagnose gas exchange abnormalities in various pulmonary and non-pulmonary diseases, and over the past few decades, technological advancements have enabled the increasing use of point-of-care (POC) tests in clinical settings, allowing for the analysis of blood gas, electrolytes, and metabolites from capillary blood samples, in addition to arterial and venous blood. Compared to venipuncture, capillary blood can be easily obtained by pricking the skin of a fingertip or ear lobe [
21]. Many studies have shown good reliability between fingertip and venous or arterial blood gas analysis in many devices, suggesting that these devices can be used as a viable alternative to arterial blood gas levels [
22,
23,
24].
The use of blood gas analysis in exercise has gained increasing attention due to its ability to provide precise insights into an acid–base status, respiratory function, and metabolic adaptations [
25,
26]. Although promising, there are still scarce studies associating arterial, venous, or even capillary blood gas variables in high-level athletes. Martínez et al. [
27] and Lucía et al. [
28] found no differences in capillary blood gas markers in well-trained cyclists after an 8-week off-season period (↓volume and intensity of training). Recently, Lourenço et al. [
29] showed that resting acid-base status can be a useful indicator for endurance performance, as they found strong relationships between blood bicarbonate ion concentrations, ventilatory threshold parameters (ventilatory threshold and respiratory compensation point), and 10 km performance.
Despite this, reproducibility studies serve as a foundational step to distinguish true physiological adaptations from potential measurement noise [
30]. This is particularly relevant in high-performance and Paralympic sports, as assessing blood gas variables can not only support accurate monitoring protocols but also contribute to evidence-based decision-making in both training and medical care. To our knowledge, few studies have shown reliability data in athletes, and the current lack of studies on this topic impairs the application of capillary blood gas analysis.
We think that the use of capillary blood gas analysis in sports science—particularly within Paralympics—may offer significant advantages due to its ease of application, rapid diagnostic capability, minimal discomfort, and non-invasive sampling method. To contribute to this scenario and provide important sensitivity information for trainers and scientists, the aim of the study was to verify the intra- and interday reliability of acid-base variables at rest in high-performance Paralympic sprinters. We hypothesize that their impairments should trigger some blood acid-base alteration in comparison to non-athletes’ subjects, which may generate different interpretations for this population.
4. Discussion
The aim of the present study was to evaluate the intra- and interday reliability of acid-base variables at rest in high-level paralympic sprinters. Although a limited number of studies have examined the effectiveness of point-of-care devices to analyze acid-base status using capillary blood in a clinical setting, to the best of our knowledge, this is the first study to provide reliability data for high-performance athletes.
Our imprecision results (
Table 2 and
Table 5) agreed with others [
35,
36,
37], indicating the high reliability of these measurements for quantifying acid-base status using point-of-care devices. No significant differences were found in any variable analyzed when comparing the three samples during intra- or interday analysis, and we also found high consistency (ICC > 0.89) among samples for all blood variables analyzed, indicating excellent reliability of these parameters (
Table 4). The only exception was the base excess (BE), which showed a moderate effect size (η
2 = 0.09), suggesting slightly higher variability in this specific parameter. Nonetheless, even in this case, statistical significance was not achieved (
p = 0.34), further supporting the consistency of the data. These results reflect high reproducibility across the repeated measurements, as the differences between conditions were statistically negligible and of limited practical relevance.
Although the subjects analyzed in our study differ from previous studies [
21,
23,
38], our results, collected in the same device, corroborate their findings. In a cohort of 250 participants with various health conditions, including cancer, pneumonia, and metabolic acidosis, Kim et al. [
21] found that, with the exception of partial pressure of oxygen (pO
2), all other parameters showed equivalent values or strong correlations with reference methods. The error values for pH, pCO
2, HCO
3−, and Hb were ±0.08 (±0.04%), ±8.66 mmHg (±8%), ±2.63 mmol·L
−1 (±15%), and ±1.67 g·dL
−1 (±7%), respectively. These findings align with those of Cao et al. [
23], who reported similar error values of pH (0.03%), pCO
2 (3.97%), and Hb (1.02%).
When comparing our mean values, we observed slightly higher pH (7.49–7.55) and lower pCO
2 (20.91–29.31) compared to previous studies [
21,
23,
31], suggesting that high-level athletes may exhibit mild blood alkalinization at rest. These data are different from those found by Lourenço et al. [
29] in high-level endurance athletes who show blood pH and pCO
2 lower than the athletes evaluated in this study. A pCO
2 value below 35 mmHg is commonly associated with hyperventilation and an increase in blood pH. Unfortunately, we did not measure ventilation data in the present study. Despite that, these data can possibly be justified by a significantly higher pulmonary vital capacity (i.e., total amount of air exhaled after maximal inhalation) in high-level athletes because of their increments in strength and resistance of respiratory muscles [
39,
40]. Exercise training can also lead to an increase in mitochondrial function [
41], which could improve tissue CO
2 production in the citric acid cycle. According to Faull et al. [
42], athletes are more accurate at interpreting interoception signals, such as pCO
2 and H
+ alterations, allowing greater ventilatory control compared to sedentary people. This may be a consequence of repeated exposure to acidosis during training sessions, which could trigger a higher pCO
2 loss through ventilation, increasing blood pH at rest. Further studies should be conducted to confirm these hypotheses.
Increased pCO
2 levels can also stimulate the conversion of CO
2 into bicarbonate ions, primarily in erythrocytes, through the catalytic action of carbonic anhydrase [
20]. A recent study by Lourenço et al. [
29] found higher bicarbonate concentration in high-level endurance runners compared to amateurs. Contrary to this hypothesis, the bicarbonate ion values in our study were similar to those found in the emergency room [
21] but lower than those of high-level endurance athletes [
29]. This discrepancy could be attributed to differences in training and physical characteristics between sprinters and endurance athletes, as well as the time of the season during which the analysis was conducted. Since our measurements were taken at the beginning of the season, athletes may have maintained the central (cerebral and ventilatory) but not peripheral (metabolic) adaptations following a period of rest [
43]. Future studies should consider monitoring these parameters throughout the entire season to investigate potential training-induced adaptations in sprinters.
Another key factor in maintaining acid-base balance is hemoglobin. During the conversion of CO
2 into bicarbonate ion, the hydrogen ions produced in this reaction are buffered by hemoglobin, which is reduced by oxygen dissociation, playing a crucial role in the buffer system. As all variables analyzed here, hemoglobin levels were highly consistent and reliable; however, we found values slightly higher than those reported in previous studies [
23,
24,
31,
38] but like those found in high-level endurance athletes [
29], reflecting classic training-induced adaptations in high-level sprinters [
44].
To assess the contribution of non-carbonic buffers (i.e., protein) on pH maintenance, base excess (BE) is commonly used in point-of-care analysis. BE is defined as the amount of strong acid that must be added to fully oxygenated blood to return the pH to 7.40 at a temperature of 37 °C and a pCO
2 of 40 mmHg. Although its application in sports is rare, our data demonstrated BE as a reliable measure, consistent with reference values (±2 meq/L). Despite the rest, BE values agree with reference values; they are less reliable than other variables analyzed here. Furthermore, we found lower BE values than in endurance athletes [
29], which can reflect a specific training-induced adaptation rather than methodological inconsistencies. From a practice standpoint, coaches and sports scientists should carefully use BE as an indicator of plasma protein buffer capacity, and future studies may provide more information about it in high-level athletes.
Creatine and urea are also commonly measured in point-of-care devices to assess kidney function and muscle catabolism. Like other parameters, creatinine and urea showed high reliability in our high-level athlete sample, demonstrating potential for use in training load monitoring throughout the season. Banfi and Del Fabbro [
45] showed a correlation between body mass and creatinine concentration in athletes from different sports. These findings support the work of Haller et al. [
7], who highlighted the value of creatinine and urea, along with creatine kinase, as markers of training load.
The predominance of small effect sizes reinforces that the blood gas parameters remained stable across the five-day interval—an important finding when considering the clinical and athletic applications in Paralympic athletes. The reliability of these measures may be essential for physiological monitoring and for informed decision-making regarding training and recovery strategies. However, despite these promising results, certain limitations of the study should be addressed in future investigations. Although none of the comparisons intraday reached statistical significance, the observed effect sizes ranged from medium to large, particularly for acid-base balance markers (e.g., pCO2, HCO3−, BE) and hematological variables (Hb, URE). These findings suggest potentially meaningful physiological changes that may not be captured solely through p-values, especially in studies with limited sample size, like ours. Therefore, the magnitude of these effects warrants further investigation in larger cohorts and may inform individualized monitoring strategies in high-performance or clinical athletic populations.