1. Introduction
Bone disease has emerged as a clinically relevant non-communicable comorbidity in people living with HIV (PLWH), largely because contemporary antiretroviral therapy (ART) has transformed HIV into a chronic condition with prolonged survival. In this setting, osteopenia, osteoporosis, and morphometric vertebral fractures are encountered earlier than expected, reflecting an overlap between traditional risk factors (smoking, low BMI, hypogonadism, vitamin D deficiency), HIV-related immune–inflammatory perturbations, and cumulative ART exposure across decades of care [
1,
2,
3,
4]. Current expert guidance therefore frames skeletal health as part of routine comprehensive HIV management, not an “optional” add-on [
5].
A key clinical nuance is that bone loss in HIV is not purely a late complication: a reproducible drop in bone mineral density (BMD) occurs soon after ART initiation, with trials and cohorts reporting roughly a 2–6% decline over the first 24–48 weeks, followed by relative stabilization [
2]. This early “reset” in skeletal homeostasis is clinically meaningful because it may move borderline patients into higher-risk zones, particularly when compounded by tenofovir disoproxil fumarate-associated renal–phosphate effects or other regimen- and host-related factors that accelerate bone turnover [
2,
3]. These dynamics motivate pragmatic risk stratification strategies that go beyond a single densitometric snapshot.
International recommendations emphasize targeted screening rather than universal DXA, typically prioritizing individuals with older age, prior fragility fracture, chronic glucocorticoid exposure, high falls risk, or other major fracture predictors [
5]. In Europe, the latest major update of the European AIDS Clinical Society (EACS) guidelines similarly reinforces structured comorbidity assessment, including bone health, within routine HIV care pathways, which is particularly relevant for real-world clinics balancing competing metabolic and cardiovascular priorities [
6]. However, even with guideline-driven screening, the operational problem remains: fracture risk in HIV is frequently multifactorial, and purely “bone-only” algorithms may under-capture functional contributors to fragility.
Dual-energy X-ray absorptiometry (DXA) remains the cornerstone for BMD measurement, yet it quantifies areal density and cannot directly capture trabecular microarchitecture or bone material properties, domains that may be disproportionately affected by inflammation, endocrine disruption, and ART-related remodeling [
5]. Trabecular bone score (TBS), derived from lumbar DXA images, offers a practical surrogate for microarchitectural integrity without additional radiation. Importantly, in PLWH, impaired TBS has been linked to subclinical vertebral fractures and may provide incremental information even when BMD is not frankly osteoporotic, supporting the concept of a “bone quality gap” in HIV-related fragility phenotypes [
7,
8].
Access constraints also matter. In resource-limited environments, such as parts of Romania, DXA availability and throughput may be limited, creating demand for scalable adjuncts. Calcaneal quantitative ultrasound (QUS) is attractive because it is portable and radiation-free, and HIV cohorts have shown clinically relevant correlations between QUS indices (e.g., stiffness) and DXA-derived BMD, supporting QUS as a screening or triage tool when densitometry access is constrained [
9,
10]. Recent work in PLWH also highlights the value of combining DXA-derived metrics (BMD, TBS) with QUS and bone turnover markers to improve morphometric vertebral fracture profiling, reinforcing a multimodal approach that remains feasible outside highly resourced centers [
11].
Even a refined skeletal assessment can still miss a broader fragility phenotype driven by falls propensity and functional reserve. Sarcopenia, conceptualized as impaired muscle strength and/or performance with clinically relevant outcome links, has evidence-based operational definitions emphasizing grip strength and gait speed because these measures best discriminate adverse outcomes [
12]. In HIV-relevant populations, validated cut-points for weakness and slowness have been tested in men and women with or at risk for HIV infection, demonstrating the practicality of incorporating simple bedside performance metrics into risk stratification pathways [
13]. Furthermore, systematic syntheses suggest that sarcopenia is not rare in PLWH and is associated with adverse clinical trajectories, while in the general older-adult literature sarcopenia is consistently associated with falls and fractures, providing a mechanistic and epidemiologic rationale for integrating muscle function into fragility profiling [
14,
15]. Therefore, this cross-sectional study in a Romanian tertiary HIV center evaluates whether adding a feasible sarcopenia screening layer improves bone fragility characterization beyond DXA, TBS, QUS, and routine biomarkers; it is hypothesized that sarcopenia-positive individuals will exhibit worse densitometric and microarchitectural profiles and that TBS impairment will be only partially explained by BMD categories.
2. Materials and Methods
2.1. Study Design & Setting
This was a single-center, cross-sectional observational study conducted in the HIV outpatient services and affiliated imaging/laboratory units at Victor Babeș University of Medicine and Pharmacy Timișoara. The Timișoara HIV outpatient service functions as one of the principal regional referral centers for HIV care in western Romania, managing approximately 900–1000 adults living with HIV, and its patient demographic—including both perinatally and horizontally infected individuals across a wide age range—broadly reflects the epidemiologic profile of PLWH in the region, although referral enrichment toward more complex cases is possible given its tertiary, university-affiliated status. The design prioritized feasibility within a resource-limited Romanian context: all assessments were performed during routine care visits or a closely aligned “one-stop” evaluation pathway.
Data collection was planned over an approximate 18–24-month window to allow consecutive recruitment without disrupting clinical workflows. The protocol was designed to be non-interventional: imaging and laboratory tests were part of standard metabolic and comorbidity assessment, while sarcopenia screening used low-cost bedside tools (handgrip dynamometer and timed walk).
2.2. Participants and Eligibility Criteria
Adults (≥18 years) with confirmed HIV infection were eligible if they were receiving stable combination ART for at least 12 months and were referred for bone health evaluation (age threshold, clinician concern, prior fracture history, prolonged corticosteroid exposure, or other standard indications). All participants were required to provide written informed consent and be able to complete the functional testing protocol.
Exclusion criteria were selected to limit major confounding of bone assessment: pregnancy, active malignancy with known bone involvement, advanced chronic kidney disease (stage 4–5), primary metabolic bone disease (including uncontrolled hyperparathyroidism and severe renal osteodystrophy), or current high-dose systemic glucocorticoids. current or recent prolonged supraphysiological systemic glucocorticoid therapy, recent orthopedic surgery affecting DXA or QUS assessment sites (within 12 months), and current use of bone-active pharmacotherapy other than vitamin D supplementation. Menopausal status was recorded but was not an exclusion criterion, as postmenopausal women represent a clinically relevant subgroup in HIV bone health assessment. Patients with acute severe illness at assessment were deferred. The final analytic sample was 98 participants.
2.3. Bone, Laboratory, and Sarcopenia Assessments
DXA measured lumbar spine (L1–L4) and proximal femur BMD, recording site-specific T-scores. BMD categories followed WHO thresholds using the lowest site T-score: normal (≥−1.0), osteopenia (−1.0 to −2.5), and osteoporosis (≤−2.5). TBS was computed from lumbar DXA images using dedicated software; to keep reporting consistent with one-decimal tables, TBS was presented as TBS × 100 (e.g., 127.6 corresponds to TBS 1.276). “Degraded” TBS was defined as TBS × 100 < 124.0, partially degraded 124.0–130.0, and normal >130.0.
Calcaneal QUS was performed on the non-dominant heel with standardized operator procedures and device quality checks. Parameters included SOS (m/s) and BUA (dB/MHz). Fasting blood tests included serum CTX and P1NP (reference turnover markers) and 25(OH) vitamin D.
A pragmatic sarcopenia screen was implemented to match resource constraints. Handgrip strength was measured with a dynamometer using standardized posture and best-of-three attempts; low grip was defined by sex-specific cutoffs consistent with common European practice. Gait speed was measured over 4 m at usual pace; slow gait was defined as <0.8 m/s. Participants were categorized as sarcopenia screen-positive if they met low grip strength and/or slow gait speed criteria.
2.4. Statistical Analysis
Continuous variables were summarized as mean ± SD; categorical variables as n (%). Normality was assessed by visual inspection and Shapiro–Wilk testing. Two-group comparisons (sarcopenia-positive vs. negative) used independent-samples t-tests for approximately normal variables and Mann–Whitney U tests for non-normal distributions. Three-group comparisons (TDF exposure strata) used one-way ANOVA with post hoc testing where appropriate; categorical outcomes used chi-square tests (or Fisher’s exact test when expected counts were small).
Pearson correlation coefficients quantified linear associations between TBS × 100 and relevant clinical/biochemical variables. A multivariable logistic regression model examined predictors of degraded TBS (TBS × 100 < 124.0), reporting adjusted odds ratios (aOR) with 95% confidence intervals. Model predictors were chosen a priori based on feasibility and clinical plausibility (age, BMI, HIV duration, nadir CD4+ category, viral suppression, cumulative TDF exposure category, sarcopenia status, CTX). Statistical significance was set at two-sided p < 0.05.
3. Results
Patients who screened positive for sarcopenia were notably older (53.4 vs. 42.8 years,
p < 0.001) and had a longer HIV duration (12.6 vs. 10.3 years,
p = 0.03), with evidence of more advanced historical immunosuppression (lower nadir CD4+: 206.3 vs. 258.7 cells/mm
3,
p = 0.029). Viral suppression was less frequent in the sarcopenia group (60.7% vs. 85.7%,
p = 0.006). Bone and muscle-related measures showed the clearest separation: CTX was higher (0.6 vs. 0.4 ng/mL,
p < 0.001), grip strength was substantially lower (23.1 vs. 33.7 kg,
p < 0.001), and gait speed was slower (0.7 vs. 1.1 m/s,
p < 0.001). Skeletal quality also appeared worse in those with sarcopenia, with a lower lumbar spine T-score (−1.7 vs. −1.2,
p = 0.014), lower TBS (123.8 vs. 127.7,
p = 0.02), and reduced calcaneal SOS (1523.3 vs. 1548.8 m/s,
p = 0.002) (
Table 1).
Longer TDF exposure showed a clear dose–response pattern toward worse bone density, weaker microarchitecture, and higher bone turnover. Compared with never-exposed patients, those with >5 years of TDF were older (49.3 vs. 42.3 years,
p = 0.004) and had substantially lower DXA T-scores across all sites—lumbar spine (−1.7 vs. −0.9), total hip (−1.3 vs. −0.6), and femoral neck (−1.6 vs. −0.9), all
p < 0.001. TBS declined stepwise (123.1 vs. 129.8,
p < 0.001) and calcaneal SOS also decreased (1524.2 vs. 1557.7 m/s,
p < 0.001), with BUA lower as exposure increased (104.8 vs. 111.6 dB/MHz,
p = 0.034). Bone turnover markers were higher with prolonged exposure: CTX rose (0.6 vs. 0.4 ng/mL,
p < 0.001) and P1NP increased (60.0 vs. 48.8 µg/L,
p = 0.02). Clinically, the proportion with degraded TBS (<124.0) was much higher in the >5-year group (82.1%) than in never-exposed patients (40.0%),
p = 0.001 (
Table 2).
This table shows that “bone quality” (TBS) worsens as BMD category declines, and the distribution of TBS categories differs significantly across BMD groups (chi-square
p = 0.006). Even among patients with normal BMD, over one-quarter had degraded TBS (27.0%), suggesting microarchitectural impairment can exist despite normal density. In osteopenia, degraded TBS was common (38.6%), and in osteoporosis it was the dominant pattern (58.8%). Vertebral fracture prevalence increased numerically from normal BMD (8.1%) to osteopenia (18.2%) and osteoporosis (23.5%), but this trend did not reach statistical significance (
p = 0.236), consistent with limited power for fracture comparisons in the smallest subgroup (
Table 3).
TBS correlated most strongly with classic skeletal measures and functional performance. Higher TBS was moderately-to-strongly associated with higher lumbar spine T-score (r = 0.6,
p < 0.001) and total hip T-score (r = 0.5,
p < 0.001), supporting that better density aligns with better microarchitecture. TBS also correlated positively with calcaneal SOS (r = 0.4,
p < 0.001) and with grip strength and gait speed (both r = 0.4,
p < 0.001), linking microarchitecture to overall musculoskeletal health. In contrast, older age (r = −0.4,
p < 0.001), longer HIV duration (r = −0.3,
p = 0.002), and higher CTX (r = −0.3,
p = 0.001) were associated with lower TBS. Vitamin D showed a small but statistically significant positive correlation (r = 0.2,
p = 0.045) (
Table 4).
After adjusting for multiple factors, the only predictor that remained statistically significant was viral suppression, which was associated with substantially lower odds of degraded TBS (aOR 0.3, 95% CI 0.1–0.9,
p = 0.034). Other variables showed trends but did not meet significance—age (aOR 1.4 per 10 years,
p = 0.106) and BMI (aOR 0.9 per 1 kg/m
2,
p = 0.072) suggested possible directionality, but uncertainty remained. Notably, prolonged TDF exposure (>5 years) and sarcopenia were not independently significant in this multivariable model (both
p > 0.3–0.5), implying that their effects on TBS may overlap with other variables or be clearer in stratified/non-linear analyses rather than a single binary endpoint model (
Table 5).
Adding DXA T-score information improved fracture discrimination more than adding TBS or sarcopenia in this dataset. The clinical-only model (A) had an AUC of 68.3, which increased to 72.5 after adding the lowest T-score (Model B; ΔAUC + 4.2), although this improvement was borderline (LRT
p = 0.081). Adding TBS (Model C) produced only a small further gain (AUC 74.0; ΔAUC +1.5;
p = 0.484), and adding sarcopenia (Model D) again increased AUC modestly (75.6; ΔAUC + 1.6;
p = 0.377), without statistically significant incremental improvement (
Table 6).
This analysis suggests that some factors affect TBS more clearly in the overall “average” range and in the lower (more vulnerable) tail. Cumulative TDF exposure was consistently associated with lower TBS both in the mean model (β − 0.5 per year,
p = 0.018) and at the 25th percentile (β − 0.6,
p = 0.023), indicating a robust relationship that persists even among patients already in the lower TBS range. Age also showed a significant negative association in both approaches (mean β − 0.2,
p = 0.013; lower-tail β − 0.3,
p = 0.034). In contrast, probable sarcopenia, CTX, vitamin D, BMI, and viral suppression were not significant in either model (all
p > 0.2), suggesting their relationships with TBS may be weaker, non-linear, or mediated through other pathways in this dataset (
Table 7).
After controlling for key confounders (age, BMI, and HIV duration), the strongest remaining relationships centered on the agreement between microarchitecture, density, and ultrasound-based bone quality. TBS had a strong positive partial correlation with calcaneal SOS (partial r = 61.1 × 100,
p < 0.001, q < 0.001), and also remained significantly linked to lumbar spine T-score (35.9 × 100, q = 0.002). Lumbar spine T-score was also significantly associated with calcaneal SOS (38.2 × 100, q = 0.001). Most other connections did not survive multiple-testing correction (q-values generally > 0.2) (
Table 8).
Figure 1 shows a clear age-gradient in microarchitectural deterioration, with the predicted probability of degraded TBS rising across all strata and separating meaningfully by TDF > 5 years and sarcopenia. At 35 years, predicted risk was lowest in the reference profile (TDF ≤ 5y/never + no sarcopenia) at 0.12, compared with 0.24 in TDF > 5y without sarcopenia, 0.21 in sarcopenia without prolonged TDF, and highest in combined exposure (TDF > 5y + sarcopenia) at 0.38. By 55 years, probabilities increased substantially and the between-group divergence persisted, reaching 0.65 (TDF ≤ 5y/never + no sarcopenia), 0.81 (TDF > 5y + no sarcopenia), 0.79 (TDF ≤ 5y/never + sarcopenia), and 0.90 (TDF > 5y + sarcopenia).
Figure 2 indicates that fracture risk aligns with a multidimensional pattern (not a single parameter), and that combining skeletal and functional features yields recognizable clusters that may help identify higher-risk patients even when any one measurement alone appears borderline.
Figure 3 demonstrates a non-linear (spline) association between CTX and the predicted probability of degraded TBS, with the CTX–risk curve shifted upward by TDF > 5 years and further accentuated in sarcopenia (dashed lines). At clinically interpretable CTX points, prolonged TDF exposure markedly increased predicted risk in the non-sarcopenic stratum: at CTX = 0.30 ng/mL, the predicted probability was 0.29 for TDF ≤ 5y/never versus 0.86 for TDF > 5y, and at CTX = 0.70 ng/mL, the probability was 0.43 versus 0.60, respectively. The wide confidence bands at the extremes reflect fewer observations in tails, but the consistent upward displacement of the TDF > 5y curves supports effect modification, implying that for similar turnover levels, individuals with prolonged TDF exposure (especially with sarcopenia) may experience disproportionate microarchitectural vulnerability.