With the progress in HIV combination antiretroviral therapy (cART) and the aging of people living with HIV (PLWH), there is increasing evidence that living with HIV is a significant risk factor for low bone mineral density (BMD) and fragility fractures [1
]. A meta-analysis of 20 studies in PWLH compared to uninfected controls [9
] showed the pooled odds ratios (ORs) for having osteopenia and osteoporosis were 6.4 and 3.7, respectively. A previous case-control analysis of Canadian women living with HIV (WLWH) (mean age of 38 years) and age-matched controls derived from the national Canadian Multicentre Osteoporosis Study (CaMos
) showed WLWH had an OR of 1.7 for having experienced a prevalent fragility fracture (i.e., low trauma fracture, such as with a fall from standing height) [6
]. Similarly, a Spanish population-based study including both men and women reported that an HIV diagnosis increased the risk of a hip fracture by 5-fold, independent of sex, age, tobacco or alcohol use, and comorbidities [10
]. Of note, despite osteoporosis and incident fractures being more common in menopausal women in the general population [11
], studies of bone disease in PLWH have primarily been skewed towards men [12
]. Given that >50% of PLWH globally are women, with the majority being of reproductive age [14
], a greater understanding of the pathophysiology of bone health in WLWH is needed to develop optimal preventative and treatment strategies as they age.
Reasons for this increased prevalence of bone fragility in PLWH are multifactorial and complex. Aside from the direct effect of the HIV virus on immune homeostasis and its potential contribution to bone loss [7
], PLWH also often have higher prevalence of risk factors for low BMD and fractures, which include poverty and poor nutrition, higher rates of drug use (including illicit and intravenous drugs, tobacco, and alcohol), lower body weight, lower vitamin D levels, and higher risk for hypogonadism (ovulatory and cycle disturbances in women and lower testosterone levels in men) [3
]. In addition to these factors, initiation of cART is associated with a short-term (<2 year) accelerated bone loss of between 2% and 6% that varies with the type of cART [18
It is also increasingly evident that both HIV infection and cART exposure are associated with accelerated aging, a process described as the declining ability of an organism to resist stress, damage, and disease [19
]. Mitochondrial aging and telomere shortening are the basis of two widely accepted theories of aging [20
]. HIV infection induces inflammation, causing oxidative stress that can damage both telomeres and mitochondrial DNA (mtDNA) [15
]. Treatments, including nucleoside reverse transcriptase inhibitors (NRTI), can induce mtDNA alterations [22
] and may accelerate telomere loss [23
], a phenomenon that has been linked with age-associated diseases, such as osteoporosis, in some [24
], but not all, studies [25
]. Leukocyte telomere length (LTL) was reported to be associated with decreased BMD of the spine and forearm, but not of the femoral neck, in all 2150 women (aged 18–79) whose health administrative data were included in a whole population-based study in the United Kingdom [24
]. Thus, the relationship between markers of accelerated aging and bone fragility requires further investigation in WLWH, since it may provide insight into the pathophysiology of metabolic bone disease in this at-risk group, and lead to interventions to reduce that risk.
In this cross-sectional study, we investigated the prevalence of abnormal BMD values in a cohort of WLWH in British Columbia (BC), Canada. We examined the relationships of BMD values with clinical and HIV-specific osteoporosis risk factors, and compared both younger and older (aged ≥50 years) WLWH’s data with a reference, population-based, age-appropriate cohort.
The results of this study support the perspective that HIV infection and its treatment are related to accelerated bone aging [19
]. Compared to the CaMos
local reference population of similarly aged women, WLWH had lower BMD, particularly the LS BMD, although it is within the population normal range. LS BMD includes at least 50% trabecular (or cancellous) bone. This relative reduction in spinal BMD was independent of age and was most strongly linked to shorter LTL. Conversely, FN BMD, with its greater cortical bone component [35
], was not significantly different between controls and WLWH. Trabecular bone has an accelerated metabolism and responds more rapidly to bone loss and turnover than cortical bone [36
]. This may explain the observation that the most pronounced differences in BMD were seen in the LS of WLWH in this age group, most of whom would be expected to be either pre- or perimenopausal. WLWH in this study also had relatively reduced TH BMD compared to the regional reference cohort of women. The TH has an intermediate composition of trabecular and cortical bone, and TH BMD was primarily dependent on age and BMI.
The link between shorter LTL, usually taken as a measure of cellular senescence [38
], and osteoporosis has been previously shown in a general population of 2150 randomly selected women [24
]. Shorter LTL in PLWH compared to HIV-negative controls has been previously observed [26
]. This is the first demonstration that shortening of LTL is associated with reduced LS BMD in WLWH. Interestingly, a recent study found telomere length was shorter in osteogenic precursor cells in peripheral blood in HIV-positive men between 20 and 25 years of age, particularly those who were infected perinatally and had lower BMD values [40
]. However, the bone site most affected in this relatively young cohort of HIV-positive men was not the LS, but rather the distal radius, followed by the FN and TH [40
], suggesting that there may be gender differences between HIV-positive men and women with respect to the type and location of bone loss.
Both HIV status and cART use can contribute to cellular senescence of bone marrow-derived mesenchymal stem cells, which serve as osteoprogenitor cells and are important for bone formation and remodeling [41
]. Of our available data, the strongest link to LTL was exposure to TDF, an increasingly common component of cART worldwide, known to affect bone density, which overpowered the expected relationship between LTL and age [44
]. This is noteworthy, as it was demonstrated that TDF exhibits the most potent inhibition of telomerase activity and the greatest telomere shortening of five nucleotide reverse transcriptase inhibitors tested at physiological concentrations in vitro [45
]. Our findings suggest one consequence of this drug’s effect relates to compromised bone health in WLWH.
Many studies have now linked cART to low BMD in PLWH, and there is a general consensus that a net bone loss of between 2% and 6% can be expected within the first 2 years of cART initiation, depending on the pharmacological combination [1
]. The majority of the available data, however, included relatively short periods of follow-up (most studies being only 2 years long) and overwhelmingly these data report studies in men.
An important question is whether lower BMD translates clinically into an increased risk of fractures in WLWH. This was answered by some of the studies mentioned above, as well as by Triant and colleagues [48
]. In that population-based study, conducted between October 1996 and March 2008, there was a comparison of fracture prevalence in HIV-positive and HIV-negative patients in the USA that involved 8525 HIV+ and 2,208,792 HIV-negative patients. Similar to our findings [6
], the authors presented an increased risk of fractures in WLWH compared to controls [48
]. Among women, the overall fracture prevalence was 2.49 vs. 1.72 per 100 persons in PLWH vs. HIV-negative patients (p
= 0.002). WLWH had a higher prevalence of vertebral fractures (0.81 vs. 0.45; p
= 0.01), but had a similar prevalence of hip fractures as their HIV-negative counterparts (0.47 vs. 0.56; p
= 0.53) [48
]. However, fracture risk was elevated starting at age 30
(i.e., in both pre- and menopausal WLWH). This suggests that to prevent future fractures awareness and prevention of osteopenia/osteoporosis needs to start very early for women living with HIV.
Our study was aimed at addressing some of the gender gaps by providing further data related to how LS, TH, and FN BMD of women living with HIV compare with the regional reference population, and what factors associated with their relatively poor bone health. It is a strength of this study that it compared local WLWH with a population-based regional CaMos control cohort and that, for the first time, we showed a relationship between the aging-related variable, leukocyte telomere length, and BMD in WLWH. This study is limited, however, by the retrospective nature of data obtained in these WLWH, by the cross-sectional design, and by the fact that more African-Canadian and Aboriginal women were in the HIV-positive group compared with the CaMos cohort, thus were were unable to adjust for ethnicity in our standardized BMD assessment in the younger women. It is also limited by lacking a control group that was similar in ethnicity and lifestyle to the WLWH.
Bone health is compromised in WLWH, even in those remaining premenopausal. Spinal BMD, which has a significant trabecular bone component, is notably lower, especially in WLWH over age 50, independent of age. Reduction of leukocyte telomere length (LTL) was strongly associated with reduced LS BMD, and this may relate to the role of cellular senescence in the loss of bone renewal potential—especially in the context of specific cART therapies, such as tenofovir, which may further accelerate telomere shortening. Guidelines for the care of WLWH need to take this new information into consideration.
Further research, awareness, and new strategies for maintaining and improving the strength and health of women’s bones are needed, and should begin the day a woman is diagnosed with HIV. Only then will we be able to better prevent fragility fractures in WLWH as they age.