Next Issue
Volume 10, February-2
Previous Issue
Volume 10, January-2

J. Clin. Med., Volume 10, Issue 3 (February-1 2021) – 182 articles

Cover Story (view full-size image): Ventricular assist device (VAD)-specific infections, in particular, driveline infections, are a concerning complication of VAD implantation that often results in significant morbidity and even mortality. The presence of a percutaneous driveline at the skin exit-site and in the subcutaneous tunnel allows biofilm formation and migration by many bacterial and fungal pathogens. Biofilm formation is an important microbial strategy, providing a shield against antimicrobial treatment and human immune responses; biofilm migration facilitates the extension of infection to deeper tissues such as the pump pocket and the bloodstream. The purpose of this review is to improve the understanding of VAD-specific infections, from basic “bench” knowledge to clinical “bedside” experience, with a specific focus on the role of biofilms in driveline infections. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
Mobile Anatomical Total Ankle Arthroplasty—Improvement of Talus Recentralization
J. Clin. Med. 2021, 10(3), 554; https://doi.org/10.3390/jcm10030554 - 02 Feb 2021
Viewed by 596
Abstract
Introduction: Total ankle arthroplasty (TAA) is becoming a more frequent treatment option for end-stage ankle osteoarthritis (OA) as outcomes measures are improving. However, there is concern that malalignment of TAA can result in premature failure of the implant. One of the malalignment issues [...] Read more.
Introduction: Total ankle arthroplasty (TAA) is becoming a more frequent treatment option for end-stage ankle osteoarthritis (OA) as outcomes measures are improving. However, there is concern that malalignment of TAA can result in premature failure of the implant. One of the malalignment issues is the talar sagittal malposition. However, a consensus on the significance of the sagittal translation of the talus in TAA is yet to be established. The aim of this study was, therefore, to clarify whether talus OA subluxation is normalized after the implantation of a mobile TAA. Methods: Forty-nine consecutive patients with symptomatic end-stage ankle OA underwent 50 cementless three-component mobile-bearing VANTAGE TAA with 21 right ankles (42%) and 29 left ankles (58%). Clinical and radiographic outcomes were assessed: Clinical variables: American Orthopedic Foot and Ankle Society (AOFAS) ankle-hindfoot score (0–100), visual analogue scale (VAS, 0–10), and ankle range of motion (ROM). Radiological variables: medial distal tibial articular angle (mDTAA), anterior distal tibial articular angle (aDTAA) and lateral talar station (LTS). Results: The clinical results showed the mean improvement in AOFAS hindfoot score from 42.12 ± SE 2.42 (Range: 9–72) preoperatively, to 96.02 ± SE 0.82 (Range: 78–100) at a mean follow-up of 12 months, with a highly statistically significant difference (p < 0.00001). Pain score (VAS) was 6.70 ± SE 0.28 (Range 0–10) preoperatively, and 0.26 ± SE 0.12 (Range: 0–3) at 12-month follow-up, with a highly statistically significant difference (p < 0.00001). ROM measurements preoperatively showed a mean of 22.55° ± SE 1.51° (Range: 0–50°), which showed a statistically significant improvement (p < 0.0001) to 45.43° ± SE 1.56° (Range: 25–60°) 12 months postoperatively. The radiological analyses revealed the following results: On the coronal view, the mDTAA preoperatively was 88.61 ± SE 0.70 (Range: 78.15–101.10), which improved to 89.46 ± SE 0.40 (Range: 81.95–95.80) at 12 months (not statistically significant—p = 0.94). On the sagittal view, the preoperative values of the aDTAA showed 82.66 ± SE 0.84 (Range: 70.35–107.47), which improved to 88.98 ± SE 0.47 (Range: 82.83–96.32) at 12 months postoperatively, with a highly statistically significant difference between preoperative and 12-months values (p < 0.00001). The mean LTS values for all patients were 3.95 mm ± SE 0.78 (Range: −11.52 to 13.89) preoperatively and 1.14 mm ± SE 0.63 (Range: −10.76 to 11.75) at 12 months, with a statistically significant difference between preoperative and 12-month follow-up (p = 0.01). The review of the radiological TAA osteointegration at 12 months showed no cases of loosening of the implanted TAAs. Two cases (4%) showed a radiolucency and one case (2%) a cyst on the tibial component; no cases had a change on the talar component. No TAA complication/revision surgeries were documented. Conclusion: In the present study, the lateral talar station of anteriorly subluxated ankles showed a significant improvement, i.e., physiological centralization of the talus, in the postoperative period when a mobile-bearing TAA was performed. The anterior/posterior congruency between the talar component and the mobile polyethylene insert of the mobile-bearing VANTAGE TAA allows the sagittal translation of the talus relative to the flat tibial component, reducing the prosthesis strain and failure. Full article
(This article belongs to the Special Issue Recent Advances in Arthroplasty)
Show Figures

Graphical abstract

Open AccessArticle
Alcohol Intake and Alcohol–SNP Interactions Associated with Prostate Cancer Aggressiveness
J. Clin. Med. 2021, 10(3), 553; https://doi.org/10.3390/jcm10030553 - 02 Feb 2021
Viewed by 528
Abstract
Excessive alcohol intake is a well-known modifiable risk factor for many cancers. It is still unclear whether genetic variants or single nucleotide polymorphisms (SNPs) can modify alcohol intake’s impact on prostate cancer (PCa) aggressiveness. The objective is to test the alcohol–SNP interactions of [...] Read more.
Excessive alcohol intake is a well-known modifiable risk factor for many cancers. It is still unclear whether genetic variants or single nucleotide polymorphisms (SNPs) can modify alcohol intake’s impact on prostate cancer (PCa) aggressiveness. The objective is to test the alcohol–SNP interactions of the 7501 SNPs in the four pathways (angiogenesis, mitochondria, miRNA, and androgen metabolism-related pathways) associated with PCa aggressiveness. We evaluated the impacts of three excessive alcohol intake behaviors in 3306 PCa patients with European ancestry from the PCa Consortium. We tested the alcohol–SNP interactions using logistic models with the discovery-validation study design. All three excessive alcohol intake behaviors were not significantly associated with PCa aggressiveness. However, the interactions of excessive alcohol intake and three SNPs (rs13107662 [CAMK2D, p = 6.2 × 10−6], rs9907521 [PRKCA,p = 7.1 × 10−5], and rs11925452 [ROBO1, p = 8.2 × 10−4]) were significantly associated with PCa aggressiveness. These alcohol–SNP interactions revealed contrasting effects of excessive alcohol intake on PCa aggressiveness according to the genotypes in the identified SNPs. We identified PCa patients with the rs13107662 (CAMK2D) AA genotype, the rs11925452 (ROBO1) AA genotype, and the rs9907521 (PRKCA) AG genotype were more vulnerable to excessive alcohol intake for developing aggressive PCa. Our findings support that the impact of excessive alcohol intake on PCa aggressiveness was varied by the selected genetic profiles. Full article
Show Figures

Figure 1

Open AccessArticle
Development of a Single-Step Antibody–Drug Conjugate Purification Process with Membrane Chromatography
J. Clin. Med. 2021, 10(3), 552; https://doi.org/10.3390/jcm10030552 - 02 Feb 2021
Viewed by 771
Abstract
Membrane chromatography is routinely used to remove host cell proteins, viral particles, and aggregates during antibody downstream processing. The application of membrane chromatography to the field of antibody-drug conjugates (ADCs) has been applied in a limited capacity and in only specialized scenarios. Here, [...] Read more.
Membrane chromatography is routinely used to remove host cell proteins, viral particles, and aggregates during antibody downstream processing. The application of membrane chromatography to the field of antibody-drug conjugates (ADCs) has been applied in a limited capacity and in only specialized scenarios. Here, we utilized the characteristics of the membrane adsorbers, Sartobind® S and Phenyl, for aggregate and payload clearance while polishing the ADC in a single chromatographic run. The Sartobind® S membrane was used in the removal of excess payload, while the Sartobind® Phenyl was used to polish the ADC by clearance of unwanted drug-to-antibody ratio (DAR) species and aggregates. The Sartobind® S membrane reproducibly achieved log-fold clearance of free payload with a 10 membrane-volume wash. Application of the Sartobind® Phenyl decreased aggregates and higher DAR species while increasing DAR homogeneity. The Sartobind® S and Phenyl membranes were placed in tandem to simplify the process in a single chromatographic run. With the optimized binding, washing, and elution conditions, the tandem membrane approach was performed in a shorter timescale with minimum solvent consumption and high yield. The application of the tandem membrane chromatography system presents a novel and efficient purification scheme that can be realized during ADC manufacturing. Full article
Show Figures

Figure 1

Open AccessReview
Tracking Down the Epigenetic Footprint of HCV-Induced Hepatocarcinogenesis
J. Clin. Med. 2021, 10(3), 551; https://doi.org/10.3390/jcm10030551 - 02 Feb 2021
Viewed by 405
Abstract
Hepatitis C virus (HCV) is a major cause of death and morbidity globally and is a leading cause of hepatocellular carcinoma (HCC). Incidence of HCV infections, as well as HCV-related liver diseases, are increasing. Although now, with new direct acting antivirals (DAAs) therapy [...] Read more.
Hepatitis C virus (HCV) is a major cause of death and morbidity globally and is a leading cause of hepatocellular carcinoma (HCC). Incidence of HCV infections, as well as HCV-related liver diseases, are increasing. Although now, with new direct acting antivirals (DAAs) therapy available, HCV is a curable cancer-associated infectious agent, HCC prevalence is expected to continue to rise because HCC risk still persists after HCV cure. Understanding the factors that lead from HCV infection to HCC pre- and post-cure may open-up opportunities to novel strategies for HCC prevention. Herein, we provide an overview of the reported evidence for the induction of alterations in the transcriptome of host cells via epigenetic dysregulation by HCV infection and describe recent reports linking the residual risk for HCC post-cure with a persistent HCV-induced epigenetic signature. Specifically, we discuss the contribution of the epigenetic changes identified following HCV infection to HCC risk pre- and post-cure, the molecular pathways that are epigenetically altered, the downstream effects on expression of cancer-related genes, the identification of targets to prevent or revert this cancer-inducing epigenetic signature, and the potential contribution of these studies to early prognosis and prevention of HCC as an approach for reducing HCC-related mortality. Full article
(This article belongs to the Special Issue Viral Hepatitis and Risk of Hepatocellular Carcinoma)
Show Figures

Figure 1

Open AccessArticle
Surgical Performance Is Not Negatively Impacted by Wearing a Commercial Full-Face Mask with Ad Hoc 3D-Printed Filter Connection as a Substitute for Personal Protective Equipment during the COVID-19 Pandemic: A Randomized Controlled Cross-Over Trial
J. Clin. Med. 2021, 10(3), 550; https://doi.org/10.3390/jcm10030550 - 02 Feb 2021
Viewed by 452
Abstract
(1) Background: During the COVID-19 pandemic, shortages in the supply of personal protective equipment (PPE) have become apparent. The idea of using commonly available full-face diving (FFD) masks as a temporary solution was quickly spread across social media. However, it was unknown whether [...] Read more.
(1) Background: During the COVID-19 pandemic, shortages in the supply of personal protective equipment (PPE) have become apparent. The idea of using commonly available full-face diving (FFD) masks as a temporary solution was quickly spread across social media. However, it was unknown whether an FFD mask would considerably impair complex surgical tasks. Thus, we aimed to assess laparoscopic surgical performance while wearing an FFD mask as PPE. (2) Methods: In a randomized-controlled cross-over trial, 40 laparoscopically naive medical students performed laparoscopic procedures while wearing an FFD mask with ad hoc 3D-printed connections to heat and moisture exchange (HME) filters vs. wearing a common surgical face mask. The performance was evaluated using global and specific Objective Structured Assessment of Technical Skills (OSATS) checklists for suturing and cholecystectomy. (3) Results: For the laparoscopic cholecystectomy, both global OSATS scores and specific OSATS scores for the quality of procedure were similar (Group 1: 25 ± 4.3 and 45.7 ± 12.9, p = 0.485, vs. Group 2: 24.1 ± 3.7 and 43.3 ± 7.6, p = 0.485). For the laparoscopic suturing task, the FFD mask group needed similar times to the surgical mask group (3009 ± 1694 s vs. 2443 ± 949 s; p = 0.200). Some participants reported impaired verbal communication while wearing the FFD mask, as it muffled the sound of speech, as well as discomfort in breathing. (4) Conclusions: FFD masks do not affect the quality of laparoscopic surgical performance, despite being uncomfortable, and may therefore be used as a substitute for conventional PPE in times of shortage—i.e., the global COVID-19 pandemic. Full article
(This article belongs to the Special Issue Recent Advances in Minimally Invasive Surgery)
Show Figures

Figure 1

Open AccessArticle
Opportunity Costs or Not? Validating the Short Gambling Harm Screen against a Set of “Unimpeachable” Negative Impacts
J. Clin. Med. 2021, 10(3), 549; https://doi.org/10.3390/jcm10030549 - 02 Feb 2021
Viewed by 441
Abstract
Assessing the harmful consequences of gambling is an area of active investigation. One measure intended to capture gambling-related harm is the 10-item short gambling harm screen (SGHS). Although good psychometric properties have been reported, it has been suggested that the screen’s less severe [...] Read more.
Assessing the harmful consequences of gambling is an area of active investigation. One measure intended to capture gambling-related harm is the 10-item short gambling harm screen (SGHS). Although good psychometric properties have been reported, it has been suggested that the screen’s less severe probes may not represent genuinely harmful consequences, but rather may reflect rational opportunity costs. Consequently, it has been argued that the screen may lead to overestimation of the extent of gambling-related harm in the population. The current study sought to examine the psychometric performance of three less severe suspect items in the SGHS. Associations between each of these items and a specially constructed scale of relatively severe “unimpeachable” gambling harms were calculated from archival data from 5551 Australian and New Zealand gamblers. All three suspect items, both individually and upon aggregation, predicted greater endorsement of “unimpeachable” harms, and indicated the presence of gambling problems. Moreover, the SGHS as a whole is highly correlated with “unimpeachable” gambling harms. Including suspect items in the SGHS was found to improve predictions of low- and moderate-risk gambling status, but slightly decreased predictions of severe gambling problems. The results are inconsistent with the notion that SGHS harm probes capture either inconsequential consequences or opportunity costs. They confirm prior findings that harm symptomatology is unidimensional, and that the report of multiple more prevalent, but less severe, harms serves as an effective indicator of the spectrum of experienced harm. Full article
(This article belongs to the Special Issue Gambling, Gaming and Other Behavioural Addictions)
Show Figures

Figure 1

Open AccessArticle
Unique Metabolomic Profile of Skeletal Muscle in Chronic Limb Threatening Ischemia
J. Clin. Med. 2021, 10(3), 548; https://doi.org/10.3390/jcm10030548 - 02 Feb 2021
Viewed by 398
Abstract
Chronic limb threatening ischemia (CLTI) is the most severe manifestation of peripheral atherosclerosis. Patients with CLTI have poor muscle quality and function and are at high risk for limb amputation and death. The objective of this study was to interrogate the metabolome of [...] Read more.
Chronic limb threatening ischemia (CLTI) is the most severe manifestation of peripheral atherosclerosis. Patients with CLTI have poor muscle quality and function and are at high risk for limb amputation and death. The objective of this study was to interrogate the metabolome of limb muscle from CLTI patients. To accomplish this, a prospective cohort of CLTI patients undergoing either a surgical intervention (CLTI Pre-surgery) or limb amputation (CLTI Amputation), as well as non-peripheral arterial disease (non-PAD) controls were enrolled. Gastrocnemius muscle biopsy specimens were obtained and processed for nuclear magnetic resonance (NMR)-based metabolomics analyses using solution state NMR on extracted aqueous and organic phases and 1H high-resolution magic angle spinning (HR-MAS) on intact muscle specimens. CLTI Amputation specimens displayed classical features of ischemic/hypoxic metabolism including accumulation of succinate, fumarate, lactate, alanine, and a significant decrease in the pyruvate/lactate ratio. CLTI Amputation muscle also featured aberrant amino acid metabolism marked by elevated branched chain amino acids. Finally, both Pre-surgery and Amputation CLTI muscles exhibited pronounced accumulation of lipids, suggesting the presence of myosteatosis, including cholesterol, triglycerides, and saturated fatty acids. Taken together, these metabolite differences add to a growing body of literature that have characterized profound metabolic disturbance’s in the failing ischemic limb of CLTI patients. Full article
(This article belongs to the Section Vascular Medicine)
Show Figures

Graphical abstract

Open AccessArticle
Low-Intensity Resistance Training with Moderate Blood Flow Restriction Appears Safe and Increases Skeletal Muscle Strength and Size in Cardiovascular Surgery Patients: A Pilot Study
J. Clin. Med. 2021, 10(3), 547; https://doi.org/10.3390/jcm10030547 - 02 Feb 2021
Viewed by 397
Abstract
We examined the safety and the effects of low-intensity resistance training (RT) with moderate blood flow restriction (KAATSU RT) on muscle strength and size in patients early after cardiac surgery. Cardiac patients (age 69.6 ± 12.6 years, n = 21, M = 18) [...] Read more.
We examined the safety and the effects of low-intensity resistance training (RT) with moderate blood flow restriction (KAATSU RT) on muscle strength and size in patients early after cardiac surgery. Cardiac patients (age 69.6 ± 12.6 years, n = 21, M = 18) were randomly assigned to the control (n = 10) and the KAATSU RT group (n = 11). All patients had received a standard aerobic cardiac rehabilitation program. The KAATSU RT group additionally executed low-intensity leg extension and leg press exercises with moderate blood flow restriction twice a week for 3 months. RT-intensity and volume were increased gradually. We evaluated the anterior mid-thigh thickness (MTH), skeletal muscle mass index (SMI), handgrip strength, knee extensor strength, and walking speed at baseline, 5–7 days after cardiac surgery, and after 3 months. A physician monitored the electrocardiogram, rate of perceived exertion, and the color of the lower limbs during KAATSU RT. Creatine phosphokinase (CPK) and D-dimer were measured at baseline and after 3 months. There were no side effects during KAATSU RT. CPK and D-dimer were normal after 3 months. MTH, SMI, walking speed, and knee extensor strength increased after 3 months with KAATSU RT compared with baseline. Relatively low vs. high physical functioning patients tended to increase physical function more after 3 months with KAATSU RT. Low-intensity KAATSU RT as an adjuvant to standard cardiac rehabilitation can safely increase skeletal muscle strength and size in cardiovascular surgery patients. Full article
(This article belongs to the Section Cardiology)
Show Figures

Graphical abstract

Open AccessArticle
Trends and Clinical Impact of Gastrointestinal Endoscopic Procedures on Acute Heart Failure in Spain (2002–2017)
J. Clin. Med. 2021, 10(3), 546; https://doi.org/10.3390/jcm10030546 - 02 Feb 2021
Viewed by 294
Abstract
Introduction: Heart failure decompensation can be triggered by many factors, including anemia. In cases of iron deficiency anemia or iron deficiency without anemia, endoscopic studies are recommended to rule out the presence of gastrointestinal neoplasms or other associated bleeding lesions. Objectives: The aims [...] Read more.
Introduction: Heart failure decompensation can be triggered by many factors, including anemia. In cases of iron deficiency anemia or iron deficiency without anemia, endoscopic studies are recommended to rule out the presence of gastrointestinal neoplasms or other associated bleeding lesions. Objectives: The aims of this study were to (i) examine trends in the incidence, clinical characteristics, and in-hospital outcomes of patients hospitalized with heart failure from 2002 to 2017 who underwent esophagogastroduodenoscopy (EGD) and/or colonoscopy, and to (ii) identify factors associated with in-hospital mortality (IHM) among patients with heart failure who underwent an EGD and/or a colonoscopy. Methods: We conducted an observational retrospective epidemiological study using the Spanish National Hospital Discharge Database (SNHDD) between 2002 and 2017. We included hospitalizations of patients with a primary discharge diagnosis of heart failure. Cases were reviewed if there was an ICD-9-CM or ICD-10 procedure code for EGD or colonoscopy in any procedure field. Multivariable logistic regression models were constructed to identify predictors of IHM among HF patients who underwent an EGD or colonoscopy. Results: A total of 51,187 (1.32%) non-surgical patients hospitalized with heart failure underwent an EGD and another 72,076 (1.85%) patients had a colonoscopy during their admission. IHM was significantly higher in those who underwent an EGD than in those who underwent a red blood cell transfusion (OR 1.10; 95%CI 1.04–1.12). However, the use of colonoscopy seems to decrease the probability of IHM (OR 0.45; 95%CI 0.41–0.49). In patients who underwent a colonoscopy, older age seems to increase the probability of IHM. However, EGD was associated with a lower mortality (OR 0.60; 95% CI 0.55–0.64). Conclusion: In our study, a decrease in the number of gastroscopies was observed in relation to colonoscopy in patients with heart failure. The significant ageing of the hospitalized HF population seen over the course of the study could have contributed to this. Both procedures seemed to be associated with lower in-hospital mortality, but in the case of colonoscopy, the risk of in-hospital mortality was higher in elderly patients with heart failure and associated neoplasms. Colonoscopy and EGD seemed not to increase IHM in patients with heart failure. Full article
(This article belongs to the Section Gastroenterology & Hepatopancreatobiliary Medicine)
Show Figures

Figure 1

Open AccessEditorial
Analgesic Drugs and COVID-19
J. Clin. Med. 2021, 10(3), 545; https://doi.org/10.3390/jcm10030545 - 02 Feb 2021
Viewed by 437
Abstract
COVID-19 pandemic represents a big challenge for the health care systems [...] Full article
(This article belongs to the Special Issue Analgesic Drugs and COVID-19)
Open AccessArticle
Multistate Modeling of COVID-19 Patients Using a Large Multicentric Prospective Cohort of Critically Ill Patients
J. Clin. Med. 2021, 10(3), 544; https://doi.org/10.3390/jcm10030544 - 02 Feb 2021
Viewed by 457
Abstract
The mortality of COVID-19 patients in the intensive care unit (ICU) is influenced by their state at admission. We aimed to model COVID-19 acute respiratory distress syndrome state transitions from ICU admission to day 60 outcome and to evaluate possible prognostic factors. We [...] Read more.
The mortality of COVID-19 patients in the intensive care unit (ICU) is influenced by their state at admission. We aimed to model COVID-19 acute respiratory distress syndrome state transitions from ICU admission to day 60 outcome and to evaluate possible prognostic factors. We analyzed a prospective French database that includes critically ill COVID-19 patients. A six-state multistate model was built and 17 transitions were analyzed either using a non-parametric approach or a Cox proportional hazard model. Corticosteroids and IL-antagonists (tocilizumab and anakinra) effects were evaluated using G-computation. We included 382 patients in the analysis: 243 patients were admitted to the ICU with non-invasive ventilation, 116 with invasive mechanical ventilation, and 23 with extracorporeal membrane oxygenation. The predicted 60-day mortality was 25.9% (95% CI: 21.8%–30.0%), 44.7% (95% CI: 48.8%–50.6%), and 59.2% (95% CI: 49.4%–69.0%) for a patient admitted in these three states, respectively. Corticosteroids decreased the risk of being invasively ventilated (hazard ratio (HR) 0.59, 95% CI: 0.39–0.90) and IL-antagonists increased the probability of being successfully extubated (HR 1.8, 95% CI: 1.02–3.17). Antiviral drugs did not impact any transition. In conclusion, we observed that the day-60 outcome in COVID-19 patients is highly dependent on the first ventilation state upon ICU admission. Moreover, we illustrated that corticosteroid and IL-antagonists may influence the intubation duration. Full article
(This article belongs to the Special Issue Management of Acute Respiratory Failure)
Show Figures

Figure 1

Open AccessReview
Etiology and Diagnosis of Permanent Hypoparathyroidism after Total Thyroidectomy
J. Clin. Med. 2021, 10(3), 543; https://doi.org/10.3390/jcm10030543 - 02 Feb 2021
Viewed by 441
Abstract
Postoperative parathyroid failure is the commonest adverse effect of total thyroidectomy, which is a widely used surgical procedure to treat both benign and malignant thyroid disorders. The present review focuses on the scientific gap and lack of data regarding the time period elapsed [...] Read more.
Postoperative parathyroid failure is the commonest adverse effect of total thyroidectomy, which is a widely used surgical procedure to treat both benign and malignant thyroid disorders. The present review focuses on the scientific gap and lack of data regarding the time period elapsed between the immediate postoperative period, when hypocalcemia is usually detected by the surgeon, and permanent hypoparathyroidism often seen by an endocrinologist months or years later. Parathyroid failure after thyroidectomy results from a combination of trauma, devascularization, inadvertent resection, and/or autotransplantation, all resulting in an early drop of iPTH (intact parathyroid hormone) requiring replacement therapy with calcium and calcitriol. There is very little or no role for other factors such as vitamin D deficiency, calcitonin, or magnesium. Recovery of the parathyroid function is a dynamic process evolving over months and cannot be predicted on the basis of early serum calcium and iPTH measurements; it depends on the number of parathyroid glands remaining in situ (PGRIS)—not autotransplanted nor inadvertently excised—and on early administration of full-dose replacement therapy to avoid hypocalcemia during the first days/weeks after thyroidectomy. Full article
Show Figures

Figure 1

Open AccessArticle
High Agreement between Barrett Universal II Calculations with and without Utilization of Optional Biometry Parameters
J. Clin. Med. 2021, 10(3), 542; https://doi.org/10.3390/jcm10030542 - 02 Feb 2021
Viewed by 367
Abstract
Purpose: To examine the contribution of anterior chamber depth (ACD), lens thickness (LT), and white-to-white (WTW) measurements to intraocular lens (IOL) power calculations using the Barrett Universal II (BUII) formula. Methods: Measurements taken with the IOLMaster 700 (Carl Zeiss, Meditec AG, Jena, Germany) [...] Read more.
Purpose: To examine the contribution of anterior chamber depth (ACD), lens thickness (LT), and white-to-white (WTW) measurements to intraocular lens (IOL) power calculations using the Barrett Universal II (BUII) formula. Methods: Measurements taken with the IOLMaster 700 (Carl Zeiss, Meditec AG, Jena, Germany) swept-source biometry of 501 right eyes of 501 consecutive patients undergoing cataract extraction surgery between January 2019 and March 2020 were reviewed. IOL power was calculated using the BUII formula, first through the inclusion of all measured variables and then by using partial biometry data. For each calculation method, the IOL power targeting emmetropia was recorded and compared for the whole cohort and stratified by axial length (AL) of the measured eye. Results: The mean IOL power calculated for the entire cohort using all available parameters was 19.50 ± 5.11 diopters (D). When comparing it to the results obtained by partial biometry data, the mean absolute difference ranged from 0.05 to 0.14 D; p < 0.001. The optional variables (ACD, LT, WTW) had the least effect in long eyes (AL ≥ 26 mm; mean absolute difference ranging from 0.02 to 0.07 D; p < 0.001), while the greatest effect in short eyes (AL ≤ 22 mm; mean absolute difference from 0.10 to 0.21 D; p < 0.001). The percentage of eyes with a mean absolute IOL dioptric power difference more than 0.25 D was the highest (32.0%) among the short AL group when using AL and keratometry values only. Conclusions: Using partial biometry data, the BUII formula in small eyes (AL ≤ 22 mm) resulted in a clinically significant difference in the calculated IOL power compared to the full biometry data. In contrast, the contribution of the optional parameters to the calculated IOL power was of little clinical importance in eyes with AL longer than 22 mm. Full article
(This article belongs to the Section Ophthalmology)
Show Figures

Figure 1

Open AccessReview
Gut Microbiota at the Intersection of Alcohol, Brain, and the Liver
J. Clin. Med. 2021, 10(3), 541; https://doi.org/10.3390/jcm10030541 - 02 Feb 2021
Viewed by 409
Abstract
Over the last decade, increased research into the cognizance of the gut–liver–brain axis in medicine has yielded powerful evidence suggesting a strong association between alcoholic liver diseases (ALD) and the brain, including hepatic encephalopathy or other similar brain disorders. In the gut–brain axis, [...] Read more.
Over the last decade, increased research into the cognizance of the gut–liver–brain axis in medicine has yielded powerful evidence suggesting a strong association between alcoholic liver diseases (ALD) and the brain, including hepatic encephalopathy or other similar brain disorders. In the gut–brain axis, chronic, alcohol-drinking-induced, low-grade systemic inflammation is suggested to be the main pathophysiology of cognitive dysfunctions in patients with ALD. However, the role of gut microbiota and its metabolites have remained unclear. Eubiosis of the gut microbiome is crucial as dysbiosis between autochthonous bacteria and pathobionts leads to intestinal insult, liver injury, and neuroinflammation. Restoring dysbiosis using modulating factors such as alcohol abstinence, promoting commensal bacterial abundance, maintaining short-chain fatty acids in the gut, or vagus nerve stimulation could be beneficial in alleviating disease progression. In this review, we summarize the pathogenic mechanisms linked with the gut–liver–brain axis in the development and progression of brain disorders associated with ALD in both experimental models and humans. Further, we discuss the therapeutic potential and future research directions as they relate to the gut–liver–brain axis. Full article
(This article belongs to the Special Issue Alcohol, Brain, and the Liver: A Troubled Relationship)
Show Figures

Graphical abstract

Open AccessArticle
Mortality and Medical Complications of Subtrochanteric Fracture Fixation
J. Clin. Med. 2021, 10(3), 540; https://doi.org/10.3390/jcm10030540 - 02 Feb 2021
Viewed by 376
Abstract
The aim of this study was to define the incidence and investigate the associations with mortality and medical complications, in patients presenting with subtrochanteric femoral fractures subsequently treated with an intramedullary nail, with a special reference to advancement of age. Materials and Methods: [...] Read more.
The aim of this study was to define the incidence and investigate the associations with mortality and medical complications, in patients presenting with subtrochanteric femoral fractures subsequently treated with an intramedullary nail, with a special reference to advancement of age. Materials and Methods: A retrospective review, covering an 8-year period, of all patients admitted to a Level 1 Trauma Centre with the diagnosis of subtrochanteric fractures was conducted. Normality was assessed for the data variables to determine the further use of parametric or non-parametric tests. Logistic regression analysis was then performed to identify the most important associations for each event. A p-value < 0.05 was considered significant. Results: A total of 519 patients were included in our study (age at time of injury: 73.26 ± 19.47 years; 318 female). The average length of hospital stay was 21.4 ± 19.45 days. Mortality was 5.4% and 17.3% for 30 days and one year, respectively. Risk factors for one-year mortality included: Low albumin on admission (Odds ratio (OR) 4.82; 95% Confidence interval (95%CI) 2.08–11.19), dementia (OR 3.99; 95%CI 2.27–7.01), presence of pneumonia during hospital stay (OR 3.18; 95%CI 1.76–5.77) and Charlson comorbidity score (CCS) > 6 (OR 2.94; 95%CI 1.62–5.35). Regarding the medical complications following the operative management of subtrochanteric fractures, the overall incidence of hospital acquired pneumonia (HAP) was 18.3%. Patients with increasing CCS (CCS 6–8: OR 1.69; 95%CI 1.00–2.84/CCS > 8: OR 2.02; 95%CI 1.03–3.95), presence of asthma/chronic obstructive pulmonary disease (COPD) (OR 2.29; 95%CI 1.37–3.82), intensive care unit (ICU)/high dependency unit (HDU) stay (OR 3.25; 95%CI 1.77–5.96) and a length of stay of more than 21 days (OR 8.82; 95%CI 1.18–65.80) were at increased risk of this outcome. The incidence of post-operative delirium was found to be 10.2%. This was associated with pre-existing dementia (OR 4.03; 95%CI 0.34–4.16), urinary tract infection (UTI) (OR 3.85; 95%CI 1.96–7.56), need for an increased level of care (OR 3.16; 95%CI 1.38–7.25), pneumonia (OR 2.29; 95%CI 1.14–4.62) and post-operative deterioration of renal function (OR 2.21; 95%CI 1.18–4.15). The incidence of venous thromboembolism (VTE) was 3.7% (pulmonary embolism (PE): 8 patients; deep venous thrombosis (DVT): 11 patients), whilst the incidence of myocardial infarction (MI)/cerebrovascular accidents (CVA) was 4.0%. No evidence of the so called “weekend effect” was identified on both morbidity and mortality. Regression analysis of these complications did not reveal any significant associations. Conclusions: Our study has opened the field for the investigation of medical complications within the subtrochanteric fracture population. Early identification of the associations of these complications could help prognostication for those who are at risk of a poor outcome. Furthermore, these could be potential “warning shots” for clinicians to act early to manage and in some cases prevent these devastating complications that could potentially lead to an increased risk of mortality. Full article
(This article belongs to the Special Issue Severely Injured Patient in Older Age)
Show Figures

Figure 1

Open AccessArticle
Bioelectrical Impedance Analysis in Patients Undergoing Major Head and Neck Surgery: A Prospective Observational Pilot Study
J. Clin. Med. 2021, 10(3), 539; https://doi.org/10.3390/jcm10030539 - 02 Feb 2021
Viewed by 278
Abstract
Background: Head and neck patients are prone to malnutrition. Perioperative fluids administration in this patient group may influence nutritional status. We aimed to investigate perioperative changes in patients undergoing major head and neck surgery and to examine the impact of perioperative fluid administration [...] Read more.
Background: Head and neck patients are prone to malnutrition. Perioperative fluids administration in this patient group may influence nutritional status. We aimed to investigate perioperative changes in patients undergoing major head and neck surgery and to examine the impact of perioperative fluid administration on body composition and metabolic changes using bioelectrical impedance. Furthermore, we sought to correlate these metabolic changes with postoperative complication rate. In this prospective observational pilot study, bioelectrical impedance analysis (BIA) was performed preoperatively and on postoperative days (POD) 2 and 10 on patients who underwent major head and neck surgeries. BIA was completed in 34/37 patients; mean total intraoperative and post-anesthesia fluid administration was 3682 ± 1910 mL and 1802 ± 1466 mL, respectively. Total perioperative fluid administration was associated with postoperative high extra-cellular water percentages (p = 0.038) and a low phase-angle score (p < 0.005), which indicates low nutritional status. Patients with phase angle below the 5th percentile at POD 2 had higher local complication rates (p = 0.035) and longer hospital length of stay (LOS) (p = 0.029). Multivariate analysis failed to demonstrate that high-volume fluid administration and phase angle are independent factors for postoperative complications. High-volume perioperative fluids administration impacts postoperative nutritional status with fluid shift toward the extra-cellular space and is associated with factors that increase the risk of postoperative complications and longer LOS. An adjusted, low-volume perioperative fluid regimen should be considered in patients with comorbidities in order to minimize postoperative morbidity. Full article
(This article belongs to the Section Otolaryngology)
Open AccessArticle
An MRI-Based Radiomic Prognostic Index Predicts Poor Outcome and Specific Genetic Alterations in Endometrial Cancer
J. Clin. Med. 2021, 10(3), 538; https://doi.org/10.3390/jcm10030538 - 02 Feb 2021
Viewed by 391
Abstract
Integrative tumor characterization linking radiomic profiles to corresponding gene expression profiles has the potential to identify specific genetic alterations based on non-invasive radiomic profiling in cancer. The aim of this study was to develop and validate a radiomic prognostic index (RPI) based on [...] Read more.
Integrative tumor characterization linking radiomic profiles to corresponding gene expression profiles has the potential to identify specific genetic alterations based on non-invasive radiomic profiling in cancer. The aim of this study was to develop and validate a radiomic prognostic index (RPI) based on preoperative magnetic resonance imaging (MRI) and assess possible associations between the RPI and gene expression profiles in endometrial cancer patients. Tumor texture features were extracted from preoperative 2D MRI in 177 endometrial cancer patients. The RPI was developed using least absolute shrinkage and selection operator (LASSO) Cox regression in a study cohort (n = 95) and validated in an MRI validation cohort (n = 82). Transcriptional alterations associated with the RPI were investigated in the study cohort. Potential prognostic markers were further explored for validation in an mRNA validation cohort (n = 161). The RPI included four tumor texture features, and a high RPI was significantly associated with poor disease-specific survival in both the study cohort (p < 0.001) and the MRI validation cohort (p = 0.030). The association between RPI and gene expression profiles revealed 46 significantly differentially expressed genes in patients with a high RPI versus a low RPI (p < 0.001). The most differentially expressed genes, COMP and DMBT1, were significantly associated with disease-specific survival in both the study cohort and the mRNA validation cohort. In conclusion, a high RPI score predicts poor outcome and is associated with specific gene expression profiles in endometrial cancer patients. The promising link between radiomic tumor profiles and molecular alterations may aid in developing refined prognostication and targeted treatment strategies in endometrial cancer. Full article
(This article belongs to the Special Issue Clinical Advances on Endometrial Cancer)
Show Figures

Figure 1

Open AccessArticle
Vitamin D Concentrations at Term Do Not Differ in Newborns and Their Mothers with and without Polycystic Ovary Syndrome
J. Clin. Med. 2021, 10(3), 537; https://doi.org/10.3390/jcm10030537 - 02 Feb 2021
Viewed by 300
Abstract
Studies suggest that non-pregnant women with polycystic ovary syndrome (PCOS) may be at elevated risk of 25 hydroxyvitamin D (25(OH)D) deficiency. Furthermore, there is evidence suggesting that 25(OH)D may also play an important role during pregnancy. Data regarding 25(OH)D deficiency during pregnancy in [...] Read more.
Studies suggest that non-pregnant women with polycystic ovary syndrome (PCOS) may be at elevated risk of 25 hydroxyvitamin D (25(OH)D) deficiency. Furthermore, there is evidence suggesting that 25(OH)D may also play an important role during pregnancy. Data regarding 25(OH)D deficiency during pregnancy in PCOS patients and its association with perinatal outcome is scarce. The aim of the study was to investigate whether mothers with and without PCOS have different 25(OH)D levels at term, how maternal 25(OH)D levels are reflected in their offspring, and if 25(OH)D levels are associated with an adverse perinatal outcome. Therefore, we performed a cross-sectional observational study and included 79 women with PCOS according to the ESHRE/ASRM 2003 definition and 354 women without PCOS and an ongoing pregnancy ≥ 37 + 0 weeks of gestation who gave birth in our institution between March 2013 and December 2015. Maternal serum and cord blood 25(OH)D levels were analyzed at the day of delivery. Maternal 25(OH)D levels did not differ significantly in women with PCOS and without PCOS (p = 0.998), nor did the 25(OH)D levels of their respective offspring (p = 0.692). 25(OH)D deficiency (<20 ng/mL) was found in 26.9% and 22.5% of women with and without PCOS (p = 0.430). There was a strong positive correlation between maternal and neonatal 25(OH)D levels in both investigated groups (r ≥ 0.79, p < 0.001). Linear regression estimates of cord blood 25(OH)D levels are about 77% of serum 25(OH)D concentrations of the mother. Compared to healthy controls, the risk for maternal complications was increased in PCOS women (48% vs. 65%; p = 0.009), while there was no significant difference in neonatal complications (22% and 22%; p = 1.0). However, 25(OH)D levels were similar between mothers and infants with and without perinatal complications. Although the share of women and infants with 25(OH)D deficiency was high in women with PCOS and without PCOS, it seems that the incidence of adverse perinatal outcome was not affected. The long-term consequences for mothers and infants with a 25(OH)D deficiency have to be investigated in future studies. Full article
(This article belongs to the Special Issue Polycystic Ovary Syndrome in Gynecologic Endocrinology)
Show Figures

Figure 1

Open AccessReview
Thrombotic Thrombocytopenic Purpura: Pathophysiology, Diagnosis, and Management
J. Clin. Med. 2021, 10(3), 536; https://doi.org/10.3390/jcm10030536 - 02 Feb 2021
Viewed by 504
Abstract
Thrombotic thrombocytopenic purpura (TTP) is a rare thrombotic microangiopathy characterized by microangiopathic hemolytic anemia, severe thrombocytopenia, and ischemic end organ injury due to microvascular platelet-rich thrombi. TTP results from a severe deficiency of the specific von Willebrand factor (VWF)-cleaving protease, ADAMTS13 (a disintegrin [...] Read more.
Thrombotic thrombocytopenic purpura (TTP) is a rare thrombotic microangiopathy characterized by microangiopathic hemolytic anemia, severe thrombocytopenia, and ischemic end organ injury due to microvascular platelet-rich thrombi. TTP results from a severe deficiency of the specific von Willebrand factor (VWF)-cleaving protease, ADAMTS13 (a disintegrin and metalloprotease with thrombospondin type 1 repeats, member 13). ADAMTS13 deficiency is most commonly acquired due to anti-ADAMTS13 autoantibodies. It can also be inherited in the congenital form as a result of biallelic mutations in the ADAMTS13 gene. In adults, the condition is most often immune-mediated (iTTP) whereas congenital TTP (cTTP) is often detected in childhood or during pregnancy. iTTP occurs more often in women and is potentially lethal without prompt recognition and treatment. Front-line therapy includes daily plasma exchange with fresh frozen plasma replacement and immunosuppression with corticosteroids. Immunosuppression targeting ADAMTS13 autoantibodies with the humanized anti-CD20 monoclonal antibody rituximab is frequently added to the initial therapy. If available, anti-VWF therapy with caplacizumab is also added to the front-line setting. While it is hypothesized that refractory TTP will be less common in the era of caplacizumab, in relapsed or refractory cases cyclosporine A, N-acetylcysteine, bortezomib, cyclophosphamide, vincristine, or splenectomy can be considered. Novel agents, such as recombinant ADAMTS13, are also currently under investigation and show promise for the treatment of TTP. Long-term follow-up after the acute episode is critical to monitor for relapse and to diagnose and manage chronic sequelae of this disease. Full article
(This article belongs to the Special Issue The Latest Clinical Advances in Thrombocytopenia)
Show Figures

Figure 1

Open AccessArticle
Reliability and Reproducibility of Landmark Identification in Unilateral Cleft Lip and Palate Patients: Digital Lateral Vis-A-Vis CBCT-Derived 3D Cephalograms
J. Clin. Med. 2021, 10(3), 535; https://doi.org/10.3390/jcm10030535 - 02 Feb 2021
Viewed by 315
Abstract
Background: The aim of the retrospective observational study was to compare the precision of landmark identification and its reproducibility using cone beam computed tomography-derived 3D cephalograms and digital lateral cephalograms in unilateral cleft lip and palate patients. Methods: Cephalograms of thirty-one (31) North [...] Read more.
Background: The aim of the retrospective observational study was to compare the precision of landmark identification and its reproducibility using cone beam computed tomography-derived 3D cephalograms and digital lateral cephalograms in unilateral cleft lip and palate patients. Methods: Cephalograms of thirty-one (31) North Indian children (18 boys and 13 girls) with a unilateral cleft lip and palate, who were recommended for orthodontic treatment, were selected. After a thorough analysis of peer-reviewed articles, 20 difficult-to-trace landmarks were selected, and their reliability and reproducibility were studied. These were subjected to landmark identification to evaluate interobserver variability; the coordinates for each point were traced separately by three different orthodontists (OBA, OBB, OBC). Statistical analysis was performed using descriptive and inferential statistics with paired t-tests to compare the differences measured by the two methods. Real-scale data are presented in mean ± SD. A p-value less than 0.05 was considered as significant at a 95% confidence level. Results: When comparing, the plotting of points posterior nasal spine (PNS) (p < 0.05), anterior nasal spine (ANS) (p < 0.01), upper 1 root tip (p < 0.05), lower 1 root tip (p < 0.05), malare (p < 0.05), pyriforme (p < 0.05), porion (p < 0.01), and basion (p < 0.05) was statistically significant. Conclusion: In patients with a cleft lip and palate, the interobserver identification of cephalometric landmarks was significantly more precise and reproducible with cone beam computed tomography -derived cephalograms vis-a-vis digital lateral cephalograms. Full article
(This article belongs to the Special Issue New Approaches and Technologies in Orthodontics)
Show Figures

Figure 1

Open AccessReview
ECMO in Cardiac Arrest: A Narrative Review of the Literature
J. Clin. Med. 2021, 10(3), 534; https://doi.org/10.3390/jcm10030534 - 02 Feb 2021
Viewed by 537
Abstract
Cardiac arrest (CA) is a frequent cause of death and a major public health issue. To date, conventional cardiopulmonary resuscitation (CPR) is the only efficient method of resuscitation available that positively impacts prognosis. Extracorporeal membrane oxygenation (ECMO) is a complex and costly technique [...] Read more.
Cardiac arrest (CA) is a frequent cause of death and a major public health issue. To date, conventional cardiopulmonary resuscitation (CPR) is the only efficient method of resuscitation available that positively impacts prognosis. Extracorporeal membrane oxygenation (ECMO) is a complex and costly technique that requires technical expertise. It is not considered standard of care in all hospitals and should be applied only in high-volume facilities. ECMO combined with CPR is known as ECPR (extracorporeal cardiopulmonary resuscitation) and permits hemodynamic and respiratory stabilization of patients with CA refractory to conventional CPR. This technique allows the parallel treatment of the underlying etiology of CA while maintaining organ perfusion. However, current evidence does not support the routine use of ECPR in all patients with refractory CA. Therefore, an appropriate selection of patients who may benefit from this procedure is key. Reducing the duration of low blood flow by means of performing high-quality CPR and promoting access to ECPR, may improve the survival rate of the patients presenting with refractory CA. Indeed, patients who benefit from ECPR seem to carry better neurological outcomes. The aim of this present narrative review is to present the most recent literature available on ECPR and to clarify its potential therapeutic role, as well as to provide an in-depth explanation of equipment and its set up, the patient selection process, and the patient management post-ECPR. Full article
Show Figures

Figure 1

Open AccessReview
Learning the Ropes of Platelet Count Regulation: Inherited Thrombocytopenias
J. Clin. Med. 2021, 10(3), 533; https://doi.org/10.3390/jcm10030533 - 02 Feb 2021
Viewed by 307
Abstract
Inherited thrombocytopenias (IT) are a group of hereditary disorders characterized by a reduced platelet count sometimes associated with abnormal platelet function, which can lead to bleeding but also to syndromic manifestations and predispositions to other disorders. Currently at least 41 disorders caused by [...] Read more.
Inherited thrombocytopenias (IT) are a group of hereditary disorders characterized by a reduced platelet count sometimes associated with abnormal platelet function, which can lead to bleeding but also to syndromic manifestations and predispositions to other disorders. Currently at least 41 disorders caused by mutations in 42 different genes have been described. The pathogenic mechanisms of many forms of IT have been identified as well as the gene variants implicated in megakaryocyte maturation or platelet formation and clearance, while for several of them the pathogenic mechanism is still unknown. A range of therapeutic approaches are now available to improve survival and quality of life of patients with IT; it is thus important to recognize an IT and establish a precise diagnosis. ITs may be difficult to diagnose and an initial accurate clinical evaluation is mandatory. A combination of clinical and traditional laboratory approaches together with advanced sequencing techniques provide the highest rate of diagnostic success. Despite advancement in the diagnosis of IT, around 50% of patients still do not receive a diagnosis, therefore further research in the field of ITs is warranted to further improve patient care. Full article
(This article belongs to the Special Issue The Latest Clinical Advances in Thrombocytopenia)
Show Figures

Figure 1

Open AccessReview
Making Sense of Intracellular Nucleic Acid Sensing in Type I Interferon Activation in Sjögren’s Syndrome
J. Clin. Med. 2021, 10(3), 532; https://doi.org/10.3390/jcm10030532 - 02 Feb 2021
Viewed by 370
Abstract
Primary Sjögren’s syndrome (pSS) is a systemic autoimmune rheumatic disease characterized by dryness of the eyes and mucous membranes, which can be accompanied by various extraglandular autoimmune manifestations. The majority of patients exhibit persistent systemic activation of the type I interferon (IFN) system, [...] Read more.
Primary Sjögren’s syndrome (pSS) is a systemic autoimmune rheumatic disease characterized by dryness of the eyes and mucous membranes, which can be accompanied by various extraglandular autoimmune manifestations. The majority of patients exhibit persistent systemic activation of the type I interferon (IFN) system, a feature that is shared with other systemic autoimmune diseases. Type I IFNs are integral to anti-viral immunity and are produced in response to stimulation of pattern recognition receptors, among which nucleic acid (NA) receptors. Dysregulated detection of endogenous NAs has been widely implicated in the pathogenesis of systemic autoimmune diseases. Stimulation of endosomal Toll-like receptors by NA-containing immune complexes are considered to contribute to the systemic type I IFN activation. Accumulating evidence suggest additional roles for cytosolic NA-sensing pathways in the pathogenesis of systemic autoimmune rheumatic diseases. In this review, we will provide an overview of the functions and signaling of intracellular RNA- and DNA-sensing receptors and summarize the evidence for a potential role of these receptors in the pathogenesis of pSS and the sustained systemic type I IFN activation. Full article
Show Figures

Figure 1

Open AccessCorrection
Correction: Feola, M., et al. Six-Month Predictive Value of Diuretic Resistance Formulas in Discharged Heart Failure Patients after an Acute Decompensation. J. Clin. Med. 2020, 9, 2932
J. Clin. Med. 2021, 10(3), 531; https://doi.org/10.3390/jcm10030531 - 02 Feb 2021
Viewed by 292
Abstract
There was an error in the original article [...] Full article
Open AccessReview
Testosterone and Bone Health in Men: A Narrative Review
J. Clin. Med. 2021, 10(3), 530; https://doi.org/10.3390/jcm10030530 - 02 Feb 2021
Viewed by 368
Abstract
Bone fracture due to osteoporosis is an important issue in decreasing the quality of life for elderly men in the current aging society. Thus, osteoporosis and bone fracture prevention is a clinical concern for many clinicians. Moreover, testosterone has an important role in [...] Read more.
Bone fracture due to osteoporosis is an important issue in decreasing the quality of life for elderly men in the current aging society. Thus, osteoporosis and bone fracture prevention is a clinical concern for many clinicians. Moreover, testosterone has an important role in maintaining bone mineral density (BMD) among men. Some testosterone molecular mechanisms on bone metabolism have been currently established by many experimental data. Concurrent with a decrease in testosterone with age, various clinical symptoms and signs associated with testosterone decline, including decreased BMD, are known to occur in elderly men. However, the relationship between testosterone levels and osteoporosis development has been conflicting in human epidemiological studies. Thus, testosterone replacement therapy (TRT) is a useful tool for managing clinical symptoms caused by hypogonadism. Many recent studies support the benefit of TRT on BMD, especially in hypogonadal men with osteopenia and osteoporosis, although a few studies failed to demonstrate its effects. However, no evidence supporting the hypothesis that TRT can prevent the incidence of bone fracture exists. Currently, TRT should be considered as one of the treatment options to improve hypogonadal symptoms and BMD simultaneously in symptomatic hypogonadal men with osteopenia. Full article
(This article belongs to the Special Issue Osteoporosis and Related Bone Metabolic Disease)
Show Figures

Figure 1

Open AccessArticle
Assessment of Renal Function Based on Dynamic Scintigraphy Parameters in the Diagnosis of Obstructive Uro/Nephropathy
J. Clin. Med. 2021, 10(3), 529; https://doi.org/10.3390/jcm10030529 - 02 Feb 2021
Viewed by 299
Abstract
This study evaluates the usefulness of parameters allowing assessment of renal function in absolute values in dynamic renal scintigraphy (DRS) with 99mTc-ethylenedicysteine (99mTc-EC) uptake constant (K), mean transit time (MTT), and parenchymal transit time (PTT) in the diagnosis of obstructive [...] Read more.
This study evaluates the usefulness of parameters allowing assessment of renal function in absolute values in dynamic renal scintigraphy (DRS) with 99mTc-ethylenedicysteine (99mTc-EC) uptake constant (K), mean transit time (MTT), and parenchymal transit time (PTT) in the diagnosis of obstructive uro/nephropathy. The study included 226 people: 20 healthy volunteers, for whom normative values of assessed parameters were determined, and 206 patients. Reproducibility of results obtained by two independent operators, specificity, correlation with estimated GFR (eGFR), and Cohen’s kappa were used to evaluate reliability of assessed parameters. Normative values were as follows: K ≥ 1.6, MTT ≤ 250 s, and PTT ≤ 225 s. Reproducibility of determination of K (rs = 0.99) and MTT (rs = 0.98) was significantly higher than that of PTT (rs = 0.95) (p = 0.001). Specificity was 100% for K, 81% for MTT, and 91% for PTT. Correlation of eGFR with K (rs = 0.89) was significantly higher than with PTT (rs = 0.53) and with split function (SF) (rs = 0.66) (p < 0.0001). Cohen’s kappa was κ = 0.89 for K, κ = 0.88 for MTT, and κ = 0.77 for PTT. In a group of patients where standard DRS parameters are unreliable (bilateral obstructive uro/nephropathy or single functioning kidney), the use of K (the most effective among assessed parameters) changed the classification of 23/79 kidneys (29%). K enables reproducible assessment of absolute, individual kidney function without modifying routine DRS protocol. Diagnostic value of MTT and PTT is limited. Full article
(This article belongs to the Section Nuclear Medicine & Radiology)
Show Figures

Figure 1

Open AccessArticle
The Influence of Family History of Neurodegenerative Disease on Adolescent Concussion Outcomes
J. Clin. Med. 2021, 10(3), 528; https://doi.org/10.3390/jcm10030528 - 02 Feb 2021
Viewed by 330
Abstract
Evidence suggests that factors associated with a family history of neurodegenerative disease (fhNDD) may influence outcomes following a concussion. However, the relevance of these findings in adolescent populations has not been fully explored. Therefore, the present study sought to evaluate the relationship between [...] Read more.
Evidence suggests that factors associated with a family history of neurodegenerative disease (fhNDD) may influence outcomes following a concussion. However, the relevance of these findings in adolescent populations has not been fully explored. Therefore, the present study sought to evaluate the relationship between fhNDD and neurological outcomes following an adolescent concussion. Data from a local pediatric concussion clinic were used to compare adolescents with (n = 22) and without (n = 44) an fhNDD. Clinical symptom burden, emotional health, cardio-autonomic function, and cognitive performance were assessed at initial (~2 weeks) and follow-up (~5 weeks) post-injury evaluations. Cardio-autonomic function was assessed at rest and during isometric handgrip contraction (IHGC). Results indicated no significant group differences in emotional health or cognitive performance. Across evaluations, those with an fhNDD exhibited greater somatic symptom severity, alterations in HRV at rest, and early blunted cardio-autonomic reactivity during IHGC compared to those without an fhNDD. These findings suggest that positive fhNDD is negatively associated with clinical symptomology and cardio-autonomic functioning following an adolescent concussion. Further, these findings encourage clinicians to utilize a comprehensive neurological evaluation to monitor concussion recovery. Future studies should look into exploring the role of specific neurodegenerative processes and conditions on concussion outcomes in adolescents. Full article
(This article belongs to the Special Issue Children Behavior and Psychophysiology)
Show Figures

Figure 1

Open AccessArticle
Impact of 1-Hour Bundle Achievement in Septic Shock
J. Clin. Med. 2021, 10(3), 527; https://doi.org/10.3390/jcm10030527 - 02 Feb 2021
Viewed by 351
Abstract
This study aimed to address the impact of 1-hr bundle achievement on outcomes in septic shock patients. Secondary analysis of multicenter prospectively collected data on septic shock patients who had undergone protocolized resuscitation bundle therapy at emergency departments was conducted. In-hospital mortality according [...] Read more.
This study aimed to address the impact of 1-hr bundle achievement on outcomes in septic shock patients. Secondary analysis of multicenter prospectively collected data on septic shock patients who had undergone protocolized resuscitation bundle therapy at emergency departments was conducted. In-hospital mortality according to 1-h bundle achievement was compared using multivariable logistic regression analysis. Patients were also divided into 3 groups according to the time of bundle achievement and outcomes were compared to examine the difference in outcome for each group over time: group 1 (≤1 h reference), group 2 (1–3 h) and group 3 (3–6 h). In total, 1612 patients with septic shock were included. The 1-h bundle was achieved in 461 (28.6%) patients. The group that achieved the 1-h bundle did not show a significant difference in in-hospital mortality compared to the group that did not achieve the 1-h bundle on multivariable logistic regression analysis (<1 vs. >1 h) (odds ratio = 0.74, p = 0.091). However, 3- and 6- h bundle achievements showed significantly lower odds ratios of in-hospital mortality compared to the group that did not achieve the bundle (<3 vs. >3 h, <6 vs. >6 h, odds ratio = 0.604 and 0.458, respectively). There was no significant difference in in-hospital mortality over time for group 2 and 3 compared to that of group 1. One-hour bundle achievement was not associated with improved outcomes in septic shock patients. These data suggest that further investigation into the clinical implications of 1-h bundle achievement in patients with septic shock is warranted. Full article
(This article belongs to the Section Emergency Medicine)
Show Figures

Figure 1

Open AccessArticle
Impact of the 25-Hydroxycholecalciferol Concentration and Vitamin D Deficiency Treatment on Changes in the Bone Level at the Implant Site during the Process of Osseointegration: A Prospective, Randomized, Controlled Clinical Trial
J. Clin. Med. 2021, 10(3), 526; https://doi.org/10.3390/jcm10030526 - 02 Feb 2021
Viewed by 346
Abstract
Introduction: The most important factor which is responsible for the positive course of implant treatment is the process of osseointegration between the implant structure and the host’s bone tissue. The aim of this study was to assess what effect the 25-hydroxycholecalciferol concentration and [...] Read more.
Introduction: The most important factor which is responsible for the positive course of implant treatment is the process of osseointegration between the implant structure and the host’s bone tissue. The aim of this study was to assess what effect the 25-hydroxycholecalciferol concentration and vitamin D deficiency treatment have on changes in the bone level at the implant site during the process of osseointegration in the mandible. Materials and Methods: The study was with 122 people qualified for implant surgery, who were assigned to three research groups (A, B, and C). Laboratory, clinical, and radiological tests were performed on the day of surgery, and after 6 and 12 weeks. The bone level in the immediate proximity of the implant was determined by radiovisiography (RVG). Results: The bone level after 12 weeks in Groups B and C was significantly higher than after 6 weeks. The bone level in the study Group B was significantly higher than in Group A. The study showed that the higher the levels of 25-hydroxycholecalciferol were observed on the day of surgery, the higher was the level of bone surrounding the implant after 6 and 12 after surgery. Conclusion: The correct level of 25-hydroxycholecalciferol on the day of surgery and vitamin D deficiency treatment significantly increase the bone level at the implant site in the process of radiologically assessed osseointegration. Full article
(This article belongs to the Special Issue Clinical Advances in Implant Dentistry)
Show Figures

Figure 1

Open AccessArticle
Age Differences in Motor Recruitment Patterns of the Shoulder in Dynamic and Isometric Contractions. A Cross-Sectional Study
J. Clin. Med. 2021, 10(3), 525; https://doi.org/10.3390/jcm10030525 - 02 Feb 2021
Viewed by 472
Abstract
Aging processes in the musculoskeletal system lead to functional impairments that restrict participation. Purpose: To assess differences in the force and motor recruitment patterns of shoulder muscles between age groups to understand functional disorders. A cross-sectional study comparing 30 adults (20–64) and 30 [...] Read more.
Aging processes in the musculoskeletal system lead to functional impairments that restrict participation. Purpose: To assess differences in the force and motor recruitment patterns of shoulder muscles between age groups to understand functional disorders. A cross-sectional study comparing 30 adults (20–64) and 30 older adults (>65). Surface electromyography (sEMG) of the middle deltoid, upper and lower trapezius, infraspinatus, and serratus anterior muscles was recorded. Maximum isometric voluntary contraction (MIVC) was determined at 45° glenohumeral abduction. For the sEMG signal registration, concentric and eccentric contraction with and without 1 kg and isometric contraction were requested. Participants abducted the arm from 0° up to an abduction angle of 135° for concentric and eccentric contraction, and from 0° to 45°, and remained there at 80% of the MIVC level while isometrically pushing against a handheld dynamometer. Differences in sEMG amplitudes (root mean square, RMS) of all contractions, but also onset latencies during concentric contraction of each muscle between age groups, were analyzed. Statistical differences in strength (Adults > Older adults; 0.05) existed between groups. No significant differences in RMS values of dynamic contractions were detected, except for the serratus anterior, but there were for isometric contractions of all muscles analyzed (Adults > Older adults; 0.05). The recruitment order varied between age groups, showing a general tendency towards delayed onset times in older adults, except for the upper trapezius muscle. Age differences in muscle recruitment patterns were found, which underscores the importance of developing musculoskeletal data to prevent and guide geriatric shoulder pathologies. Full article
(This article belongs to the Special Issue New Frontiers in Geriatric Diseases)
Show Figures

Figure 1

Previous Issue
Back to TopTop