Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (62)

Search Parameters:
Keywords = retransplantation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 2179 KiB  
Article
Hepatic Artery Thrombosis After Orthotopic Liver Transplant: A 20-Year Monocentric Series
by Vincenzo Tondolo, Gianluca Rizzo, Giovanni Pacini, Luca Emanuele Amodio, Federica Marzi, Giada Livadoti, Giuseppe Quero and Fausto Zamboni
J. Clin. Med. 2025, 14(13), 4804; https://doi.org/10.3390/jcm14134804 - 7 Jul 2025
Viewed by 416
Abstract
Background/Objectives: Hepatic artery thrombosis (HAT) is a serious vascular complication in patients undergoing orthotopic liver transplantation (OLT). It is associated with a high risk of graft loss, re-transplantation (re-OLT), and mortality. This study aimed to evaluate the incidence and management of HAT, [...] Read more.
Background/Objectives: Hepatic artery thrombosis (HAT) is a serious vascular complication in patients undergoing orthotopic liver transplantation (OLT). It is associated with a high risk of graft loss, re-transplantation (re-OLT), and mortality. This study aimed to evaluate the incidence and management of HAT, analyzing potential risk factors. The secondary objectives included quantifying 90-day postoperative morbidity and mortality rates. Methods: In this retrospective, observational, single-center study, data from liver transplant donors and recipients who underwent OLT between 2004 and 2024 were analyzed. HAT was classified as early (e-HAT, ≤30 days) or late (l-HAT, >30 days). Univariate statistical analysis was performed to identify the risk factors associated with HAT occurrence. Multivariate analysis was not performed due to the small number of HAT events, which would increase the risk of model overfitting. Results: In the 20 year study period, a total of 532 OLTs were performed, including 37 re-OLTs. The rates of major morbidity, reoperation, and mortality within 90 days were 44.5%, 22.3%, and 7.1%, respectively. HAT occurred in 2.4% of cases (e-HAT: 1.6%; l-HAT: 0.7%). Among e-HAT cases, 66.6% were asymptomatic and identified through routine postoperative Doppler ultrasound. All e-HAT cases were surgically treated, with a re-OLT rate of 33.3%. Three l-HAT cases required re-OLT. Overall, the HAT-related mortality and re-OLT rates were 7.6% and 46.1%, respectively. At a follow-up of 86 months, the rate of graft loss was 9.2%, and the rate of post-OLT survival was 77%. Patients who developed HAT had a higher donor-to-recipient body weight ratio and longer warm ischemia times (WITs). Additionally, patients undergoing re-OLT had a higher risk of developing HAT. Conclusions: Although the incidence of HAT is low, its clinical consequences are severe. Early Doppler ultrasound surveillance is crucial for detecting e-HAT and preventing graft loss. A high donor-to-recipient body weight ratio, a prolonged warm ischemia time, and re-OLT seem to be associated with a high risk of HAT. Full article
(This article belongs to the Section General Surgery)
Show Figures

Figure 1

11 pages, 418 KiB  
Article
Impact of Primary Diagnosis on the Outcome of Heart Transplantation in Children
by Csaba Vilmányi, Zsolt L. Nagy, György S. Reusz and László Ablonczy
J. Cardiovasc. Dev. Dis. 2025, 12(6), 205; https://doi.org/10.3390/jcdd12060205 - 29 May 2025
Viewed by 426
Abstract
Introduction: Pediatric heart transplantation (HTX) remains the only therapeutic option for end-stage heart failure not amenable to conventional surgical or catheter interventions. We reviewed our pediatric HTX outcomes according to primary diagnosis. Patients and Methods: Sixty-two patients underwent HTX between 01/2007 and 12/2022. [...] Read more.
Introduction: Pediatric heart transplantation (HTX) remains the only therapeutic option for end-stage heart failure not amenable to conventional surgical or catheter interventions. We reviewed our pediatric HTX outcomes according to primary diagnosis. Patients and Methods: Sixty-two patients underwent HTX between 01/2007 and 12/2022. Patients were divided into congenital heart disease (CHD, n = 20) and cardiomyopathy (CMP, n = 42) groups. All potential variables relevant to patient recovery and long-term survival with endpoints of retransplantation or death were analyzed. Results: CHD patients underwent HTX after significantly more multiple major cardiac surgeries per patient (2.5 [0–5]) than CMP patients (0.5 [0–2], p < 0.01), without notable allosensitization. Post-HTX recovery was longer in CHD (mean mechanical ventilation 7 vs. 3 days, p = 0.001), likely due to longer surgical time (468 vs. 375 min, p = 0.037). There were no significant differences in the frequency of rejections between the two groups (4/20 vs. 9/42). Midterm survival was slightly better (85/70% p = NS) in CMP (median follow-up 44.5 [0–177] months). Conclusion: Our study confirmed good short- and long-term outcomes of pediatric HTX in both CMP and CHD. The longer postoperative recovery in CHD did not lead to higher mortality. No higher pretransplant hypersensitization was observed, possibly explaining the lack of difference in the number and severity of rejections. Full article
Show Figures

Figure 1

16 pages, 311 KiB  
Review
Genomic and Biomarker Innovations in Predicting Kidney Transplant Rejection
by Rachana Punukollu, Sandesh Parajuli, Harshad Chaudhari and Girish Mour
J. Clin. Med. 2025, 14(11), 3642; https://doi.org/10.3390/jcm14113642 - 22 May 2025
Cited by 2 | Viewed by 968
Abstract
Currently, approximately 90,000 patients are on the kidney transplant waitlist in the United States, including 10,000 individuals awaiting re-transplantation due to prior graft failure. Allograft rejection remains a leading cause of kidney transplant failure. While the current gold standard for diagnosing rejection is [...] Read more.
Currently, approximately 90,000 patients are on the kidney transplant waitlist in the United States, including 10,000 individuals awaiting re-transplantation due to prior graft failure. Allograft rejection remains a leading cause of kidney transplant failure. While the current gold standard for diagnosing rejection is tissue biopsy, it is invasive and impractical for routine or longitudinal graft surveillance. This review summarizes the current landscape of non-invasive biomarkers for detecting and predicting kidney transplant rejection, with a focus on both historical context and recent advancements. In particular, we highlight the roles of donor-derived cell-free DNA (dd-cfDNA) and gene expression profiling (GEP) in identifying acute rejection. We also discuss emerging biomarkers such as torque teno virus (TTV), which has shown potential as an indirect indicator of immunosuppression levels and rejection risk. Importantly, this review excludes biomarker studies that rely on tissue biopsy, emphasizing non-invasive approaches to rejection monitoring. Full article
(This article belongs to the Section Nephrology & Urology)
Show Figures

Figure 1

17 pages, 4212 KiB  
Review
Decellularized Liver Matrices for Expanding the Donor Pool—An Evaluation of Existing Protocols and Future Trends
by Marcin Morawski, Maciej Krasnodębski, Jakub Rochoń, Hubert Kubiszewski, Michał Marzęcki, Dominik Topyła, Kacper Murat, Mikołaj Staszewski, Jacek Szczytko, Marek Maleszewski and Michał Grąt
Biomolecules 2025, 15(1), 98; https://doi.org/10.3390/biom15010098 - 10 Jan 2025
Cited by 1 | Viewed by 1178
Abstract
Liver transplantation is the only curative option for end-stage liver disease and is necessary for an increasing number of patients with advanced primary or secondary liver cancer. Many patient groups can benefit from this treatment, however the shortage of liver grafts remains an [...] Read more.
Liver transplantation is the only curative option for end-stage liver disease and is necessary for an increasing number of patients with advanced primary or secondary liver cancer. Many patient groups can benefit from this treatment, however the shortage of liver grafts remains an unsolved problem. Liver bioengineering offers a promising method for expanding the donor pool through the production of acellular scaffolds that can be seeded with recipient cells. Decellularization protocols involve the removal of cells using various chemical, physical, and enzymatic steps to create a collagenous network that provides support for introduced cells and future vascular and biliary beds. However, the removal of the cells causes varying degrees of matrix damage, that can affect cell seeding and future organ performance. The main objective of this review is to present the existing techniques of producing decellularized livers, with an emphasis on the assessment and definition of acellularity. Decellularization agents are discussed, and the standard process of acellular matrix production is evaluated. We also introduce the concept of the stepwise assessment of the matrix during decellularization through decellularization cycles. This method may lead to shorter detergent exposure times and less scaffold damage. The introduction of apoptosis induction in the field of organ engineering may provide a valuable alternative to existing long perfusion protocols, which lead to significant matrix damage. A thorough understanding of the decellularization process and the action of the various factors influencing the final composition of the scaffold is essential to produce a biocompatible matrix, which can be the basis for further studies regarding recellularization and retransplantation. Full article
(This article belongs to the Section Synthetic Biology and Bioengineering)
Show Figures

Figure 1

6 pages, 200 KiB  
Opinion
In the Twilight of Evidence: Is Bypass Surgery Still on the Table for Cardiac Allograft Vasculopathy?
by Emyal Alyaydin and Andreas J. Flammer
J. Clin. Med. 2025, 14(1), 132; https://doi.org/10.3390/jcm14010132 - 29 Dec 2024
Viewed by 895
Abstract
Background: Cardiac allograft vasculopathy (CAV) is a major prognosis-limiting factor in patients undergoing orthotopic heart transplantation (HT). Due to the diffuse involvement of the coronary tree, CAV lesions are often not amenable to percutaneous coronary intervention (PCI), leaving coronary artery bypass grafting (CABG) [...] Read more.
Background: Cardiac allograft vasculopathy (CAV) is a major prognosis-limiting factor in patients undergoing orthotopic heart transplantation (HT). Due to the diffuse involvement of the coronary tree, CAV lesions are often not amenable to percutaneous coronary intervention (PCI), leaving coronary artery bypass grafting (CABG) and retransplantation as primary revascularization options. Aim and Results: The latest guidelines from the International Society for Heart and Lung Transplantation (ISHLT) recognize CABG as a viable option but with a downgraded strength of recommendation. The 2023 ISHLT guidelines now categorize CABG as a Class IIb recommendation (Level of Evidence: C) for highly selected CAV patients with anatomically suitable lesions, a downgrade from the Class IIa recommendation in the 2010 guidelines. This adjustment underscores the persisting reliance on limited, retrospective studies and the lack of substantial new data supporting CABG in CAV management. Our article examines the evidence collected since 2010 on this topic, highlighting key findings and assessing the role of CABG in contemporary transplant practice. This article calls for targeted investigations to better define the role of CABG as a therapeutic option, addressing the gaps in evidence for surgical revascularization in HT patients. Full article
(This article belongs to the Special Issue Surgery Updates of Heart Transplantation in Children and Adults)
13 pages, 448 KiB  
Article
Role of the Lymphocyte Count-to-C-Reactive Protein Ratio in the Risk Stratification for High EASE Scores After Living Donor Liver Transplantation: A Retrospective Observational Cohort Study
by Jaesik Park, Chul Soo Park, Min Suk Chae, Ho Joong Choi and Sang Hyun Hong
J. Clin. Med. 2024, 13(23), 7344; https://doi.org/10.3390/jcm13237344 - 2 Dec 2024
Viewed by 935
Abstract
Background: Early allograft failure (EAF) significantly contributes to mortality, necessitating re-transplantation following liver transplantation. The EAF simplified estimation (EASE) score has been recently developed to predict EAF. We aimed to assess the predictive capacity of high EASE scores for EAF and postoperative outcomes [...] Read more.
Background: Early allograft failure (EAF) significantly contributes to mortality, necessitating re-transplantation following liver transplantation. The EAF simplified estimation (EASE) score has been recently developed to predict EAF. We aimed to assess the predictive capacity of high EASE scores for EAF and postoperative outcomes and to evaluate the association between the lymphocyte count-to-C-reactive protein ratio (LCR) and high EASE scores after living donor liver transplantation (LDLT). Methods: We retrospectively analyzed the data of 808 patients who underwent LDLT. After excluding 16 patients with incomplete laboratory data, the final cohort included 792 patients. Patients with EASE scores ≥−0.74 were categorized into the high EASE group. Multivariate logistic regression was used to examine the association between the LCR and high EASE scores. Results: High EASE scores demonstrated superior predictive accuracy for EAF development relative to that of the early allograft dysfunction (EAD) model (p = 0.018) and were more closely associated with overall mortality (p = 0.033). A preoperative LCR < 12.7 significantly increased the odds (odds ratio, 3.3; confidence interval, 1.997–5.493) of exhibiting high EASE scores post-LDLT, alongside preoperative hematocrit levels, operative duration, intraoperative continuous renal replacement therapy, administered calcium dose, mean heart rate, and donor age. Conclusions: The EASE score could offer enhanced utility for predicting EAF and overall mortality following LDLT relative to that of EAD. Identifying and managing risk factors, including low LCR values, for elevated EASE scores is essential for improving patient prognoses. Full article
(This article belongs to the Section General Surgery)
Show Figures

Figure 1

15 pages, 2180 KiB  
Review
Time to Rethink Bronchiolitis Obliterans Syndrome Following Lung or Hematopoietic Cell Transplantation in Pediatric Patients
by Tang-Her Jaing, Yi-Lun Wang and Chia-Chi Chiu
Cancers 2024, 16(21), 3715; https://doi.org/10.3390/cancers16213715 - 4 Nov 2024
Cited by 2 | Viewed by 2455
Abstract
Background: Similar in histological characteristics and clinical manifestations, bronchiolitis obliterans syndrome (BOS) can develop following lung transplantation (LTx) or hematopoietic cell transplantation (HCT). In contrast to lung transplantation, where BOS is restricted to the lung allograft, HCT-related systemic graft-versus-host disease (GVHD) is the [...] Read more.
Background: Similar in histological characteristics and clinical manifestations, bronchiolitis obliterans syndrome (BOS) can develop following lung transplantation (LTx) or hematopoietic cell transplantation (HCT). In contrast to lung transplantation, where BOS is restricted to the lung allograft, HCT-related systemic graft-versus-host disease (GVHD) is the root cause of BOS. Because lung function declines following HCT, diagnosis becomes more difficult. Given the lack of proven effective medicines, treatment is based on empirical evidence. Methods: Cross-disciplinary learning is crucial, and novel therapies are under investigation to improve survival and avoid LTx. Recent advances have focused on updating the understanding of the etiology, clinical features, and pathobiology of BOS. It emphasizes the significance of learning from experts in other transplant modalities, promoting cross-disciplinary knowledge. Results: Our treatment algorithms are derived from extensive research and expert clinical input. It is important to ensure that immunosuppression is optimized and that any other conditions or contributing factors are addressed, if possible. Clear treatment algorithms are provided for each condition, drawing from the published literature and consensus clinical opinion. There are several novel therapies currently being investigated, such as aerosolized liposomal cyclosporine, Janus kinase inhibitors, antifibrotic therapies, and B-cell-directed therapies. Conclusions: We urgently need innovative treatments that can greatly increase survival rates and eliminate the need for LTx or re-transplantation. Full article
Show Figures

Figure 1

12 pages, 940 KiB  
Article
Challenges in Pediatric Liver Retransplantation: A Technical Perspective
by Carlotta Plessi, Roberto Tambucci, Raymond Reding, Xavier Stephenne, Isabelle Scheers, Giulia Jannone and Catherine de Magnée
Children 2024, 11(9), 1079; https://doi.org/10.3390/children11091079 - 3 Sep 2024
Cited by 1 | Viewed by 1137
Abstract
Background/Objectives: Liver retransplantation (reLT) is the only option for pediatric patients experiencing graft loss. Despite recent advancements in surgical techniques and perioperative management, it remains a high-risk procedure. Our aim is to describe our experience in pediatric reLT, focusing on the technical aspects [...] Read more.
Background/Objectives: Liver retransplantation (reLT) is the only option for pediatric patients experiencing graft loss. Despite recent advancements in surgical techniques and perioperative management, it remains a high-risk procedure. Our aim is to describe our experience in pediatric reLT, focusing on the technical aspects and surgical challenges. Methods: We systematically analyzed surgical reports from pediatric reLT performed at our center between 2006 and 2023 to identify recurrent intraoperative findings and specific surgical techniques. We focused on challenges encountered during different phases of reLT, including hepatectomy, vascular, and biliary reconstruction. Additionally, we compared patient and graft survival rates among different groups. Results: During the study period, 23 children underwent 25 reLT procedures at our center. Major surgical challenges included complex hepatectomy and vascular reconstructions, necessitating tailored approaches. Our analysis shows that patient and graft survival were significantly lower for reLT compared to primary transplantation (p = 0.002). Early reLT had a significantly lower graft survival compared to late reLT (p = 0.002), although patient survival was comparable (p = 0.278). Patient and graft survival rates were comparable between the first and second reLT (p = 0.300, p = 0.597). Patient survival tended to be higher after living-donor liver transplantation (LDLT) compared to deceased-donor liver transplantation (DDLT), although the difference was not statistically significant (p = 0.511). Conclusions: Pediatric reLT involves significant technical challenges and lower survival rates. Advances in perioperative management are crucial for improving outcomes. Further research is needed to optimize surgical strategies and evaluate the long-term benefits of LDLT in pediatric reLT. Full article
(This article belongs to the Special Issue Long-Term Outcomes in Pediatric Liver Transplantation)
Show Figures

Figure 1

19 pages, 1601 KiB  
Review
Advancements in Predictive Tools for Primary Graft Dysfunction in Liver Transplantation: A Comprehensive Review
by Piotr Gierej, Marcin Radziszewski, Wojciech Figiel and Michał Grąt
J. Clin. Med. 2024, 13(13), 3762; https://doi.org/10.3390/jcm13133762 - 27 Jun 2024
Cited by 2 | Viewed by 2851
Abstract
Orthotopic liver transplantation stands as the sole curative solution for end-stage liver disease. Nevertheless, the discrepancy between the demand and supply of grafts in transplant medicine greatly limits the success of this treatment. The increasing global shortage of organs necessitates the utilization of [...] Read more.
Orthotopic liver transplantation stands as the sole curative solution for end-stage liver disease. Nevertheless, the discrepancy between the demand and supply of grafts in transplant medicine greatly limits the success of this treatment. The increasing global shortage of organs necessitates the utilization of extended criteria donors (ECD) for liver transplantation, thereby increasing the risk of primary graft dysfunction (PGD). Primary graft dysfunction (PGD) encompasses early allograft dysfunction (EAD) and the more severe primary nonfunction (PNF), both of which stem from ischemia–reperfusion injury (IRI) and mitochondrial damage. Currently, the only effective treatment for PNF is secondary transplantation within the initial post-transplant week, and the occurrence of EAD suggests an elevated, albeit still uncertain, likelihood of retransplantation urgency. Nonetheless, the ongoing exploration of novel IRI mitigation strategies offers hope for future improvements in PGD outcomes. Establishing an intuitive and reliable tool to predict upcoming graft dysfunction is vital for early identification of high-risk patients and for making informed retransplantation decisions. Accurate diagnostics for PNF and EAD constitute essential initial steps in implementing future mitigation strategies. Recently, novel methods for PNF prediction have been developed, and several models for EAD assessments have been introduced. Here, we provide an overview of the currently scrutinized predictive tools for PNF and EAD evaluation strategies, accompanied by recommendations for future studies. Full article
(This article belongs to the Section Gastroenterology & Hepatopancreatobiliary Medicine)
Show Figures

Figure 1

10 pages, 3601 KiB  
Article
Comparison of the Effects of Multiple Frailty and Nutritional Indexes on Postoperative Outcomes in Critically Ill Patients Undergoing Lung Transplantation
by Sang-Wook Lee, Donghee Lee and Dae-Kee Choi
Medicina 2024, 60(7), 1018; https://doi.org/10.3390/medicina60071018 - 21 Jun 2024
Viewed by 1199
Abstract
Background and Objective: Lung transplantation is the only life-extending therapy for end-stage pulmonary disease patients, but its risks necessitate an understanding of outcome predictors, with the frailty index and nutritional status being key assessment tools. This study aims to evaluate the relationship [...] Read more.
Background and Objective: Lung transplantation is the only life-extending therapy for end-stage pulmonary disease patients, but its risks necessitate an understanding of outcome predictors, with the frailty index and nutritional status being key assessment tools. This study aims to evaluate the relationship between preoperative frailty and nutritional indexes and the postoperative mortality rate in patients receiving lung transplants, and to determine which measure is a more potent predictor of outcomes. Materials and Methods: This study reviewed 185 adults who received lung transplants at a single medical center between January 2013 and May 2023. We primarily focused on postoperative 7-year overall survival. Other outcomes measured were short-term mortalities, acute rejection, kidney complications, infections, and re-transplantation. We compared the predictive abilities of preoperative nutritional and frailty indicators for survival using receiver operating characteristic curve analysis and identified factors affecting survival through regression analyses. Results: There were no significant differences in preoperative nutritional indicators between survivors and non-survivors. However, preoperative frailty indicators did differ significantly between these groups. Multivariate analysis revealed that the American Society of Anesthesiologists Class V, clinical frailty scale, and Charlson Comorbidity Index (CCI) were key predictors of 7-year overall survival. Of these, the CCI had the strongest predictive ability with an area under the curve of 0.755, followed by the modified frailty index at 0.731. Conclusions: Our study indicates that for critically ill patients undergoing lung transplantation, frailty indexes derived from preoperative patient history and functional autonomy are more effective in forecasting postoperative outcomes, including survival, than indexes related to preoperative nutritional status. Full article
(This article belongs to the Section Intensive Care/ Anesthesiology)
Show Figures

Figure 1

6 pages, 3102 KiB  
Case Report
Embolization of a Hepatic Arterio-Portal Venous Fistula to Treat Portal Hypertension in a Liver Transplant Recipient
by Ji Ae Yoon, Cherng Chao and Zurabi Lominadze
Transplantology 2024, 5(2), 110-115; https://doi.org/10.3390/transplantology5020011 - 11 Jun 2024
Viewed by 2052
Abstract
Hepatic arterio-portal venous fistula (HAPVF) is a rare, abnormal connection between the hepatic artery and portal vein. HAPVFs are usually caused by trauma or hepatobiliary instrumentation. Fistulas can expand and produce symptoms of severe portal hypertension. The decision to embolize should be based [...] Read more.
Hepatic arterio-portal venous fistula (HAPVF) is a rare, abnormal connection between the hepatic artery and portal vein. HAPVFs are usually caused by trauma or hepatobiliary instrumentation. Fistulas can expand and produce symptoms of severe portal hypertension. The decision to embolize should be based on fistula location, size, and symptoms. We report a case of HAPVF in a liver transplant recipient presenting with worsening ascites and variceal hemorrhage after several prior liver biopsies. Given the extensive nature of the fistula and hepatic decompensation, the HAPVF was successfully embolized, resulting in clinical improvement and obviating the need for re-transplantation. Full article
(This article belongs to the Section Solid Organ Transplantation)
Show Figures

Figure 1

9 pages, 742 KiB  
Article
Pre-Emptive Kidney Retransplantation from Deceased Donors
by Antonio Franco Esteve, Patricio Mas-Serrano, Fransico Manuel Marco, Eduardo Garin Cascales and Francisco Javier Perez Contreras
Transplantology 2024, 5(1), 37-45; https://doi.org/10.3390/transplantology5010004 - 28 Feb 2024
Viewed by 1658
Abstract
There is uncertainty about the best approach to replacement treatment for kidney transplant recipients with chronic terminal graft dysfunction, since a retransplant could be performed before the resumption of dialysis, thus avoiding this treatment and the dilemma of whether or not to suspend [...] Read more.
There is uncertainty about the best approach to replacement treatment for kidney transplant recipients with chronic terminal graft dysfunction, since a retransplant could be performed before the resumption of dialysis, thus avoiding this treatment and the dilemma of whether or not to suspend immunosuppressive therapy. However, there is limited experience in pre-emptive repeat transplantations, and none from deceased donors. This study aims to assess the results of a pre-emptive retransplantation program with brain-dead deceased donors. We designed a retrospective matched cohort study, including 36 recipients in the pre-dialysis group and 36 controls who were already on dialysis, matched for donor age and transplant date, which could not differ by more than 7 days between pairs. The variables used to standardize the cohorts were donor and recipient age and sex, blood group, duration of the first graft, time on the waitlist to receive the second graft, cold ischemia time, induction and maintenance of immunosuppression, and HLA antibodies (-) prior to retransplantation. The efficacy variables were early graft loss, acute rejection, delay in graft function, renal function at the end of follow-up, survival time, and recipient and graft survival at 24 and 48 months’ follow-up. The pre-dialysis group presented a significantly shorter waitlist time, lower immunization status, and a significantly longer duration of the first graft than the control group. The percentage of recipients who presented early graft loss, delayed renal function, or acute rejection was similar between groups. No significant differences were observed in kidney function or in the survival of the recipient or graft. Retransplantation yields good outcomes in patients with terminal chronic dysfunction, helping to avoid recurrence to dialysis, shortening the time spent on the waitlist, reducing the risk of producing antibodies, and resolving the dilemma of whether or not to stop immunosuppression. Full article
(This article belongs to the Section Solid Organ Transplantation)
Show Figures

Figure 1

11 pages, 2773 KiB  
Article
Unique Changes in the Lung Microbiome following the Development of Chronic Lung Allograft Dysfunction
by Yeuni Yu, Yun Hak Kim, Woo Hyun Cho, Dohyung Kim, Min Wook So, Bong Soo Son and Hye Ju Yeo
Microorganisms 2024, 12(2), 287; https://doi.org/10.3390/microorganisms12020287 - 29 Jan 2024
Cited by 1 | Viewed by 1873
Abstract
The importance of lung microbiome changes in developing chronic lung allograft dysfunction (CLAD) after lung transplantation is poorly understood. The lung microbiome–immune interaction may be critical in developing CLAD. In this context, examining alterations in the microbiome and immune cells of the lungs [...] Read more.
The importance of lung microbiome changes in developing chronic lung allograft dysfunction (CLAD) after lung transplantation is poorly understood. The lung microbiome–immune interaction may be critical in developing CLAD. In this context, examining alterations in the microbiome and immune cells of the lungs following CLAD, in comparison to the lung condition immediately after transplantation, can offer valuable insights. Four adult patients who underwent lung retransplantation between January 2019 and June 2020 were included in this study. Lung tissues were collected from the same four individuals at two different time points: at the time of the first transplant and at the time of the explantation of CLAD lungs at retransplantation due to CLAD. We analyzed whole-genome sequencing using the Kraken2 algorithm and quantified the cell fractionation from the bulk tissue gene expression profile for each lung tissue. Finally, we compared the differences in lung microbiome and immune cells between the lung tissues of these two time points. The median age of the recipients was 57 years, and most (75%) had undergone lung transplants for idiopathic pulmonary fibrosis. All patients were administered basiliximab for induction therapy and were maintained on three immunosuppressants. The median CLAD-free survival term was 693.5 days, and the median time to redo the lung transplant was 843.5 days. Bacterial diversity was significantly lower in the CLAD lungs than at transplantation. Bacterial diversity tended to decrease according to the severity of the CLAD. Aerococcus, Caldiericum, Croceibacter, Leptolyngbya, and Pulveribacter genera were uniquely identified in CLAD, whereas no taxa were identified in lungs at transplantation. In particular, six taxa, including Croceibacter atlanticus, Caldiserium exile, Dolichospermum compactum, Stappia sp. ES.058, Kinetoplastibacterium sorsogonicusi, and Pulveribacter suum were uniquely detected in CLAD. Among immune cells, CD8+ T cells were significantly increased, while neutrophils were decreased in the CLAD lung. In conclusion, unique changes in lung microbiome and immune cell composition were confirmed in lung tissue after CLAD compared to at transplantation. Full article
(This article belongs to the Section Microbiomes)
Show Figures

Figure 1

18 pages, 1499 KiB  
Review
Heart Transplantation
by Nikolaos Chrysakis, Dimitrios E. Magouliotis, Kyriakos Spiliopoulos, Thanos Athanasiou, Alexandros Briasoulis, Filippos Triposkiadis, John Skoularigis and Andrew Xanthopoulos
J. Clin. Med. 2024, 13(2), 558; https://doi.org/10.3390/jcm13020558 - 18 Jan 2024
Cited by 10 | Viewed by 5779
Abstract
Heart transplantation (HTx) remains the last therapeutic resort for patients with advanced heart failure. The present work is a clinically focused review discussing current issues in heart transplantation. Several factors have been associated with the outcome of HTx, such as ABO and HLA [...] Read more.
Heart transplantation (HTx) remains the last therapeutic resort for patients with advanced heart failure. The present work is a clinically focused review discussing current issues in heart transplantation. Several factors have been associated with the outcome of HTx, such as ABO and HLA compatibility, graft size, ischemic time, age, infections, and the cause of death, as well as imaging and laboratory tests. In 2018, UNOS changed the organ allocation policy for HTx. The aim of this change was to prioritize patients with a more severe clinical condition resulting in a reduction in mortality of people on the waiting list. Advanced heart failure and resistant angina are among the main indications of HTx, whereas active infection, peripheral vascular disease, malignancies, and increased body mass index (BMI) are important contraindications. The main complications of HTx include graft rejection, graft angiopathy, primary graft failure, infection, neoplasms, and retransplantation. Recent advances in the field of HTx include the first two porcine-to-human xenotransplantations, the inclusion of hepatitis C donors, donation after circulatory death, novel monitoring for acute cellular rejection and antibody-mediated rejection, and advances in donor heart preservation and transportation. Lastly, novel immunosuppression therapies such as daratumumab, belatacept, IL 6 directed therapy, and IgG endopeptidase have shown promising results. Full article
(This article belongs to the Special Issue Mechanical Circulatory Support in Patients with Heart Failure)
Show Figures

Figure 1

12 pages, 1388 KiB  
Article
Functional Assessment of Long-Term Microvascular Cardiac Allograft Vasculopathy
by Noemi Bora, Orsolya Balogh, Tamás Ferenci and Zsolt Piroth
J. Pers. Med. 2023, 13(12), 1686; https://doi.org/10.3390/jpm13121686 - 5 Dec 2023
Viewed by 1344
Abstract
Background: Cardiac allograft vasculopathy (CAV) is a leading cause of death and retransplantation following heart transplantation (HTX). Surveillance angiography performed yearly is indicated for the early detection of the disease, but it remains of limited sensitivity. Methods: We performed bolus thermodilution-based coronary flow [...] Read more.
Background: Cardiac allograft vasculopathy (CAV) is a leading cause of death and retransplantation following heart transplantation (HTX). Surveillance angiography performed yearly is indicated for the early detection of the disease, but it remains of limited sensitivity. Methods: We performed bolus thermodilution-based coronary flow reserve (CFR) and index of microcirculatory resistance (IMR) and fractional flow reserve (FFR) measurements in HTX patients undergoing yearly surveillance coronary angiography without overt CAV. Results: In total, 27 HTX patients were included who had 52 CFR, IMR, and FFR measurements at a mean of 43 months after HTX. Only five measurements were performed in the first year. CFR decreased significantly by 0.13 every year (p = 0.04) and IMR tended to increase by 0.98 every year (p = 0.051), whereas FFR did not change (p = 0.161) and remained well above 0.80 over time. After one year, CFR decreased significantly (p = 0.022) and IMR increased significantly (p = 0.015), whereas FFR remained unchanged (p = 0.72). Conclusions: The functional status of the epicardial coronary arteries of transplanted hearts did not deteriorate over time. On the contrary, a significant decrease in CFR was noted. In view of the increasing IMR, this is caused by the deterioration of the function of microvasculature. CFR and IMR measurements may provide an early opportunity to diagnose CAV. Full article
Show Figures

Figure 1

Back to TopTop