The lack of a consistent definition for hypotension in the pediatric population complicates the study of hypotension in pediatric kidney transplant recipients. Additionally, there is limited data to guide the management of hypotension in these patients both before and after transplantation. A 10-question survey was distributed to pediatric nephrologists. The survey data was analyzed using descriptive statistical methods without hypothesis testing. A total of 25 respondents completed the survey, revealing significant variability in the definitions of hypotension used and in the management approaches for hypotension during the pre- and posttransplant periods. There is significant variability in practice patterns regarding the management of hypotension in pediatric transplant recipients. This highlights the need for further research to evaluate the prevalence of chronic hypotension in the pediatric ESKD population and to assess its impact on transplant outcomes.
Anticancer therapy for patients with gastric cancer on hemodialysis is challenging owing to varying pharmacokinetics and a lack of clinical trial data. This study aimed to evaluate the efficacy and safety of the capecitabine plus oxaliplatin (CapeOX) regimen in a 73-year-old male Japanese patient with stage IV gastric cancer (human epidermal growth factor receptor 2 negative) undergoing hemodialysis. The selected chemotherapy regimen was approximately 50% dose of CapeOX (capecitabine 1500 mg/day on days 1-14 and oxaliplatin 100 mg/day 2-h infusion on day 1) every 3 weeks. Data on plasma drug concentrations, metabolic enzyme genetic polymorphisms, and clinical outcomes were analyzed. Anticancer therapy initially controlled the tumor; however, disease progression and cumulative peripheral neuropathy led to discontinuation after 17 cycles (approximately 12 months of treatment). Oxaliplatin exhibited a rebound increase after each dialysis session (dialyzer clearance [CLdial]: median, 44.12 [interquartile range {IQR}: 24.89 - 70.08] mL/min; hemodialysis removal rate: median, 35.98% [IQR: 19.63 - 54.45]. α-fluoro-β-alanine, the final metabolite of capecitabine, accumulated substantially, although approximately half of them was removed by hemodialysis (CLdial: median, 61.32 [IQR: 24.89 - 70.08] mL/min; hemodialysis removal rate: median, 47.98% [IQR: 44.74 - 50.29]). The UPB1 intronic variant and a DPYD missense mutation (1627 A > G) were detected. The DPYD variant likely influenced 5-fluorouracil metabolism, as its area under the concentration-time curve from 0 to 12 h was comparable to the standard dosage. These findings suggest that appropriate dose reduction and genetic screening might be considered part of chemotherapy guidance to improve safety and effectiveness for patients with advanced gastric cancer undergoing hemodialysis.
The ambulatory arterial stiffness index (AASI) has emerged as an ambulatory blood pressure monitoring (ABPM) measure of stiffness and is supposedly useful in younger subjects. The objective of our study was to evaluate the relationships between the AASI and indices of arterial stiffness in a pediatric population at risk of hypertension. Cross-sectional study of children/adolescents (8-18 years) whose pulse wave velocity (PWV: carotid-to-femoral cf-PWV and heart-finger hf-PWV), augmentation index (AIx; normalized at 75 bpm: AIx75), systemic arterial stiffness (aortic pulse pressure/stroke volume, measured via pulse contour analysis), and ABPM were measured. At-risk populations were potential vascular remodeling (preterm birth, n = 44 and chronic kidney diseases, n = 7) and potential hyperkinetic causes (congenital central hypoventilation syndrome, n = 14 and psychostimulant treatment, n = 10). The mean age of the 75 participants was 12.3 ± 2.5 years (34 girls), and their mean AASI was 0.33 ± 0.17. AASI did not correlate with cf-PWV, hf-PWV, AIx, or systemic arterial stiffness. In contrast, the AASI correlated with both systolic and diastolic BP dipping at night (R =  - 0.23; p = 0.048 and R =  - 0.33; p = 0.004, respectively). Systemic arterial stiffness correlated with hf-PWV and AIx75 (R = 0.35; p = 0.004 and R =  - 0.34; p = 0.013, respectively). Based on ABPM, 15/75 (20%) participants had hypertension, and they had higher cf-PWV than participants without hypertension (5.64 ± 0.70 vs 4.92 ± 0.78 m/s, p = 0.002) and not different AASI values (0.34 ± 0.14 vs 0.32 ± 0.18, p = 0.756). AASI is not a measure of arterial stiffness in children at risk of secondary hypertension. • The ambulatory arterial stiffness index (AASI) has emerged as an ambulatory blood pressure monitoring measure of stiffness and is supposedly useful in younger subjects. Few adult studies validated the concept of AASI as a marker of arterial stiffness, and recent studies suggested the need for AASI measurement in pediatric subjects. • Our cross-sectional study, which used different methods for the assessment of vascular remodeling, shows that the AASI is not a reliable marker of arterial stiffness in children at risk of secondary hypertension.
Renal involvement is a common condition in rheumatologic diseases. The aim of this study was to evaluate the clinical presentations and histopathological results in patients diagnosed with non-lupus rheumatic diseases who underwent renal biopsy. The records of 900 renal biopsies performed at our hospital between 2004 and 2023 were retrospectively reviewed, and 139 patients diagnosed with non-lupus rheumatic diseases were included. Data on demographic and clinical characteristics, laboratory results, renal biopsy indications, and histopathological findings were obtained from hospital records. Among the patients, 46 had rheumatoid arthritis (RA), 31 had ankylosing spondylitis (AS), 31 had familial Mediterranean fever (FMF), 17 had Behçet's disease (BD), and 14 had psoriasis (PsO). The most common indication for biopsy was hematuria + subnephrotic proteinuria, accounting for 17.3%. The most frequent pathological diagnosis was amyloidosis (36.7%), followed by focal segmental glomerulosclerosis (FSGS) (19.4%), chronic tubulointerstitial nephritis (TIN) (8.6%), immunoglobulin A (IgA) nephropathy (7.9%), and membranous glomerulonephritis (7.2%). A significant difference was noted among rheumatologic diseases groups according to the reasons for biopsy (p = 0.031) and pathological diagnoses (p = 0.003). The frequency of amyloidosis was higher in the FMF group with 61.3%, FSGS was higher in the BD group with 47.1%, and IGA nephropathy was higher in the PsO group with 35.7%. The characteristics of renal involvement in rheumatic diseases differ depending on the underlying disease. Close monitoring of renal functions and performing biopsy when necessary are critically important in these patients.
Chronic kidney disease (CKD) is associated with a markedly increased risk of cardiovascular (CV) morbidity and mortality that cannot be fully explained by traditional risk factors. Emerging evidence highlights the central role of the gut-kidney-heart axis, whereby gut microbiota dysbiosis promotes the generation and systemic accumulation of uremic toxins, including indoxyl sulfate (IS), p-cresyl sulfate (PCS), and trimethylamine N-oxide (TMAO). These metabolites contribute to endothelial dysfunction, oxidative stress, inflammation, and vascular remodeling, thereby accelerating CV disease progression in CKD. Dietary patterns represent a key modifiable factor influencing gut microbiota composition and metabolic activity. The Mediterranean diet, characterized by high intake of plant-based foods, dietary fiber, and polyphenols, and low consumption of red and processed meats, has emerged as a promising microbiota-targeted strategy. It promotes saccharolytic fermentation, enhances short-chain fatty acid production, and reduces proteolytic pathways responsible for uremic toxin generation. Accumulating evidence from observational studies, meta-analyses, and dietary intervention trials suggests that adherence to Mediterranean and plant-based dietary patterns is associated with reduced uremic toxin burden, improved renal outcomes, and lower CV risk in CKD populations. However, direct interventional evidence linking Mediterranean diet adherence to changes in specific uremic toxin levels remains limited. This narrative review summarizes current evidence on diet-microbiota interactions in CKD and highlights the Mediterranean diet as a biologically plausible strategy for targeting the gut-kidney-heart axis. Future well-designed randomized controlled trials (RCTs) are needed to confirm causal relationships and support clinical implementation.
Introduction: Recent research underscores the risks of maintaining a positive calcium balance in hemodialysis (HD) patients. This study aims to evaluate outcomes based on the calcium levels of HD patients, specifically those with hypocalcemia. Methods: In this retrospective cohort study, data from 71,101 HD patients were analyzed and classified into six groups based on calcium levels: severe hypocalcemia (<7.5 mg/dL, n = 1078), moderate hypocalcemia (7.5-7.99 mg/dL, n = 4000), mild hypocalcemia (8.0-8.39 mg/dL, n = 9846), lower-normal calcium (8.4-9.29 mg/dL, n = 38,697), upper-normal calcium (9.3-10.19 mg/dL, n = 14,505), and hypercalcemia (≥10.2 mg/dL, n = 1975). Results: The numbers of deaths, CVE, and fracture at the end point of the follow-up were recorded: 401 (37.2%), 189 (23.2%), and 224 (20.8%) in the severe hypocalcemia group, respectively; 1523 (38.1%), 663 (22.8%), and 802 (20.1%) in the moderate hypocalcemia group, respectively; 3985 (40.5%), 1618 (22.9%), and 2054 (20.9%) in the mild hypocalcemia group, respectively; 17,067 (44.1%), 6948 (24.9%), and 8676 (22.4%) in the lower-normal calcium group, respectively; 6904 (47.6%), 2967 (27.3%), and 3471 (23.9%) in the upper-normal calcium group, respectively; and 1074 (54.4%), 457 (30.0%), and 473 (23.9%) in the hypercalcemia group, respectively. The 5-year patient survival rates for the severe hypocalcemia, moderate hypocalcemia, mild hypocalcemia, lower-normal calcium, upper-normal calcium, and hypercalcemia groups were 73.9%, 70.0%, 68.8%, 66.4%, 66.1%, and 62.8%, respectively. The 5-year cardiovascular event-free survival rates for the severe hypocalcemia, moderate hypocalcemia, mild hypocalcemia, lower-normal calcium, upper-normal calcium, and hypercalcemia groups were 78.2%, 79.0%, 78.2%, 76.2%, 75.3%, and 72.6%, respectively. The hazard ratios (HRs) for the all-cause mortality (HR: 0.94, 95% CI: 0.84-1.05) and cardiovascular events (HR: 0.98, 95% CI: 0.84-1.15) of the severe hypocalcemia group were consistently not higher than those of the lower-normal calcium group even after thorough adjustments were made for various clinical variables. Multivariable Cox regression analyses revealed that the HRs for all-cause mortality and cardiovascular events of the mild hypocalcemia groups were lower than those of the lower-normal calcium group. Serum calcium levels were not associated with increased risk of fracture. Conclusions: Patients with various degrees of hypocalcemia, including severe hypocalcemia, were not associated with increased mortality and cardiovascular event rates. We suggest that symptoms and clinical presentation should be prioritized rather than simply targeting the normalization of calcium levels in hypocalcemia correction.
Atypical hemolytic uremic syndrome (aHUS) is a complement-associated thrombotic microangiopathy (TMA). Thalassemia is a hemolytic anemia triggered by defects in the beta-globin gene. Mutations in the complement factor H (CFH) gene are associated with the development of aHUS; however, CFH gene mutations coexisting with thalassemia have rarely been reported. CFH gene mutations are susceptibility factors that can trigger complement overactivation during pregnancy. When aHUS is combined with thalassemia, large amounts of hemoglobin are released from fragmented erythrocytes, leading to amplification of the complement cascade and resulting in a "two-hit" mechanism that may affect the efficacy of C5 inhibitors. Here, we report a case of pregnancy-associated aHUS in a patient with thalassemia and a CFH gene mutation. After admission to the hospital, plasmapheresis and hemodialysis were used to treat TMA. Following confirmation of aHUS, eculizumab was initiated immediately, and the erythroid maturation agent luspatercept was added for the treatment of thalassemia. The patient was weaned off hemodialysis, and there was no recurrence of aHUS during the follow-up period. This diagnostic and therapeutic approach may provide a new strategy for the use of C5 inhibitors to treat patients with aHUS and thalassemia.
Proper nutrition is a fundamental determinant of human health and is structured through the intake of various nutritional components: (i) macronutrients, including carbohydrates, lipids, and proteins, which provide energy and essential structural materials for metabolic and physiological processes; (ii) micronutrients, such as vitamins and minerals, although required in smaller quantities, play crucial roles as enzymatic cofactors and regulators of numerous biochemical pathways; (iii) natural bioactive compounds (NBCs), substances found in plant-based foods (including polyphenols, carotenoids, phytosterols, and sulfur compounds) that exert protective effects thanks to their antioxidant and anti-inflammatory properties, contributing to the prevention of numerous chronic non-communicable diseases (CNCDs) [...].
暂无摘要(点击查看详情)
Background: The combined prognostic value of the fibrinogen-to-albumin ratio (FAR) and uric acid-to-albumin ratio (UAR) in acute kidney injury patients undergoing continuous renal replacement therapy remains unclear. Methods: This retrospective cohort study utilized the MIMIC-IV database. Adult patients with AKI receiving CRRT were included and stratified into four groups based on optimal FAR and UAR cut-offs. Multivariable Cox proportional hazards regression and restricted cubic spline analyses were employed to examine associations with 30-, 90-, and 360-day all-cause mortality. Results: Patients with high FAR/high UAR had the poorest survival (log-rank p < 0.001). After multivariable adjustment, high FAR/high UAR was associated with higher 30-day (HR = 2.17, 95%CI: 1.61-2.92) and 360-day mortality (HR = 1.50, 95%CI: 1.18-1.90) vs. low FAR/low UAR. The association was stronger in patients with an SOFA score > 12 or vasopressin use (interaction p < 0.05). Conclusions: In critically ill AKI patients undergoing CRRT, the combined assessment of the FAR and UAR is associated with elevated mortality risk. These readily obtainable composite markers may support risk stratification in clinical practice.
The present case report describes the case of a 76-year-old diabetic female patient operated on for ischemic heart disease and aortic valve stenosis. The patient suffered from acute kidney injury related to hypotension secondary to the use of anti-hypertensive drugs. The present case report also discusses the need for greater caution in treating hypertension in elderly patients with diabetes and cardiovascular disease, drawing on the available literature. Patients should be encouraged to measure their blood pressure at home, and written blood pressure target values should be provided to both the patient and their caregivers. It is important to note that hypotension, similar to hypertension, may lead to complications such as falls, acute kidney injury and hospitalization. Furthermore, the adverse events that develop following simultaneous coronary artery bypass graft and surgical aortic valve replacement surgery in the elderly are discussed in an aim to provide further insight on this matter.
Sarcopenia, defined as the progressive loss of skeletal muscle mass and strength, is increasingly recognized as a significant concern in patients with chronic kidney disease (CKD) and particularly in kidney transplant recipients (KTx-ps). This review explores the complex interplay of pathophysiological mechanisms, prevalence, and management strategies of sarcopenia in the context of kidney transplantation. CKD contributes to sarcopenia through systemic inflammation, malnutrition, uremic toxin accumulation, and metabolic imbalances, all of which persist or are exacerbated after transplantation due to immunosuppressive therapies especially corticosteroids. Notably, the post-transplant period may introduce additional risks, such as altered body composition and reduced physical activity, further aggravating muscle wasting. Sarcopenia affects approximately 26% of KTx-ps, leading to adverse outcomes including decreased quality of life, increased risk of infection, frailty, delayed recovery, and graft loss. The diagnosis remains challenging due to variability in assessment tools and a lack of standardized criteria. Management strategies must be multifactorial, including personalized nutritional support, targeted physical activity, and, where appropriate, pharmacological interventions. Early identification through imaging and functional testing is critical, especially in older patients and those with prolonged dialysis vintage. Emerging therapies, such as myostatin inhibitors, offer promise but require further validation. Additionally, early steroid withdrawal may mitigate muscle loss without compromising graft survival in selected patients. This review underscores the need for heightened awareness and standardized protocols to identify and manage sarcopenia in kidney transplantation, ultimately improving long-term outcomes and patient-centered care.
Giant coronary artery aneurysms (CAAs), defined as dilatations exceeding 20 mm or four times the reference vessel diameter, are rare and risk rupture or thrombosis. Coronary pulmonary artery fistulas may lead to grow aneurysm formation via chronic high-flow shunting. Their coexistence is rare in elderly patients without prior Kawasaki disease. An 83-year-old woman with hypertension, dyslipidaemia, and type 2 diabetes mellitus was referred for evaluation of an abnormal mediastinal contour. Chest computed tomography showed a 43 × 33 mm left anterior descending artery aneurysm with a fistulous to the pulmonary artery, grown from 30 × 20 mm over 10 years. Right heart cardiac catheterization showed no evidence of pulmonary artery hypertension with a mild shunt flow (Qp/Qs: 1.15). Despite being asymptomatic, the aneurysm's size and growth warranted intervention. Considering her age and comorbidities, transcatheter coil embolization was selected over surgical or stent-based therapies. Coils were deployed using intravascular ultrasound (IVUS) and a balloon-assisted technique, successfully occluding the aneurysm without compromising distal flow. Follow-up angiography at 6 months and echocardiography up to 3 years confirmed sustained occlusion. This case highlights that even in asymptomatic elderly patients, large and expanding CAAs with fistulas warrant careful evaluation due to the risk of rupture. Transcatheter coil embolization provided a safe and effective treatment alternative to surgery, especially in high-risk anatomical and clinical settings. IVUS and balloon assistance were critical to procedural success, and long-term follow-up demonstrated sustained aneurysm exclusion.
IgA-dominant infection-related glomerulonephritis (IgA-IRGN) is a histopathological variant of staphylococcus-associated glomerulonephritis (SAGN) that occurs as a result of an immune response to S. aureus antigens, with subsequent deposition of IgA in the nephron. This can lead to acute kidney injury, hematuria, and proteinuria. It is important to differentiate IgAIRGN from primary IgA nephropathy (IgAN) because the treatment strategies differ. IgA-dominant IRGN requires treatment of the infection with antibiotics, whereas treatment of primary IgAN involves immunosuppression. Here, we present a case that highlights the clinical dilemma of distinguishing IgA nephropathy from IRGN. Our patient presented with chronic osteomyelitis secondary to Staph. The hospital course was complicated by acute renal failure that required dialysis. Renal biopsy showed IgA deposits, and the patient was initially treated with steroids for IgA Nephropathy. The patient did not respond to immunosuppressive treatments and had a second biopsy, clinical history, and course that closely resembled IgA-IRGN. The patient was eventually removed from dialysis after five months. In patients with documented S. aureus infection who present with acute kidney injury, hematuria, and proteinuria, IgA-IRGN should be considered as the etiology. Source control measures should be attempted when necessary and patients should be treated with an appropriate course of antibiotics. Proper diagnosis is important to avoid exposure to immunosuppressive medications and potentially worse outcomes in these patients.
Adverse social determinants are known contributors to poorer long-term allograft outcomes. Yet optimal measures of social determinants are lacking. We sought to evaluate the association between socioeconomic status (SES) and graft loss in children using several different metrics of social determinants of health (SDOH). Our study included 137 pediatric kidney transplant recipients (< 18 years at kidney failure onset) transplanted between 2010 and 2020. The association between SES measures (individual- and area-level) and graft survival was examined using Cox regression models. Measures were dichotomized: HOUSES Index [low SES(Q1) vs. high (Q2-Q4)], Area Deprivation Index (ADI) [high deprivation (76-100 percentile) vs. low (1-75 percentile)], and Childhood Opportunity Index (COI) [low opportunity (1s-25 percentile) vs. high (26-100)]. All models were adjusted for transplant age, sex, donor type, pretransplant dialysis, insurance, kidney failure cause, and body mass index z-score. After adjusting for covariates, we observed a significantly increased risk of graft loss in HOUSES Q1 vs. Q2-4 recipients (adjusted hazard ratio [aHR]: 3.25; 95% CI: 1.38, 7.64; p = 0.007) and nationally standardized low-opportunity COI vs. high-opportunity COI recipients (aHR: 5.01; 95% CI: 2.01, 12.5; p = 0.001). In contrast, we observed no significant association between ADI and graft loss. Using the HOUSES Index and COI, we identified a 3-fold to fivefold higher risk of pediatric graft loss among recipients with lower SDOH. These tools provide robust non-biological predictors for transplant outcomes; however, larger studies are required to compare their relative impact.
Despite significant advancements in medical and technical aspects in the field of dialytic renal replacement therapies, morbidity and mortality are extremely high among patients with end-stage renal disease (ESRD), and most interventional studies yielded unsatisfactory outcomes. Hemodiafiltration with endogenous reinfusion (HFR) emerges as a distinctive blood purification technique characterized by a dual-chamber dialyzer and a resin adsorption filter core, integrating the mechanisms of diffusion, convection, and adsorption within a single therapeutic framework. Current evidence suggests that, in addition to its capacity for the removal of uremic toxins, HFR could also be effective in ameliorating the inflammatory states and oxidative stress among the dialysis population; however, the existing evidence is limited by small sample sizes and short follow-up periods. Nevertheless, in clinical practice, several challenges associated with HFR in clinical practice necessitate immediate consideration. These challenges include the technical specifications of dialyzers, the impact on micronutrient levels, and the requirement for a broader scope of clinical indications. The present review elucidates the technical aspects of HFR and summarizes our perspectives on the contentious issues pertaining to its clinical application. Finally, we offer insights into future clinical trials exploring more specific indications for the utilization of HFR. Future studies should be adequately powered, prospective, multi-center, randomized controlled trials with robust clinical endpoints to address the identified challenges associated with HFR, thereby optimizing its therapeutic role in the management of ESRD. What is the impact of hemodiafiltration with endogenous reinfusion on nutrients retention?Are hemodiafiltration with endogenous reinfusion provide greater clearance efficiency on uremic toxins compared with hemoperfusion?Should hemodiafiltration with endogenous reinfusion be prescribed in patients with specific needs and more indications?
Background: Sepsis has heterogeneous etiologies. Although bacteria are the most common causative agents, viral and yeast forms of sepsis also occur. Procalcitonin (PCT) is widely used to monitor severe bacterial infections and may support the differential diagnosis of infection etiology. Methods: We evaluated the diagnostic value of PCT in viral sepsis using PCT levels at ICU admission and PCT kinetics during ICU treatment. Results: During the COVID-19 pandemic, 191 adult ICU patients with sepsis and a positive SARS-CoV-2 PCR test at hospital admission were included and classified into two groups according to the presence or absence of bacterial or yeast co-infection. PCT showed a distinct diagnostic pattern influenced by the presence of co-infection. PCT remained low in isolated viral sepsis, whereas elevated concentrations were associated with superimposed bacterial or yeast co-infection and worse clinical outcomes. Overall mortality was significantly lower in patients with isolated viral sepsis compared to those with co-infection (50 vs. 69%, p = 0.009). Conclusions: In viral sepsis, persistently low PCT concentrations argue against bacterial co-infection, whereas elevated or rising values should prompt increased diagnostic evaluation. Although PCT provides clinically relevant diagnostic information, it must be interpreted cautiously and in conjunction with clinical assessment and microbiological data. PCT should serve as an adjunctive, not a standalone, marker of infection etiology in sepsis.
Urinary epidermal growth factor (uEGF) and clusterin (uCLU) are emerging biomarkers in chronic kidney disease (CKD), but rigorous analytical validation is required before clinical implementation. We evaluated intra-individual variability and long-term storage stability of uEGF and uCLU in CKD. In the prospective, multicenter UVALID study, 60 adults with CKD stages 2-4 underwent urine sampling at three visits over 8 weeks. First-morning and 24-h urine samples were collected to assess intra-individual variability over 24 h, 3 days and 8 weeks. Biomarkers were measured in duplicate by ELISA and normalized to urinary creatinine (/Cr). Inter-laboratory performance was assessed using quality control samples. Stability after 12 and 15 months of storage at -20 °C and -80 °C and the influence of pH were evaluated. Over 24 h, 3 days, and 8 weeks, uEGF/Cr demonstrated low variability and remained stable after long-term storage at both temperatures. In contrast, uCLU/Cr showed greater variability and pronounced instability at -20 °C, whereas stability was preserved at -80 °C. Samples with pH > 6 partially preserved uCLU stability at -20 °C. Inter-laboratory reproducibility was acceptable for uEGF but suboptimal for uCLU at low concentrations. Thus, uEGF showed robust analytical performance, supporting its potential clinical applicability in CKD, whereas uCLU exhibited important analytical and pre-analytical limitations, warranting further assay optimization. These findings underscore the need for rigorous validation to facilitate biomarker implementation in clinical practice.
Patients with primary aldosteronism (PA) have an increased risk of developing cardiovascular disease. In patients with bilateral adrenal aldosterone hypersecretion, mineralocorticoid receptor antagonists (MRAs) are recommended; however, the benefit of these agents is observed only in cases that show a sufficient increase in post-treatment PRA (responders). Because renin secretion can be influenced by sodium intake, the increase in post-treatment PRA may be attributable to sodium restriction. A total of 90 patients who received eplerenone or esaxerenone for PA treatment were included in the study. Patients whose PRA was ≥ 1.0 ng/mL/h at one year after the initiation of MRAs treatment were defined as responders. The predictors of responders and the effect of sodium intake were investigated. Both baseline and post-treatment PRA and PAC levels were higher in responders. In addition, baseline estimated sodium intake tended to be lower, and post-treatment estimated sodium intake was significantly lower in responders. The post-treatment PRA value was significantly correlated with the post-treatment estimated sodium intake, and the post-treatment estimated sodium intake was an independent predictor of responders, suggesting that post-treatment PRA elevation may be partially attributable to adequate sodium restriction. However, the change in the estimated glomerular filtration rate over the 3-year follow-up period was not different between responders and non-responders. Although sodium restriction is suggested to facilitate attainment of post-treatment PRA ≥ 1.0 ng/mL/h, a marker of adequate aldosterone blockade, it remains uncertain whether achieving PRA ≥ 1.0 ng/mL/h is associated with favorable renal outcomes.
Compatible donor-recipient pairs are increasingly participating in kidney paired donation programs despite eligibility for direct living donor kidney transplant. Beyond addressing overt incompatibility, inclusion of compatible donor-recipient pairs is a strategy to optimize donor-recipient matching, mitigate major anatomical or immunologic disadvantages, and enable multiway or extended exchange chains. Evidence from national and regional registries has suggested that compatible donor-recipient pair participation expands the effective donor pool and improves access for highly sensitized and other hard-to-match recipients. We conducted a structured synthesis of global literature examining compatible pair participation in kidney paired donation programs, focusing on reported clinical outcomes, program-level effects, ethical considerations, and operational models. We comprehensively searched PubMed, Embase, Scopus, and the Cochrane Library and identified studies published between January 2000 and June 2025 reporting on compatible donor-recipient pairs that had specific data. From our search, we included 24 studies from North America, Europe, Asia, and Australia, which encompassed >2500 compatible donor-recipient pairs. Across programs, compatible donor-recipient pairs comprised a proportion of kidney paired donation transplants and were associated with increased program activity. Where reported, participation was driven by the pursuit of superior outcomes, including improved HLA matching, access to younger or lower-risk donors, or avoidance of specific immunologic mismatches, alongside altruistic considerations. Short-term patient and graft outcomes for recipients of compatible donor-recipient pairs were comparable to conventional living donor kidney transplants in reported cohorts. Overall, the evidence supported strategic inclusion of compatible pairs to enhance kidney paired donation program capacity and improve access for highly sensitized and hard-to-match recipients without compromising short-term outcomes.