Alport syndrome is a rare genetic disorder characterized by progressive renal impairment, hearing loss, and ocular abnormalities. Our study aimed to elucidate global research trends on Alport syndrome through bibliometric analysis. A comprehensive bibliometric analysis of Alport syndrome literature (1990-2025) was performed using WoS and Scopus. After merging and de-duplication (n = 2371), CiteSpace, VOSviewer, and Bibliometrix facilitated network visualization, co-occurrence, co-citation, and historiographic evaluations. Research on Alport syndrome has grown exponentially since 1990 (2371 documents; 6.04% annual growth), dominated by original articles and moderate international collaboration. The USA leads in output, collaboration, and citations; China's volume is rising with evolving global impact. A hub-and-cluster author structure (e.g., Savige, Gross, Kashtan) publishes mainly in nephrology journals (Kidney International, JASN, Pediatric Nephrology), with interdisciplinary citation flows. Co-citation and bursts trace a shift from GBM/type IV collagen genetics to precision diagnostics and renoprotective therapy. Landmark studies established COL4 mutations, genotype-phenotype links, and early ACE inhibition, defining a gene-structure-phenotype-therapy axis. Alport syndrome research has progressed from genetic discovery toward more integrative and precision-focused approaches, supported by expanding global collaboration and contributions from key nephrology journals. Emerging priorities-including earlier diagnosis, renoprotective strategies, and advanced molecular techniques-reflect a sustained shift toward more personalized research directions that may inform future clinical management.
Information on childhood cancer burden is crucial for effective cancer policy planning. Unfortunately, observed paediatric cancer data are not available in every country, and previous global burden estimates have not discretely reported several common cancers of childhood. We aimed to inform efforts to address childhood cancer burden globally by analysing results from the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2023, which now include nine additional cancer causes compared with previous GBD analyses. GBD 2023 data sources for cancer estimation included population-based cancer registries, vital registration systems, and verbal autopsies. For childhood cancers (defined as those occurring at ages 0-19 years), mortality was estimated using cancer-specific ensemble models and incidence was estimated using mortality estimates and modelled mortality-to-incidence ratios (MIRs). Years of life lost (YLLs) were estimated by multiplying age-specific cancer deaths by the standard life expectancy at the age of death. Prevalence was estimated using survival estimates modelled from MIRs and multiplied by sequelae-specific disability weights to estimate years lived with disability (YLDs). Disability-adjusted life-years (DALYs) were estimated as the sum of YLLs and YLDs. Estimates are presented globally and by geographical and resource groupings, and all estimates are presented with 95% uncertainty intervals (UIs). Globally, in 2023, there were an estimated 377 000 incident childhood cancer cases (95% UI 288 000-489 000), 144 000 deaths (131 000-162 000), and 11·7 million (10·7-13·2) DALYs due to childhood cancer. Deaths due to childhood cancer decreased by 27·0% (15·5-36·1) globally, from 197 000 (173 000-218 000) in 1990, but increased in the WHO African region by 55·6% (25·5-92·4), from 31 500 (24 900-38 500) to 49 000 (42 600-58 200) between 1990 and 2023. In 2023, age-standardised YLLs due to childhood cancer were inversely correlated with country-level Socio-demographic Index. Childhood cancer was the eighth-leading cause of childhood deaths and the ninth-leading cause of DALYs among all cancers in 2023. The percentage of DALYs due to uncategorised childhood cancers was reduced from 26·5% (26·5-26·5) in GBD 2017 to 10·5% (8·1-13·1) with the addition of the nine new cancer causes. Target cancers for the WHO Global Initiative for Childhood Cancer (GICC) comprised 47·3% (42·2-52·0) of global childhood cancer deaths in 2023. Global childhood cancer burden remains a substantial contributor to global childhood disease and cancer burden and is disproportionately weighted towards resource-limited settings. The estimation of additional cancer types relevant in childhood provides a step towards alignment with WHO GICC targets. Efforts to decrease global childhood cancer burden should focus on addressing the inequities in burden worldwide and support comprehensive improvements along the childhood cancer diagnosis and care continuum. St Jude Children's Research Hospital, Gates Foundation, and St Baldrick's Foundation.
Urothelial carcinoma is a common malignancy with a substantial burden, including in Egypt. Although Ki-67 and EMT-related markers such as vimentin have been previously studied, region-specific validation and practical cutoff performance remain relevant.Author names: Please confirm if the author names are presented accurately and in the correct sequence (given name, middle name/initial, family name). Also, kindly confirm the details in the metadata are correct. We have made the requested corrections. In addition, we would like to change the author order so that Deema Abu Al-Teen is listed as author number 4, and Abdulhadi Samman is moved to author number 8 in her place, with the same affiliation as currently shown related to each author, Samman (Jeddah) and Abu Al-Teen (Amman). This cross-sectional study included 60 archival conventional UC cases diagnosed at university hospitals during 2024. Tumors were re-reviewed to confirm grade and stage and categorized as NMIBC or MIBC. Immunohistochemistry was performed for vimentin and Ki-67 (MIB-1). Vimentin was considered positive when ≥ 10% of tumor cells showed convincing cytoplasmic staining. The Ki-67 labeling index was assessed in hotspots by counting ≥ 500 tumor nuclei and categorized as low (< 30%) versus high (≥ 30%) using a predefined pragmatic cutoff. Multivariable logistic regression was used to identify independent predictors of high-grade disease. Vimentin positivity was detected in 18/60 cases (30.0%), and high Ki-67 index was present in 37/60 (61.7%). High Ki-67 independently predicted high-grade carcinoma (OR = 5.601, p = 0.009) and was significantly associated with high grade (p = 0.011), non-papillary pattern (p < 0.001), and muscle invasion (p = 0.001), but not lymphovascular invasion (p = 0.405). Vimentin showed no significant associations and was not an independent predictor; Ki-67 and vimentin were not significantly correlated. In this Egyptian cohort, Ki-67 ≥ 30% was strongly associated with adverse pathological features and independently predicted high-grade disease, supporting its use as a practical ancillary marker and providing regional validation of prior reports. Vimentin was not significantly associated with the studied parameters in this dataset. Future studies should assess alternative cutoffs within the same cohort and validate outcome-linked thresholds in longitudinal datasets.
Excessive sugar consumption has been implicated in the development of insulin resistance and diabetic nephropathy (DN). The present study aimed to establish a novel animal model of DN using a high-sugar diet (HSD) and evaluate the renoprotective effects of empagliflozin. Male Wistar rats were divided into four groups: Normal, Normal + Empagliflozin, Diabetic, and Diabetic + Empagliflozin. Diabetes was induced using a 35% sugar-water solution and a low-dose of streptozotocin. Empagliflozin (15 mg/kg/day) was administered via gavage. Biochemical parameters, renal function markers, oxidative stress indicators, and histopathological assessments were performed. HSD significantly increased fasting blood glucose (FBS) (242.71 mg/dl in Diabetic vs. 96.6 mg/dl in Normal), insulin levels (0.63 vs. 0.288), and homeostatic model assessment for insulin resistance (HOMA-IR) (0.2716 vs. 0.0687), indicating severe insulin resistance. Empagliflozin treatment significantly reduced FBS (128 mg/dl), improved insulin sensitivity (insulin 0.2725, HOMA-IR 0.0854), and partially restored β-cell function (HOMA-B 1.5779). The diabetic group exhibited impaired renal function, with elevated blood urea nitrogen (BUN 38 mg/dl), creatinine (3.44 mg/dl), and proteinuria (2600 mg/24 h). Empagliflozin reduced these markers to near-normal levels (BUN 17 mg/dl, creatinine 0.44 mg/dl, proteinuria 135 mg/24 h). Oxidative stress parameters showed that empagliflozin increased antioxidant activities (SOD 83.31 U/ml, CAT 0.048 U/ml, GLT 0.341 nMol/ml) and decreased lipid peroxidation (MDA 7.69 nMol/ml). Histological analysis revealed that empagliflozin ameliorated glomerular and tubular damage, reducing necrosis and fibrosis in diabetic kidneys. A high-sugar diet induced insulin resistance and diabetic nephropathy in rats, characterized by metabolic disturbances, oxidative stress, and renal dysfunction. Empagliflozin demonstrated significant renoprotective effects by enhancing insulin sensitivity, improving kidney function, reducing oxidative stress, and mitigating histopathological damage. These findings highlight HSD's potential as a key driver of insulin resistance and DN and empagliflozin's potential as a therapeutic agent in managing diabetes-induced kidney injury, particularly in the context of excessive dietary sugar intake.
Infections are the predominant etiologies of post-transplant mortality in India, yet a structured infectious disease curriculum for nephrology fellows is limited. We evaluated whether large language model (LLM)-augmented clinical reasoning could improve diagnostic accuracy, clinical completeness, and clinical efficiency in managing complex transplant infection scenarios. This was a prospective, single-center, pre (LLM-unaided)-post (LLM-aided)-study (washout period of 7 days). We investigated 1200 responses generated on 30 expert-validated kidney transplant infectious disease vignettes (50 sub-questions). 12 final-year nephrology fellows participated voluntarily in the study. Intervention was Open AI ChatGPT-4o generated exemplar responses. Responses were scored for accuracy (0-100%), completeness (0-3 scale), and time-to-answer by blinded assessors. Inter-rater reliability and paired differences were also analyzed. Median accuracy increased from 72% (65·5-78·5) in the unaided arm to 85·5% (79·0-91·0) in the aided arm with Δ =  + 13·0%, p-value < 0·001. Completeness improved from 1·9 (1·6-2·1) to 2·5 (2·3-2·8) (p-value < 0·001). Median time-to-answer decreased from 7.1 (5.9-8.4) to 5.9 (4.8-7.1) minutes (p = 0.002). The most significant gains occurred in fungal (+ 19·2%) and viral (+ 17·5%) infections, especially in immunosuppressant modulation and antifungal treatment protocols. Minimal or no benefit was observed in dual infection scenarios and niche dose-duration knowledge. Only a single clinically significant hallucination (3·3%) occurred. Our findings primarily support the role of LLM-generated outputs as structured educational tools to enhance clinical reasoning training among nephrology fellows. Real-world clinical deployment requires additional validation, safety safeguards, and prospective outcome studies.
This study aimed to investigate the relaxant effects of doxazosin (an α₁-adrenergic receptor blocker) and vardenafil (a phosphodiesterase type 5 inhibitor) on rabbit ureteral smooth muscle using an in vitro organ-bath model, and to determine whether sequential administration enhances relaxation, particularly in different ureteral segments. Ureteral segments (middle and distal) were obtained from 15 adult male New Zealand White rabbits and mounted in organ baths containing oxygenated Krebs-Henseleit solution at 37 °C. Each ureteral segment (middle and distal) obtained from the animals was treated as an independent experimental unit. Thus, a total of 15 middle and 15 distal ureteral segments were analyzed (n = 15 per group). Contractions were induced by 60 mmol/L KCl, and relaxation responses to doxazosin and vardenafil-applied separately or sequentially-were recorded isometrically. The degree of relaxation was expressed as a percentage of the initial KCl-induced contraction. Data were analyzed using one-way ANOVA followed by Tukey's post hoc test. Both doxazosin and vardenafil produced significant relaxation in KCl-precontracted tissues (p < 0.001). In the middle ureter, relaxation responses were 47.3 ± 3.6% for doxazosin and 33.7 ± 3.5% for vardenafil. In the distal ureter, relaxation increased to 54.6 ± 2.2% and 40.8 ± 2.2%, respectively. Sequential administration of doxazosin followed by vardenafil yielded the greatest relaxation (84.1 ± 3.9% in middle, 88.4 ± 3.1% in distal segments; p < 0.001), whereas the reverse order produced a lower but still significant response (57.4 ± 4.8% and 80.5 ± 4.4%, respectively). Overall, distal segments exhibited greater pharmacologic sensitivity than middle ones. Doxazosin and vardenafil both exert potent relaxant effects on ureteral smooth muscle, with enhanced efficacy when administered sequentially-especially in distal segments. These findings provide mechanistic insight into the synergistic interaction between α₁-adrenergic blockade and PDE5 inhibition and support the rationale for combination therapy in facilitating distal ureteral stone passage.
Whether racial/ethnic differences exist in time to nephrectomy (TTN) and whether prolonged TTN differentially affects cancer-specific mortality (CSM) remains unclear. We evaluated race/ethnicity as a predictor of prolonged TTN and assessed race/ethnicity-specific associations between TTN and CSM in localized renal cell carcinoma (RCC). Patients were identified within the Surveillance, Epidemiology and End Results database (2010-2021) and stratified according to race/ethnicity and TTN ≤ 3 vs. > 3 months. Multivariable logistic regression models, propensity score matching (PSM) and multivariable competing risks regression models were used. In 11,058 T1b-2 N0 M0 clear-cell RCC patients, TTN > 3 months was recorded in 1168 (15.6%) of 7506 Caucasians, in 505 (23.6%) of 2138 Hispanics, in 180 (26.6%) of 676 African Americans, and in 118 (16.0%) of 738 Asians and Pacific Islanders (API). Hispanic (OR 1.80, p < 0.001) and African American (OR 2.10, p < 0.001) race/ethnicity independently predicted higher proportions of TTN > 3 months, compared to Caucasian. Over the study span, the proportion of patients with TTN > 3 months increased significantly in all four racial/ethnic groups (all p < 0.01). After PSM, TTN > 3 months was associated with higher CSM in Caucasians (sHR 1.57, p < 0.001) and in Hispanics (sHR 1.55, p = 0.046), but not in African Americans or APIs. In localized RCC patients treated with nephrectomy, TTN > 3 months became more prevalent over time in all four examined racial/ethnic groups. In Hispanics and African Americans, TTN > 3 months proportions were higher than in Caucasians and APIs. TTN > 3 months was independently associated with higher CSM in Caucasians and Hispanics, but not in African Americans and APIs.
Kidney transplantation in recipients aged ≥ 70 years is increasing, yet outcome data remain heterogeneous, with current evidence being limited to pooled results from single and multicenter studies. This study aims to investigate the impact of advanced age on patient survival as well as graft-related outcomes from multicenter and registry-based studies and to propose a multicenter study to bridge any identified research gap. The PRISMA and the Cochrane Handbook were followed. This review included multicenter studies and registry-based comparative studies only, which are less prone to selection bias. Outcomes were analysed using risk ratios (RRs) at fixed time points, when applicable, with a random-effects Mantel-Haenszel model. Five multicenter studies including a total of 272,982 recipients, of whom 16,309 were aged ≥ 70 years, were included. Recipients aged ≥ 70 years had a lower overall and 5-year survival, RR = 0.87 (95% CI 0.77-0.99; p = 0.04, I2 = 96%, τ2 = 0.00) and RR = 0.86 (95% CI 0.81-0.92; p = 0.02, I2 = 0%, τ2 = 0.00), respectively, compared to < 70. No significant differences were observed for 1- and 3-year patient survival between the two age groups. Moreover, no significant differences were observed for overall graft survival, 1-, 3-, or 5-year graft survival, delayed graft function, acute rejection and graft loss between the two age groups. Substantial heterogeneity was observed for long-term outcomes. Kidney transplantation in carefully selected recipients aged ≥ 70 years is associated with preserved graft survival and comparable early graft-related outcomes, but lower overall and 5-year survival, compared to those < 70 years of age. However, due to the limited number of studies included and substantial heterogeneity, future large multicenter studies including geriatric variables are required to determine the role of age on patient survival and graft-related outcomes. PROSPERO: CRD420261293247.
Posterior urethral valve (PUV) represent a congenital obstructive uropathy, and despite advances in prenatal diagnosis and postnatal management, chronic kidney disease (CKD) develops in nearly 20% of affected patients. The aim of this study is to determine the clinical characteristics and kidney outcomes in children with PUV. A total of 52 boys followed with a diagnosis of posterior urethral valves were retrospectively evaluated. Data regarding antenatal diagnosis, age at admission, and timing of cystoscopic diagnosis were recorded. Clinical and laboratory data, including serum creatinine levels and glomerular filtration rate, ultrasonographic findings, urodynamic study results, DMSA findings, and voiding cystourethrography results, were collected. In addition, lower urinary tract symptoms and the frequency of lower urinary tract infections were documented. The mean age of the 52 patients was 9.8 ± 4.8 years, and the mean age at diagnosis was 28 ± 41 months. Antenatal diagnosis was present in 32 patients (64%) and CKD developed in 10 patients (19.2%). There was no significant difference in antenatal diagnosis rates between patients with and without CKD. In the non-CKD group, hydronephrosis grades decreased significantly at the last follow-up compared with baseline, whereas no significant improvement was observed in the CKD group. These patients had more severe and persistent hydronephrosis at final evaluation. Patients with CKD had higher ratios of increased bladder wall thickness and renal echogenicity at baseline and ureteral dilatation at final follow-up. In the baseline renal function model, baseline GFR was independently associated with CKD, but proteinuria did not retain statistical significance in multivariable analysis. There was no significant difference between groups regarding severe bladder dysfunction, DMSA renal scarring, vesicoureteral reflux grades, recurrent urinary tract infections, or incontinence. Our findings suggest that renal damage in children with PUV may begin during the antenatal period. In this cohort, postnatal factors such as bladder dysfunction, vesicoureteral reflux, and recurrent urinary tract infections were not significantly associated with CKD. Persistent hydronephrosis at last follow-up was more commonly observed in children with CKD. In multivariable analysis, baseline renal function was significantly associated with CKD development.
Following implementation of a standardized hydrocelectomy protocol under local anesthesia combined with systemic analgesia at a tertiary referral center, we evaluated its feasibility, safety, and patient acceptance. In this retrospective single-center study, 35 consecutive patients underwent hydrocelectomy between May 2024 and August 2025 via a standardized in situ spermatic cord block. All patients received protocol-based systemic analgesia (1 g metamizole i.v. and 7.5 mg piritramide s.c.). Anticoagulants were paused 48 h preoperatively and resumed postoperatively, while antiplatelet therapy was continued. The primary endpoint was completion without conversion to analgosedation or general anesthesia. Secondary endpoints included procedural pain (VAS), postoperative complications (Clavien-Dindo), and patient satisfaction. Hydrocelectomy was completed under local anesthesia and systemic analgesic support in 33/35 patients (94%). One patient required conversion to analgosedation, one to general anesthesia. Seventeen patients (48.5%) were ASA III-IV. Median peak procedural pain was moderate (VAS 6, IQR 0-8) and inversely correlated with age (ρ =  - 0.485, p = 0.005). Two patients (5.7%) required surgical revision (Clavien-Dindo IIIb), and four minor complications were managed conservatively. No complications were attributed to the anesthetic technique, and no recurrence occurred within 90 days. Overall, 87.5% of patients were satisfied and would recommend it. Hydrocelectomy under local anesthesia combined with systemic analgesia is feasible, safe, and well tolerated. Despite transient moderate procedural pain, most patients did not require conversion to procedural sedation or general anesthesia, with high satisfaction. These results support a patient-centered, resource-efficient approach, particularly for older and comorbid patients, including those with ASA III-IV classification, and warrant further prospective evaluation.
Prostate abscess is an uncommon but potentially life-threatening condition. In tropical Australia, melioidosis caused by Burkholderia pseudomallei represents a unique and under-recognized etiology. We aimed to evaluate drainage strategies and clinical outcomes of prostate abscess in a tropical referral center. A retrospective cohort study was conducted of all patients undergoing procedural drainage for radiologically confirmed prostate abscess at Cairns Hospital between August 2016 and October 2025. Demographic, clinical, microbiological, radiologic, and operative data were collected. Patients were stratified into melioid and non-melioid cohorts. Outcomes were analyzed descriptively by drainage modality and microbiological profile. 84 drainage procedures were performed. The mean age was 60 years (range 17-95). Burkholderia pseudomallei accounted for 50% of cases. Transurethral deroofing (TUD) was the most common drainage modality (n = 69; 82%), followed by transrectal ultrasound-guided aspiration (n = 9), transperineal aspiration (n = 3), and transrectal incision and drainage (n = 3). Secondary intervention was required in 33% of patients undergoing primary aspiration compared with 1% following TUD. ICU admission occurred in 19 patients overall and was more frequent in those with melioid disease. Long-term urinary and sexual symptoms were more frequently documented following TUD, although interpretation is limited by the retrospective design and lack of validated outcome measures. Prostate abscess in tropical northern Australia is strongly associated with melioidosis and systemic infection. Transurethral deroofing is commonly utilized for multifocal and complex disease, while minimally invasive approaches may be appropriate in selected cases. Observed differences in outcomes likely reflect disease severity and selection bias. Early recognition and tailored drainage strategies are essential to optimize outcomes.
Interstitial cystitis/bladder pain syndrome (IC/BPS) is a chronic inflammatory bladder disorder with a poorly understood etiology and limited therapeutic options. Ferroptosis, an iron-dependent form of regulated cell death driven by lipid peroxidation, has recently been implicated in bladder epithelial damage, suggesting a potential role in IC/BPS pathogenesis. To systematically review available preclinical studies examining the potential role of ferroptosis in experimental models relevant to interstitial cystitis, with a focus on molecular mechanisms and ferroptosis-related biomarkers. This systematic review was conducted in accordance with the PRISMA 2020 guidelines. A structured PICO framework was used to identify eligible preclinical studies that evaluated ferroptosis in animal models or ex vivo systems mimicking interstitial cystitis. Data were extracted regarding ferroptosis markers, signaling pathways, and therapeutic interventions. Four preclinical studies were included, involving rodent models of cyclophosphamide- and LPS-induced cystitis, cystitis glandularis, and one study incorporating human bladder biopsies. Across all studies, ferroptosis was consistently associated with bladder injury, characterized by decreased expression of GPX4 and SLC7A11, elevated malondialdehyde (MDA), and increased lipid ROS. Key regulatory pathways involved included Wnt/β-catenin, NF-κB, and Nrf2. Ferroptosis inhibitors such as dexrazoxane, hydrogen sulfide donors, dietary restriction, and pachymic acid attenuated bladder damage by restoring redox homeostasis. None of the studies evaluated pain-related outcomes or functional bladder parameters, which limits the translational interpretation of findings in the context of IC/BPS symptomatology. This systematic review highlights ferroptosis as a compelling mechanistic link between chronic inflammation and urothelial injury in interstitial cystitis/bladder pain syndrome. Targeting ferroptosis may offer a novel disease-modifying strategy in a condition where current treatments remain largely symptomatic. Future translation into clinical settings could involve biomarker-guided phenotyping and intravesical delivery of ferroptosis-modulating agents to achieve localized and pathogenesis-based therapy. However, given the limited number of studies and the lack of human validation, these findings should be considered preliminary and hypothesis-generating.
To evaluate finerenone-associated adverse events (AEs) and to investigate the association between finerenone use and renal injury via data mining of the Food and Drug Administration Adverse Event Reporting System (FAERS). To minimize statistical bias, the data extraction period was set from database inception (2004) to provide a stable background for disproportionality analysis. Four disproportionality algorithms (ROR, PRR, BCPNN, and MGPS) and stricter case-screening methods were employed to improve analytical precision. Additionally, a clinical priority evaluation was conducted to rank clinical risks and surveillance levels for these AEs. Supplementary analysis was performed to assess the relationship between finerenone and renal injury, as well as associated risk factors. A total of 1316 finerenone-related reports were identified. 30 AEs were detected as significantly positive signals, with most being related to renal function (15 PTs, 50%), blood pressure (5 PTs, 16.67%), and blood potassium (4 PTs, 13.33%). Among them, blood glucose increased, blood creatine increased, and flank pain were new potential AEs. Acute kidney injury, hyperkalemia, renal impairment, glomerular filtration rate decreased, blood creatinineincreased, blood potassium increased, and hyponatremia exhibited moderate clinical priority levels and warrant further study. Signals reflecting renal injury were detected in patients regardless of baseline nephropathy. Male sex, taking more than 3 drugs, and using amlodipine may be risk factors for finerenone-related nephrotoxicity. These results highlight new finerenone-related AEs, provide ranked guidance for pharmacovigilance through clinical priority evaluation, and clarify factors that influence renal injury, providing guidance for individualized treatment and improved drug safety.
Observational evidence indicates an association between sleep disorders and the risk of nephrolithiasis (kidney stones), but whether this relationship reflects a causal effect remains uncertain. The causal relationship between genetically determined sleep disorders and the risk of nephrolithiasis using a two-sample Mendelian randomization (MR) approach was investigated. This two-sample MR study utilized genome-wide association summary data for both sleep disorders and nephrolithiasis. The main analysis was conducted using the inverse variance weighted method, while confirmatory analyses employed the weighted mode, weighted median, and MR-Egger regression approaches to test for consistency and robustness. Heterogeneity among instrumental variables was evaluated using Cochran's Q-test, and the potential for horizontal pleiotropy was assessed with the MR-Egger regression intercept. To further gauge the robustness and consistency of the findings, a leave-one-out sensitivity analysis was performed, systematically omitting each single-nucleotide polymorphism in turn. Genetically predicted sleep apnea was associated with nephrolithiasis (OR = 1.0032, 95% CI: 1.0011-1.0053, P = 0.0025) and nephrolithiasis intervention (OR = 1.0027, 95% CI: 1.0008-1.0046, P = 0.0057) and that combined sleep disorders were associated with nephrolithiasis (OR = 1.0042, 95% CI: 1.0014-1.0070, P = 0.0032) and nephrolithiasis intervention (OR = 1.0027, 95% CI: 1.0004-1.0050, P = 0.0240). The MR-Egger test indicated no pleiotropy, and the Q-test revealed no heterogeneity for the causal associations above. The MR-PRESSO test showed no horizontal pleiotropy among the tested SNPs for the identified causal associations. The leave-one-out analysis revealed similar results. This MR study revealed causal associations between sleep disorders and nephrolithiasis. Further studies are needed to confirm these findings and elucidate the underlying mechanisms.
Post-transplant hypertension is common in kidney transplant recipients and contributes to cardiovascular risk and allograft dysfunction. Most available data come from studies where the primary indication for SGLT2 inhibitor use was post-transplant diabetes mellitus or cardiorenal protection, with blood pressure assessed as a secondary outcome. To systematically evaluate the impact of SGLT2 inhibitors on blood pressure, metabolic and renal outcomes, and safety in kidney transplant recipients. PubMed, Embase, and Scopus clinical trial registries were systematically searched from inception to October 20, 2025. Randomized controlled trials and observational studies were included. Primary outcomes were changes in systolic and diastolic blood pressure at 3, 6, and 12 months. Secondary outcomes included body weight, glycated hemoglobin (HbA1c), renal function, and adverse events. Twelve studies comprising 1,292 participants were included. In controlled difference-in-differences analyses (5 studies), SGLT2 inhibitors showed no significant blood pressure reductions versus control at any time point. Exploratory single-arm analyses suggested within-group systolic blood pressure reductions at 3 and 6 months; however, these estimates are at high risk of bias and cannot establish treatment effect. Exploratory single-arm analyses suggested modest short-term reductions in systolic blood pressure and suggested metabolic effects with an acceptable safety profile. However, controlled difference-in-differences analyses showed no significant blood pressure reductions versus control. Most available evidence derives from studies in which SGLT2 inhibitors were not initiated specifically for blood pressure control. Dedicated randomized controlled trials are required to determine their role in the management of post-transplant hypertension.
To compare perioperative outcomes between the 48-h short-stay pathway and traditional inpatient management for patients undergoing robot-assisted partial nephrectomy (RAPN), and to evaluate the feasibility, safety, recovery efficiency, and economic benefits of the 48-h short-stay pathway. This retrospective study included 175 patients who underwent RAPN between February 2022 and June 2024. Patients were assigned to a 48-h short-stay group (n = 60) or a traditional inpatient group (n = 115). A 1:1 propensity score matching (PSM) was conducted to balance baseline characteristics, including age, sex, BMI, comorbidities, tumor features, surgeon identity, and surgical year. Perioperative outcomes, recovery indicators, complications, and medical costs were compared. After PSM, 53 matched pairs were analyzed. The short-stay group showed significantly shorter operative time, less intraoperative blood loss, shorter warm ischemia time, earlier mobilization, earlier oral intake, faster bowel function recovery, and shorter bed rest (all P < 0.05). The short-stay group had 71.7% of patients discharged on postoperative day (POD) 1 and 100% within 48 h, while the traditional group had 22.6% on POD1, 33.96% on POD2, and 43.4% on POD ≥ 3 (P < 0.001). Both total and postoperative hospital stays were significantly shorter in the short-stay group (2.00 vs. 6.00 days, P < 0.001), with lower hospitalization costs (P < 0.001). Postoperative creatinine was lower in the short-stay group (P = 0.023), while creatinine change was comparable (P = 0.063). Complication rates, emergency department visits, and 30-day readmission rates were similar between groups (all P > 0.05). The short-stay group had a significantly lower drain placement rate (P = 0.002) without increased adverse events. The 48-h short-stay pathway for selected patients undergoing RAPN is feasible and safe. It accelerates postoperative recovery, shortens hospital stay, reduces medical costs, and optimizes healthcare resource utilization, without compromising safety or oncological early outcomes.
Metastatic urothelial carcinoma has undergone a major therapeutic transition, with first-line immune checkpoint inhibitor-based combinations now established as standard care. In this context, the key clinical question is no longer whether early immunotherapy should be used, but how evidence from historical deferred approaches, switch-maintenance, and contemporary upfront regimens can be interpreted through the lens of post-progression survival (PPS), a major determinant of overall survival (OS). A narrative synthesis of randomized phase II-III trials (2016-2025), PubMed-indexed studies, and real-world datasets was performed. Data on efficacy (OS, progression-free survival [PFS], PPS), safety, and rates of access to second-line therapy were extracted from JAVELIN Bladder 100, KEYNOTE-045, EV-302/KEYNOTE-A39, CheckMate-901, KEYNOTE-361, IMvigor130, and DANUBE. Avelumab maintenance improved OS (23.8 vs 15.0 months) despite >50% crossover to ICIs. Pembrolizumab after progression prolonged OS (10.3 vs 7.4 months) without PFS benefit, highlighting PPS as a major survival determinant. Real-world evidence indicates that only a minority of patients initiating first-line therapy ultimately receive second-line treatment, supporting therapeutic strategies that ensure earlier exposure to immunotherapy. Frontline combinations, such as enfortumab vedotin plus pembrolizumab and nivolumab plus cisplatin/gemcitabine, have, therefore, redefined the treatment paradigm and are now preferred standards or validated frontline options, whereas pembrolizumab-chemotherapy, atezolizumab-based combinations, and dual-checkpoint approaches have not demonstrated comparable survival benefit. Early integration of immunotherapy remains the key principle underlying survival improvement in metastatic urothelial carcinoma, but this concept must now be interpreted within a modern therapeutic framework in which enfortumab vedotin plus pembrolizumab is the preferred first-line regimen for most patients. Avelumab maintenance and post-platinum pembrolizumab remain historically pivotal benchmark strategies and may still retain relevance where newer regimens are unavailable or unsuitable. Across treatment eras, PPS appears to be a major driver of OS and deserves more systematic reporting in future trials.
This study aims to examine the impact of frailty on the survival outcomes of patients undergoing maintenance hemodialysis (HD) and to develop a predictive model for mortality risk. In this prospective cohort study, 400 HD patients were enrolled and followed for 24 months. Frailty was assessed by the Fried phenotype. Depression and anxiety were evaluated using the PHQ-9 and GAD-7 scales, respectively. Patients were randomly split into a model development group (n = 280) and a validation group (n = 120). Kaplan-Meier curves and the log-rank test were used for survival analysis. Independent predictors were identified using LASSO-Cox regression to construct a nomogram. Model performance was evaluated using the C-index, calibration curves, and decision curve analysis (DCA). The prevalence of frailty was 45.75%. Multivariable analysis identified frailty (HR = 1.85, 95% CI 1.03-3.36), age (HR = 1.04, 95% CI 1.01-1.07), depression (HR = 4.91, 95% CI 2.00-12.04), anxiety (HR = 3.49, 95% CI 1.78-6.83), cardiovascular disease (HR = 2.06, 95% CI 1.13-3.78), serum creatinine (HR = 1.004, 95% CI 1.003-1.005), and total cholesterol (HR = 1.50, 95% CI 1.13-2.00) as independent risk factors (all P < 0.05). The model demonstrated a C-index of 0.903. In the validation cohort, the AUCs were 0.889 (6-month), 0.897 (1-year), and 0.941 (2-year). Calibration and DCA confirmed good accuracy and clinical utility. Frailty is prevalent and independently associated with mortality in HD patients. The developed nomogram provides an accurate tool for individualized risk prediction. The particularly strong influence of depression and anxiety on survival underscores the critical need for integrating routine psychological screening into the clinical management of hemodialysis patients.
Percutaneous nephrolithotomy (PCNL) access is traditionally obtained via papillary puncture to minimize bleeding, though non-papillary access is frequently utilized in everyday clinical practice due to practical considerations such as anatomical variations and stone location. Because existing comparative studies have yielded inconsistent findings, this systematic review and meta-analysis aimed to synthesize existing evidence evaluating the safety and efficacy of papillary versus non-papillary access during PCNL. A comprehensive systematic search of databases, including PubMed, Embase, the Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov, alongside grey literature, was conducted from inception until December 2025. Randomized controlled trials and observational studies comparing papillary and non-papillary access for renal calculi in patients undergoing PCNL were included. Data were pooled using a random-effects model to calculate mean differences (MD) and odds ratios (OR) alongside their 95% confidence intervals (CIs). A total of six studies comprising 856 patients met the inclusion criteria. Pooled analyses demonstrated no statistically significant differences between the papillary and non-papillary access groups regarding hemoglobin drop (MD 0.09 g/dL, 95% CI -0.19 to 0.37), transfusion rates (OR 1.28, 95% CI 0.59 to 2.74), changes in postoperative serum creatinine levels (MD 0.02 mg/dL, 95% CI -0.03 to 0.07), duration of hospital stay (MD 0.08 days, 95% CI -0.28 to 0.44), stone-free status (OR 1.20, 95% CI 0.76 to 1.89), or operative duration (MD 2.81 min, 95% CI -2.06 to 7.68). Heterogeneity across most outcomes was minimal. Meta-regression identified stone size as the only significant moderator, which negatively influenced operative duration. Papillary and non-papillary access techniques during PCNL appear to provide comparable clinical outcomes with no significant differences in major perioperative parameters. The clinical decision-making regarding the selection of access technique should be individualized, considering patient anatomy, stone characteristics, and surgical expertise rather than strict adherence to a single standardized puncture strategy.
Percutaneous nephrolithotomy (PCNL) with nephrostomy drainage remains the standard treatment for large and complex renal calculi; however, the comparative effectiveness of tubeless PCNL versus standard PCNL, particularly across different nephrostomy tube sizes, remains uncertain.This meta-analysis focuses on comparing T-PCNL versus S-PCNL across nephrostomy bore sizes to provide a comprehensive evaluation of outcomes. We conducted a comprehensive database search until July 2024, focusing on randomized controlled trials (RCTs) that compared T-PCNL and S-PCNL. We included studies that reported outcomes related to operative time, stone-free rate (SFR), post-operative complications, and length of hospital stay. Results were expressed as standardized mean differences (SMD), mean differences (MD), and risk ratios (RR), with 95% confidence intervals (CI). Twenty-eight RCTs comprising 2,171 patients were included. T-PCNL significantly reduced hospital stay (SMD = -1.12; p < 0.00001), analgesic use (SMD = -1.01; p < 0.00001), and postoperative pain (SMD = -1.05; p = 0.01), particularly when compared to medium and large bore tubes. No significant differences were found in SFR (RR = 1.03; p = 0.15), transfusion, urosepsis, or Clavien-Dindo grade II-IV events. Operative time and residual stone burden were also lower with T-PCNL. T-PCNL is a safe and effective alternative to S-PCNL, offering faster recovery and fewer postoperative symptoms. Its benefits are most pronounced when compared to larger bore nephrostomy tubes, supporting its broader adoption in appropriately selected patients.