IgG4-related disease is a rare, immune-mediated, multisystem fibroinflammatory condition. It is characterized by elevated serum IgG4 concentrations and tissue infiltration of IgG4-positive plasma cells with distinctive histopathological features, including storiform fibrosis and obliterative phlebitis. Gastroenterological involvement is diverse and represents a critical diagnostic consideration, encompassing the pancreas, bile ducts, liver, esophagus, stomach, and intestine. In this review, we summarize the current understanding of genetic susceptibility and environmental risk factors underlying this disease, and systematically describe the clinical presentations, histopathological characteristics, imaging findings, and serological profiles of each major gastroenterological manifestation. We further review evidence-based treatment regimens ranging from glucocorticoids and conventional immunosuppressants to emerging biologic therapies, discuss organ-specific therapeutic responses, and identify predictors of disease relapse along with long-term surveillance strategies. From a gastroenterologist's perspective, this review aims to provide a practical and integrated framework to facilitate early recognition, accurate differentiation from mimicking conditions such as pancreaticobiliary malignancies, and optimized long-term management of this complex yet treatable disease.
Autosomal dominant acute porphyrias are rare inherited disorders of haem biosynthesis characterised by accumulation of potentially neurotoxic porphyrin precursors and attacks of severe abdominal pain with autonomic and neuropsychiatric features. Disease severity ranges from asymptomatic individuals to those with recurrent, life-threatening attacks. The International Porphyria Network invited 34 acute porphyria specialists from 17 countries to form an expert panel. The invited group included clinicians from diverse specialities (ie, internal medicine, haematology, endocrinology, gastroenterology, hepatology, neurology, and biochemistry), together with laboratory scientists and patient representatives. The panel met online (in 2023-25) to develop 15 evidence-based recommendations with the use of the Grading of Recommendations, Assessment, Development, and Evaluations framework addressing attack prevention, management of sporadic and recurrent attacks, long-term follow-up, surveillance for primary liver cancer, and family screening. The guidelines support safe, consistent clinical care and improved outcomes, recognising global variation in resources and access to high-cost drugs, and highlighting priorities for future research.
Genetic polymorphisms of alcohol-metabolizing enzymes influence drinking behavior and susceptibility to alcohol-related liver disease (ALD). This study aimed to clarify the distribution of ALDH2 and ADH1B genotypes in liver transplant recipients with ALD, and to explore associations with post-transplant drinking behavior using a recipient-donor paired design. Twenty-four living donor liver transplant (LDLT) recipients with ALD, their corresponding living donors, and 50 healthy controls were analyzed. Single nucleotide polymorphisms of ALDH2 (rs671) and ADH1B (rs1229984) were determined by the Cycleave PCR method. Drinking behavior before and after transplantation was assessed by self-report and electronic health records. Post-transplant alcohol use was classified as habitual (≥ once per week) or occasional (< once per week). ALDH2 genotype distribution differed among three groups (p = 0.002), with enrichment of the ALDH2*1/*1 genotype in ALD recipients (92%). ADH1B genotype distribution also differed among groups (p = 0.041), although the effect size was modest. During a median follow-up of 5.3 years (range, 0.3-15.9 years), 11 recipients (45%) reported post-transplant alcohol use, including 5 habitual drinkers (20%). In an exploratory analysis, a non-significant trend toward higher rates of habitual drinking was observed among recipients who received grafts from donors with the ALDH2*1/*1 genotype compared with other genotypes (40% vs. 7%, p = 0.12). The ALDH2*1/*1 genotype was more prevalent among liver transplant recipients with alcohol-related cirrhosis than among donors or healthy controls, supporting a role of alcohol tolerance-related genetic background in progression to end-stage liver disease. Donor ALDH2 genotypes may also influence post-transplant drinking behavior in an exploratory manner.
Improvement in hepatic reserve after balloon-occluded retrograde transvenous obliteration (BRTO) in patients with gastric varices (GVs) has not been fully established. The relationship between increases in liver volume (LV) or splenic volume (SV) after BRTO and prognosis is still unclear. In this study, we aimed to evaluate the factors associated with improvement in hepatic reserve after BRTO in GV patients and the relationship between changes in LV or SV after BRTO and prognosis. We retrospectively enrolled 258 patients who recieved their first BRTO for GV treatment at 12 institutions between January 2004 and May 2019. Hepatic reserve, LV, and SV were evaluated before and 6 months after BRTO. Changes in hepatic reserve were evaluated in 160 patients. Albumin levels and prothrombin time-international normalized ratio improved significantly, while platelet counts decreased significantly at 6 months after BRTO. Multivariate logistic regression analysis showed that history of hepatocellular carcinoma and modified albumin-bilirubin (mALBI) grade were independent factors associated with the improvement of albumin-bilirubin (ALBI) score. The ALBI score significantly improved in patients with mALBI grade 2b or 3 (p < 0.001), but not in patients with mALBI grade 1 or 2a. Eighty-three patients who underwent abdominal computed tomography examination 6 months after BRTO had significantly increased LV and SV (LV, p < 0.01; SV, p < 0.01). The patients with a > 10% increase in SV had significantly poorer prognosis than the others (p = 0.03). BRTO for GVs leads to improvement of hepatic reserve. Patients with increased SV after BRTO had poor prognosis. Not aplicable.
Chronic hepatitis B (CHB) is characterized by progressive structural and functional liver impairment involving hepatic dysfunction, fibrosis accumulation, and nutritional-immune imbalance. Although non-invasive indices derived from routine laboratory parameters-such as the albumin-bilirubin (ALBI) score, fibrosis-4 (FIB-4) index, and prognostic nutritional index (PNI)-are widely used, their integrated value for stage-based risk stratification in HBV-related liver disease remains unclear. To develop and internally validate a clinically applicable, non-invasive risk stratification framework based on ALBI, FIB-4, and PNI, and to evaluate its performance across different clinical stages of HBV-related liver disease. We conducted a single-center retrospective cross-sectional study including 842 hospitalized patients with HBV-related liver disease. Patients were categorized as chronic hepatitis B, compensated cirrhosis, or decompensated cirrhosis at admission. ALBI, FIB-4, and PNI were calculated from routine laboratory data. Ordinal logistic regression was used to identify factors associated with advanced disease stage. Discriminative performance was assessed using receiver operating characteristic (ROC) analysis, and calibration was evaluated. Factors associated with hepatic encephalopathy were explored in patients with decompensated cirrhosis. ALBI and FIB-4 increased with higher disease stage, whereas PNI declined significantly (all P < 0.001). Multivariable analysis identified age, ALBI, and FIB-4 as independent factors associated with advanced stage, while PNI and platelet count were inversely associated. The combined framework was associated with improved discrimination for disease stages compared with individual indices, with further enhancement after incorporation of prothrombin time (PT). Higher ALBI values were independently associated with hepatic encephalopathy among patients with decompensated cirrhosis. An integrated framework combining ALBI, FIB-4, PNI, and PT may provide a simple, interpretable, and cost-accessible tool for stage-based risk stratification in HBV-related liver disease within hospitalized populations.
暂无摘要(点击查看详情)
Professional identity formation in healthcare professions education (HPE) is a complex and often fraught process. Physicians and trainees experience identity struggles as they navigate their personal and professional development, along with the conflicting expectations, cultural norms and systemic pressures within the healthcare environment. Despite growing attention to these challenges, conceptual clarity around identity struggle remains limited. To advance theoretical understanding of identity struggle, we conducted a critical narrative review of identity theories from developmental psychology, social psychology and sociology. We selected three major theoretical traditions-Neo-Eriksonian, symbolic interactionist and social identity theories-and analysed their conceptualizations of identity, identity development and identity struggle. We examined these diverse perspectives to inform research and educational practice on professional identity formation in HPE. Each theoretical tradition offers distinct insights into identity struggle. Neo-Eriksonian theories emphasize exploration and commitment as central processes, framing struggle as developmental and potentially productive. Symbolic Interactionist theories highlight the role of socialization and identity dissonance, viewing struggle as emerging from tensions between personal agency and societal norms. Social Identity theories focus on group belonging and intergroup dynamics, conceptualizing struggle at both individual and socio-contextual levels. We provide common critiques and limitations of each theoretical tradition. These perspectives illuminate varied mechanisms through which identity struggle manifests and evolves. This review underscores the multifaceted nature of identity struggle and the value of theoretical pluralism in understanding professional identity formation. Struggle is not inherently negative; rather, it can be a catalyst for growth when appropriately framed and supported. We propose how educators and researchers might use these theoretical lenses to design interventions that foster productive identity development and address systemic contributors to identity struggles. We invite scholars drawing on critical perspectives of power and structure to challenge and deepen the conversation on identity struggle in HPE.
High demand for endoscopic procedures contributes to prolonged wait times and limited access to care, particularly in safety-net health systems. Missed appointments, including no-shows and late reschedules, further strain resources and delay diagnosis and treatment. Epic Systems provides a proprietary Risk of Patient No-Show Model, previously validated in primary care settings, but its performance in predicting attendance for endoscopy appointments has not been evaluated. A retrospective cohort study was conducted among adults scheduled for outpatient endoscopy at a single safety-net hospital between January 1, 2023, and August 1, 2024. Patients aged ≥ 18 years with at least one prior encounter in the electronic medical record (EMR) were included; inpatient procedures and early cancellations (> 2 days before the appointment) were excluded. The primary outcome was appointment completion versus no-show. Epic's predicted no-show risk, demographic characteristics, procedure type, and patient portal activation status were extracted. The relationship between predicted and observed no-show rates was assessed using the coefficient of determination (R²) and linear regression. Secondary analyses stratified results by patient portal activation. Among 4,658 unique patients, the median age was 59 years, 52.2% were male, and the cohort was racially and linguistically diverse. Overall, 1,493 patients (32.1%) did not attend their scheduled endoscopy. Epic's predicted no-show risk demonstrated a strong linear correlation with actual no-show rates (R² = 0.87). Observed missed appointment rates followed the equation: Missed Appointment Rate = 1.30 × (Epic Risk) + 0.16, indicating a baseline no-show rate of 16%. Each percentage-point increase in Epic's predicted risk corresponded to a 1.3-point increase in observed no-show rate. Patients without an activated patient portal (MyChart) had approximately 5-percentage-point higher no-show rates across the risk spectrum. Epic's Risk of Patient No-Show model shows strong correlation with real-world endoscopy attendance and may support predictive overbooking and targeted outreach to improve endoscopy unit efficiency. Given its integration within the EMR, this tool offers a practical framework for operational interventions, though further validation across diverse health systems is warranted.
Quality control can reduce variations in operators' performance conducting magnetically controlled capsule gastroscopy (MCCG). However, an optimal method for quality control in routine MCCG procedures is still lacking. This study aimed to develop an automatic quality control system (AQCS) and assess its effectiveness in a clinical trial. We developed the AQCS using convolutional neural network (CNN) models to monitor inspection completeness, evaluate gastric cleanliness, and identify suspicious lesions. Then, patients were prospectively randomized to undergo routine MCCG with or without AQCS assistance. The primary outcome was the blind spot rate in the AQCS and control groups. The CNN model demonstrated specificity of 98.27-99.30% and sensitivity of 76.33-96.35% in gastric site identification. Between August 27, 2021, and July 28, 2022, a total of 200 patients were randomized, with 98 and 96 patients analyzed in the AQCS and control groups, respectively. Compared to the control group, the AQCS group achieved lower blind spot rates (median: 0.00% vs. 16.67%, P < 0.01), higher lesion detection rates (75.51% vs. 60.42%, P = 0.02), and comparable gastric examination time (28.63 min vs. 27.58 min, P = 0.48). Additionally, AQCS showed high consistency with expert evaluations in cleanliness assessment (Kappa = 0.95, P < 0.01). No serious adverse events occurred in either group. AQCS significantly reduced the blind spot rate during MCCG procedures. It could be a powerful assistant tool to mitigate operator skill variability and enhance the overall quality of routine MCCG examinations. ClinicalTrials.gov Identifier NCT04954677.
Diagnostic delay in Crohn's disease (CD) may increase the risk of poor prognosis. This study developed and validated a machine learning (ML) model for clinicians to conduct preliminary diagnosis and refer suspected patients. Retrospective cohort study. This retrospective dual-center study analyzed electronic medical records (EMRs) from two Chinese tertiary hospitals (2020-2024). Patients with a first-time CD diagnosis and disease duration ≤24 months were included, while those aged <18 years, with incomplete medical histories, or prior use of aminosalicylates, glucocorticoids, or biologics were excluded. Nine algorithms were trained, including logistic regression (LR), support vector machine (SVM), gradient boosting machine (GBM), and neural network (NN). Model performance was evaluated using the area under the receiver operating characteristic curve (AUROC), calibration curves, and decision curve analysis. The interpretability of the optimal model was enhanced by SHapley Additive exPlanations (SHAP) values. The final model was developed using data from 526 patients at Jinling Hospital (276 CD and 250 non-IBD), with 10 predictive features selected from 28 candidate variables through univariate and multivariate logistic regression. The GBM model showed superior performance, achieving an AUROC of 0.998 (95% CI: 0.995-1.000) in the training cohort and 0.954 (95% CI: 0.923-0.985) in the test cohort. External validation in 196 patients at Zhongda Hospital (100 CD and 96 non-IBD) yielded an AUROC of 0.944 and a well-calibrated Brier score of 0.094. Using dual-center data, an interpretable ML model was developed and validated to support early CD diagnosis and referral decisions.
Magnetic resonance imaging (MRI) is the preferred staging modality in the evaluation of rectal cancer. We aimed to evaluate the accuracy of MRI among large rectal polyps referred for endoscopic resection. We analyzed consecutive patients who underwent MRI prior to endoscopic resection for rectal neoplasia. 44 patients with large rectal polyps (mean size 5.8 cm) underwent MRI prior to endoscopic resection. MRI categorized 24 (54.5%) lesions as T0/T1, 16 (36.4%) T2, and 4 (9.1%) T3. Final pathology demonstrated 5 (11.4%) adenomas, 21 (47.7%) high grade dysplasia, 5 (11.4%) intramucosal adenocarcinoma, 9 (20.5%) pT1 adenocarcinoma, and 4 (9.1%) pT2 adenocarcinoma. Findings were discordant in 21 (47.7%) patients (p < 0.01), where MRI over-staged 18 (40.9%) and under-staged 3 (6.8%) patients. We demonstrate that MRI over-stages over 40% (18/44) of large rectal polyps. MRI staging should be interpreted cautiously when considering endoscopic resection for large rectal polyps.
To identify risk factors associated with poor prognosis in patients with severe fever with thrombocytopenia syndrome (SFTS) and develop a prognostic model based on these factors. A retrospective analysis was conducted on 207 patients with SFTS admitted to Tongji Hospital from April 1, 2023, to July 18, 2024. Patients were categorized into survival (n = 133) and death (n = 74) groups based on their prognosis. Univariate and multivariate logistic regression analyses were performed to identify independent predictors of mortality, incorporating demographic characteristics and inflammatory biomarkers measured within 24 h of hospital admission. A nomogram model was constructed using R software based on the regression coefficients of the identified predictors. The model's discriminative ability was evaluated using the receiver operating characteristic (ROC) curve, with the area under the curve (AUC) and concordance index (C-index) calculated. Internal validation was performed using the Bootstrap resampling method (1,000 iterations).Furthermore, an external validation cohort comprising 55 patients with SFTS admitted to our hospital between August 2024 and February 2025 was retrospectively collected to evaluate the model's generalizability and stability. Age, viral load, procalcitonin (PCT), and interleukin-10 (IL-10) were identified as independent risk factors for poor prognosis. A nomogram model incorporating these four factors demonstrated robust predictive performance, yielding an AUC of 0.905 (95% CI, 0.862-0.949; P < 0.001).Internal and external validations confirmed the model's stability and strong prognostic performance in patients with SFTS. Decision curve analysis (DCA) showed that the nomogram yielded a higher net benefit over a wider threshold probability range than previous models in predicting SFTS mortality. This study provides a novel prognostic model for SFTS patients, which may aid in early risk stratification. However, its clinical utility and generalizability need to be further validated in larger cohorts. Not applicable.
Human albumin (HA) infusion is beneficial for the management of decompensated cirrhosis, but its mechanism remains unclear. This study aimed to evaluate the effect of HA infusion on systemic inflammation, oxidative stress, and prognosis in decompensated cirrhosis. In a cohort study, patients with acute decompensated cirrhosis were enrolled and categorized according to HA infusion. Serum interleukin (IL)-6, IL-10, tumor necrosis factor-alpha (TNF-α), superoxide dismutase (SOD), malondialdehyde (MDA), and glutathione (GSH) levels were measured before and after treatment. We compared the one-year cumulative incidence of further decompensation between the two groups. In a meta-analysis, all relevant papers were systematically searched, and pooled using a random-effects model. In the cohort study, 55 patients were included, of whom 23 received HA infusion. HA infusion significantly reduced IL-6 (92.79 ± 144.14 vs. 66.99 ± 89.61, P = 0.031) and MDA (15.19 ± 8.44 vs. 11.05 ± 6.77, P = 0.006) levels, but increased GSH (135.76 ± 14.91 vs. 142.00 ± 19.69, P = 0.033) level. ∆IL-6 (-25.80 ± 64.28 vs. 1.83 ± 25.05, P = 0.013) and ∆TNF-α (-12.95 ± 28.91 vs. 1.82 ± 11.31, P = 0.037) were greater in the HA group than in the control group. One-year cumulative incidence of further decompensation was significantly lower in the HA group than in the control group (P = 0.036). HA infusion was an independent protective factor of further decompensation (sHR = 0.312, P = 0.02). In the meta-analysis, 7 papers with 457 patients were included. ∆IL-6, ∆TNF-α, and ∆MDA were greater in the HA group than in the control group, but the difference was not statistically significant. HA infusion may reduce the risk of further decompensation by improving systemic inflammation and oxidative stress in decompensated cirrhosis.
Gastric diseases are a major health burden, especially in East Asia, with high incidence and mortality rates. While Helicobacter pylori infection and poor lifestyle behaviors are established risk factors, the impact of drug-related gastric injury in aging populations remains underexplored. In China, polypharmacy is common among older adults, with nearly 40% using self-purchased medications without professional guidance. Similar trends are observed in other aging societies, highlighting the global challenge of unregulated drug use. Existing studies lack adjustments for comorbidities, prediction tools, and methodological innovation. This study analyzes CHARLS 2018 data to: (1) examine the relationship between self-purchased medication use and gastric disease, and (2) identified strongly associated groups, such as patients with chronic kidney disease (CKD) and diabetes. We included 19,752 participants from the CHARLS database. Participants were divided into gastric disease and control groups based on DA007_10 questionnaires. We compared baseline characteristics, including self-purchased medication use, between groups. Multivariate logistic regression models were used to assess the correlation between self-purchased medication use and gastric disease. The receiver operating characteristic (ROC) curve evaluated Model 3's predictive performance. Stratified analysis was conducted to assess the stability of the results. Significant differences were observed between the gastric disease (n = 659) and control (n = 5,428) groups in factors such as ethnicity, self-reported health status, hypertension, dyslipidemia, diabetes, chronic lung diseases, liver disease, kidney disease, smoking history, and medication use. In all models, self-purchasing medication was associated with an increased prevalence of gastric disease (adjusted OR = 1.75, 95% CI 1.46-2.09, p < 0.001), including Model 3 (adjusted OR = 1.75, 95% CI 1.46-2.09, p < 0.001). The model demonstrated moderate discriminatory ability in distinguishing individuals with and without gastric disease (AUC = 0.71). Stratified analyses revealed that the association between self-purchased medication use and gastric disease risk was consistent across covariates, with stronger effects observed in participants with CKD (adjusted OR = 1.89) and diabetes (OR = 1.82). This study found that self-purchasing medication was positively correlated with the risk of gastric disease in Chinese older adults (adjusted OR = 1.75), and the association was stronger among those with chronic kidney disease (adjusted OR = 1.89) or diabetes (OR = 1.82). This finding provides a reference for the risk stratification and prevention strategies of gastric disease.
暂无摘要(点击查看详情)
BackgroundWhile food insecurity (FI) assesses household access to sufficient food, increasing attention is being paid to nutrition security (NS), the ability to obtain health-promoting foods.AimThis cross-sectional analysis examined differences in sociodemographics, diet quality, and health status between food insecure families with low versus high NS and assessed if FI and NS severity was associated with food assistance program use.MethodsData were obtained from the 2023 Massachusetts Statewide Food Access Survey of >3000 adults. FI was assessed using the United States Department of Agriculture 18-item Household Food Security Module, NS via a validated four-item scale, and diet quality via the Prime Diet Quality Score (PDQS). FI and NS were analyzed as binary and continuous variables. Inferential statistics assessed group differences. Logistic regression assessed associations between the severity of FI and NS and food assistance program use.SummaryHouseholds with FI, ≥1 child (≤18 years), and annual income ≤$50,000 (n = 335) were included. Forty-three percent reported low NS. Low (vs. high) household NS was associated with a higher prevalence of child-level FI (79% vs. 43%, P < 0.001), SNAP use (83% vs. 70%, P = 0.009), food pantry use (69% vs. 44%, P < 0.001), adult overweight/obesity (39% vs. 17%, P < 0.001), and lower adult PDQS (43 ± 5.1 vs. 44 ± 4.3, P = 0.023), indicating lower diet quality. More severe household FI and lower NS were associated with higher odds of SNAP and food pantry use (P < 0.001). These findings support expanding food assistance programs to increase nutritious food access for families with children before food and nutrition insecurity become severe.
This study aimed to investigate the relationship between admission serum lipase levels, prognostic clinical scoring systems, and in-hospital mortality in patients presenting to the emergency department with acute pancreatitis. This retrospective cohort study included 389 patients hospitalized with acute pancreatitis at a tertiary care center between 2021 and 2023. Demographic characteristics, clinical findings, laboratory parameters obtained at admission and at 48 h, and imaging findings were collected. Prognostic assessment was performed using the Ranson, APACHE II, BISAP, Glasgow-Imrie, HAPS, and CTSI scoring systems. Patients were categorized according to a predefined serum lipase cutoff of 600 U/L. The associations between serum lipase levels, prognostic score categories, and in-hospital mortality were analyzed, and the diagnostic performance of the lipase cutoff was evaluated. The overall in-hospital mortality rate was 3.9% (n = 15). Admission serum lipase levels were significantly associated with the Ranson score (p < 0.05) and several laboratory parameters. A serum lipase cutoff of 600 U/L identified patients with a Ranson score ≥ 3 with a sensitivity of 70.2% (95% CI: 60.4-78.8) and a specificity of 57.3% (95% CI: 51.2-63.2). Higher prognostic scores, including Ranson, BISAP, and APACHE II, were significantly associated with in-hospital mortality (p < 0.01). However, admission serum lipase levels were not significantly associated with in-hospital mortality. Admission serum lipase showed a limited association with selected prognostic indicators in acute pancreatitis but was not associated with in-hospital mortality. These findings suggest that serum lipase has limited value as a standalone prognostic marker and should be interpreted in conjunction with established clinical scoring systems. Further prospective studies are needed to better clarify its role in risk assessment.
5-fluorouracil (5-FU) ± targeted therapy is a standard of care in frail/elderly patients with an unresectable colorectal adenocarcinoma (CRC) in first-line setting. Panitumumab plus sotorasib combination (KRAS G12C inhibitor) are promising in advanced line in KRAS G12C-mutated CRC. Here we assess the safety and efficacy of 5-FU combination with panitumumab and sotorasib as first-line treatment in frail/elderly patients with unresectable KRAS G12C-mutated CRC. In this ENGIC 01 - PRODIGE 107 - FFCD 2306 - COLOSOTO multicenter, open-label, prospective single-arm phase II trial, the main inclusion criteria are adult patients with unresectable locally advanced or metastatic KRAS G12C-mutated CRC, unfit for a doublet/triplet chemotherapy. All patients will receive 5-FU plus panitumumab and sotorasib in 2-week-cycles until progression or intolerance. The primary endpoint is 8-months progression-free survival (PFS). The secondary endpoints include median PFS, disease control rate, time to progression, overall survival, best objective response rate, duration of response, safety profile, quality of life and geriatric assessment. A 70% 8-months PFS is expected (H0 <50%), and 37 patients will need to be included. Treatment with 5-FU plus panitumumab and sotorasib could be a promising alternative to 5-FU ± targeted therapy in first-line setting in frail/elderly patients with unresectable KRAS G12C-mutated CRC.
Fontan circulation is associated with progressive multisystem dysfunction, yet its biochemical mechanisms are poorly understood. Gut microbiota-derived metabolites, particularly short-chain fatty acids (SCFAs) and bile acids, shape cardiovascular health. We previously reported elevated secondary bile acids in Fontan patients, but their SCFA profile remains uncharacterized. Fontan patients and matched healthy subjects were evaluated by body composition, frailty, cardiopulmonary exercise testing, hemodynamics, and plasma SCFA quantification. Twenty Fontan patients (25.5 years [IQR: 22.8-30.3]; 35% women) and 20 healthy controls (30.0 years [25.8-34.3]; 30% women) were enrolled. Compared to controls, Fontan patients exhibited elevated plasma levels of several SCFAs: propionic acid (1.84 [1.45-2.68] vs. 1.19 [1.07-1.49] μM; p = 0.002), butyric acid (1.27 [0.90-1.71] vs. 0.75 [0.52-0.94] μM; p = 0.002), valeric acid (0.25 [0.15-0.36] vs. 0.13 [0.11-0.16] μM; p < 0.001), and caproic acid (0.44 [0.35-0.67] vs. 0.25 [0.21-0.39] μM; p < 0.001). Acetic acid levels did not differ significantly between groups. Additionally, branched-chain SCFAs were elevated in Fontan patients: isobutyric acid (0.44 [0.32-0.68] vs. 0.26 [0.23-0.30] μM; p < 0.001) and 2-methylbutyric acid (0.38 [0.27-0.58] vs. 0.19 [0.15-0.25] μM; p < 0.001). Notably, caproic, isobutyric, and 2-methylbutyric acids showed strong correlations with key clinical and hemodynamic parameters. Furthermore, isobutyric and 2-methylbutyric acids were significantly correlated with dehydrolithocholic acid levels (R = 0.67 and 0.54, respectively) and other bile acid components. Fontan patients have elevated plasma straight and branched SCFAs linked to adverse clinical and hemodynamic profiles; further evaluation is warranted.
Human liver tissue-derived organoids recapitulate key hepatic phenotypes but are commonly maintained under static conditions, whereas microfluidic organ-on-chip systems provide controllable perfusion and mass transport. Scalable integration of human liver tissue-derived organoids into a perfused, human-relevant Liver-on-Chip remains limited. We combined healthy human liver tissue-derived organoids with a high-throughput three-lane OrganoPlate microfluidic format to establish a perfused organoid Liver-on-Chip (HepLoC) featuring 3D luminal tubules under continuous flow. After hepatocyte-directed differentiation under perfusion, bioengineered HepLoC formed mature hepatocyte-like architectures with increased mature hepatocyte marker proteins, enrichment of hepatic transcriptomic signatures, and functional bile canaliculi. As a proof-of-concept for drug-induced liver injury, troglitazone induced dose-dependent hepatocyte injury accompanied by tight-junction disruption, MRP2 mislocalization, and impaired bile acid export, recapitulating key features of cholestatic liver injury. To model metabolic liver disease, free fatty acids triggered lipid droplet accumulation, increased triglycerides and reactive oxygen species, and upregulated lipogenic and inflammatory genes while largely preserving viability, consistent with early-stage metabolic dysfunction-associated fatty liver disease. The high-throughput HepLoC format further enabled parallel testing of reference hepatotoxic drugs and curcumin liposomes by reduced lipid accumulation in fatty-acid-treated HepLoC with minimal hepatotoxicity. Our perfused, organoid-based microfluidic Liver-on-Chip recapitulates essential human liver structure and function and enables integrated, parallel evaluation of hepatotoxicity and optimization of nanotherapeutic strategies, which deciphers the mechanisms of liver diseases, bridging the gap between preclinical research and clinical translation.