In this work, we propose an incremental pulse-width erase (IPWE) scheme for fast and variation-tolerant gate-induced drain leakage (GIDL) erase of 3D NAND flash. For the GIDL erase operation, GIDL-generated hole accumulation is required to raise the channel potential. This requirement leads to a transient state that degrades erase speed and broadens distribution of the erased Vth. In addition, the degradation becomes more pronounced with critical-dimension (CD) variation and temperature variation. The proposed IPWE scheme increases erase pulse width progressively, rather than increasing erase voltage as in the conventional incremental step pulse erase (ISPE) scheme. Sentaurus TCAD simulations of a 3D NAND flash with a surrounded BL PAD structure demonstrate that the IPWE scheme achieves a 1.18 V larger Vth shift compared to the ISPE scheme for the same total erase time of 6.6 ms. The IPWE scheme also effectively narrows the erase Vth shift distribution, reducing it by 40 mV under a 55 nm CD variation, 0.26 V for a 10 nm CD variation between channel strings, and 2 V across a 50 K temperature variation, all within a total erase time of 6.6 ms.
Electronic health records contain years of longitudinal clinical notes rich in evolving patient information, yet their volume, redundancy, and fragmentation limit clinical usability and scalable modeling. We present CLIN-SUMM (Clinical Longitudinal Insight from Notes using Summarization), a framework that restructures summarization as a longitudinal representation problem. Rather than collapsing histories into static summaries, CLIN-SUMM incrementally constructs structured, categorized, date-partitioned patient representations, summarizing only newly documented information at each encounter while preserving temporal fidelity without access to future data. This standardized representation can be computed once and reused across downstream tasks, thereby decoupling narrative processing from prediction. Across 12,356 Massachusetts General Hospital patients, CLIN-SUMM achieved 70% token reduction while maintaining high clinician-rated correctness and completeness. Using dementia as a case study, fine-tuning Clinical ModernBERT on CLIN-SUMM summaries yielded AUROC 0.86 for diagnosis and 0.81 for 3-year risk prediction, with longitudinal analyses demonstrating progressive risk separation years before a formal diagnosis. CLIN-SUMM summaries also enabled efficient extraction of longitudinal medication trajectories, improving medication capture compared to structured EHR data while maintaining high dosage agreement. CLIN-SUMM provides a scalable representation layer for clinical review and longitudinal machine learning, enhancing disease modeling, risk prediction, and other longitudinal reasoning tasks.
Disease-related malnutrition is highly prevalent among hospitalized patients and is associated with increased mortality, complications, and prolonged hospital stays. Early identification of patients at nutritional risk is therefore essential to improve clinical outcomes. The Controlling Nutritional Status (CONUT) score is an objective prognostic immunometabolic marker derived from serum albumin, total cholesterol, and lymphocyte count. This study aimed to evaluate the prognostic value of the CONUT score for in-hospital mortality and length of hospital stay (LOS) in hospitalized patients with moderate-to-severe nutritional risk and to determine whether incorporating CONUT improves the predictive performance of a clinical model based on routine admission variables. A retrospective observational cohort study was conducted, including 671 adult patients admitted to a tertiary university hospital with CONUT ≥ 6. Multivariable logistic regression was used to assess predictors of in-hospital mortality, while LOS was analyzed using multivariable linear regression. Model discrimination was evaluated using receiver operating characteristic (ROC) curve analysis and comparison of the area under the curve (AUC). Higher CONUT scores were independently associated with increased in-hospital mortality. Each one-point increase in CONUT was associated with 28% higher odds of death (OR 1.28; 95% CI 1.13-1.46; p < 0.001). Patients with a severe CONUT score had significantly higher mortality compared with those with a moderate CONUT score (OR 1.77; 95% CI 1.12-2.81; p = 0.004). Incorporating CONUT into the clinical prediction model significantly improved discrimination, increasing the AUC from 0.728 to 0.753 (DeLong p = 0.035). Higher CONUT values were also associated with longer hospital stays: each additional point corresponded to a 5.4% increase in LOS (p = 0.009), and a severe CONUT score was associated with a 17.6% longer stay (p = 0.027). the CONUT score is independently associated with in-hospital mortality and prolonged hospitalization. While its incremental discriminative improvement is modest, its automated calculation from routine laboratory data makes it a practical and scalable tool for early risk stratification.
Coronary flow reserve reflects microvascular function, whereas filling pressure indicates myocardial hemodynamic burden. In angina with nonobstructive coronary arteries, abnormalities in either could contribute to a supply-demand mismatch; however, their combined prognostic significance remains unclear. Patients with angina with nonobstructive coronary arteries who underwent invasive coronary reactivity tests were studied retrospectively. Coronary microvascular dysfunction (CMD) was coronary flow reserve <2.5. An artificial intelligence-enabled ECG marker of elevated filling pressure (AIEFP) was applied as a scalable prognostic marker of myocardial hemodynamic stress. Associations with major adverse cardiovascular events were evaluated using Cox regression and propensity score overlap weighting. Among 1957 patients with angina with nonobstructive coronary arteries (median age, 51.8 years; 67.2% women), CMD+ was identified in 26.4% and AIEFP+ in 6.6%. Over 8 years, 204 patients (10.4%) experienced major adverse cardiovascular events. In nested models, both CMD (hazard ratio [HR]=1.42 [95% CI=1.13-1.78]; P=0.002) and AIEFP (HR=2.28 [95% CI=1.67-3.11]; P<0.001) remained associated with major adverse cardiovascular events after adjustment. Adding AIEFP to the base model improved model fitness (Δχ2=23.65, P<0.001; ΔAkaike Information Criterion=-21.7) and modestly improved discrimination (ΔHarrell concordance index=0.02, P=0.02). In a 4-group model, adjusted risks increased in a graded manner. Overlap-weighted analysis after covariate balance demonstrated distinct temporal risk trajectories: CMD+/AIEFP+ had the highest 1-year risk (12.7%), and AIEFP+ had the highest cumulative 8-year risk (57% in CMD- and 51.8% in CMD+). CMD and AIEFP provide complementary prognostic information, with AIEFP remaining significantly associated with outcomes after multivariable adjustment. A dual-axis framework integrating myocardial hemodynamic stress (AIEFP) and microvascular function (coronary flow reserve) may enhance risk stratification and enable identification of clinically distinct angina with nonobstructive coronary arteries phenotypes with divergent trajectories.
暂无摘要(点击查看详情)
Background: We evaluated whether adding S100B to NSE improved discrimination or high-specificity rule-in of poor neurological outcome after out-of-hospital cardiac arrest (OHCA). Methods: In this single-center retrospective cohort study, comatose adult OHCA survivors treated with targeted temperature management had NSE and S100B measured at 0, 24, 48, and 72 h after return of spontaneous circulation. At each time point, we assessed NSE alone, S100B alone, and a logistic model combining both biomarkers in paired complete cases. Discrimination was assessed using the area under the receiver operating characteristic curve (AUC). Rule-in performance was evaluated using a timepoint-specific threshold that achieved 100% specificity in our cohort. Poor neurological outcome was defined as cerebral performance category 3-5 at 6 months. Results: Among 124 patients, 66 (53.2%) had poor outcomes. AUCs were similar between NSE alone and the combination across all time points (all p > 0.3). At 48 h, the combination ruled in 46/65 (70.8%) patients with poor outcome versus 36/65 (55.4%) with NSE alone, identifying 10 additional patients and a 15.4-percentage-point difference (95% confidence interval, -5.6 to 23.6). Conclusions: Adding S100B to NSE did not improve overall discrimination. The higher 48 h rule-in yield was estimated imprecisely and should be interpreted cautiously. Our findings require external validation before they can be translated to clinical settings.
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
The IMforte trial had supported lurbinectedin as a recommended first-line maintenance therapy for patients with extensive-stage small-cell lung cancer (ES-SCLC), while its cost-effectiveness remains uncertain for patients and clinical decision-makers. This study aims to evaluate the cost-effectiveness of adding lurbinectedin to atezolizumab as maintenance therapy for ES-SCLC and to assess the impact of drug wastage and mitigation strategies from the perspective of the US healthcare system. A partitioned survival model was developed to compare the cost-effectiveness of lurbinectedin plus atezolizumab versus atezolizumab alone. Key outcomes included total costs, effectiveness measured in quality-adjusted life years (QALYs), and the incremental cost-effectiveness ratio (ICER). Drug wastage was incorporated in the base-case analysis, and scenario analyses evaluated mitigation strategies through vial sharing and dose rounding. Model uncertainty was assessed using one-way sensitivity analysis and probabilistic sensitivity analysis. Over a 10-year time horizon, the lurbinectedin plus atezolizumab group accrued 0.3 more QALYs than the atezolizumab group. Under conditions of drug wastage, the addition of lurbinectedin resulted in an incremental cost of $230,000. When drug wastage was either excluded or mitigated through strategies such as vial sharing and dose rounding, the incremental cost was reduced to $180,000. The corresponding ICERs were $770,000 and $610,000/QALY, respectively. To achieve a greater than 50% probability of cost-effectiveness, an 85% price reduction in both high-cost drugs would be required. Sensitivity analyses indicated that the results were robust to variations in model parameters. At current price, lurbinectedin plus atezolizumab is not cost-effective as ES-SCLC maintenance therapy. Mitigating drug wastage and implementing substantial price reductions are essential to improve its economic value.
Background: Past workplace exposure to asbestos in combination with tobacco smoking has increased the risk of lung cancer for some residents in an area within the Friuli Venezia Giulia region, Northeast Italy. In light of studies showing that lung cancer screening (LCS) with low-dose computed tomography (LDCT) can reduce mortality, local stakeholders and decision-makers decided to assess the potential benefits, harms and cost-effectiveness of a single round of LCS with LDCT versus standard care among people aged 55 to 80 who were formerly exposed to asbestos and with at least 10 pack-years of smoking. Methods: An economic model was developed using a decision tree connected to a Markov cohort model. The primary outcome was the incremental cost per additional quality-adjusted life year (QALY). Other outcomes included the number of life years saved, the number of deaths averted and overdiagnosis. Results: Per 10,000 people screened, the intervention led to 395 additional QALYs (95% credible interval: 129 to 831) and incremental total costs of EUR 1,086,345 (95% credible interval: -852,607 to 2,155,826). The incremental cost per QALY gained was EUR 2750. There was a probability of cost-effectiveness of 99.5% relative to a threshold of EUR 25,000. Conclusions: The model estimated that the intervention was cost-effective. The model's simplifications and limitations should be considered when interpreting the findings in relation to policy-making decisions. Further research could include the costs and benefits of incidental findings and could assess the cost-effectiveness of repeated rounds of screening for the same population.
Diabetic retinopathy (DR) is a leading cause of preventable blindness among people with type 2 diabetes mellitus (T2DM). Early detection is essential to prevent vision-threatening complications. This study evaluated the long-term cost-effectiveness of CODMAP (Ophthalmologic Screening for Diabetes Mellitus in Primary Care (Cribado Oftalmológico en Diabetes Mellitus en Atención Primaria)), an optimised screening programme combining two-field non-mydriatic fundus photography (NMFP) and optical coherence tomography versus conventional single-field NMFP in a public healthcare setting. A Markov model simulated DR progression over 50 years in a cohort of 7729 patients with T2DM. Eight health states reflected DR severity, with state-specific costs and quality-adjusted life years (QALYs) accrued annually. Analyses took the perspective of the Spanish National Health Service, applying a 3% annual discount rate. Incremental cost-effectiveness ratios, calculated as a ratio of means (ICERsROM), and net monetary benefits (NMBs) were estimated through 10 000 Monte Carlo simulations, at a €30 000 willingness-to-pay threshold. Deterministic and probabilistic sensitivity analyses, scenario analyses and bootstrap validation were performed. CODMAP accrued higher costs (€206.4 million vs €205.1 million) but more QALYs (128 691.7 vs 118 013.8), resulting in an ICERROM of €124.25/QALY. The probability of cost-effectiveness at €30 000/QALY was 74.3%, remaining stable in scenario analyses (75.7%-78.8%). Mean incremental NMB in the base case was €319.0 million. Cost and QALY variations were the main drivers of uncertainty. CODMAP offers greater long-term health gains at a favourable incremental cost per QALY in the Spanish context. However, the non-definitive probability of cost-effectiveness warrants cautious policy consideration, with attention to local infrastructure, implementation capacity and cost structures.
Background and Objectives: Ascites is associated with substantial symptom burden and increased healthcare utilization, and it is observed in patients with advanced disease across multiple etiologies. However, because ascites is a clinical sign rather than a diagnosis category, it can be challenging to study using routine health reporting. In routinely collected hospital administrative data, ascites is commonly captured using International Classification of Diseases, Tenth Revision (ICD-10) code R18, an etiologically non-specific classification whose outcome implications are less documented. We aimed to evaluate the incremental association of R18-coded ascites with length of stay (LOS), readmission burden, and in-hospital mortality in the Gastroenterology and Internal Medicine inpatient department, beyond comorbidity burden and other coded decompensation proxies. Materials and Methods: We conducted a single-center retrospective study using routinely collected administrative discharge data from adult inpatient admissions (2015-2023) in the Gastroenterology and Internal Medicine department of a Romanian tertiary-care hospital. Admissions were classified by the presence of ICD-10 R18-coded ascites. Outcomes were LOS, readmission burden (count of subsequent admissions), and in-hospital mortality. Multivariable models adjusted for age, sex, and comorbidity burden (Charlson Comorbidity Index), with additional models incorporating ICD-10-derived decompensation proxies to assess overlap in administrative severity signal. LOS was further examined within Charlson strata to evaluate incremental stratification. Results: Coded ascites was associated with higher hospital burden, including longer LOS and greater readmission burden, and with higher in-hospital mortality in partially adjusted models. Within each CCI stratum, LOS remained higher among admissions with R18-coded ascites, supporting incremental stratification beyond comorbidity alone. Furthermore, mobility impairment was an important predictor of LOS. Age-stratified analyses suggested a high-burden phenotype among younger patients and infrequent R18 coding among the very elderly in this cohort. Conclusions: These findings support the potential utility of R18-coded ascites as a pragmatic administrative marker for risk adjustment and service planning.
The randomized phase 3 RACE trial demonstrated eltrombopag used in conjunction with traditional immunosuppressive therapy (IST) to be an efficacious treatment for severe aplastic anemia. However, eltrombopag can pose a significant cost burden. Using clinical data from the RACE trial, we performed the first cost-effectiveness analysis of frontline eltrombopag plus IST in adults with newly diagnosed severe aplastic anemia across all accepted willingness-to-pay (WTP) thresholds from the United States modified societal perspective. The primary outcome was the incremental net monetary benefit across WTP thresholds of $50,000-$150,000/quality-adjusted life year (QALY) with base-case WTP threshold of $120,000/QALY. Frontline eltrombopag plus IST versus frontline IST alone accrued discounted costs of $593,000 and $561,000, and discounted QALYs of 11.53 and 10.73, respectively, with a per-patient incremental net monetary benefit of $64,000 [95% credible interval -$91,000 to $262,000] favoring frontline eltrombopag plus IST. Deterministic sensitivity analyses demonstrated that the 6-month overall response risk ratio (with versus without frontline eltrombopag) had the greatest effect on model results. In probabilistic sensitivity analysis, frontline eltrombopag with IST was favored in 78% of 10,000 second-order Monte Carlo iterations, as well as 58% and 82% of 10,000 iterations at WTPs of $50,000 and $150,000/QALY, respectively. At current pricing, frontline eltrombopag is the cost-effective therapeutic strategy in adults with severe aplastic anemia in the US. Longer-term clinical and patient utility data will likely help further clarify the cost-effectiveness of eltrombopag in this setting.
Dementia diagnosis in sub-Saharan Africa is constrained by limited access to specialist neuroimaging interpretation and reduced specificity of brief cognitive tools in low-literacy populations. We evaluated the agreement, incremental value, and comparative performance of Mini Mental State Exam (MMSE), visual MRI medial temporal atrophy (MTA), and automated brain morphometry in older Ugandan adults with suspected dementia. In this cross-sectional study, adults aged ≥50 years with suspected dementia were recruited from neurology and psychiatry clinics at two hospitals and from a community cohort. Participants underwent MMSE and standardized 1.5T brain MRI. Visual MRI ratings were performed by radiologists blinded to clinical data, and automated morphometry was generated using NeuroQuant® normative percentiles. Hippocampal occupancy (HOC <5th percentile) was used as a reference MRI biomarker for comparative classification. Agreement between visual and automated measures was assessed using Spearman correlation and intraclass correlation. Incremental value was assessed using regression models, and comparative performance using area under the curve (AUC). Sixty-three participants were included (mean age 75.6 ± 8.7 years; 49 female). Agreement between visual ratings and automated morphometry was poor. MMSE correlated inversely with MTA (ρ = -0.47; p = 0.049) and correlated positively with hippocampal volume percentile (ρ = 0.46; p = 0.056). Adding hippocampal volume to MTA did not improve model fit for MMSE (ΔR2 = 0.028; p = 0.18). For comparative classification, MMSE alone was sensitive but poorly specific, while the combined MMSE-MTA model improved specificity and discrimination (AUC 0.70 vs 0.62 for either measure alone). Visual and automated MRI measures were not interchangeable in this heterogeneous cohort. Automated hippocampal volumetry added limited value beyond visual MTA for global cognition, while combining MMSE with visual MTA showed modest improvement in comparative classification and warrants further validation.
The prevalence of metabolic dysfunction-associated steatotic liver disease (MASLD) continues to rise, underscoring the need for tools to stratify individual risk of disease progression. We evaluated whether logistic regression models augmented by deep learning-based predictions (DLPs) can improve the B-mode ultrasound-based identification of at-risk MASLD, defined as patients with increased fibrosis risk. We retrospectively analyzed 205 patients with a total of 636 ultrasound images. We developed a model that reproduces the LSM-based dichotomous fibrosis risk classification using clinical parameters and ultrasound image-derived deep learning pipelines. Patients were classified by same-day liver stiffness measurement (LSM) (<8 kPa: low fibrosis risk; ≥8 kPa: increased fibrosis risk). We assessed the incremental value of DLPs when added to the parameters sex, age, BMI, diabetes mellitus type 2 status and the fibrosis-4 score (FIB-4) based on accuracy, AUROC, and related statistics. The logistic regression model combining the clinical parameters and the DLPs achieved acceptable performance with an AUROC of 0.73 and a test accuracy of 68%. The same model without DLPs showed an AUROC of 0.72 and a test accuracy of 61%. Including FIB-4 improved performance further (AUROC 0.92, accuracy 88%). Models based solely on image data demonstrated limited diagnostic performance. B-mode ultrasound provides a weak fibrosis-related signal, yielding limited diagnostic performance. Meaningful discrimination required the incorporation of clinical parameters, with FIB-4 offering the greatest improvement among the parameters assessed. Deep learning predictions added only modest incremental value. Prospective validation is needed to clarify clinical utility.
Intracranial atherosclerotic stenosis (ICAS) is a major cause of stroke and cognitive impairment. We evaluated whether nighttime systolic blood pressure (SBP) provides additional information beyond daytime SBP regarding cognition, ICAS burden, and plasma biomarkers in patients with ICAS. In this multicenter cross-sectional study, patients with ICAS underwent 24-hour ambulatory BP monitoring, brain magnetic resonance imaging, plasma biomarker assays, and neuropsychological testing. Multivariable models including both daytime and nighttime SBP assessed associations with ICAS burden, cerebral small vessel disease burden, plasma biomarkers, and cognition. Incremental analyses evaluated additional information from nighttime SBP beyond daytime SBP. Structural equation modeling examined patterns of associations among SBP, ICAS burden, NfL (neurofilament light chain), and cognition. Among 301 patients, higher nighttime SBP was associated with poorer cognition (per 10 mm Hg, β=-0.139 [95% CI, -0.263 to -0.016]; P=0.028), higher plasma NfL (β=0.162 [95% CI, 0.049-0.275]; P=0.005), and greater ICAS burden (odds ratio [OR], 1.365 [95% CI, 1.090-1.716]; P=0.007) independent of daytime SBP. Among biomarkers, only NfL was consistently associated with nighttime SBP. Incremental analyses supported additional information from nighttime SBP beyond daytime SBP for cognition, ICAS burden, and NfL. In structural equation modeling, the association between nighttime SBP and cognition was more closely aligned with NfL than with ICAS burden. In ICAS, nighttime SBP provides additional information beyond daytime SBP for cognition, vascular burden, and neuroaxonal injury. Nighttime SBP and plasma NfL may represent complementary indicators of vascular stress and cognitive vulnerability.
Vertebral end plates (EP) are essential for disc homeostasis due to their dual role of structural load distributor and a nutritional interface. Focal EP defects are frequently observed clinically, but their true biomechanical significance in vivo is unknown. Finite Element (FE) analysis offers a unique opportunity to simulate specific anatomical changes and can quantify the consequences of the mechanical disruption. We employed a validated FE Model of L4-L5 functional unit and investigated the effects of physiological compression and pure moments on models simulating: (1) Healthy discs with intact EP, (2) Four EP defect models, and (3) Graded anatomical EP damage based on Integrated Total Endplate Score (I-TEPS). Stress distribution on cartilage end plate (CEP), bony end plate (BEP), annulus fibrosus (AF), nucleus pulposus (NP), and subchondral bone was documented. In the intact healthy model, stress was uniformly distributed without focal concentrations. In contrast, all EP defect models demonstrated significant, non-linear stress elevation across the motion segment. Incremental analysis based on I-TEPS revealed a 'mechanical tipping point' phenomenon: stresses remained relatively stable until score three but increased sharply once the score reached four, coinciding with the onset of BEP involvement. Focal EP defects, irrespective of their location, cause stress extensions beyond the defect zone, causing cascading mechanical disruption of the disc environment. Combined CEP and BEP defects (I-TEPS ≥ 4) represent synergistic mechanical failure, characterized by abnormal NP pressure, annular tension, and subchondral bone stress. These findings demonstrate that EP integrity is one of the primary determinants of segmental spine stability.
In patients with severe aortic stenosis (AS) undergoing transcatheter aortic valve replacement (TAVR), myocardial dysfunction may extend beyond the left ventricle and remain underrecognized by conventional severity classification paradigms. Strain echocardiography allows early detection of subclinical dysfunction across cardiac chambers, potentially enhancing prognostic stratification. We retrospectively analysed 234 patients with severe AS undergoing TAVR who had echocardiograms suitable for strain analysis. Left ventricular (LV), left atrial (LA), and right ventricular (RV) strain values were quantified pre- and post-TAVR. Chamber dysfunction was defined using consensus strain thresholds, and multichamber impairment (MCI) was defined by dysfunction in two or more chambers. A modified damage staging system incorporating strain data was compared to an established model. The association between chamber impairment and all-cause mortality was explored. Strain-defined chamber impairment was common, with 29% of patients exhibiting MCI. LA dysfunction was the most frequent isolated abnormality, accounting for 88% of single chamber impairment. Modest improvement was observed post-TAVR, predominantly in LV global longitudinal strain (+ 1.8 ± 3.5%, p < 0.01). MCI was associated with higher rates of atrial fibrillation, chronic kidney disease, and mitral/tricuspid regurgitation. At 12-month follow-up, patients with three-chamber impairment had significantly increased mortality risk (HR 6.84, 95% CI 1.77-26.4, p = 0.005). The modified staging system reclassified 27% of patients into higher risk categories but did not significantly improve predictive accuracy over the established model. Multichamber dysfunction, particularly involving the LA, is prevalent in patients undergoing TAVR and confers a higher early mortality risk. While the addition of strain data improves damage detection, its incremental prognostic value over conventional models appears modest in this cohort. Larger studies are required to explore this further. Comprehensive strain imaging may nonetheless identify patients requiring closer surveillance and targeted post-TAVR therapy.
Background: Predicting genomic alterations from routine hematoxylin and eosin (H&E) whole-slide images (WSIs) may help triage molecular testing. Methods: We retrospectively enrolled 437 patients at Osaka Metropolitan University Hospital across 26 cancers, matched with clinical gene-panel data. We curated 1023 binary endpoints across SNV, CNV, and SV categories. We extracted slide embeddings from five pathology foundation models (Prism, GigaPath, Feather, Chief, and Titan) using a unified feature extraction pipeline and benchmarked them using a lightweight downstream Multi-Layer Perceptron (MLP) classifier. Using the best-performing patch feature system, we trained a multi-instance learning model to assess incremental benefit. Results: Titan achieved the highest and most stable transfer performance, with a median endpoint-wise Area Under the Receiver Operating Characteristic curve (AUROC) of 0.77 in the slide benchmarking; at the patch-level, prediction of APC_SNV reached an AUROC of 0.916, and prediction of KRAS_SNV reached an AUROC of 0.811 on the held-out test set. Conclusions: In a heterogeneous clinical gene-panel setting, pathology foundation models can provide strong baseline genomic-prediction signals without additional fine-tuning. We propose a practical, deployment-oriented two-stage workflow: rapid slide-embedding screening to prioritize robust representations and candidate endpoints, followed by patch-level training for high-value tasks where additional performance gains and interpretable regions are clinically worthwhile.
To explore the potential of PerioAI, an artificial intelligence system integrating intraoral scanning and cone-beam CT, to automatically measure gingival margin-to-bone distance (GBD) and convert it into AI-derived probing depth (AI-PD), and to evaluate whether AI-PD may provide additional information to support periodontal clinical decision making when radiographic imaging represents the primary source of available periodontal information. This cross-sectional proof-of-principle study included 53 patients with periodontitis (1298 teeth, 7788 sites). GBD measurements were converted to AI-PD using validated formulas. Clinical decision making (prognosis and treatment planning) was evaluated by one periodontist under three information conditions: (i) orthopantomogram (OPG) + original periodontal chart (used to establish the reference clinical decision); (ii) OPG-only; and (iii) OPG + AI-PD. Using the reference clinical decision, agreement rates for clinical decisions obtained under the two information conditions (OPG-only and OPG + AI-PD) were calculated and compared, and the risk of overtreatment was also assessed. Compared with OPG-only, the OPG + AI-PD condition showed higher agreement rate with the reference clinical decision, indicating that PerioAI may provide additional information for clinical decision making. Patient-level average agreement rates increased from 77.6% to 84.7% for prognosis (p < 0.05) and from 78.2% to 84.3% for treatment planning (p < 0.05). Tooth-level agreement rates improved from 78.1% to 86.0% for prognosis (p < 0.05) and from 78.8% to 85.4% for treatment planning (p < 0.05). The addition of AI-PD was associated with a 42.3% reduction in overtreatment risk (Steps 1-2 vs. Step 3) and a 98.5% reduction in the risk of tooth extraction (Step 3 vs. extraction). When combined with radiographic information, PerioAI shows potential to provide incremental information for clinical decision making. Future research should integrate additional periodontal parameters and validate the approach in larger and more diverse populations.