We evaluated the feasibility and long-term outcomes of minimally invasive surgery (MIS) after neoadjuvant immunotherapy in patients with resectable non-small cell lung cancer (NSCLC). Patients with stage IB-IIIB NSCLC who underwent induction immunotherapy followed by surgery (2015-2022) were included. Overall and event-free survival were estimated using the Kaplan-Meier approach. Cox regression was used to quantify associations between clinical and pathologic variables and survival outcomes. In total, 69 patients met the inclusion criteria (MIS, n = 38; open surgery, n = 31). Patients who underwent open surgery had larger tumors (P < .001), more-frequent extended resection (P = .007), more major complications (P = .03), and longer in-hospital stay (P = .002), compared with patients who underwent MIS. The rate of conversion to open surgery in the MIS group was 16% (6/38). Histologic subtype (P = .057), completeness of resection (P > .9), response to treatment (P > .9), and stage (P = .6) were not statistically different between groups. Five-year event-free and overall survival in the MIS patients were 72% and 80%, respectively. Among all patients, pathologic N2 disease (hazard ratio [HR], 3.04; 95% CI, 1.1-8.38; P = .031) and upstaging (HR, 2.57; 95% CI, 1.05-6.34; P = .04) were associated with a greater risk of recurrence. Major pathologic response was associated with longer event-free survival (HR, 0.22; 95% CI, 0.05-0.93; P = .039). Programmed death-ligand 1 level (HR, 1.02; 95% CI, 1.01-1.04; P = .007) and tumor dimensional reduction after treatment (HR, 0.96; 95% CI, 0.93-0.99; P = .016) were associated with major pathologic response. MIS after induction immunotherapy is feasible and does not compromise long-term oncologic outcomes.
Transcatheter aortic valve replacement (TAVR) is an established alternative to surgical aortic valve replacement (SAVR) in intermediate- and low-risk patients with severe aortic stenosis. However, most randomized trials have enrolled patients older than 70 years with limited follow-up, and evidence on long-term outcomes in younger populations remains insufficient. Our objective was to compare long-term clinical outcomes between propensity-matched TAVR and SAVR patients aged <70 years. A total of 959 patients (SAVR: 808, TAVR: 151) were included, resulting in 132 propensity-matched patients per group. The primary outcome was a composite of all-cause death, stroke, and procedure- or valve-related hospitalization at 30 days, 2 years, 5 years, and up to 10 years. A flexible parametric survival model was used to evaluate the hazard ratio (HR) for reintervention and its time dynamics. Baseline characteristics were balanced between matched groups (mean age 64.9 years; Society of Thoracic Surgeons Predicted Risk of Mortality score 2.9%). At 10 years, primary outcome rates were similar (SAVR: 50.8% vs TAVR: 42.9%; HR, 1.01; 95% CI, 0.66-1.53; P = .95). Patients who underwent SAVR had greater rates of acute kidney injury, bleeding, and new-onset atrial fibrillation during hospitalization (P < .01), whereas permanent pacemaker rates were similar (P = .140). Reintervention was more frequent after TAVR (10.6% vs 3.7%; P = .03), and flexible parametric survival model showed a greater risk of reintervention in TAVR (HR, 3.63; 95% CI, 0.73-1.78; P = .01). At long-term follow-up, all-cause mortality, stroke, and rehospitalization rates were similar between TAVR and SAVR in patients younger than 70 years with aortic stenosis. However, patients who underwent TAVR had greater rates and risk of reintervention, which did not affect mortality.
The optimal treatment strategy for residual Type A aortic arch dissections after ascending aortic replacement remains debated. This study evaluated the frequency and outcomes of aortic reinterventions after reoperative total arch replacement with frozen elephant trunk (FET) implantation for residual Type A aortic dissection. Between April 2015 and March 2025, 116 patients underwent elective redo arch replacement using the FET technique for residual Type A dissection at 6 European aortic centers. Incidence, procedural characteristics, and outcomes of patients requiring downstream aortic reinterventions were retrospectively analyzed. In-hospital mortality following FET implantation was 5% (n = 6). Among 110 hospital survivors, 43 patients (39%) required at least 1 aortic reintervention within 48 months: thoracic or abdominal endovascular aortic repair in 36 (80%), graft replacement in 2 (4.4%), and open thoracoabdominal replacement in 4 (9%). No spinal cord injury or disabling stroke occurred postoperatively; 3 patients (6%) died within 30 days of reintervention. After discharge, a second reintervention was necessary in 7 patients (16%), involving endovascular repair in 4 (57%) and open thoracoabdominal replacement in 3 (43%). The cumulative risk of any reintervention was 31.0% (95% CI, 23.1%-40.6%) at 2 years and 55.4% (95% CI, 42.0%-66.9%) at 4 years, whereas overall survival after FET was 87.8% (95% CI, 81.1%-95.1%) and 82.0% (95% CI, 72.1%-93.3%) at 2 and 4 years, respectively. Reoperative arch replacement with FET yields favorable survival but is accompanied by a substantial need for staged downstream aortic reinterventions, most of which can be managed endovascularly, underscoring the importance of structured long-term imaging surveillance.
Shared decision-making of prosthesis selection for aortic valve replacement (AVR) weighs patient-specific valve durability with oral anticoagulation requirements. Given evolving strategies of lifetime management, we evaluated contemporary longitudinal outcomes of patients aged 65 years or greater undergoing bioprosthetic versus mechanical AVR. Patients aged 65-85 who underwent isolated surgical AVR (2018-2022) were identified in the United States Centers for Medicare and Medicaid Services database and stratified by valve type as bioprosthetic (bAVR) or mechanical (mAVR). Doubly robust risk adjustment of variables including frailty was performed using inverse probability weighting of propensity scores, multivariable logistic regression, and time-to-event analysis with competing risks. The primary outcome was the composite of all-cause mortality, valve reintervention, stroke, and bleeding. The study cohort included a total of 69,423 patients (62,925 bAVR and 6498 mAVR). After comprehensive risk adjustment was performed, bAVR versus mAVR was associated with superior freedom from the primary composite outcome over the 5-year study period (hazard ratio [HR], 0.82; P < .001), as well as reduction in longitudinal mortality (HR, 0.78; P < .001), all-cause readmissions (HR, 0.89; P < .001), and readmissions for bleeding (HR, 0.47, P < .001) and heart failure (HR, 0.86; P < .001). No difference in aortic valve reintervention was observed between groups (HR, 0.93; P = .65). Similar trends favoring bAVR were observed in subanalyses of patients aged 65 to 69 years. For patients with preoperative end-stage renal disease, there was no difference in the longitudinal primary outcome between bAVR or mAVR. In Medicare beneficiaries, bAVR was associated with superior risk-adjusted survival, fewer bleeding complications, and fewer readmissions with no difference in valve reintervention compared with mAVR.
To investigate the prognosis of peripheral early-stage lung adenocarcinoma patterns treated by lobectomy or segmentectomy. Retrospective multicentric cohort of patients with cT1a-bN0M0 lung adenocarcinoma who underwent lobectomy or segmentectomy with systematic lymph node dissection in 10 European centers (one per country) from 2015 to 2021. Overall survival (OS), disease-free survival (DFS), and lung cancer-specific death (LCSD) between both groups were assessed in entire dataset and in dataset of histologic aggressive patterns, before and after propensity score-matching (PSM). Prognostic risk factors were analyzed using parsimonious model Cox regression. Recurrences were assessed by linearized risks. Lobectomy and segmentectomy were performed in 1029 (73.1%) and 377 (26.8%) patients, respectively. In total, 427 (30.3%) patients had at least 1 histologic aggressive (micropapillary or solid) pattern, and 88 patients (20.7%) underwent segmentectomy. OS, DFS, and LCSD rates were similar between patients who underwent lobectomy or segmentectomy, in both datasets, before and after PSM. In aggressive dataset, PSM, 5-year OS rates were lobectomy 88.0% (95% CI, 80.9-95.7%), segmentectomy 89.1% (95% CI, 82.2-96.6%), P = .8; 5-year DFS rates were lobectomy 79.8% (95% CI, 70.8-89.8%), segmentectomy 80.6% (95% CI, 71.6-90.6%), P = .6; and 5-year LCSD rates were lobectomy 6.0%, segmentectomy 7.8%, P = .8. Locoregional recurrence was not superior in patients who underwent segmentectomy in entire dataset (linearized risks: lobectomy 0.078, segmentectomy 0.073) and in aggressive dataset (linearized risks: lobectomy 0.036, segmentectomy 0.011) only in the unmatched cohorts. Aggressive histologic patterns impacted on only LCSD, and only when they were dominant. Segmentectomy seems comparable to lobectomy for patients with peripheral cT1a-bN0M0 lung adenocarcinoma even in case of histologic aggressive patterns.
The extent of arch replacement for aortic dissection remains controversial. The natural history of a residually dissected arch remains unknown, and the morbidity of arch reintervention in these settings is understudied. Therefore, the aim of this study was to investigate outcomes of redo total arch replacement in patients who had previous proximal aortic repair for acute type A dissection. In total, 79 consecutive patients underwent redo total arch replacement with frozen elephant trunk after previous type A dissection repair. Surgical outcomes were obtained from the institutional database and the electronic medical record was reviewed for follow-up outcomes. The median time between type A repair and redo total arch replacement with frozen elephant trunk was 4.25 [2.05, 7.78] years. The most common comorbidity was hypertension in 76 of 79 (96.2%), and 14 of 79 (17.7%) had previous cerebrovascular disease. The median crossclamp and circulatory arrest times (minutes) were 78 [61, 108] and 18 [14, 25], respectively. The operative mortality rate was 3 in 79 (3.8%), the postoperative stroke rate was 7 of in (8.9%), and the paraplegia rate was 4 in 79 (5.1%). The 1-year survival was 93.5% (88.1%-99.2%), and the 3-year cumulative incidence of aortic reintervention was 53.1% (38.3%-65.8%). In total, 35 patients required distal aortic reintervention, including 19 distal thoracic endovascular aortic repair extensions, and 5 open thoracoabdominal aneurysm repairs. Redo total arch replacement after previous proximal aortic repair for acute type A dissection can be performed safely at an experienced aortic referral center.
The prognostic significance of atrial fibrillation (AF) in patients with degenerative mitral regurgitation (MR) undergoing mitral valve (MV) repair remains controversial. The association between preexisting AF and long-term clinical and echocardiographic outcomes following mitral repair was analyzed. This retrospective study included 959 patients (mean age, 60 ± 12 years) who underwent MV repair for degenerative MR between 2001 and 2022. Patients were stratified by preoperative rhythm: 670 in sinus rhythm and 289 with AF. Serial echocardiographic changes were assessed using linear mixed models. The mean follow-up was 8.6 ± 5.4 years (8219 patient-years). Compared to patients with sinus rhythm, those with AF were older, more symptomatic, and exhibited greater cardiac remodeling, prompting higher rates of concomitant tricuspid annuloplasty (45% vs 5.7%) and ablation procedures (91% vs 0%) (P < .05 for all). At 10 years, AF patients had lower overall survival (89% vs 96%; P = .003) and freedom from composite adverse events-mitral reintervention, heart failure, stroke, catheter ablation, or pacemaker implantation (56% vs 80%; P < .001). Propensity score matching (n = 239 pairs) for 15 clinical covariates confirmed AF as an independent predictor of adverse events (adjusted hazard ratio, 1.8; 95% confidence interval, 1.3-2.5; P < .001). Additionally, AF patients demonstrated less improvement in left atrial size, tricuspid regurgitation (TR) pressure gradient, and TR severity up to 10 years (group effect P < .001). These findings indicate that AF identifies a higher-risk phenotype characterized by advanced atrial disease and persistent hemodynamic burden following MV repair. Future studies are needed to determine whether intervention before the development of advanced atrial remodeling or AF can improve long-term outcomes.
Staged single-lung transplantation (SSLT) has been proposed as a strategy for high-risk patients amid donor organ shortages. We aimed to compare modern SSLT versus bilateral lung transplantation (BLT) outcomes and identify factors associated with SSLT long-term survival. We retrospectively analyzed the United Network for Organ Sharing registry for all adult lung transplants from 2005 to 2021. Propensity-matched outcomes of SSLT (2 sequential, contralateral single-lung transplants) were compared with BLT. Kaplan-Meier methods were used to assess survival, and multivariable analysis identified independent predictors of mortality after SSLT. Among 188 recipients of SSLT and 2948 recipients of BLT, recipients of SSLT were older and more likely to have interstitial lung disease. Despite greater 1-year survival among recipients of SSLT (92.0% vs 87.5%), median overall survival was significantly shorter compared with the BLT cohort (5.8 vs 7.1 years), and both 5-year (54.8% vs 63.8%) and 10-year (27.4% vs 43.6%) survival estimates were lower (P < .001). Thirty-day mortality after the second transplant was also greater in SSLT (4.7% vs 2.7%, P = .04). On multivariable analysis, SSLT was associated with greater 3-year mortality (hazard ratio, 1.72; 95% CI, 1.03-2.84), although on subgroup analysis, those with a diagnosis of interstitial lung disease, age ≥65 years, or body mass index ≥30 and duration between staged lung transplantation ≥5 years at time of listing showed similar outcomes. In the modern era, BLT is associated with superior long-term survival compared with SSLT. SSLT can achieve acceptable early outcomes in select patients or donor-scarce situations, but BLT should remain the preferred approach when feasible.
Esophageal squamous cell carcinoma (ESCC) poses significant therapeutic challenges, with poor prognosis despite standard treatments including neoadjuvant chemoradiotherapy or chemotherapy. Emerging evidence suggests that adding immune checkpoint blockade to conventional chemotherapy-referred to as neoadjuvant immunochemotherapy (nICT)-enhances pathologic tumor regression; however, real-world data regarding long-term survival outcomes remain limited. This single-center retrospective study analyzed data from 204 patients with locally advanced ESCC treated between January 2019 and September 2022. Patients received either neoadjuvant chemotherapy (nCT; n = 95) or nICT (n = 109). To address baseline imbalances, stabilized inverse probability of treatment weighting (sIPTW) based on propensity scores was applied. The primary endpoints were overall survival (OS) and disease-free survival (DFS). The median duration of follow-up was 36.7 months (95% confidence interval [CI], 33.8-39.6 months). After applying sIPTW, compared to the nCT group, the nICT group exhibited significantly improved OS (hazard ratio [HR], 0.49; 95% CI, 0.26-0.91; P = .03) and DFS (HR, 0.58; 95% CI, 0.35-0.95; P = .04). The 3-year OS and DFS rates were 84.3% and 69.8%, respectively, in the nICT group and 68.8% and 58.1%, respectively, in the nCT group. In the nICT group, pathologic complete response (pCR) and major pathologic response (MPR) were associated with reduced risk of recurrence and death. nICT was associated with improved OS and DFS in patients with locally advanced ESCC. Higher rates of pCR and MPR were correlated with better survival outcomes. These findings support the integration of immunotherapy into neoadjuvant treatment strategies, although prospective trials remain necessary for validation.
The cross-sectional area (CSA) of the iliopsoas muscle on computed tomography is associated with sarcopenia, but its perioperative changes and prognostic significance in lung cancer remain unclear. This study aimed to clarify postoperative changes in the iliopsoas CSA and their prognostic correlation in the perioperative period of lung cancer. We analyzed 270 patients with lung cancer age ≥70 years who underwent lobectomy between January 2016 and December 2020. Iliopsoas CSA at the L3 vertebral level was measured preoperatively and at 6 and 12 months postoperatively and analyzed with ImageJ software. Patients were grouped based on whether their CSA decreased or increased between 6 months and 12 months postoperatively. Recurrence-free survival (RFS) and overall survival (OS) were compared between groups. Compared with preoperatively, the iliopsoas CSA was decreased significantly at 6 months (23.2 mm2; P < .01) and 12 months (15.5 mm2; P = .03). Patients in the decreasing CSA were older (P = .02), more often male (P = .02), and had poorer lung function (P = .04). Decreased CSA was linked to poorer 5-year RFS (P = .07) and significantly worse OS (P = .04). The multivariate analysis revealed that a decrease in iliopsoas muscle CSA between 6 months and 12 months postoperatively was an independent poor prognostic factor for RFS (P < .01) and OS (P = .05). In elderly patients with primary lung cancer, decreased iliopsoas CSA from 6 months to 12 months postoperatively was at least an independent poor prognostic factor for OS.
In the United States, neoadjuvant chemoradiotherapy with CROSS (41.6 Gy, carboplatin/paclitaxel) has been the standard for patients with operable locally advanced esophageal adenocarcinoma (EAC). Recently, the ESOPEC multicenter phase III trial reported superior overall survival (OS) with perioperative FLOT (5-FU/leucovorin/oxaliplatin/docetaxel) compared to CROSS. We aimed to examine trends in neoadjuvant treatment strategies and compare outcomes in a contemporary real-world cohort. Patients with clinical stage II-IVA (T2-4aN0-3M0 by the American Joint Committee on Cancer 8th Edition staging classification) EAC who underwent neoadjuvant chemoradiation (nCRT; ≥41.4 Gy) or neoadjuvant chemotherapy (nCT) followed by esophagectomy were identified in the National Cancer Database (2015-2022). Postoperative outcomes and overall survival (Kaplan-Meier) were compared in propensity-matched cohorts. A total of 9845 patients were identified, of whom 8955 (91.0%) received nCRT and 890 (9.0%) received nCT before esophagectomy. The median patient age was 65 years (interquartile range [IQR], 58-71 years), and 88.1% (n = 8674) of the patients were male. Over time, there was trend toward increasing use of nCT, from 6.7% of patients in 2015 to 9.6% in 2022 (P < .001). In well-balanced matched cohorts (1:1, nCT vs nCRT; n = 106), despite an improved pathologic response with nCRT, nCT was associated with improved OS (median, 51.4 months vs 40.5 months; P = .029; hazard ratio, 0.82; 95% confidence interval, 0.68-0.98) and a better 5-year OS rate (49.8% vs 43.3%) at a median follow-up of 28.8 months (IQR, 17.2-42.5 months). In a contemporary cohort of patients with locally advanced EAC, nCRT remained the predominant approach. However, consistent with ESOPEC, nCT alone was associated with improved OS.
To evaluate whether an institutional shift from anticoagulation with heparin to bivalirudin in Impella 5.5 patients altered bleeding or thrombotic rates, transfusion dependency, or device biocompatibility. All patients supported by an Impella 5.5 between August 2022 and December 2024 were reviewed retrospectively. Heparin was used before December 2023, at which point the switch was made to bivalirudin. Groups were defined by anticoagulant initiated by postimplantation day 2, and patients with crossover or anticoagulation pauses for >2 days were excluded from the laboratory analyses. Baseline characteristics, anticoagulation details, complications, and clinical outcomes were compared between the heparin and bivalirudin groups. Biocompatibility was assessed via laboratory values recorded over the first 14 days of Impella support or until device explantation in patients supported for <14 days. Among the 168 patients, 83 (49%) received heparin and 85 (51%) received bivalirudin. Baseline characteristics were similar in the 2 groups (P > .05), as were rates of thrombotic events (13% vs 14%; P = .871), total bleeding events (37% vs 30%; P = .275), and transfusion requirements (57% vs 51%; P = .433). However, bivalirudin patients demonstrated significantly faster rates of platelet recovery and lactate dehydrogenase reduction after Impella implantation (all P < .05), had fewer insertion site bleeds that required reoperation (10% vs 1%; P = .015), and were more often within their partial thromboplastin time goal (on 24% vs 41% of device-supported days; P ≤ .001). Systemic anticoagulation with bivalirudin in Impella 5.5 patients is associated with clinically meaningful improvements in postoperative thrombocytopenia and biocompatibility, with easier management, fewer insertion site bleeds, and more time within the target anticoagulation range.
We have demonstrated cardioprotection by the adenosine triphosphate-sensitive potassium (KATP) channel opener diazoxide in multiple animal models in an effort to reduce myocardial stunning following cardiac surgery. These translational findings supported this first-in-human Phase I safety and feasibility trial evaluating diazoxide as an additive to hypothermic, hyperkalemic cardioplegia. In this Food and Drug Administration-approved safety and feasibility trial, 30 patients undergoing nonemergent cardiac surgery (coronary artery bypass, aortic, or valve) received intracoronary diazoxide in the first dose of hypothermic, hyperkalemic cardioplegia at the time of cross-clamp placement. Safety and clinical endpoints, including a novel definition of myocardial stunning (need for inotropic support for >24 and <72 hours), myocardial enzymes, change in ejection fraction, and hemodynamic parameters, were collected. Thirty patients received intracoronary diazoxide. The Society of Thoracic Surgeons Predicted Risk of Mortality ranged from 0.3% to 8.0%. An increase in mean arterial pressure was noted with diazoxide administration (mean, 59.8 mm Hg before vs 62.0 mm Hg after). The mean time to arrest was 80 ± 43 seconds. Two patients (6.7%) had myocardial stunning. Mean peak lactate was 5.6 ± 5.1, peak creatine kinase was 939 ± 588 U/L, and peak troponin was 12,531 ± 18,615 ng/L. An increase in left ventricular ejection fraction of 2.1 ± 4.9% (prebypass to postbypass by transesophageal echocardiography) was noted. Intracoronary diazoxide is safe and feasible in patients undergoing nonemergent cardiac surgery.
This study was conducted to compare improvements in myocardial perfusion between Y-composite and aortocoronary configurations during 1 year after coronary artery bypass grafting (CABG). Of 311 off-pump CABG patients using the saphenous vein as Y-composite (composite group) or aortocoronary (aorta group) graft, 204 patients who underwent myocardial single-photon emission computed tomography during the preoperative period, at 3 months after surgery, and at 1 year after surgery were enrolled. After 2:1 propensity score matching, 43 matched sets with 86 composite and 43 aortocoronary patients were compared. Based on a 17-segment model, the degree of perfusion impairment for each segment was evaluated semiquantitatively via a reversibility score, which was defined as rest minus stress perfusion values. The improvement in perfusion was evaluated based on the temporal changes in reversibility scores and was compared using a linear mixed model. Of 2193 segments, 795 segments (518 and 277 segments in composite and aorta groups, respectively) demonstrated a significant decrease in perfusion on preoperative myocardial single-photon emission computed tomography and were successfully revascularized by CABG. Median (quartile 1, quartile 3) preoperative reversibility scores in the ischemic segments were 7 (4, 13) and 7 (4, 12) in composite and aorta groups, respectively (P = .575). Median (quartile 1, quartile 3) reversibility scores at 3 months and 1 year postsurgery were 0 (0, 4) and 0 (0, 5) in the composite group and 0 (0, 4) and 0 (0, 5) in the aorta group, and there was no significant difference in the pattern of improvement in myocardial perfusion between the groups (P = .592). Myocardial perfusion improvement during the first year after CABG was not significantly different between Y-composite and aortocoronary configurations.
暂无摘要(点击查看详情)
Randomized data support the mortality benefit of lung cancer screening (LCS); however, completion rates among those meeting eligibility criteria remain concerningly low. The Distressed Communities Index (DCI) is a composite socioeconomic metric assessed by ZIP code. We sought to investigate the association between DCI and completion of scheduled LCS appointments. A retrospective study was conducted in patients scheduled for LCS with low-dose computed tomography at a high-volume center in 2021. Patients were marked as a "no-show" when they missed or canceled their scheduled screening appointment. The association between DCI and screening no-show rates was modeled using a stepwise approach. A total of 1114 patients were scheduled for LCS, of whom 83.5% (n = 928) resided in ZIP codes considered "at-risk" or "distressed" in terms of socioeconomic distress based on their DCI score. In the entire cohort, 18.4% of patients (n = 205) did not attend any screening appointments scheduled within the year. Patients in the "at-risk" and "distressed" categories were more likely to have a no-show (19.5% vs 12%; P = .015). A higher DCI score was associated with increased probability of screening noncompletion on univariable analysis (odds ratio [OR], 1.02; 95% confidence interval [CI], 1.01-1.03; P = .002). On multivariable analysis, additional factors associated with higher likelihood of a no-show included current smoking status (adjusted OR [aOR], 2.59; 95% CI, 1.70-3.96) and initial screening (aOR, 2.07; 95% CI, 1.43-3.00). Higher DCI scores are associated with increased no-show rates for LCS appointments. This tool can be used to target populations at risk of underscreening and associated lung cancer mortality with navigational support or novel interventions to increase screening.
Liver cirrhosis is a known risk factor for mortality after cardiac surgery. In this study, we investigated the correlation between existing prediction models and actual mortality rates after cardiac surgery. Between July 2014 and December 2022, 1257 patients with a history of chronic liver disease underwent cardiac surgery, of whom 220 had a diagnosis of cirrhosis. Preoperative comorbidities, intraoperative details, and postoperative outcomes were analyzed. The Mayo Clinic and VOCAL-Penn scores were applied to the 220 patients and compared to observed mortality. Among the 220 patients with cirrhosis, the preoperative median MELD score was 10. According to the Mayo Clinic score, the mean predicted mortality rate was 20.9% at 30 days and 29.6% at 90 days. In contrast, according to the VOCAL-Penn score, the mean predicted mortality rate was 9.0% at 30 days and 14.0% at 90 days, whereas the observed postoperative mortality of the 220 patients was 4.1% at 30 days and 12.6% at 90 days. VOCAL-Penn showed good calibration but still overestimated mortality by 3% at 30 days and by 1.3% at 90 days. The Mayo Clinic score demonstrated weak calibration, overestimating mortality by 15% at 30 days and by 17% at 90 days. In this single-center analysis, the Mayo Clinic and VOCAL-Penn scores overestimated the postoperative mortality in patients with cirrhosis undergoing cardiac surgery, although VOCAL-Penn provided a more accurate risk prediction. Ultimately, this heterogeneous group of patients often presents with a wide range of comorbidities, which complicate risk stratification; however, patients should not be denied surgery solely on the basis of current risk prediction models.
Lung cancer presenting as subsolid lesions has been described as less aggressive than solid lung cancers. Thus, lobectomy for subsolid lesions may be an overly extensive resection compared with sublobar techniques. The objective of the current study was to compare oncological differences between patients receiving lobectomy and sublobar resection for lung cancers presenting as subsolid lesions less than 3 cm. A retrospective review of prospectively maintained databases from the International Early Lung Cancer Action Program, Initiative for Early Lung Cancer Research on Treatment, and Weill Cornell Medicine was conducted to identify lung cancers presenting as subsolid lesions treated with surgical resection. Solid lung cancers and nodules greater than 3 cm were excluded. Computed tomography imaging was used to determine the size of nodule and percent solid component. The primary outcome of interest was lung cancer-specific survival. The secondary outcomes of interest included disease-free survival and overall survival. Median duration from surgery to outcome or last follow-up was 57.7 months (interquartile range, 30.1-111.5 months). A total of 624 patients were identified and divided into 2 groups based on extent of resection: Group A underwent lobectomy (320 patients), and group B underwent sublobar resection (304 patients). Of patients undergoing sublobar resection, 61% (185) underwent wedge resection and 39% (119) underwent segmentectomy. Nodules were further stratified by percent solid component (nonsolid, <25% solid, 25%-49% solid, >50% solid). Sixteen patients (2.56%) of the entire study population developed recurrence, and 2 patients (0.3%) had died of lung cancer. At the end of follow-up, lung cancer-specific survival was 100% in group A and 98.2% in group B (95.2-100.0) (log-rank, P = .0709). Disease-free survival in group A was 67.4% (95% CI, 57.8-77.0) and 71.1% in group B (95% CI, 59.9-82.2) (log-rank, P = .8497). Overall survival was 68.2% in group A (95% CI, 58.5-77.9) and 72.3% in group B (95% CI, 60.9-83.8) (log-rank, P = .9124). Lung cancer-specific survival and disease-free survival were not significantly different when comparing lobectomy directly with wedge resection and segmentectomy, respectively, for the entire cohort. Patients with tumors greater than 2 cm treated with sublobar resection had worse disease-free survival compared with those treated with lobectomy. Percent solid component did not significantly impact lung cancer-specific survival or disease-free survival. In this large cohort of patients with lung cancers presenting as subsolid nodules less than 3 cm, lung cancer-specific survival was excellent and similar between lobectomy and sublobar resection. Disease-free survival was worse for patients with tumors greater than 2 cm treated with sublobar resection.
Thoracic Surgery Oncology Group 103 was a prospective multi-institutional trial that aimed to (1) evaluate the role of perioperative chemotherapy for low-risk patients with metastatic colorectal cancer undergoing pulmonary metastasectomy and (2) characterize the impact of surgery on outcomes for high-risk patients with metastatic colorectal cancer receiving chemotherapy. From July 2018 to September 2023, patients with histologically confirmed primary colorectal cancer and lung metastases amenable to complete margin-negative resection from 3 institutions were enrolled and stratified by low- and high-risk clinical features and randomized accordingly within 2 treatment paradigms. A total of 22 and 26 patients were enrolled in the low- and high-risk cohorts, respectively. Randomization within the low-risk cohort resulted in 8 individuals (40.0%) receiving perioperative chemotherapy and surgery (+Chemo) and 12 patients (60.0%) undergoing surgery alone (-Chemo). Median overall survival was unable to be calculated for either group, and median recurrence-free survival was 21.8 months for +Chemo and not yet reached for -Chemo (P = .33). Among high-risk patients receiving chemotherapy, 8 (36.4%) remained on chemotherapy only (-SX) and 14 (63.6%) underwent pulmonary metastasectomy (+SX). Partial response to initial chemotherapy was achieved in 4 (50.0%) -SX patients and 6 (42.9%, P = 1.00) +SX patients. No deaths occurred in either group, and median recurrence-free survival was 33.4 months in the -SX group and 55.8 (P = .95) months in the +SX group. Patient accrual targets were not reached, leaving this study underpowered; as such, all analyses are descriptive and hypothesis generating. We were unable to determine differences in survival in the high-risk cohort; however, our findings suggest that adequately selected low-risk individuals can be treated with up-front pulmonary metastasectomy without additional lung-directed chemotherapy.
Aortic diameter is the primary metric used to trigger surgical intervention in chronic type B aortic dissection (TBAD). This study investigates the correlation between diameter growth and changes in aortic centerline length (CL), centerline curvature (CC), and aortic volume in chronic TBAD. We retrospectively collected 88 computed tomography (CT) images from 35 patients with chronic TBAD. Eighteen patients had 3 CT images for analysis (CT-1, CT-2, and CT-3), whereas the remaining 17 had 2 scans (CT-1 and CT-2). Three-dimensional geometry and aortic centerlines were reconstructed from each scan, and the mean and maximum TBAD diameters were obtained. The CL and CC from the left subclavian to celiac branch were also obtained. The relationship between the diameter growth rate versus CL/CC growth rate was investigated using linear regression analysis. Between CT-1 and CT-2, the CL growth rate had an inverse correlation with the mean ( R = - 0.8169 , P < . 0001 ) and maximum ( R = - 0.8043 , P < . 001 ) diameter growth rate. The CC growth rate also had an inverse correlation with the mean ( R = - 0.8460 , P < . 001 ) and maximum ( R = - 0.8588 , P < . 001 ) diameter growth rate. Between CT-2 and CT-3, we found similar inverse correlations between diameter growth rate versus CL/CC growth rate (all | R | > 0.78 , P < . 001 ). Moreover, early CL/CC growth (CT-1 and CT-2) was a stronger predictor of late diameter growth (CT-2 and CT-3) ( | R | > 0.72 ) than early diameter growth itself ( | R | < 0.61 ), indicating the centerline length/curvature may be better predictors for diameter growth of chronic TBAD. In chronic TBAD, the CL/CC growth has an inverse relationship with aortic diameter growth, which may be used to predict diameter of chronic TBAD.