We evaluated the feasibility and long-term outcomes of minimally invasive surgery (MIS) after neoadjuvant immunotherapy in patients with resectable non-small cell lung cancer (NSCLC). Patients with stage IB-IIIB NSCLC who underwent induction immunotherapy followed by surgery (2015-2022) were included. Overall and event-free survival were estimated using the Kaplan-Meier approach. Cox regression was used to quantify associations between clinical and pathologic variables and survival outcomes. In total, 69 patients met the inclusion criteria (MIS, n = 38; open surgery, n = 31). Patients who underwent open surgery had larger tumors (P < .001), more-frequent extended resection (P = .007), more major complications (P = .03), and longer in-hospital stay (P = .002), compared with patients who underwent MIS. The rate of conversion to open surgery in the MIS group was 16% (6/38). Histologic subtype (P = .057), completeness of resection (P > .9), response to treatment (P > .9), and stage (P = .6) were not statistically different between groups. Five-year event-free and overall survival in the MIS patients were 72% and 80%, respectively. Among all patients, pathologic N2 disease (hazard ratio [HR], 3.04; 95% CI, 1.1-8.38; P = .031) and upstaging (HR, 2.57; 95% CI, 1.05-6.34; P = .04) were associated with a greater risk of recurrence. Major pathologic response was associated with longer event-free survival (HR, 0.22; 95% CI, 0.05-0.93; P = .039). Programmed death-ligand 1 level (HR, 1.02; 95% CI, 1.01-1.04; P = .007) and tumor dimensional reduction after treatment (HR, 0.96; 95% CI, 0.93-0.99; P = .016) were associated with major pathologic response. MIS after induction immunotherapy is feasible and does not compromise long-term oncologic outcomes.
The optimal treatment strategy for residual Type A aortic arch dissections after ascending aortic replacement remains debated. This study evaluated the frequency and outcomes of aortic reinterventions after reoperative total arch replacement with frozen elephant trunk (FET) implantation for residual Type A aortic dissection. Between April 2015 and March 2025, 116 patients underwent elective redo arch replacement using the FET technique for residual Type A dissection at 6 European aortic centers. Incidence, procedural characteristics, and outcomes of patients requiring downstream aortic reinterventions were retrospectively analyzed. In-hospital mortality following FET implantation was 5% (n = 6). Among 110 hospital survivors, 43 patients (39%) required at least 1 aortic reintervention within 48 months: thoracic or abdominal endovascular aortic repair in 36 (80%), graft replacement in 2 (4.4%), and open thoracoabdominal replacement in 4 (9%). No spinal cord injury or disabling stroke occurred postoperatively; 3 patients (6%) died within 30 days of reintervention. After discharge, a second reintervention was necessary in 7 patients (16%), involving endovascular repair in 4 (57%) and open thoracoabdominal replacement in 3 (43%). The cumulative risk of any reintervention was 31.0% (95% CI, 23.1%-40.6%) at 2 years and 55.4% (95% CI, 42.0%-66.9%) at 4 years, whereas overall survival after FET was 87.8% (95% CI, 81.1%-95.1%) and 82.0% (95% CI, 72.1%-93.3%) at 2 and 4 years, respectively. Reoperative arch replacement with FET yields favorable survival but is accompanied by a substantial need for staged downstream aortic reinterventions, most of which can be managed endovascularly, underscoring the importance of structured long-term imaging surveillance.
Shared decision-making of prosthesis selection for aortic valve replacement (AVR) weighs patient-specific valve durability with oral anticoagulation requirements. Given evolving strategies of lifetime management, we evaluated contemporary longitudinal outcomes of patients aged 65 years or greater undergoing bioprosthetic versus mechanical AVR. Patients aged 65-85 who underwent isolated surgical AVR (2018-2022) were identified in the United States Centers for Medicare and Medicaid Services database and stratified by valve type as bioprosthetic (bAVR) or mechanical (mAVR). Doubly robust risk adjustment of variables including frailty was performed using inverse probability weighting of propensity scores, multivariable logistic regression, and time-to-event analysis with competing risks. The primary outcome was the composite of all-cause mortality, valve reintervention, stroke, and bleeding. The study cohort included a total of 69,423 patients (62,925 bAVR and 6498 mAVR). After comprehensive risk adjustment was performed, bAVR versus mAVR was associated with superior freedom from the primary composite outcome over the 5-year study period (hazard ratio [HR], 0.82; P < .001), as well as reduction in longitudinal mortality (HR, 0.78; P < .001), all-cause readmissions (HR, 0.89; P < .001), and readmissions for bleeding (HR, 0.47, P < .001) and heart failure (HR, 0.86; P < .001). No difference in aortic valve reintervention was observed between groups (HR, 0.93; P = .65). Similar trends favoring bAVR were observed in subanalyses of patients aged 65 to 69 years. For patients with preoperative end-stage renal disease, there was no difference in the longitudinal primary outcome between bAVR or mAVR. In Medicare beneficiaries, bAVR was associated with superior risk-adjusted survival, fewer bleeding complications, and fewer readmissions with no difference in valve reintervention compared with mAVR.
Transcatheter aortic valve replacement (TAVR) is an established alternative to surgical aortic valve replacement (SAVR) in intermediate- and low-risk patients with severe aortic stenosis. However, most randomized trials have enrolled patients older than 70 years with limited follow-up, and evidence on long-term outcomes in younger populations remains insufficient. Our objective was to compare long-term clinical outcomes between propensity-matched TAVR and SAVR patients aged <70 years. A total of 959 patients (SAVR: 808, TAVR: 151) were included, resulting in 132 propensity-matched patients per group. The primary outcome was a composite of all-cause death, stroke, and procedure- or valve-related hospitalization at 30 days, 2 years, 5 years, and up to 10 years. A flexible parametric survival model was used to evaluate the hazard ratio (HR) for reintervention and its time dynamics. Baseline characteristics were balanced between matched groups (mean age 64.9 years; Society of Thoracic Surgeons Predicted Risk of Mortality score 2.9%). At 10 years, primary outcome rates were similar (SAVR: 50.8% vs TAVR: 42.9%; HR, 1.01; 95% CI, 0.66-1.53; P = .95). Patients who underwent SAVR had greater rates of acute kidney injury, bleeding, and new-onset atrial fibrillation during hospitalization (P < .01), whereas permanent pacemaker rates were similar (P = .140). Reintervention was more frequent after TAVR (10.6% vs 3.7%; P = .03), and flexible parametric survival model showed a greater risk of reintervention in TAVR (HR, 3.63; 95% CI, 0.73-1.78; P = .01). At long-term follow-up, all-cause mortality, stroke, and rehospitalization rates were similar between TAVR and SAVR in patients younger than 70 years with aortic stenosis. However, patients who underwent TAVR had greater rates and risk of reintervention, which did not affect mortality.
To investigate the prognosis of peripheral early-stage lung adenocarcinoma patterns treated by lobectomy or segmentectomy. Retrospective multicentric cohort of patients with cT1a-bN0M0 lung adenocarcinoma who underwent lobectomy or segmentectomy with systematic lymph node dissection in 10 European centers (one per country) from 2015 to 2021. Overall survival (OS), disease-free survival (DFS), and lung cancer-specific death (LCSD) between both groups were assessed in entire dataset and in dataset of histologic aggressive patterns, before and after propensity score-matching (PSM). Prognostic risk factors were analyzed using parsimonious model Cox regression. Recurrences were assessed by linearized risks. Lobectomy and segmentectomy were performed in 1029 (73.1%) and 377 (26.8%) patients, respectively. In total, 427 (30.3%) patients had at least 1 histologic aggressive (micropapillary or solid) pattern, and 88 patients (20.7%) underwent segmentectomy. OS, DFS, and LCSD rates were similar between patients who underwent lobectomy or segmentectomy, in both datasets, before and after PSM. In aggressive dataset, PSM, 5-year OS rates were lobectomy 88.0% (95% CI, 80.9-95.7%), segmentectomy 89.1% (95% CI, 82.2-96.6%), P = .8; 5-year DFS rates were lobectomy 79.8% (95% CI, 70.8-89.8%), segmentectomy 80.6% (95% CI, 71.6-90.6%), P = .6; and 5-year LCSD rates were lobectomy 6.0%, segmentectomy 7.8%, P = .8. Locoregional recurrence was not superior in patients who underwent segmentectomy in entire dataset (linearized risks: lobectomy 0.078, segmentectomy 0.073) and in aggressive dataset (linearized risks: lobectomy 0.036, segmentectomy 0.011) only in the unmatched cohorts. Aggressive histologic patterns impacted on only LCSD, and only when they were dominant. Segmentectomy seems comparable to lobectomy for patients with peripheral cT1a-bN0M0 lung adenocarcinoma even in case of histologic aggressive patterns.
Staged single-lung transplantation (SSLT) has been proposed as a strategy for high-risk patients amid donor organ shortages. We aimed to compare modern SSLT versus bilateral lung transplantation (BLT) outcomes and identify factors associated with SSLT long-term survival. We retrospectively analyzed the United Network for Organ Sharing registry for all adult lung transplants from 2005 to 2021. Propensity-matched outcomes of SSLT (2 sequential, contralateral single-lung transplants) were compared with BLT. Kaplan-Meier methods were used to assess survival, and multivariable analysis identified independent predictors of mortality after SSLT. Among 188 recipients of SSLT and 2948 recipients of BLT, recipients of SSLT were older and more likely to have interstitial lung disease. Despite greater 1-year survival among recipients of SSLT (92.0% vs 87.5%), median overall survival was significantly shorter compared with the BLT cohort (5.8 vs 7.1 years), and both 5-year (54.8% vs 63.8%) and 10-year (27.4% vs 43.6%) survival estimates were lower (P < .001). Thirty-day mortality after the second transplant was also greater in SSLT (4.7% vs 2.7%, P = .04). On multivariable analysis, SSLT was associated with greater 3-year mortality (hazard ratio, 1.72; 95% CI, 1.03-2.84), although on subgroup analysis, those with a diagnosis of interstitial lung disease, age ≥65 years, or body mass index ≥30 and duration between staged lung transplantation ≥5 years at time of listing showed similar outcomes. In the modern era, BLT is associated with superior long-term survival compared with SSLT. SSLT can achieve acceptable early outcomes in select patients or donor-scarce situations, but BLT should remain the preferred approach when feasible.
In the United States, neoadjuvant chemoradiotherapy with CROSS (41.6 Gy, carboplatin/paclitaxel) has been the standard for patients with operable locally advanced esophageal adenocarcinoma (EAC). Recently, the ESOPEC multicenter phase III trial reported superior overall survival (OS) with perioperative FLOT (5-FU/leucovorin/oxaliplatin/docetaxel) compared to CROSS. We aimed to examine trends in neoadjuvant treatment strategies and compare outcomes in a contemporary real-world cohort. Patients with clinical stage II-IVA (T2-4aN0-3M0 by the American Joint Committee on Cancer 8th Edition staging classification) EAC who underwent neoadjuvant chemoradiation (nCRT; ≥41.4 Gy) or neoadjuvant chemotherapy (nCT) followed by esophagectomy were identified in the National Cancer Database (2015-2022). Postoperative outcomes and overall survival (Kaplan-Meier) were compared in propensity-matched cohorts. A total of 9845 patients were identified, of whom 8955 (91.0%) received nCRT and 890 (9.0%) received nCT before esophagectomy. The median patient age was 65 years (interquartile range [IQR], 58-71 years), and 88.1% (n = 8674) of the patients were male. Over time, there was trend toward increasing use of nCT, from 6.7% of patients in 2015 to 9.6% in 2022 (P < .001). In well-balanced matched cohorts (1:1, nCT vs nCRT; n = 106), despite an improved pathologic response with nCRT, nCT was associated with improved OS (median, 51.4 months vs 40.5 months; P = .029; hazard ratio, 0.82; 95% confidence interval, 0.68-0.98) and a better 5-year OS rate (49.8% vs 43.3%) at a median follow-up of 28.8 months (IQR, 17.2-42.5 months). In a contemporary cohort of patients with locally advanced EAC, nCRT remained the predominant approach. However, consistent with ESOPEC, nCT alone was associated with improved OS.
Esophageal squamous cell carcinoma (ESCC) poses significant therapeutic challenges, with poor prognosis despite standard treatments including neoadjuvant chemoradiotherapy or chemotherapy. Emerging evidence suggests that adding immune checkpoint blockade to conventional chemotherapy-referred to as neoadjuvant immunochemotherapy (nICT)-enhances pathologic tumor regression; however, real-world data regarding long-term survival outcomes remain limited. This single-center retrospective study analyzed data from 204 patients with locally advanced ESCC treated between January 2019 and September 2022. Patients received either neoadjuvant chemotherapy (nCT; n = 95) or nICT (n = 109). To address baseline imbalances, stabilized inverse probability of treatment weighting (sIPTW) based on propensity scores was applied. The primary endpoints were overall survival (OS) and disease-free survival (DFS). The median duration of follow-up was 36.7 months (95% confidence interval [CI], 33.8-39.6 months). After applying sIPTW, compared to the nCT group, the nICT group exhibited significantly improved OS (hazard ratio [HR], 0.49; 95% CI, 0.26-0.91; P = .03) and DFS (HR, 0.58; 95% CI, 0.35-0.95; P = .04). The 3-year OS and DFS rates were 84.3% and 69.8%, respectively, in the nICT group and 68.8% and 58.1%, respectively, in the nCT group. In the nICT group, pathologic complete response (pCR) and major pathologic response (MPR) were associated with reduced risk of recurrence and death. nICT was associated with improved OS and DFS in patients with locally advanced ESCC. Higher rates of pCR and MPR were correlated with better survival outcomes. These findings support the integration of immunotherapy into neoadjuvant treatment strategies, although prospective trials remain necessary for validation.
To evaluate whether an institutional shift from anticoagulation with heparin to bivalirudin in Impella 5.5 patients altered bleeding or thrombotic rates, transfusion dependency, or device biocompatibility. All patients supported by an Impella 5.5 between August 2022 and December 2024 were reviewed retrospectively. Heparin was used before December 2023, at which point the switch was made to bivalirudin. Groups were defined by anticoagulant initiated by postimplantation day 2, and patients with crossover or anticoagulation pauses for >2 days were excluded from the laboratory analyses. Baseline characteristics, anticoagulation details, complications, and clinical outcomes were compared between the heparin and bivalirudin groups. Biocompatibility was assessed via laboratory values recorded over the first 14 days of Impella support or until device explantation in patients supported for <14 days. Among the 168 patients, 83 (49%) received heparin and 85 (51%) received bivalirudin. Baseline characteristics were similar in the 2 groups (P > .05), as were rates of thrombotic events (13% vs 14%; P = .871), total bleeding events (37% vs 30%; P = .275), and transfusion requirements (57% vs 51%; P = .433). However, bivalirudin patients demonstrated significantly faster rates of platelet recovery and lactate dehydrogenase reduction after Impella implantation (all P < .05), had fewer insertion site bleeds that required reoperation (10% vs 1%; P = .015), and were more often within their partial thromboplastin time goal (on 24% vs 41% of device-supported days; P ≤ .001). Systemic anticoagulation with bivalirudin in Impella 5.5 patients is associated with clinically meaningful improvements in postoperative thrombocytopenia and biocompatibility, with easier management, fewer insertion site bleeds, and more time within the target anticoagulation range.
We have demonstrated cardioprotection by the adenosine triphosphate-sensitive potassium (KATP) channel opener diazoxide in multiple animal models in an effort to reduce myocardial stunning following cardiac surgery. These translational findings supported this first-in-human Phase I safety and feasibility trial evaluating diazoxide as an additive to hypothermic, hyperkalemic cardioplegia. In this Food and Drug Administration-approved safety and feasibility trial, 30 patients undergoing nonemergent cardiac surgery (coronary artery bypass, aortic, or valve) received intracoronary diazoxide in the first dose of hypothermic, hyperkalemic cardioplegia at the time of cross-clamp placement. Safety and clinical endpoints, including a novel definition of myocardial stunning (need for inotropic support for >24 and <72 hours), myocardial enzymes, change in ejection fraction, and hemodynamic parameters, were collected. Thirty patients received intracoronary diazoxide. The Society of Thoracic Surgeons Predicted Risk of Mortality ranged from 0.3% to 8.0%. An increase in mean arterial pressure was noted with diazoxide administration (mean, 59.8 mm Hg before vs 62.0 mm Hg after). The mean time to arrest was 80 ± 43 seconds. Two patients (6.7%) had myocardial stunning. Mean peak lactate was 5.6 ± 5.1, peak creatine kinase was 939 ± 588 U/L, and peak troponin was 12,531 ± 18,615 ng/L. An increase in left ventricular ejection fraction of 2.1 ± 4.9% (prebypass to postbypass by transesophageal echocardiography) was noted. Intracoronary diazoxide is safe and feasible in patients undergoing nonemergent cardiac surgery.
The cross-sectional area (CSA) of the iliopsoas muscle on computed tomography is associated with sarcopenia, but its perioperative changes and prognostic significance in lung cancer remain unclear. This study aimed to clarify postoperative changes in the iliopsoas CSA and their prognostic correlation in the perioperative period of lung cancer. We analyzed 270 patients with lung cancer age ≥70 years who underwent lobectomy between January 2016 and December 2020. Iliopsoas CSA at the L3 vertebral level was measured preoperatively and at 6 and 12 months postoperatively and analyzed with ImageJ software. Patients were grouped based on whether their CSA decreased or increased between 6 months and 12 months postoperatively. Recurrence-free survival (RFS) and overall survival (OS) were compared between groups. Compared with preoperatively, the iliopsoas CSA was decreased significantly at 6 months (23.2 mm2; P < .01) and 12 months (15.5 mm2; P = .03). Patients in the decreasing CSA were older (P = .02), more often male (P = .02), and had poorer lung function (P = .04). Decreased CSA was linked to poorer 5-year RFS (P = .07) and significantly worse OS (P = .04). The multivariate analysis revealed that a decrease in iliopsoas muscle CSA between 6 months and 12 months postoperatively was an independent poor prognostic factor for RFS (P < .01) and OS (P = .05). In elderly patients with primary lung cancer, decreased iliopsoas CSA from 6 months to 12 months postoperatively was at least an independent poor prognostic factor for OS.
This study was conducted to compare improvements in myocardial perfusion between Y-composite and aortocoronary configurations during 1 year after coronary artery bypass grafting (CABG). Of 311 off-pump CABG patients using the saphenous vein as Y-composite (composite group) or aortocoronary (aorta group) graft, 204 patients who underwent myocardial single-photon emission computed tomography during the preoperative period, at 3 months after surgery, and at 1 year after surgery were enrolled. After 2:1 propensity score matching, 43 matched sets with 86 composite and 43 aortocoronary patients were compared. Based on a 17-segment model, the degree of perfusion impairment for each segment was evaluated semiquantitatively via a reversibility score, which was defined as rest minus stress perfusion values. The improvement in perfusion was evaluated based on the temporal changes in reversibility scores and was compared using a linear mixed model. Of 2193 segments, 795 segments (518 and 277 segments in composite and aorta groups, respectively) demonstrated a significant decrease in perfusion on preoperative myocardial single-photon emission computed tomography and were successfully revascularized by CABG. Median (quartile 1, quartile 3) preoperative reversibility scores in the ischemic segments were 7 (4, 13) and 7 (4, 12) in composite and aorta groups, respectively (P = .575). Median (quartile 1, quartile 3) reversibility scores at 3 months and 1 year postsurgery were 0 (0, 4) and 0 (0, 5) in the composite group and 0 (0, 4) and 0 (0, 5) in the aorta group, and there was no significant difference in the pattern of improvement in myocardial perfusion between the groups (P = .592). Myocardial perfusion improvement during the first year after CABG was not significantly different between Y-composite and aortocoronary configurations.
The prognostic significance of atrial fibrillation (AF) in patients with degenerative mitral regurgitation (MR) undergoing mitral valve (MV) repair remains controversial. The association between preexisting AF and long-term clinical and echocardiographic outcomes following mitral repair was analyzed. This retrospective study included 959 patients (mean age, 60 ± 12 years) who underwent MV repair for degenerative MR between 2001 and 2022. Patients were stratified by preoperative rhythm: 670 in sinus rhythm and 289 with AF. Serial echocardiographic changes were assessed using linear mixed models. The mean follow-up was 8.6 ± 5.4 years (8219 patient-years). Compared to patients with sinus rhythm, those with AF were older, more symptomatic, and exhibited greater cardiac remodeling, prompting higher rates of concomitant tricuspid annuloplasty (45% vs 5.7%) and ablation procedures (91% vs 0%) (P < .05 for all). At 10 years, AF patients had lower overall survival (89% vs 96%; P = .003) and freedom from composite adverse events-mitral reintervention, heart failure, stroke, catheter ablation, or pacemaker implantation (56% vs 80%; P < .001). Propensity score matching (n = 239 pairs) for 15 clinical covariates confirmed AF as an independent predictor of adverse events (adjusted hazard ratio, 1.8; 95% confidence interval, 1.3-2.5; P < .001). Additionally, AF patients demonstrated less improvement in left atrial size, tricuspid regurgitation (TR) pressure gradient, and TR severity up to 10 years (group effect P < .001). These findings indicate that AF identifies a higher-risk phenotype characterized by advanced atrial disease and persistent hemodynamic burden following MV repair. Future studies are needed to determine whether intervention before the development of advanced atrial remodeling or AF can improve long-term outcomes.
The extent of arch replacement for aortic dissection remains controversial. The natural history of a residually dissected arch remains unknown, and the morbidity of arch reintervention in these settings is understudied. Therefore, the aim of this study was to investigate outcomes of redo total arch replacement in patients who had previous proximal aortic repair for acute type A dissection. In total, 79 consecutive patients underwent redo total arch replacement with frozen elephant trunk after previous type A dissection repair. Surgical outcomes were obtained from the institutional database and the electronic medical record was reviewed for follow-up outcomes. The median time between type A repair and redo total arch replacement with frozen elephant trunk was 4.25 [2.05, 7.78] years. The most common comorbidity was hypertension in 76 of 79 (96.2%), and 14 of 79 (17.7%) had previous cerebrovascular disease. The median crossclamp and circulatory arrest times (minutes) were 78 [61, 108] and 18 [14, 25], respectively. The operative mortality rate was 3 in 79 (3.8%), the postoperative stroke rate was 7 of in (8.9%), and the paraplegia rate was 4 in 79 (5.1%). The 1-year survival was 93.5% (88.1%-99.2%), and the 3-year cumulative incidence of aortic reintervention was 53.1% (38.3%-65.8%). In total, 35 patients required distal aortic reintervention, including 19 distal thoracic endovascular aortic repair extensions, and 5 open thoracoabdominal aneurysm repairs. Redo total arch replacement after previous proximal aortic repair for acute type A dissection can be performed safely at an experienced aortic referral center.
Latin America faces significant disparities in cardiovascular surgery training access. We used a driven geospatial analysis to map surgical deserts-underserved regions to surgical education-and quantified disparities through population-adjusted density metrics and structural variables. Geospatial analysis mapped all cardiovascular surgery training programs in Latin America. A Composite Access Index (density, travel time, economics, structure) was developed. Hierarchical clustering and regression were used to identify surgical deserts and structural impacts, revealing 3 types: geographic, structural, and economic. A total of 243 cardiovascular surgery programs across 19 Latin American countries provided 454 annual positions. Brazil leads with 170 programs and 280 positions, whereas Guatemala, Honduras, and Nicaragua each have 1 program offering 2 positions. Program density is greatest in Cuba (3.39 positions/million), followed by Uruguay (0.88), Peru (0.73), and Panama (0.67), and lowest in Guatemala (0.11), Argentina (0.13), and Venezuela (0.14). Integrated residency models exist in Cuba, Panama, Peru, and Ecuador; only Brazil, Chile, Mexico, and Venezuela require board examinations. Health spending per capita ranges from $934 in Chile to $35 in Haiti. The Composite Access Index highlights high-access countries (Cuba 1.00, Uruguay 0.95, Chile 0.85), moderate-access (Brazil 0.69, Peru 0.75, Colombia 0.64), and limited-access "surgical deserts" (Bolivia 0.31, Guatemala 0.27). Regression analyses showed no significant effect of structure requirements on access (P > .77). Funding, decentralization, and geography are primary determinants of equitable training. This comprehensive mapped study exposes that cardiovascular surgery training in Latin America is highly unequal; surgical deserts persist as a result of limited funding, geographic isolation, and program decentralization, whereas structural requirements minimally influence access.
暂无摘要(点击查看详情)
To investigate the long-term outcomes of double-outlet right ventricle (DORV), focusing on identifying specific subtypes capable of achieving successful biventricular repair (BVR) with favorable results. A retrospective review was conducted of 274 DORV patients followed at a single center from 1999 to 2023. After exclusions, 238 patients were analyzed, including 97 who underwent single ventricle palliation (SVP) and 141 who underwent BVR. Outcomes were assessed using Kaplan-Meier survival and freedom from reintervention estimates, with competing risk analysis for survival, transplantation, or death. The median follow-up was 11.9 years. Subaortic and subpulmonary ventricular septal defects (VSDs) were more common in the BVR group, while noncommitted VSDs and features suggestive of heterotaxy (eg, persistent left superior vena cava, complete atrioventricular septal defect) were more frequent in SVP. BVR tended toward improved transplantation-free survival (8% vs 16% for transplantation or death; P = .06). At 40 years post-repair, estimated outcomes were 70% survival, 8% transplantation, and 20% mortality for BVR, compared to 45%, 18%, and 37%, respectively, for SVP. Subgroup analysis showed equivalent survival in noncommitted, subpulmonary, and doubly committed types, with superior survival in subaortic DORV after BVR (100% vs 75%; P = .025). Freedom from reintervention was also comparable across subtypes. Our study shows that BVR was associated with improved outcomes compared to SVP, including higher survival rates and lower rates of both mortality and transplantation,. Decades of experience at our institution demonstrates the feasibility of BVR in complex DORV subtypes, such as noncommitted and subpulmonary VSD.
To evaluate the relationship between intraoperative 3-dimensional (3D) echocardiographic measures of right ventricular (RV) function and invasively derived RV-pulmonary arterial (RV-PA) coupling metrics obtained from single-beat end-systolic pressure-volume analysis, identify echocardiographic surrogates capable of discriminating impaired RV-PA coupling during cardiac surgery. In a prospective observational study of 78 adult patients undergoing cardiac surgery with transesophageal echocardiography and pulmonary artery catheter monitoring, RV volumes and functional indices were quantified using 3D TEE, whereas RV-PA coupling (Ees/Ea) was calculated using the V0 single-beat method. Patients were stratified by coupling status using an Ees/Ea cutoff of 0.8. Relationships between echocardiographic variables and coupling metrics were analyzed using Spearman correlation, receiver operating characteristic analysis, and multivariable logistic regression adjusted for Society of Thoracic Surgeons risk score and perfusion time. Patients with impaired RV-PA coupling (28.2%) demonstrated significantly larger 3D end-systolic volume and reduced 3D ejection fraction (EF) and 4-dimensional-derived fractional area change (FAC) compared with those with preserved coupling. 3D EF showed the strongest correlation with coupling (ρ = 0.94, P < .001) and significant discriminative performance (area under the curve = 0.94; optimal cutoff <44.2%). Indexed end-systolic volume and 3D FAC also reliably identified impaired coupling, whereas TAPSE and TAPSE/PASP did not. Exploratory analyses showed that low Ees/Ea and reduced 3D FAC/right ventricular systolic pressure were associated with prolonged postoperative vasopressor/inotrope use. Intraoperative 3D volumetric indices, particularly 3D EF and FAC, may closely reflect invasively measured RV-PA coupling and potentially identify patients with impaired coupling. These findings may support the use of 3D echocardiography as a practical surrogate for intraoperative RV-PA coupling assessment during cardiac surgery.
Randomized data support the mortality benefit of lung cancer screening (LCS); however, completion rates among those meeting eligibility criteria remain concerningly low. The Distressed Communities Index (DCI) is a composite socioeconomic metric assessed by ZIP code. We sought to investigate the association between DCI and completion of scheduled LCS appointments. A retrospective study was conducted in patients scheduled for LCS with low-dose computed tomography at a high-volume center in 2021. Patients were marked as a "no-show" when they missed or canceled their scheduled screening appointment. The association between DCI and screening no-show rates was modeled using a stepwise approach. A total of 1114 patients were scheduled for LCS, of whom 83.5% (n = 928) resided in ZIP codes considered "at-risk" or "distressed" in terms of socioeconomic distress based on their DCI score. In the entire cohort, 18.4% of patients (n = 205) did not attend any screening appointments scheduled within the year. Patients in the "at-risk" and "distressed" categories were more likely to have a no-show (19.5% vs 12%; P = .015). A higher DCI score was associated with increased probability of screening noncompletion on univariable analysis (odds ratio [OR], 1.02; 95% confidence interval [CI], 1.01-1.03; P = .002). On multivariable analysis, additional factors associated with higher likelihood of a no-show included current smoking status (adjusted OR [aOR], 2.59; 95% CI, 1.70-3.96) and initial screening (aOR, 2.07; 95% CI, 1.43-3.00). Higher DCI scores are associated with increased no-show rates for LCS appointments. This tool can be used to target populations at risk of underscreening and associated lung cancer mortality with navigational support or novel interventions to increase screening.
Complete atrioventricular septal defect (CAVSD) repair is ideally performed within the first 6 months of life; however, in resource-limited settings, access barriers often delay surgery We compared outcomes between early (<6 months) and late (≥6 months) surgical repair at a single reference center in Colombia. A retrospective cohort of 111 consecutive patients (2012-2022) undergoing primary CAVSD repair was analyzed. Patients were divided into early (n = 22) and late (n = 89) groups. Preoperative, intraoperative, and postoperative variables-including operative mortality, survival, and freedom from left atrioventricular valve (LAVV) reintervention-were compared using χ2, Mann-Whitney U, and Kaplan-Meier analyses. Median age at surgery was 4.9 months (interquartile range, 4.5-5.4) for early versus 8.5 months (interquartile range, 7.2-14.3) for late repair. Trisomy 21 was present in 83.8%. The early group had greater preoperative oxygen saturation (P = .004), less prevalent moderate/severe pulmonary hypertension (P = .013), and less LAVV regurgitation (4.5% vs 43.9%, P = .003). Operative mortality was 4.5% overall (0% early vs 5.6% late, P = .58). The early group had longer intensive care unit (median 8 vs 6 days, P = .012) and hospital stays (median 21 vs 13 days, P = .006). Medium-term survival and freedom from LAVV reintervention were comparable (log-rank P > .05). In this cohort, later CAVSD repair yielded operative mortality and medium-term outcomes comparable with early repair despite less-favorable preoperative profiles. These findings may support flexible surgical timing in resource-limited settings with robust perioperative care and follow-up.