There is limited data regarding the use of clinical assessment alone for neuromuscular blockade (NMB) titrations in the setting of acute respiratory distress syndrome (ARDS). To compare the amount of cisatracurium (CIS) consumed when utilizing train-of-four (TOF)-guided NMB titrations or clinical assessment alone without TOF for continuous infusion (CI) administration. A retrospective analysis was performed evaluating TOF-guided titrations compared with clinical assessment alone (non-TOF) for CI NMB with CIS. Individuals within the TOF group were assessed from January 2013 to December 2018 while those within the clinical assessment alone group were assessed from January 2021 to December 2024. Patients were excluded if they were less than 18 years old, had documentation of COVID-19 infection, were receiving extracorporeal membrane oxygenation, or had NMB initiated at an outside hospital. The primary objective was assessing drug utilization between groups. A total of 1047 and 553 individuals were screened resulting in 99 and 65 included for analysis in the TOF and non-TOF groups, respectively. The median cumulative CIS dose was 665 (472, 927) mg in the TOF group and 536 (400, 699) mg in the clinical assessment alone group, P = 0.011. The median infusion rate was 32% higher in the TOF group: 2.5 (1.8, 3.6) versus 1.9 (1.8, 2.4) µg/kg/min, P < 0.001, despite similar starting rates. Median drug costs were also significantly reduced when comparing the TOF group with the non-TOF group: $178 (126, 247) versus $143 (107, 187), P = 0.011. This study assesses drug utilization when comparing TOF-guided NMB titration with clinical assessment alone in the modern ARDS era. Utilization of clinical assessment alone without TOF monitoring for CI CIS resulted in significantly reduced drug utilization and costs. Further studies are needed to assess the impact of clinical assessment alone on improvement of oxygenation.
Ketamine has analgesic and sedative properties that are thought to reduce opioid tolerance and opioid-induced hyperalgesia. For this reason, ketamine may be used to manage pain and agitation, although the literature evaluating its use for this purpose is limited. To describe the prescribing practices of ketamine in the medical intensive care unit (MICU) as it pertains to pain and agitation, as well as assess measures of safety and efficacy. This was a single-center, retrospective cohort study of critically ill patients who received a ketamine infusion for pain and/or agitation between June 2021 and August 2024. The primary outcome was opioid consumption, reported in oral morphine milligram equivalents (MME), and sedative use pre- versus post-ketamine initiation. Secondary outcomes included paired mean pain scores, proportion of time within goal Richmond Agitation-Sedation Scale score range, and incidence of adverse events. For the 78 patients included, there was no difference in analgesic use pre- versus post-ketamine initiation (60 vs 45 MME; P = 0.677). No differences were observed in sedative doses required pre- versus post-ketamine initiation (5.3 vs 3.4 mg of benzodiazepines; P = 0.508, 0.9 vs 1.1 μg/kg/h of dexmedetomidine; P = 0.062, and 44 vs 44 μg/kg/min of propofol; P = 0.180). There was a median paired difference of -1 point on the pain scale ([interquartile range -1, 0]; P = 0.030) after ketamine initiation. Ketamine was associated with an increased incidence of hallucinations (1.3% vs 6.4%; P = 0.046). To the authors' knowledge, this is the first study examining the use of ketamine for pain and/or agitation that specifically focuses on MICU patients. The clinical benefit of the routine use of ketamine for these indications in this patient population is unclear and may increase patients' risk of adverse effects such as hallucinations.
The Impella® is a continuous axial flow pump that utilizes a purge solution to create a positive pressure barrier and prevent pump thrombosis. Bicarbonate-based purge solution (BBPS) is currently Food and Drug Administration-approved specifically for patients who are unable to tolerate heparin or have a contraindication to heparin, but solution choice between BBPS and heparin-based purge solution (HBPS) varies in clinical practice. This study aims to compare the efficacy and safety of the BBPS versus HBPS when BBPS is used as the standard of care. The major endpoint was a composite outcome of pump thrombosis, stroke, and bleeding events. This study was a retrospective analysis at 2 tertiary academic medical centers within the same health system (IRB Protocol #2023P002761). Adult patients were included if they required an Impella® device and received either the BBPS or HBPS for at least 72 hours between July 2022 and October 2023. The major endpoint was a composite outcome of pump thrombosis, stroke, and bleeding events. Pump thrombosis was defined as incidence of 2 or more purge pressures greater than 800 mmHg, the use of thrombolytic purge solution, or thrombus noted on imaging. Minor endpoints included intensive care unit (ICU) mortality, ICU length of stay, and vascular complications. A total of 178 patients were evaluated of which 66 were included. The composite outcome occurred in 23 of 38 patients (60.5%) in the BBPS group and 17 of 28 patients (60.7%) in the HBPS group (P = 0.99). There was no difference in pump thrombosis, stroke, or bleeding events between cohorts. Vascular events were noted in 5 (17.9%) of HBPS and 13 (34.2%) of BBPS group (P = 0.14). Additional minor endpoints were not significantly different between the groups. This study found no differences in pump thrombosis, stroke, and bleeding events in the BBPS versus HBPS group. Although these findings are supportive of utilizing BBPS as standard of care, larger, multi-center studies are needed to confirm differences in bleeding or thrombotic outcomes.
The objective of this systematic review and meta-analysis was to compare the efficacy and safety of glucagon-like peptide-1 (GLP-1) receptor agonists for the management of antipsychotic-induced weight gain. A systematic review was conducted following PRISMA methodology through July 2025 that evaluated the efficacy and safety of GLP-1 agonists for the management of antipsychotic-induced weight gain. Efficacy endpoints were change in body weight (kg), change in body mass index (BMI) (kg/m2), and change in HbA1c (%). The safety endpoint was gastrointestinal (GI) adverse effects. A P-value of 0.05 was considered statistically significant, and heterogeneity was reported as I2. Six studies were included in this systematic review, of which 4 trials were included in the meta-analysis. The difference found between GLP-1 agonists and placebo was a change in weight of -5.85 kg (P = 0.0622, 95% CI = -9.72 to -1.97), a change in BMI of -2.11 kg/m2 (P = 0.7692, 95% CI = -5.51 to 1.29), and a change in HbA1c of -1.58 (P = 0.6659, 95% CI = -4.75 to 1.58). The overall risk ratio for a patient to experience a GI-related adverse effect when taking a GLP-1 agonist compared to placebo was 1.83 (95% CI = 1.42 to 2.37). While the efficacy endpoints did not reach significance, this meta-analysis shows that select GLP-1 receptor agonists may be used to promote weight loss in patients taking antipsychotics. Controlling antipsychotic-induced weight gain helps patients remain adherent to therapeutic doses of their antipsychotic medications. The use of certain GLP-1 agonists may be considered to help promote weight loss in patients experiencing antipsychotic-induced weight gain.
This review summarizes current evidence on the efficacy and safety of tofersen (Qalsody) in treating amyotrophic lateral sclerosis (ALS). PubMed, MEDLINE, Google Scholar, and ClinicalTrials.gov were searched using the keywords: Qalsody, BIIB067, antisense oligonucleotides, SOD1, and amyotrophic lateral sclerosis. Articles published from inception to November 2025 were included. English-language studies assessing the pharmacokinetics, pharmacology, efficacy, and safety of tofersen were included. Prescribing information and real-world evidence were also reviewed. Tofersen is an intrathecally administered antisense oligonucleotide targeting superoxide dismutase 1 (SOD1) mRNA. Early trials demonstrate dose-dependent reductions in cerebrospinal fluid (CSF) SOD1 protein levels of -33% and slower ALS Functional Rating Scale (ALSFRS-R) decline compared to placebo (-1.19 vs -5.63 points). In Phase 3 trials, tofersen reduced CSF SOD1 by 29% and plasma neurofilament light chain (NfL) by 60%, while biomarkers increased in the placebo group. There was no significant difference in ALSFRS-R decline between tofersen and placebo (-6.98 vs -8.14; P = 0.97). Real-world data show favorable patient-related outcomes and improvement in ALSFRS-R. Adverse effects are primarily lumbar puncture related with serious neurologic events documented in 7% of tofersen recipients.Relevance to Patient Care and Clinical Practice in Comparison to Existing Drugs:As the first Food and Drug Administration (FDA)-approved gene-directed therapy for SOD1 ALS, tofersen directly targets the underlying genetic cause. Barriers include the need for genetic confirmation and intrathecal administration. Tofersen provides a promising targeted treatment option for pathogenic SOD1 ALS. Ongoing studies will clarify its long-term clinical impact.
Patients with traumatic brain injury (TBI) may develop paroxysmal sympathetic hyperactivity (PSH), a syndrome manifesting as cyclic increases in vital signs and motor activity. Previous studies show propranolol mitigating sympathetic symptoms and negative outcomes, but few reports assess its effect on body temperature. The purpose of this study was to determine if propranolol attenuates hyperthermia secondary to PSH in patients with TBI. This study evaluated febrile patients with TBI treated with propranolol. The primary outcome was the difference in maximum (Tmax), minimum (Tmin), and average (Tavg) daily temperatures and temperature variability (Tvar). Secondary outcomes included similar evaluation of heart rate (HR) and mean arterial pressure (MAP), differences in a modified clinical features scale (mCFS), and number of infectious work-ups before and after propranolol initiation. Data were collected one day before, the day of, and 3 days after propranolol initiation. Repeated measures analysis of variance (ANOVA) was used to evaluate vital sign differences among days of therapy. Post-hoc Tukey's tests were performed to identify between-day differences if they existed. Fifty-nine patients were included. The majority were male (76.3%) and had a severe TBI (78.0%). Tmax, Tmin, Tavg, and Tvar were all not different between Day -1 and Day 3. Tmax decreased from 38.6 ± 0.11°C on Day 0 (the day of propranolol initiation) to 38.3 ± 0.14°C on Day 3 (P = 0.22). Maximum and average HRs were significantly decreased after propranolol initiation on Day 3 (P < 0.001) and on Days 2 (P = 0.04) and 3 (P = 0.01), respectively. Changes in MAP were not statistically significant. The mCFS and number of infectious work-ups were significantly lower after propranolol initiation. Temperature was not significantly decreased after propranolol initiation in patients with TBI and hyperthermia; however, HR and mCFS were significantly lower. Larger studies are needed to characterize the effect of propranolol on temperature in patients with PSH.
Cellulitis is a common cause of hospitalization imposing significant burden on healthcare systems. Treatment disposition is dependent on extent and severity of infection with a range of possibilities from ambulatory to inpatient management. Limited evidence is available, aside from expert opinion, to guide the use of long-acting lipoglycopeptides (LaLGPs) to reduce hospitalizations. Identify patient-specific risk factors associated with short-stay hospitalizations (<72 hours) for cellulitis to guide development of an emergency department (ED) LaLGP decision algorithm. Adult patients with cellulitis treated with antibiotic monotherapy were screened. Three cohorts were identified based on length of hospitalization (ED discharge, <72 hours, ≥72 hours). The primary outcome was to identify risk factors associated with short-stay hospitalization. Patients were excluded if they required surgical intervention in the operating room, received polymicrobial therapy, required intensive care unit admission, received cellulitis treatment in the past 30 days, or were treated for multiple infections. A total of 161 patients were analyzed (ED discharge, n = 75; Short-stay, n = 46; and Long-stay, n = 40). Need for incision and drainage (odds ratio [OR] 1.8; 95% confidence interval [CI]: 0.76-2.36), advancing age (OR 1.02; 95% CI 1.00-1.05), or presenting with fever (OR 32.7; 95% CI 5.89-616.5) were associated with short-stay hospitalization compared to ED discharge. Diabetes (OR 3.02, 95% CI 1.31-7.11), presence of purulence (OR 3.86, 95% 1.09-13.79), and history of methicillin-resistant Staphylococcus aureus (MRSA) (OR 4.31, 95% CI 1.08-17.96) were independently associated with long-stay hospitalizations. Fever was the only factor independently associated with short-stay hospitalization, compared to patients discharged directly from the ED. Diabetes, purulence, and history of MRSA were associated with longer hospitalizations. Inclusion of these factors in a decision algorithm may help guide use of LaLGPs for cellulitis in the ED, potentially reducing cellulitis hospitalization rates.
Antifungal-resistant Candida species have become increasingly prevalent in recent years, posing significant therapeutic challenges in the inpatient setting. Despite their clinical relevance, comprehensive data evaluating risk factors for the development of invasive infections caused by Candida sp. with antifungal resistance are limited. This study aims to identify independent predictors for the development of invasive clinical infections caused by fluconazole-resistant or echinocandin-resistant Candida sp. in hospitalized veterans across the United States Veterans Health Administration. This retrospective, observational, nationwide analysis included adults ≥18 years admitted to any Veterans Affairs Medical Center between January 1, 2009, and September 30, 2024, with culture-positive or rapid diagnostic test-confirmed invasive Candida sp. infection from otherwise-sterile sites. Data were gathered on baseline demographics, baseline laboratory data, comorbid conditions, and exposure to antibacterial agents, antifungal agents, β-Hydroxy β-methylglutaryl-CoA (HMG-CoA) reductase inhibitors, and immunosuppressant drug therapy. Univariate and multivariate logistic regression models were used to assess associations between clinical variables and the development of invasive fluconazole-resistant and echinocandin-resistant Candida sp. Eligible cases were found from 25 Veterans Affairs Medical Center facilities; 1651 episodes had available fluconazole susceptibility data, and 1117 episodes had echinocandin susceptibility data. Fluconazole non-susceptibility was independently associated with at least 7 days of recent exposure to either fluconazole or an echinocandin, among a few other variables. Echinocandin resistance, while less common, was strongly linked to the receipt of dialysis, prolonged Gram-positive antibacterial exposure, and any prior antifungal use. These findings can inform antifungal stewardship efforts to curb the rise of resistant Candida sp.
Guidelines recommend arginine vasopressin (AVP) at a fixed dose for patients with septic shock who remain hypotensive after fluid resuscitation and norepinephrine, regardless of body weight. This study seeks to evaluate whether body weight affects hemodynamic response (HDR) to AVP in critically ill patients with septic shock. This was a single-center, retrospective cohort study of adult patients with septic shock. Patients received AVP at a fixed rate of 0.03 units/min in addition to catecholamine vasopressors and were grouped into 4 categories according to admission body weight. The primary endpoint was the time from AVP initiation to HDR, defined as a decrease in norepinephrine equivalents (NEEs) by at least 0.03 mcg/kg/min while maintaining a mean arterial pressure of at least 65 mm Hg for 1 hour. Secondary endpoints included a time to HDR defined as reduction in NEE of 0.05 mcg/kg/min, intensive care unit (ICU) length of stay (LOS), total duration of vasoactive agents, incidence of renal replacement therapy (RRT), and 28-day mortality. A total of 170 patients were included in the study. No differences were observed between categories in time to HDR of 0.03 NEE (P = 0.854) or 0.05 NEE (P = 0.985). The total duration of vasopressors and ICU LOS were also similar between categories (P > 0.05). Patients weighing <75 kg had a lower incidence of RRT (P < 0.001) and patients weighing <100 kg had a lower 28-day mortality (P = 0.006). These findings remained significant after regression analysis. Although body weight did not have a significant impact on the time to HDR, there was a significant difference in the incidence of RRT and 28-day mortality favoring those who weighed <75 and <100 kg, respectively. These findings contribute to the existing body of evidence, though larger trials are warranted to assess this further.
Inflammation-driven mechanisms play a central role in adverse outcomes after non-ST-elevation myocardial infarction (NSTEMI), yet simple, widely available biomarkers for early risk stratification remain insufficiently defined. Hemogram-derived indices and iron-related inflammatory markers may provide complementary prognostic information. To evaluate the prognostic significance of the mean platelet volume-to-monocyte ratio (MMR) and serum ferritin in predicting major adverse cardiovascular events (MACE) in patients with NSTEMI, and to assess the association of angiotensin-converting enzyme (ACE) inhibitor therapy with clinical outcomes. This prospective cohort study included 170 consecutive NSTEMI patients admitted to the University Clinical Center Tuzla between February 2022 and January 2023. All patients received dual antiplatelet therapy and high-intensity statins. The baseline evaluation included a complete blood count, serum ferritin, and C-reactive protein. MMR was calculated as the ratio of mean platelet volume to absolute monocyte count. Patients were followed for 12 months for the occurrence of MACE, defined as cardiovascular death, non-fatal myocardial infarction, urgent revascularization, stroke, or hospitalization for heart failure. During follow-up, 103 patients (60.6%) experienced MACE. Admission MMR (18.1 ± 11.7 vs 13.2 ± 5.5; P = 0.003) and ferritin levels (284 ± 396 vs 152 ± 109 µg/L; P = 0.001) were significantly higher in patients with events. In multivariable analysis, both MMR (odds ratio [OR] 1.06, 95% confidence interval [CI] 1.02-1.11; P = 0.008) and ferritin (OR 1.28 per 100 µg/L, 95% CI 1.10-1.55; P = 0.003) independently predicted MACE, while ACE inhibitor therapy was associated with a lower risk (OR 0.24, 95% CI 0.08-0.70; P = 0.01). The combined model demonstrated good discriminative performance (AUC 0.72; 95% CI 0.64-0.80). Elevated admission MMR and ferritin were independently associated with a higher 1-year risk of MACE in patients with NSTEMI. ACE inhibitor therapy was associated with improved outcomes, although causality cannot be inferred. These findings suggest that readily available inflammatory biomarkers may complement established clinical parameters for early risk stratification and support continued guideline-directed pharmacotherapy in NSTEMI.
Oral step-down therapy for the treatment of gram-negative blood stream infections (GNBSIs) has been shown to be noninferior to full courses of intravenous (IV) therapy. Studies comparing oral fluoroquinolones, sulfamethoxazole-trimethoprim (SMX-TMP), and beta-lactams have reported similar treatment outcomes, but beta-lactam groups have been small, and efficacy has been questioned given lower oral bioavailability. This study aims to evaluate the safety and efficacy of oral cephalosporins compared with penicillins for step-down therapy for the treatment of GNBSI. This was a retrospective study in adult patients admitted for Enterobacterales GNBSI that were transitioned from IV antibiotics to an oral beta-lactam. The primary outcome was treatment failure. Secondary outcomes included components of the primary outcome, microbiological failure, antibiotic-associated adverse drug events (ADEs), and Clostridioides difficile infection (CDI). Overall, 280 patients with GNBSI with step-down to an oral penicillin (n = 140) or cephalosporin (n = 140) were included. More patients in the cephalosporin group had a urinary source of infection (87.9% vs 62.1%, P < 0.001) and urinary abnormalities (40% vs 28.6%, P = 0.044). Treatment failure with oral cephalosporins was noninferior to penicillins (7.1% vs 7.1%; 95% confidence interval [-6.4, 6.4], P = 0.002). No significant differences were found for microbiological failure, antibiotic-associated ADEs, or CDI. Oral cephalosporins were noninferior to oral penicillins as step-down therapy for Enterobacterales GNBSI. This represents the first comparison of oral beta-lactams for the treatment of GNBSI.
In the United States, standard enoxaparin treatment dosing in pregnant patients is 1 mg/kg subcutaneously twice daily. For nonpregnant patients with obesity (body mass index (BMI) ≥30 kg/m2), literature supports empiric dose reductions to 0.8 mg/kg subcutaneously twice daily. To investigate whether pregnant patients with obesity require empiric enoxaparin treatment dose adjustments compared to pregnant patients without obesity. This retrospective chart review included pregnant adults who received therapeutic enoxaparin and had appropriately timed low-molecular-weight heparin (LMWH) anti-factor Xa levels. The primary endpoint compared initial LMWH anti-factor Xa levels between patients with and without obesity. Secondary endpoints included the frequency and extent of enoxaparin dose adjustments by BMI, among those with nontherapeutic initial levels. Safety was assessed by number and severity of bleeding events. Statistical analyses included the Kruskal-Wallis test, Student's T-test, and one-way ANOVA. Of 121 patients identified, 74 met inclusion criteria. Participants were classified as nonobese (n = 33) or obese (n = 41). Initial LMWH anti-factor Xa levels differed significantly between groups (P = 0.045). The mean BMI of patients with a subtherapeutic initial level (29.02 kg/m2) differed significantly from those with a supratherapeutic initial level (35.8 kg/m2) (P = 0.030). No significant difference was found in adjusted enoxaparin dose between participants with and without obesity, but a significant difference existed between subtherapeutic and supratherapeutic groups (P < 0.001). Each group had 1 clinically relevant nonmajor bleed. Limitations included single-center design and small sample size. Pregnant patients with obesity receiving treatment enoxaparin are likely to achieve target LMWH anti-factor Xa levels without initial dose adjustments. Morbidly obese patients, however, may still risk supratherapeutic levels. Pregnant patients without obesity were prone to subtherapeutic LMWH anti-factor Xa levels necessitating enoxaparin dose adjustments. Routine LMWH anti-factor Xa monitoring may be beneficial for obstetric patients receiving therapeutic enoxaparin, regardless of BMI.
It is challenging for patients with heart failure to obtain all core classes of guideline-directed medical therapy (GDMT), especially because sacubitril-valsartan and sodium-glucose cotransporter 2 inhibitors (SGLT2i) are cost-prohibitive for many. Pharmacy technicians in an outpatient heart failure clinic can provide support for medication access. The objective of this study was to determine the impact of a pharmacy technician on access to high-cost GDMT for patients with heart failure. This retrospective, pre-post cohort study evaluated adults with heart failure who were eligible for treatment with both sacubitril-valsartan and SGLT2i and had a heart failure clinic visit prior to the addition of a pharmacy technician (September 1, 2022, through February 28, 2023) or after the addition (September 1, 2023, through February 29, 2024). Excluded patients were pregnant, incarcerated, Veterans Affairs patients, had an estimated glomerular filtration rate <20 mL/min/1.73 m2, or received a heart transplant. The primary composite outcome was a binary variable identifying if patients were prescribed and filled sacubitril-valsartan and/or SGLT2i, evaluated using logistic regression controlling for patient prescription drug coverage. Secondary outcomes included affordable access to ivabradine and vericiguat and 30-day hospitalization rate. Pharmacy technician time spent was evaluated in the post-cohort. There were 192 patients included; 96 in each cohort. Access to sacubitril-valsartan and/or SGLT2i increased from 71.9% pre-cohort to 86.5% post-cohort (OR = 2.46, 95% CI = [1.18, 5.34]; P = 0.02). No differences were observed in secondary outcomes. The pharmacy technician spent an average of 66.4 minutes per patient assisted (n = 21/96). This study demonstrated a significant increase in access to sacubitril-valsartan and/or SGLT2i after the implementation of a pharmacy technician. Larger, prospective studies are needed to evaluate the impact of a pharmacy technician on clinical outcomes in this population.
Cyclosporine is an immunosuppressant extensively used for the prevention and treatment of graft-vs-host disease (GvHD) in pediatric allogeneic hematopoietic stem cell transplantation (allo-HSCT). Converting the administration route of cyclosporine from intravenous to oral is common in the early period of allo-HSCT. Various factors may have an impact on the conversion ratio of cyclosporine. To evaluate the effect of converting administration route from intravenous to oral on cyclosporine exposure in pediatric allo-HSCT recipients. Children who underwent allo-HSCT and were administered with cyclosporine for the prevention of GvHD were included. The cyclosporine trough concentration (C0), the trough concentration-dose ratio (CDR), and the conversion ratio were evaluated. Meanwhile, factors related to the bioavailability of cyclosporine were also investigated. A total of 67 children with 280 concentrations were involved. The conversion ratio used in the study was approximately 1:2, and a significant decrease in cyclosporine CDR (110.5 vs 41.4 mg/kg per μg/L, P < 0.001) was observed. The overall bioavailability of cyclosporine was approximately 35%. Age younger than 3 years old (β = -10.70, 95% CI = -18.45 to -2.96, P = 0.007) and moderately increased transaminases (β = -17.95, 95% CI = -25.42 to -10.48, P < 0.001) had a significant impact on cyclosporine bioavailability. A conversion ratio of 1:3 was found to be more appropriate for pediatric allo-HSCT recipients when switching cyclosporine from intravenous to oral administration. Children younger than 3 years old or with moderately increased transaminases had significant lower cyclosporine bioavailability. These results can assist in an individualized approach for patients undergoing cyclosporine formulation switching.
Data comparing benzodiazepine (BZD) to nonbenzodiazepine (non-BZD) sedatives in patients on mechanical ventilation (MV) demonstrate non-BZD sedation is associated with reduced duration of MV and lower mortality and delirium rates in medical and cardiac intensive care unit (ICU) populations. Limited data exist in surgical/trauma ICU (STICU) populations. Evaluate BZD versus non-BZD sedation on days alive and free of MV at 28 days in STICU patients. This single-center, institutional review board-approved, retrospective cohort study evaluated adult patients admitted to the STICU, intubated for ≥24 hours, and required continuous non-BZD versus BZD sedation in a level I trauma academic hospital. Primary outcome was days alive and free of MV at 28 days. Secondary outcomes included percent of time patients were oversedated and rates of delirium and ventilator-associated pneumonia (VAP). A multivariable model was constructed to determine independent predictors of being alive and free of MV at 28 days. After screening, 209 patients were included (BZD, n = 88; non-BZD, n = 121). BZD patients had higher Charlson comorbidity index (CCI) and APACHE II scores. Non-BZD patients had higher rates of traumatic brain injury. Patients receiving non-BZD sedation experienced more days alive and free of MV at 28 days compared with BZD sedation (24 [interquartile range, IQR = 22-25] vs 24 [IQR = 20-25]; P = 0.002). In a multivariable model, including APACHE II score and CCI, non-BZD sedation remained independently associated with increased odds of being alive and free of MV at 28 days (adjusted odds ratio: 6.98; P = 0.002). In addition, non-BZD sedation was associated with less time being oversedated (P < 0.0001), less delirium (P = 0.003), and lower rates of VAP (P = 0.0007). In STICU patients requiring MV, non-BZD sedation was associated with more ventilator-free days, less time oversedated, reduced delirium, and reduced VAP. These findings extend prior ICU sedation data into the STICU population supporting the use of non-BZD strategies when clinically appropriate to optimize patient outcomes.
Intravenous fluids are commonly used in postcardiac surgery to maintain or increase intravascular volume. While volume expansion after cardiac surgery is often necessary, the optimal fluid type to use is not established. The objective of the study is to evaluate cost-savings and clinical outcomes of an albumin minimization protocol for postcardiac surgery fluid resuscitation at a surgical intensive care unit in a community teaching hospital. This was a single-center retrospective cohort study of patients who received fluid resuscitation after open heart coronary artery bypass surgery or valvular surgery while on cardiopulmonary bypass at a community teaching hospital between February 2021 and May 2022. Cohorts were split up prior to the implementation of an albumin minimization protocol that was implemented in September 2021. The primary outcome was the amount of albumin or crystalloid fluids received after surgery, and the overall cost of intravenous fluids after surgery. Secondary outcomes included 30-day mortality, acute kidney injury, hours on oxygen support, hours on vasopressors, multiple vasopressors used, perioperative blood product transfusions, and 72-hour surgery take back. Of 434 total patients evaluated, 400 patients met criteria for inclusion. Baseline characteristics were balanced between the 2 groups. Average surgical time was shorter in the postprotocol arm. Per patient use of albumin decreased by 27.1 g (22.8-31.4) while crystalloid fluid use increased by 1 L (0.9-1.2) after implementation of the albumin minimization protocol. Average cost savings were approximately $178 per surgery. No statistically significant difference was seen in any of the secondary safety and efficacy outcomes. This study adds to the body of literature suggesting that the use of an albumin minimization protocol after open heart cardiac surgery was safe and effective. A significant reduction in cost and utilization of albumin was seen in the study without affecting patient outcomes.
Atrial fibrillation (AF) is associated with an increased risk of stroke and heart failure (HF). Historically, rate control has been the preferred treatment strategy due to fewer adverse drug events, but emerging evidence suggests early rhythm control may offer mortality benefits. Despite this, a critical gap exists in understanding short-term outcomes of rhythm control. Assess the impact of early rhythm control strategy in patients with new-onset AF on mortality, hospitalizations, or emergency room (ER) visits compared to rate control strategy. Retrospective observational study of patients at least 18 years of age diagnosed with new-onset AF cared for in a large United States healthcare system. Patients with identifiable triggers were excluded. The primary outcome was rates of all-cause mortality, hospitalization, or ER visit for AF, atrial flutter (AFL), or HF at 30 days. Four hundred sixty-three patients were included, 278 in the rate control group and 185 in the rhythm control group. The median follow-up duration across groups was 654.5 days. At 30 days post-diagnosis, the rate control group experienced a higher rate of the primary outcome (19.8% vs. 11.9%, P = 0.027). Cox proportional hazards model demonstrated that rhythm control was associated with a reduced risk of experiencing the secondary outcome (HR 0.759, 95% CI [0.582, 0.984], P = 0.042). In patients with new-onset AF, early rhythm control was associated with a lower incidence of all-cause mortality, hospitalization, or ER visits. These findings underscore the importance of rhythm control in optimizing early AF management and improving patient outcomes.
Heparin resistance (HR) poses the risk for significant complications, as subtherapeutic anticoagulation may lead to thrombotic events; however, there remains a lack of guidance on standardized management strategies in cardiac intensive care unit (CICU) patients and those with mechanical circulatory support (MCS) devices. The purpose of this study is to describe current management strategies for patients with suspected HR and provide insights into its definition among critically ill cardiac patients. This retrospective study evaluated intensive care unit (ICU) patients receiving greater than or equal to 25 units/kg/h of unfractionated heparin (UFH) that failed to achieve 2 consecutive therapeutic activated partial thromboplastin time (aPTT) values. The primary outcome was incidence of patients transitioned to a direct thrombin inhibitor (DTI). Secondary outcomes included major bleeding, thrombosis, and antithrombin III supplementation. A subgroup analysis compared anticoagulation characteristics by agent (UFH vs. DTI), including anticoagulant infusion volume, and time to goal aPTT. Of 76 patients receiving titratable UFH, 62 (81.6%) met inclusion criteria. Transition to a DTI occurred in 4 (6.5%) patients, all who received bivalirudin. Major bleeding occurred in 4 (6.5%) patients and thrombosis in 2 (3.2%) while receiving UFH. Median time to goal aPTT was 88 (interquartile range [IQR] = 55.3-123.3) hours with UFH vs. 6 (IQR = 3.7-11.3) hours with bivalirudin (P = 0.002). Median daily anticoagulant volume was 578 (IQR = 404.0-770.4) mL with UFH vs. 190 (IQR = 147.3-218.5) mL with bivalirudin (P = 0.001). Our findings describe current management practices for suspected HR among critically ill cardiac patients. Although the small subset of patients transitioned to DTI limits generalizability, earlier recognition and individualized anticoagulation strategies may be warranted in MCS patients given their inherent thrombotic risk. Future studies are needed to further define HR and evaluate anticoagulation strategies in this population.
The use of the Calvert formula for carboplatin dosing is well established. However, controversy persists regarding the choice of kidney function estimation equations, with oncology and nephrology guidelines offering differing recommendations. This study aimed to compare the actual carboplatin doses administered to patients with those estimated using various kidney function equations, and to assess whether these differences could influence clinical outcomes. This retrospective chart review included adult patients who received at least 1 dose of carboplatin at our institution between May 2015 and March 2024. Carboplatin doses were estimated using the Calvert formula with different kidney function equations and compared with the actual doses administered. A total of 360 patients met the inclusion criteria, with a median age of 57 years (46-68), a median weight of 72 kg (61-84), and 248 (68.9%) were female. Significant inter-method variability was observed in the median estimated kidney function and corresponding median carboplatin dosing across equations. Cockcroft-Gault (C-G) using actual body weight (110 mL/min, 677 mg), C-G with adjusted body weight (94 mL/min, 602 mg), 2021 CKD-EPI (chronic kidney disease epidemiology collaboration) (100 mL/min/1.73 m², 649 mg), 2021 CKD-EPIBSAAdj (101 mL/min, 658 mg), and the Janowitz equation (83 mL/min, 577 mg), P < 0.001. Hematologic toxicity, including febrile neutropenia, was significantly more common in patients dosed using C-G equation with actual body weight compared with adjusted body weight, occurring in 14/20 (70%) versus 5/115 (4.3%), respectively (P < 0.001). Carboplatin dosing varies significantly depending on the kidney function estimation method used. There is an urgent need for standardized, interdisciplinary dosing guidelines.
To evaluate the efficacy and safety of mepolizumab, a humanized monoclonal antibody that targets interleukin-5, a key mediator in eosinophilic inflammation, in reducing moderate-to-severe exacerbations among patients with eosinophilic chronic obstructive pulmonary disease (COPD). PubMed, Scopus, Web of Science, and Cochrane databases were systematically searched using the terms: "Mepolizumab," "COPD," "Chronic Obstructive Pulmonary Disease," for randomized controlled trials comparing subcutaneous mepolizumab (100 mg every 4 weeks) with placebo in patients with eosinophilic COPD from inception till July 2025. Randomized controlled trials comparing subcutaneous mepolizumab with placebo in adults with eosinophilic COPD were included. Two independent reviewers screened studies and extracted data. Finally, 4 studies with a total of 1953 patients were included. Of these, 978 (50.0%) received mepolizumab. Statistical analysis was performed using R software (version 4.5.0). Mepolizumab significantly prolonged the time to first moderate or severe exacerbation (hazard ratio [HR] = 0.80; 95% confidence interval [CI] 0.69-0.92; P = 0.016) and reduced the rate of moderate-to-severe exacerbations (rate ratio 0.80; 95% CI 0.78-0.83; P < 0.001). The risk of adverse events (AEs) (risk ratio [RR] = 1.00; 95% CI 0.95-1.06; P = 0.962) was similar between groups, while the risk of serious adverse events or death (RR = 0.83; 95% CI 0.72-0.96; P = 0.031) was significantly lower in the mepolizumab group.Relevance to Patient Care and Clinical Practice in Comparison With Existing Drugs:Mepolizumab provides a targeted, biomarker-guided treatment option, potentially reducing exacerbations without the added safety concerns like infections and metabolic complications as seen with existing therapies. Mepolizumab reduces the time to first moderate or severe exacerbation and prolongs symptom-free periods in patients with eosinophilic COPD, without increasing the risk of AEs.