Background: Human cysticercosis, caused by the larval stage (cysticerci) of the pork tapeworm Taenia solium, is an important zoonotic disease. The disease is prevalent in developing countries where porcine cysticercosis is common and undercooked pork is habitually consumed. Objective: This study aimed to develop an immunochromatography-based test kit for the rapid diagnosis of human cysticercosis using low-molecular-weight antigens purified from cyst fluid of the T. solium Asian genotype to detect specific IgG antibodies in whole blood. The kit was designated as "the cysticercosis whole-blood test kit (iCys WB kit)." Methods: It was evaluated under laboratory conditions using 164 whole-blood samples, of which 21 were from confirmed cysticercosis cases. The results of the iCys WB kit, which detects anti-T. solium (cysticercus) antibodies in simulated whole blood samples, were compared with results from corresponding human serum samples. Results: When using both sample types, iCys WB kit demonstrated an accuracy of 98.8%, a sensitivity of 91.7%, a specificity of 100%, a positive likelihood ratio of 0, a negative likelihood ratio of 0.083, and an ROC area of 0.96. The agreement between results obtained from simulated whole-blood and serum samples showed perfect concordance. Conclusions: The iCys WB kit is a valuable easy-to-handle diagnostic tool and may be applicable for supporting clinical diagnosis at the point of care.
Sickle cell disease (SCD) is a severe, inherited hemoglobin disorder characterized by chronic hemolysis, vaso-occlusive crises (VOCs), and systemic inflammation. Hydroxyurea is the standard conventional pharmacotherapy for SCD, but it has certain limitations, necessitating the need to explore other safe and effective treatment options for SCD. Ayurveda interventions offer a potential therapeutic approach complementary to conventional medicine for SCD management, with anti-inflammatory, immunomodulatory, and hematopoietic properties. This randomized controlled trial will evaluate the efficacy and safety of an Ayurvedic therapeutic regimen as an adjunct to hydroxyurea in SCD management, assessing its impact on hematological parameters, inflammatory biomarkers, VOC frequency, and overall quality of life. A PROBE (Prospective, Randomized, Open-Label, Blinded End Point) study will be conducted on individuals of any gender aged 18 years or older and diagnosed with SCD (with hemoglobin S levels more than 60% and a history of at least 1 VOC per year over the past 3 y). Individuals with acute VOC or any severe infection requiring hospitalization, a history of significant comorbidities, or hematopoietic stem cell transplantation will not be considered. The study will be conducted at the All India Institute of Medical Sciences, Bhopal, India. A total of 100 participants will undergo random assignment in a 1:1 ratio to receive either an Ayurveda regimen (Dadimadi Ghrita, Punarnavadi Mandura, and Vasaguduchyadi Kwatha) as an add-on to hydroxyurea or hydroxyurea alone for 8 months. The primary outcome will be a change in hemoglobin electrophoresis parameters (hemoglobin S, fetal hemoglobin, and adult hemoglobin) and the frequency of VOC episodes over 8 months. The secondary outcome measures include changes in the levels of proinflammatory markers (interleukin-6, interleukin-8, C-reactive protein, and transforming growth factor-β) and lactate dehydrogenase, frequency of hospitalization for VOCs and blood transfusions, and health-related quality of life (Short Form-8 Health Survey questionnaire). Safety will be evaluated by recording the incidence of adverse events and changes in liver and kidney function tests from baseline. The recruitment of study participants was initiated on November 1, 2023. By the second week of February 2025, 83 participants had been enrolled in the study. The final study is expected to be complete by December 31, 2025. We will start the analysis of the study outcomes in February 2026, and the publication of the final results is expected by August 2026. This randomized controlled trial protocol outlines a rigorous study design aimed to explore the potential benefits of an integrated therapeutic regimen comprising Ayurveda interventions and standard conventional care in the long-term management of SCD through validated clinical and laboratory parameters. The outcomes of this study can address the needs and challenges associated with SCD management and inform future management protocols.
Diabetic foot ulcers (DFU) as one of the most severe complications of diabetes was characterized by high mortality and incidence of cardiovascular events. Serum uric acid (UA) was well known for cause of gout which was closely related with cardiovascular complications, but also known as the main antioxidant in extracellular fluid. In this study, we aimed to investigate whether serum UA was related with severity and prognosis of patients with DFU. 535 patients with diabetes and firstly diagnosed with DFU hospitalized in Ruijin hospital was consecutively recruited. Concentration of serum UA was examined when admission. Participants were grouped into three groups according to tertiles of their serum UA level. These patients were followed up for an average of 48 months to observe the outcomes, including wound healing, non-fatal cerebral and cardiovascular events (NCCE) and all-cause death. The association of serum UA concentration with the severity of disease evaluated through Wagner and Infection degree, and risk of outcomes were analyzed through partial correlation and Cox regression analysis, respectively. In bivariate correlation analysis, UA was negatively correlated with the Wagner (r=-0.159, P <0.001) and Infection degree (r=-0.171, P <0.001). In partial correlation, UA was still negatively and independently correlated with the Wagner (r=-0.74, P <0.001) and Infection degree (r=-0.190, P <0.001) after controlling for age, sex, type of diabetes, duration of diabetes, duration of DFU, PAD, HbA1c, serum creatine and urine protein excretion in 24hours. In terms of Cox regression analysis, UA was positively and independently related with the risk of NCCE (HR = 1.002, 1.000 to 1.004, P = 0.011), all-cause mortality (HR = 1.003, 1.001 to 1.004, P = 0.006) and healing rate (HR = 1.001, 1.000 to 1.003,P = 0.042) after adjusting for age, sex, type of diabetes, duration of diabetes, duration of DFU, history of stroke, history of CAD, PAD, HbA1c, serum creatine and urine protein excretion in 24 hours. UA might exert a beneficial effect in the context of DFU wound repair, but was an important risk factor for NNCE and all-cause mortality in patients with DFU.
Sickle cell disease (SCD) disproportionately affects racial and ethnic minority groups in the US and is associated with high levels of morbidity and health care utilization. However, population-level geographic differences and temporal variation in SCD hospitalization outcomes remain incompletely characterized. To assess temporal and regional patterns of SCD hospitalizations in New York State from 2009 through 2022. This retrospective cross-sectional study analyzed inpatient SCD hospitalizations recorded in the New York State Statewide Planning and Research Cooperative System deidentified database between January 1, 2009, and December 31, 2022. The analytic sample included 42 271 hospitalizations after exclusion of records with missing demographic, cost, or facility information. Data analysis was conducted from July 17, 2024, to February 14, 2025. Hospitalization with SCD across 8 state-defined health service areas. Outcomes included regional distribution of hospitalizations, mean length of stay, mean total charges, and trends in severity of illness and risk of mortality as defined by the All Patient Refined Diagnosis Related Groups classification system. Demographic and regional distributions were compared across years and regions. Among 42 271 SCD hospitalizations (21 777 female [51.5%]), most occurred among individuals identified as Black (35 318 [83.6%) compared with White (750 [1.8%]), multiracial (242 [0.6%]), and other race or ethnicity 5956 (14.1%) and were aged 18 to 29 (16 794 [39.7%]) or 30 to 49 years (13 480 [31.8%]). New York City accounted for the largest proportion of hospitalizations statewide. There were significant differences in the length of stay and total charges across service areas; Central New York had the longest mean (SD) length of stay of 6.3 (7.3) days, followed by the Hudson Valley (6.2 [7.2]) days, while Long Island had the highest mean (SD) total charges at $59 476.3 ($63 823.5). The proportion of hospitalizations classified as major severity increased from 751 of 5897 (12.7%) in 2009 to 1011 of 3709 (27.3%) in 2022, and the proportion classified as major risk of mortality increased from 170 of 5897 (2.9%) to 469 of 3709 (12.6%) during the same period. Long Island had the highest proportion of hospitalizations with major risk of mortality (93 of 970 [9.6%]), whereas New York City exhibited one of the lower proportions of major risk of mortality (1531 of 27 923 [5.5%]) despite high hospitalization volume. In this cross-sectional study, geographic and temporal differences in SCD hospitalization outcomes were observed across New York State during a 14-year period. These findings suggest the need for region-specific strategies to improve access to specialized care, reduce severe outcomes, and optimize health care resource use for individuals living with SCD.
This study aimed to evaluate the safety, including blood pressure changes, and medication compliance associated with a generic prolonged-release formulation of mirabegron (Selebeta® PR Tab. 50 mg) in Korean adults with overactive bladder (OAB) through a large-scale, real-world, multicenter, retrospective observational study. Patients with OAB who were prescribed Selebeta® PR once daily for at least 3 months between July 2020 and July 2021 from 95 medical institutions in Korea were included. The primary endpoint was the proportion of patients with an increase in systolic blood pressure (SBP) of ≥10 mmHg after 3 months of treatment. Secondary endpoints included additional thresholds of blood pressure elevation, changes in SBP and diastolic blood pressure (DBP) and pulse rate over time, improvement in OAB symptom, incidence of adverse events, and medication compliance. A total of 2,091 patients were enrolled in the study. After 3 months of treatment, 8.8% of the patients had an SBP increase of ≥10 mmHg; 3.7%, ≥15 mmHg; and 2.3%, ≥20 mmHg. DBP increased by ≥5 mmHg in 16.8% and by ≥10 mmHg in 6.7% of patients. OAB symptoms also improved significantly in the subgroup with available OAB symptom score data. The overall incidence of adverse events was 2.1%, mostly mild, and mean medication coverage was 73.6%. This study demonstrated that Selebeta® PR is an effective and well-tolerated treatment for adults with OAB, with no clinically meaningful increase in blood pressure and a low incidence of adverse events.
To explore the risk factors and predictive value of Mycoplasma pneumoniae (MP) infection in children. A total of 2042 children with suspected Mycoplasma pneumoniae infection who were treated for the first time at Civil Aviation General Hospital from October 2023 to December 2023 were selected as the study subjects. Among them, 1637 cases were confirmed as Mycoplasma pneumonia-infected and were included in the pneumonia group, while the remaining 405 cases were non-Mycoplasma pneumonia-infected and were included in the non-Mycoplasma pneumonia group. The clinical data of the 2 groups of children (including gender, age, initial symptoms, laboratory indicators, etc) and the risk factors of MP infection in children were compared, and the receiver operating characteristic curve was analyzed. This study showed that the percentage of neutrophils in the non-MP infection group was significantly lower than that in the MP infection group, and the difference was statistically significant (P < .001). When comparing the percentages of lymphocyte percentage (LY) and hemoglobin in the 2 groups of children, the Mycoplasma pneumonia-infected group was lower than the non-Mycoplasma pneumonia-infected group, and the differences were both statistically significant (P < .05). Logistic regression analysis revealed that white blood cell and neutrophil-to-lymphocyte ratio (NLR) might be valuable markers for predicting MP infection. The Spearman correlation indicated that LY was collinear with the occurrence of MP infection, and Least Absolute Shrinkage and Selection Operator (LASSO) regression analysis demonstrated that both LY and NLR might be valuable markers for predicting MP infection (P < .05). Receiver operating characteristic curve analysis revealed that the area under the curve of the NLR for diagnosing MP infection was 0.624, with a cutoff value of 1.36 (sensitivity of 0.798 and specificity of 0.558). In the diagnosis of MP infection, the consistency between the RNA method and the immunogold colloidal method was poor (Kappa = 0.108, P < .05). The consistency between the RNA method and the immunogold colloidal method in the diagnosis of MP infection is poor. Both the white blood cell and NLR are valuable markers for MP infection.
Background and Objectives: Venous thromboembolic disease (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is a major cause of morbidity and mortality worldwide and imposes a substantial financial burden on health systems due to both the direct and indirect costs of acute management and long-term complications. This systematic review aimed to assess patient satisfaction with anticoagulation therapy for VTE and to highlight potential differences according to the type of anticoagulant. The review focused on factors influencing the patient experience, such as perceived efficacy, ease of use, adverse effects, and health-related quality of life. Materials and Methods: A systematic review, without quantitative meta-analysis, was conducted in accordance with PRISMA 2020 guidelines. Articles were identified through searches in major databases (PubMed, Scopus, Cochrane Library and others) using keywords including "patient satisfaction", "anticoagulation", "venous thromboembolic disease", and "quality of life". In total, 21 studies published between 2009 and 2025 met the inclusion criteria. The studies assessed patient satisfaction with different types of anticoagulation, including vitamin K antagonists (VKAs), direct oral anticoagulants (DOACs), and low-molecular-weight heparin (LMWH) injections. Results: Across the included studies, patients generally reported higher levels of treatment satisfaction with DOACs compared with VKAs, mainly due to the absence of routine laboratory monitoring and fewer dietary restrictions. However, satisfaction varied according to age, sex, and clinical status. In specific patient populations, such as those with cancer-associated thrombosis, factors including fewer drug-drug interactions and perceptions of safety with LMWH appeared to influence treatment choice and satisfaction. Adverse effects, particularly bleeding, were identified as major drivers of dissatisfaction. Several studies suggested that higher treatment satisfaction was associated with better adherence, while quality of life appeared to improve in patients treated with DOACs in comparison with VKAs. Conclusions: Patient satisfaction is a critical component of successful VTE management. Overall, DOACs appear to be associated with higher treatment satisfaction than traditional therapies such as VKAs, although further high-quality research is needed to individualise anticoagulation strategies. Systematic incorporation of patient-reported satisfaction into clinical decision-making and into international guidelines may improve adherence, enhance quality of life, and ultimately increase the effectiveness of anticoagulation therapy.
Blood pressure (BP) management following successful reperfusion after endovascular thrombectomy (EVT) is critical in achieving favorable clinical outcomes. Individualized BP management using predictive modeling by machine learning may further improve prediction of functional outcomes. This study was a retrospective analysis of data from the Outcome in Patients Treated with Intra-Arterial Thrombectomy-Optimal Blood Pressure Control (OPTIMAL-BP) trial, a randomized controlled trial comparing between intensive and conventional BP management during the 24 h after successful recanalization by EVT from June 18, 2020, to November 28, 2022. The trial was conducted across 19 centers in South Korea. Machine learning models were developed to predict functional independence (90-day modified Rankin Scale 0 to 2). Model performance was compared between clinical variables only and systolic blood pressure (SBP) metrics in addition to clinical variables. In addition, the Shapley additive explanations (SHAP) analysis was performed to provide model explanation and understand the importance of SBP metrics. A total of 288 patients (61.1% men, median age 75 years [interquartile range, 65-81]) were included. Among the six algorithms, the deep neural network model incorporating SBP metrics performed best on validation, achieving an area under the curve of 0.86 (95% confidence interval, 0.76-0.92) which was significantly better than the model using only clinical variables (area under the curve 0.80 [95% confidence interval, 0.69-0.88], P = .037). Among SBP metrics, SHAP analysis identified time rate of SBP and minimum SBP as important features, with time rate showing greater influence in the intensive group and minimum SBP in the conventional group. Integrating SBP metrics with clinical variables significantly improved machine learning performance in predicting functional outcomes after successful EVT. Explainable artificial intelligence (AI) identified time rate and minimum SBP as key predictors of outcome. Trial Registration Information: ClinicalTrials.gov (NCT04205305; registered December 17, 2019).
Several biomarkers have been assessed for the diagnosis of chronic kidney disease (CKD). However, limited data are available regarding their ability to predict mortality in CKD. The aim of this study was to assess the ability of lipocalin-2, receptor for advanced glycation end products (RAGE), tissue inhibitor of metalloproteinase 1 (TIMP-1), osteopontin, and trefoil factor 3 (TFF-3) in predicting the prognosis of patients with CKD. We included patients with CKD as defined by an estimated glomerular filtration rate of <60 mL/min/1.73 m2 of either sex who were above the age of 18 years. Patients with a history of acute-on-chronic kidney disease were excluded. The novel markers of interest were estimated from patients' plasma samples using the multiplex enzyme-linked immunosorbent assay. These patients were followed for 1 year to assess mortality. The median (with interquartile range) plasma concentrations of lipocalin-2, RAGE, TIMP-1, osteopontin, and TFF-3 were 56.9 (44.72-64.18) ng/mL, 7.1 (4.92-9.56) ng/mL, 46.44 (33.52-47.56) ng/mL, 60.2 (98.9-167.61) ng/mL, and 4.87 (8.03-15.65) ng/mL, respectively. A combined receiver operating characteristic curve was plotted to determine the ability of the novel renal biomarkers to predict mortality in patients with CKD. The cutoff values for lipocalin-2 and RAGE for predicting mortality were 62.48 ng/mL with sensitivity and specificity of 86% and 78% (area under the curve: 0.814; p = 0.007) and 8.5 ng/mL with sensitivity and specificity of 71% and 72% (area under the curve: 0.738; p = 0.04), respectively. Biomarkers including lipocalin-2 and RAGE were assessed in this study and were found to have good predictive value for mortality outcomes in CKD. Future studies should establish the relationship between these novel renal biomarkers and the progression of CKD.
Comorbidity is prevalent among people with multiple sclerosis (pwMS) and may contribute to disease progression. Physical exercise (PE) reduces symptoms in pwMS and also benefits comorbidities. Digital (e)-based PE has been proposed as a tool to support the integration of PE. To describe a protocol for a randomized controlled trial (RCT) based on a structured approach and the results from a controlled feasibility study of an e-based PE intervention in pwMS with and without comorbidities. In a RCT following a feasibility study (n=50), patients will be randomly assigned in a 1:1 ratio to receive either usual care (n=150) or usual care plus an e-based PE program at home (n=150). The exercise program consists of resistance training with resistance bands targeting the lower extremities. The sessions will enable participants to engage in group exercises from their homes, supervised online by physiotherapists, two 60-minute sessions per week for 6 months (24 weeks and 48 sessions). The primary endpoint is change of walking capacity using the 6-minutes' walk test. Secondary endpoints include "no evidence of disease activity" (NEDA)-3 scale, measures of quality of life and fatigue as well as levels of neurofilament light chain in blood and cerebrospinal fluid.Results of the feasibility study: Fifty individuals were eligible and randomized to an intervention group (n = 23) or to a usual care control group (n = 27). A total of 24 sessions were conducted for three months in groups of 6, supervised by the physiotherapist. In the intervention group, two pwMS did not begin the PE program due to occupational constraints with a resulting recruitment rate of 91.3% (21/23). Of the remaining 21 individuals, 16 (76.2%) completed the follow-up assessments with a mean attendance of 15.2 (range 6-22) sessions per participant, corresponding to a 63% adherence rate. No intervention-related adverse events were reported. This protocol describes a prospective RCT study and the supporting feasibility data of an e-based PE performed at home. The effects of e-based PE performed at home will be evaluated, offering a significant contribution to the field of digital healthcare solutions and MS. https://clinicaltrials.gov/, identifier NCT06298201.
Autophagy plays a crucial role in maintaining cellular homeostasis and has been implicated in the pathogenesis of knee osteoarthritis (OA). However, data on radiographic stage-dependent transcriptional variation of autophagy-related genes in patients with knee OA, particularly using peripheral blood samples, remain limited. The aim of this study was to evaluate whether disease severity was associated with stage-dependent changes in the expression of selected autophagy-related genes within a patient cohort. A total of 200 patients diagnosed with knee OA were included in the study. Disease severity was classified according to the Kellgren-Lawrence radiographic grading system. Peripheral blood samples were collected, and the expression levels of selected autophagy-related genes were analyzed using quantitative real-time polymerase chain reaction [autophagy-related 5 (ATG5), ATG7, unc-51-like kinase 1 (ULK1), microtubule-associated protein 1 light chain 3 beta (LC3B), WD repeat domain phosphoinositide-interacting protein 1 (WIPI1), neighbor of BRCA1 gene 1 (NBR1), forkhead box O3 (FOXO3), transcription factor EB (TFEB)]. Relative gene expression was calculated using the ΔCt method, and comparisons were performed across radiographic stages. Associations between gene expression levels and systemic inflammatory markers were also assessed. Significant stage-dependent differences were observed in the expression of ULK1, TFEB, WIPI1, and NBR1 (p<0.05), with higher ΔCt values (reduced relative expression) in advanced radiographic stages compared with early-stage disease. In contrast, ATG5, ATG7, LC3B, and FOXO3 expression remained stable across radiographic stages. Furthermore, no significant associations were observed between expression of autophagy-related genes and systemic inflammatory status, as assessed by C-reactive protein levels. In patients with knee OA, regulatory and early autophagy-related genes exhibit radiographic stage-associated transcriptional alterations in peripheral blood, while expression of core autophagy machinery genes remain relatively stable. These findings suggest that disease severity is associated with selective transcriptional changes in autophagy-related pathways within the OA patient population and support further investigation of stage-dependent molecular patterns in knee OA.
Studies have demonstrated lower odds of survival from out-of-hospital cardiac arrest (OHCA) during nighttime hours, but this has not been studied in North America since 2013, and it is unclear what factors might explain this survival difference. To identify whether OHCA survival during nighttime hours remains lower than during daytime hours using contemporary data and whether it can be explained by variable patient physiology or emergency care factors. This cohort study included adults (aged ≥18 years) with OHCA in the Cardiac Arrest Registry for Enhanced Survival from 2013 to 2024. Daytime was defined as 7:00 am to 10:59 pm, and nighttime was defined as 11:00 pm to 6:59 am. Primary outcomes were sustained return of spontaneous circulation (ROSC) and neurologically favorable survival (Cerebral Performance Category score of 2 or more). A multilevel mixed-effects logistic regression model with prehospital agency as a random effect and patient or treatment characteristics as fixed effects was used. A similar analysis of postresuscitation survival was performed among patients with sustained ROSC, adjusting for the time-to-cardiopulmonary resuscitation interval and defibrillation status. A mediation analysis was performed to identify whether the prehospital response interval mediates the association. Of 1 151 845 patients in the registry, 874 415 were eligible and included in the analysis, and the median (IQR) age in the cohort was 64 (52-75) years with 557 515 males (63.8%) and 181 878 Black or African American patients (20.8%), 146 352 Hispanic or Latino patients (16.7%), and 447 646 White patients (51.2%). A minority of OHCA responses occurred at nighttime (241 967 [27.7%]), and the odds of sustained ROSC and neurologically favorable survival were lower at nighttime than daytime (sustained ROSC: 62 548 [25.8%] vs 193 486 [30.6%]; adjusted odds ratio [aOR], 0.85; 95% CI, 0.84-0.86; neurologically favorable survival: 16 234 [6.7%] vs 58 542 [9.3%]; aOR, 0.84; 95% CI, 0.82-0.86). Among those with sustained ROSC, the odds of postresuscitation survival at nighttime were also lower than daytime (aOR, 0.93; 95% CI, 0.90-0.95). The prehospital response interval partially mediated the nighttime survival disadvantage, with approximately 12.6% of the total effect mediated by the response interval. In this cohort study of OHCA, nighttime response was associated with lower adjusted odds of sustained ROSC, neurologically favorable survival, and postresuscitation survival. Emergency care factors accounted for only a portion of the decreased odds of survival at nighttime.
Pelvic fractures are classified as stable or unstable. They correlate with a severity of trauma and the initial medical treatment is decisive. This study evaluated the transferrals of such fractures and described the initial treatment as well as the clinical course. We analysed retrospective data from a large cohort of the TraumaRegister DGU® (TR-DGU), covering the period from 2014 to 2023, comprising a total of n = 397,910 patients. All patients aged ≥ 16 years were included. Injury patterns were described according to the Abbreviated Injury Scale (AIS), the mechanically unstable fractures were classified with an AIS ≥ 3. We considered all participating hospitals within Germany. The patients were subdivided in three groups: Group 1 = primary admitted patients with outcome, group 2 = pre-treated patients transferred in from other hospitals, and group 3 = primary admitted and early (< 48 h) transferred out. The majority of the patients was male and about 53 years old. Blunt trauma was the leading trauma mechanism. Concomitant injuries (AIS 2+) affected thorax (56%), spinal cord (41%), lower extremities (38%), head (31%) and abdomen (24%). Among primary admitted cases with pelvic fractures (n = 36,398), 21,091 cases (57.9%) had an unstable pelvic fracture (AIS pelvis 3-5). Level 1 trauma centers not only treated 12,836 primary admitted cases with unstable pelvic fractures (83.5%) but also received 2,365 patients (15.4%) from other hospitals via transfer; while only 1% of cases were transferred out early (n = 170). Transfusion was administered in 5,984 patients (16.5%) (AIS 2-5). A pelvic binder was applied in 7,096 (36.3%) patients and surgical stabilisation was performed in 4,075 (14.9%) patients. The length of stay on intensive care unit was highest in AIS 5 with 6 days. The mortality rate was 38.5% in AIS 5, and 9.9% in AIS 2. Over the course of the last 10 years, the prevalence of unstable pelvic ring fractures (AIS 3-5) constantly remained around 9%. Unstable pelvic fractures were increasingly transferred to a Level I trauma center. Unstable pelvic fractures correlated with a high Injury Severity Score (ISS). The early treatment involved the transfusion of packed red blood cells, the application of a pelvic binder and the surgical stabilisation. Though, these tools were increasingly utilized with the severity of trauma.
Prolonged hospital length of stay (LOS) is an increasingly important quality metric among regulators and payers that has been associated with worse patient outcomes and decreased patient satisfaction. The aim of this study was to identify predictors of prolonged hospital LOS after surgery for Meyerding grade 2 spondylolisthesis using a multicenter prospectively collected registry. The prospectively collected Spine CORe™ Quality Outcomes Database (QOD) study group cohort, which consisted of 328 patients from 14 sites, was used to identify all patients who underwent single-stage lumbar fusion for Meyerding grade 2 lumbar spondylolisthesis. Prolonged LOS was defined as ≥ 4 days (75th percentile). An array of demographic, comorbidity, and perioperative factors known to impact LOS were collected for each patient. Bivariate tests, including the chi-square goodness of fit and independent t-test, were used to identify variables associated with prolonged LOS. Multivariable logistic regression analysis was conducted to determine independent predictors of prolonged LOS. The QOD cohort comprised 328 patients with a follow-up rate of > 80%. After excluding patients with an anterior or lateral surgical approach and missing LOS data, the final cohort included 268 patients, of whom 52 (19.4%) experienced a prolonged LOS. In the univariate analysis, older age, dependent ambulation, insurance status, depression, greater estimated blood loss, longer operative duration, multilevel fusion (2 or more levels), perioperative complications (e.g., incidental durotomy and urinary tract infection), and nonhome discharge were associated with prolonged LOS. In the adjusted model, multilevel arthrodesis independently increased the odds of prolonged LOS (OR 2.11, 95% CI 1.07-4.18; p = 0.03), whereas private insurance (vs Medicare/Medicaid/government) was associated with lower odds (OR 0.42, 95% CI 0.20-0.87; p = 0.02). Patient-reported outcomes at 60 months did not differ between the groups with and without prolonged LOS. In this multicenter Spine CORe™ QOD study, multilevel lumbar fusion and noncommercial insurance were the principal independent predictors of prolonged LOS after surgery for grade 2 spondylolisthesis. These findings are valuable for patient informed consent, as well as to identify higher-risk patients who could benefit from earlier inpatient resource allocation (social work and counseling) to facilitate timely discharge.
The association between serum level of folate and length of hospital stay (LOS) remains unclear. Adult participants were retrospectively recruited from two teaching hospitals in Shanghai, China. Serum level of folate and vitamin C were measured within three days after hospital admission. LOS was defined as the interval between admission and discharge. Patients were classified into tertile groups based on serum level of folate to assess the association with LOS. We further re-analyzed the association between serum level of folate and LOS under different serum level of vitamin C (also divided into tertile groups). A total number of 9,645 patients were included. Each standard deviation (SD) increase in serum level of folate was associated with 0.41 d (95% CI: -0.73 d, -0.09 d) decrease in LOS in the crude model; however, it lost significance after full adjustment. Taken the interaction effect of vitamin C into consideration, serum level of folate was inversely associated with LOS in those whose serum level of vitamin C was higher than 41.6 μmol/L. Serum level of folate was inversely associated with LOS, but the association was limited to patient with high vitamin C level.
To assess the accuracy of oxidative stress markers in predicting delayed graft function (DGF) lasting more than one week after deceased-donor kidney transplantation (KTx). A translational pilot study was carried out and a five-step hierarchical diagnostic accuracy strategy was applied to evaluate the predictive value of oxidative stress markers for DGF >7 days in adult patients with end-stage kidney disease undergoing deceased-donor KTx. Statistics comprise multiple conventional methods, C-statistics, and diagnostic test performance measures. Cohort analysis of 27 consecutive patients revealed that 33.3% (9/27) developed DGF >7 days. Assessment of the oxidative stress markers selected in the first statistical step comparing donors and controls revealed a significant correlation of the DGF duration with the recipient's protein oxidation (ρ=0.466; p=0.022) as well as donor's hydrogen peroxide (ρ=-0.489; p<0.014). The areas under the curve were 0.683 (95% confidence interval [95%CI], 0.458-0.859; p=0.224) and 0.746 (95%CI=0.534-0.897; p=0.019), respectively. A cutoff of ≤14 μM H2O2 had the highest level of discriminating power for predicting DGF >7 days. The sensitivity, specificity, positive predictive value, negative predictive value, and overall diagnostic accuracy to identify patients at risk for experiencing DGF >7 days was 85.7%, 61.1%, 46.1%, 91.7%, and 68%, respectively. Donor hydrogen peroxide was identified as a potential predictor of DGF >7 days in patients who underwent deceased-donor KTx. A cut-off value of ≤14 μmol/L is suggested to stratify patients at high risk of requiring dialysis beyond the first post-transplant week.
Intervertebral disk degeneration (IDD) is a degenerative disease mainly characterized by intervertebral disk tissue senescence and collagen loss. The role of hsa_circ_0036763 in IDD remains incompletely understood. The clinical value of hsa_circ_0036763 was explored by retrospectively analyzing the serum samples of 120 IDD patients through ROC curve, Kaplan-Meier curve, and Cox proportional hazards analysis. The severity of IDD was distinguished through the modified Pfirrmann grading system. The downstream regulatory mechanism (miR-4741/RGSL1) of hsa_circ_0036763 was mined through the miRDB and TargetScan databases, and their targeting relationships were demonstrated by the dual-luciferase reporter and RNA pull-down assays. The IL-1β-induced senescent nucleus pulposus cells (NPCs) were used to evaluate the potential role of hsa_circ_0036763/miR-4741/RGSL1 axis in IDD progression. Serum hsa_circ_0036763 expression was negatively correlated with the modified Pfirrmann grade of IDD patients (r = -0.675). Its low expression predicted a higher risk of IDD recurrence (HR = 0.252, p = 0.002). Hsa_circ_0036763 sponged miR-4741, thereby relieving its inhibition on RGSL1 expression. Exogenous overexpression of hsa_circ_0036763 alleviated the senescence, collagen loss, and oxidative stress in IL-1β-induced NPCs, which were reversed by co-transfection of miR-4741 mimics and silencing of RGSL1. Hsa_circ_0036763 may serve as a prognostic marker for IDD and participates in IDD-related NPC degeneration by regulating miR-4741/RGSL1.
Postpartum hemorrhage (PPH) remains a leading cause of maternal morbidity and mortality globally, particularly in women with antepartum hemorrhage (APH). Current risk assessment methods lack standardized predictive tools that are both simple and reliable for clinical application. We conducted a secondary analysis of a prospectively collected cohort of 100 pregnant women presenting with APH at ≥28 weeks' gestation at a tertiary care centre in northern India. Multivariable logistic regression was used to identify significant predictors of PPH. A point-based clinical risk score was then developed based on the multivariable model and internally validated using bootstrap techniques with 1000 replicates. PPH occurred in 30% of patients (n=30). Multivariable analysis identified four independent predictors of PPH: maternal age (adjusted odds ratio [OR] 1.29 per year; 95% confidence interval [CI] 1.10-1.51; p=0.002), gravidity (OR 2.11 per unit; 95% CI 1.00-4.43; p=0.049), gestational age at delivery (OR 0.64 per week; 95% CI 0.44-0.94; p=0.021), and antepartum blood transfusion (OR 2.44; 95% CI 1.02-5.84; p=0.045). The prediction model demonstrated excellent discrimination with an area under the receiver operating characteristic (ROC) curve of 0.86 (95% CI 0.80-0.92) and good calibration (slope 0.95). Bootstrap internal validation yielded an optimism-corrected AUC of 0.84. The resulting four-factor risk score stratified patients into four risk categories with PPH rates ranging from 4% (low risk) to 100% (very high risk). The four-variable score provides an accurate, easily applicable tool with excellent predictive performance. The score is a promising tool that, pending external validation, may facilitate early identification of high-risk patients and improve maternal outcomes. Further research should focus on external validation of this tool in diverse populations and its integration into clinical practice.
Dengue fever is one of the most common mosquito-borne viral infections, with severe cases characterized by plasma leakage, hemorrhage, and multi-organ involvement. Identification of dengue serotypes and reliable biomarkers is essential for predicting disease progression and guiding timely interventions. This prospective cohort study was conducted at a super-speciality tertiary care hospital in southern India from July 2024 to July 2025. A total of 69 patients presenting with dengue warning signs were included in the study. Patients were categorized into the severe dengue group (n = 25) and non severe dengue group (n = 44). Clinical data, laboratory findings, dengue serotype, and serial serum samples collected on Days 1, 4, and 8 were analyzed to evaluate the predictive and monitoring efficacy of Interleukin-6 (IL-6) and Interleukin-8 (IL-8), and followed up till discharge. Out of 69 dengue patients with warning signs, 32 dengue-positive patients were serotyped, which included DEN V-1 (31.3%), DEN V-2 (31.3%), DEN V-3 (15.6%), DEN V-4 (18.8%), and mixed DEN V-(2 + 3) (3.1%). Severe dengue patients exhibited a higher frequency of secondary dengue infection (IgG) than primary dengue infection (88% vs. 12%), with statistically significantly higher packed cell volume, hemoglobin levels, high AST levels, and prolonged activated partial thromboplastin time, as well as lower platelet counts and albumin levels. Platelet transfusion was given to 35 dengue patients, which had also resulted in significant length of stay in hospital in comparison to non-transfused patients. IL-6 and IL-8 levels were significantly elevated in severe dengue patients when compared to non-severe dengue patients on Day 1 and Day 4, followed by a decline on Day 8, corresponding with clinical recovery. However, the elevated IL-8 levels were observed to be significantly associated with longer hospital stays, indicating its potential role as an early predictor of disease progression. The observed co-circulation of multiple serotypes reflects the hyper-endemic pattern reported across India. Early measurement of these cytokines IL-6 and IL-8 helps distinguish severe from non-severe dengue among patients presenting with warning signs. IL-6 and IL-8 may have potential as biomarkers for disease severity. However their role in guiding platelet transfusion requires further investigation in non-severe cases and prioritizing timely management for those at higher risk of severe disease.
Globally, the older population is rapidly increasing and might be a challenge for healthcare providers in the future. Therefore, new methods of providing home-based care are urgently needed. Telemonitoring (TM) has been proposed to optimize patient care and enhance remote monitoring and management by clinicians. This feasibility study aimed to describe the experiences and perceptions of home-based TM among older adults with chronic conditions and healthcare professionals (HCPs). A feasibility study including interviews with older adults and diaries written by HCPs during the intervention. Participants were recruited from two municipalities in central Sweden. The intervention involved home-based TM for 4 months. TM systems, equipped with sensors for monitoring blood pressure, body weight, physical activity, and oxygen saturation, were tailored to individual needs and installed in participants' homes by an IT company. Data were collected between 2023 and 2024 and analysed using deductive content analysis regarding demand, acceptability, implementation and practicality, to examine whether TM is feasible in real-life settings. The data collection included 12 older adults and 21 healthcare professionals. From interviews with older adults (n=4), and diaries (n=9) from HCPs. Older adults measured their blood pressure every morning and entered the data on a tablet. They reported becoming more aware of changes in their health, particularly with body weight and blood pressure. For sustained engagement and motivation, a comprehensive education plan for TM involving patients, relatives, and HCP is essential. This study indicates that raising health awareness among older adults is positive and underscores the importance of person-centered care. However, some aspects were not fully realized in our study, highlighting the need for further research and refinement in TM implementation to better meet the needs of older adults and healthcare professionals. ClinicalTrials.gov Identifier: NCT04955600 registration date 2021-03-01.