Although roles of CD44 in the genetic predisposition for bronchopulmonary dysplasia (BPD) has been recognized, the immune characteristics of CD44+ monocytes in BPD are unclear. We aimed to (1) compare the expression and function of CD44 on monocytes in BPD and non-BPD infants; (2) explore the roles of CD44 on monocytes in hyperoxia-induced inflammation. Flow cytometry assessments of the expression pattern and cytokine-secreting response upon LPS stimulation of CD44+ monocytes were conducted using peripheral blood samples from BPD infants (n = 80) and non-BPD infants (n = 106). The role of CD44 on monocytes was validated in hyperoxia exposure. CD44 expression and inflammatory cytokine secretion ability increased with postmenstrual age. The ability of CD44+ monocytes to secrete IL-6 and TNF-α was significantly greater than that of CD44-monocytes. Compared with those of non-BPD infants, the response capacity of the monocytes in BPD infants to secret IL-6 and TNF-α, especially TNF-α sourced from CD44+ intermediate monocytes, was greater upon LPS stimulation. CD44 expression significantly increased in hyperoxia, and CD44 knockdown significantly ameliorated the inflammation induced by hyperoxia. CD44+ monocytes played important roles in mediating the hyper-inflammatory response in BPD. One specific subpopulation, TNF-α+/CD44+intermediate monocytes, might be a valuable marker for identifying BPD. 1. This study revealed the key role of an immune cell subtype-CD44+ monocytes-in hyperoxia-induced bronchopulmonary dysplasia (BPD) and characterized the expression and function of CD44+ monocytes between premature infants with BPD and those without BPD. 2. CD44 was closely associated with the proinflammatory cytokine secretion capacity of monocytes. A specific subpopulation-TNF-α+/CD44+ intermediate monocytes- might serve as a valuable marker for the identification of BPD in the future. 3. Our findings added new evidence of the association between CD44+ monocytes and BPD pathogenesis, which provided new insights into the immune mechanisms for BPD.
Hypertension is a major global public health challenge, and effective management of antihypertensive treatment response is crucial. This study aimed to analyze the demographic, clinical, and lifestyle factors influencing treatment response in hypertensive patients receiving Valsartan monotherapy or Valsartan/Hydrochlorothiazide (HCTZ) combination therapy. This prospective cohort study included 354 adult patients with hypertension who were on stable antihypertensive therapy for at least 6 months prior to enrollment and were initiated on or switched to Valsartan monotherapy (80 mg) or Valsartan/HCTZ combination therapy (80/12.5 mg or 160/12.5 mg) based on clinical indication. Baseline blood pressure inclusion criteria were 140-179 mmHg systolic and/or 90-109 mmHg diastolic. Demographic, clinical, and lifestyle data were collected. The primary outcome was achievement of target blood pressure (< 140/90 mmHg) after 3 weeks of treatment; the secondary outcome was reduction in systolic (SBP) and diastolic (DBP) blood pressure. Statistical analysis included repeated measures ANOVA, multivariable logistic regression, and post-hoc comparisons with Bonferroni correction. The sample consisted of 58.0% females and 42.0% males. Baseline mean SBP was 157.50 ± 17.23 mmHg and DBP was 103.23 ± 10.17 mmHg. After three weeks of treatment, mean SBP decreased to 131.44 ± 20.40 mmHg and mean DBP to 85.08 ± 12.54 mmHg (p < 0.001). Target BP achievement at Week 3 was 82.6% for Valsartan 80 mg, 70.5% for Valsartan/HCTZ 80/12.5 mg, and 56.4% for Valsartan/HCTZ 160/12.5 mg. These unadjusted rates reflect baseline differences in hypertension severity and comorbidity burden rather than differential regimen efficacy. Younger age, shorter hypertension duration, and absence of diabetes were independently associated with better treatment response in multivariable analysis. Younger age and shorter disease duration are key predictors of successful initial blood pressure control with Valsartan-based therapy in this short-term assessment. Longer-term studies are needed to confirm whether these early predictors translate to sustained blood pressure control and reduced cardiovascular events. Treatment should be individualized based on patient age, comorbidity profile, and duration of hypertension, with dose adjustment as needed to achieve optimal outcomes.
暂无摘要(点击查看详情)
Allogeneic hematopoietic stem cell transplantation is an established curative therapy for many hematological diseases, but graft-versus-host disease remains a major cause of morbidity and mortality. Tacrolimus, a calcineurin inhibitor, is widely used for prophylaxis because it suppresses T-cell activation. However, its clinical use is complicated by a narrow therapeutic window and marked pharmacokinetic variability. Therapeutic drug monitoring based on trough whole-blood concentrations is routinely used to guide dosing, but this approach has limitations, particularly in transplantation recipients who experience rapid physiological and hematological changes. This review summarizes recent insights into determinants of tacrolimus pharmacology in hematopoietic stem cell transplantation and discusses emerging perspectives for individualized dosing. Tacrolimus exerts its immunosuppressive effects by forming a complex with FK506-binding proteins that inhibits calcineurin and suppresses activation of nuclear factor of activated T cells. Beyond this canonical mechanism, interactions with FK506-binding proteins influence the distribution of tacrolimus within blood cells. Because tacrolimus strongly divides into erythrocytes and leukocytes, whole-blood concentrations reflect systemic exposure and drug binding within circulating blood components. In recipients of hematopoietic stem cell transplantation, marked fluctuations in blood cell counts during conditioning therapy and hematopoietic recovery can alter this distribution, potentially causing changes in concentrations without corresponding changes in pharmacologically active exposure. Genetic variation in drug-metabolizing enzymes further contributes to variability in tacrolimus pharmacokinetics. In particular, polymorphisms in the gene encoding cytochrome P450 3A5 influence tacrolimus metabolism and may affect early dose requirements during the post-transplant period. Additionally, temporal fluctuations in tacrolimus exposure within individual patients are increasingly recognized as clinically relevant. Measures that capture the proportion of time during which concentrations remain within the therapeutic range provide a useful indicator of exposure stability. Tacrolimus therapy after hematopoietic stem cell transplantation is influenced by molecular pharmacology, blood cell-dependent distribution, genetic determinants of metabolism, and temporal variability in drug exposure. Integrating these factors may improve understanding of therapeutic drug monitoring and promote more individualized strategies to maintain stable immunosuppression and improve transplant outcomes.
Regular exercise promotes health through multiple mechanisms, with DNA methylation serving as a key epigenetic modification involved in the molecular regulation induced by exercise. Peripheral blood, as an accessible tissue, is commonly used to study exercise-induced DNA methylation changes. This systematic review aims to synthesize evidence from human exercise intervention studies on the effects of exercise on peripheral blood DNA methylation and related epigenetic markers, and to evaluate their potential as biomarkers of exercise efficacy. The study protocol was registered in PROSPERO (CRD420251069398). A systematic search was conducted in Pubmed, Web of Science, Cochrane Library, Embase and Scopus databases from database inception through 29 May 2025 for all English-language studies. Eligible studies were included. Risk of bias of randomized controlled trials were assessed using the Cochrane collaboration's tool (version 5.1.0) and quasi-experimental studies were assessed using the JBI Critical Appraisal Checklists for Quasi-Experimental Studies. The results were summarized using narrative synthesis. Twenty-nine studies were included, covering both acute and long-term exercise interventions. Acute exercise induced minor and mostly statistically nonsignificant changes in peripheral blood DNA methylation, whereas long-term exercise elicited significant methylation remodeling in genes related to metabolism, inflammation, and immune function. Exercise dose (frequency, intensity, and duration) and population characteristics influenced the magnitude and scope of methylation responses. Peripheral blood DNA methylation provides robust evidence for exercise-mediated systemic epigenetic regulation and shows promise as a biomarker for exercise adaptation and intervention effects. Future research should optimize intervention designs and methodological standards to deepen understanding of exercise epigenetic mechanisms.
Cytomegalovirus (CMV) reactivation is a major cause of morbidity and mortality after haploidentical hematopoietic stem cell transplantation (Haplo-HSCT). While T cells are crucial for controlling CMV, the dynamic characteristics of the T-cell receptor (TCR) repertoire and their predictive value for refractory CMV reactivation remain poorly characterized. This study aimed to longitudinally monitor the TCR repertoire in Haplo-HSCT patients to identify features associated with CMV reactivation and to explore predictive biomarkers for refractory and recurrent CMV disease. We enrolled 164 Haplo-HSCT patients, of which a subset of 28 was prospectively monitored for TCR repertoire dynamics via multiplex PCR and high-throughput sequencing. Patients were stratified into control (no reactivation, n = 10), acute phase (first reactivation, n = 18), and resolution phase (after clearance, n = 18) groups. CMV-specific TCRs were identified by database matching. The incidence of CMV reactivation was 48.17% (79/164). CMV reactivation induced significant perturbations in TCR repertoire architecture, characterized by increased CDR3 clonality, reduced clonotype distribution evenness, and a decreased number of unique Clonotypes (all P < 0.05) compared to controls, indicating a state of oligoclonal expansion. This skewed architecture persisted even after viral clearance, suggesting delayed CMV-specific immune reconstitution. Consistent with this, patients with CMV reactivation had significantly lower CD4+ T cell counts compared to those without reactivation (P = 0.003). Longitudinal analysis revealed that impaired CD4+ T cell recovery preceded CMV reactivation. Patients with secondary reactivation (6/18) exhibited reduced diversity of V-J gene pairings (lower Shannon index, P < 0.05) and significant downregulation of specific TRBV28/TRBJ1-1 and TRBV6-1/TRBJ2-7 combinations. Crucially, lower abundance and total frequency of CMV-specific TCRs at initial reactivation predicted secondary reactivation (P < 0.05) and were correlated with a longer disease course. Four patients who failed letermovir prophylaxis lacked CMV-specific TCRs in their top clonotypes. The abundance of CMV-specific TCRs is a key predictor of refractory CMV reactivation post-Haplo-HSCT. Longitudinal TCR profiling offers a promising strategy for risk stratification, potentially guiding more personalized preemptive therapies. Not applicable.
This study aims to investigate the role of lactylation and m6A modification-related genes in the tumor microenvironment and immunotherapy for hepatocellular carcinoma (HCC) patients. RNA-sequence data and corresponding clinical information of HCC were obtained from the TCGA and ICGC datasets. LASSO Cox regression analysis was implied to construct a lactylation-m6A related prognostic model. The 7-gene signature was established and effectively stratified patients into high- and low-risk groups. Further analysis revealed significant differences between the two risk groups in terms of tumor microenvironment, expression levels of immune checkpoint genes, and drug responsiveness. Specifically, the high-risk group exhibited increased immune cell infiltration, lower IC50 values for several drugs including 5-fluorouracil, afatinib, crizotinib, cediranib, taselisib, and staurosporine; Whereas the low-risk group displayed reduced stromal component proportions and better responses to entinostat, irinotecan, KRAS inhibitors, cisplatin, axitinib, and topotecan. Functionally, knockdown of TCOF1 and HDAC1 significantly attenuated the migration and invasive capacity of Huh-7cells. The lactylation-m6A related prognostic model exhibited robust predictive efficiency in HCC. TCOF1 and HDAC1 may be promising tumor biomarkers for HCC and more researches are needed to validate these results.
Robotic-assisted CABG is a minimally invasive alternative to conventional sternotomy. The purpose of this study was to evaluate the short and mid-term outcomes of our experience and to enrich the literature with quality comparisons of these procedures. From 1 October, 2020 to 1 July, 2025, 927 CABG procedures were performed at a single institution. Two groups of patients were analyzed: the robotic assisted minimally invasive CABG group (RA-CABG) and the conventional sternotomy CABG group (CS-CABG). This was a retrospective comparison of all consecutive patients undergoing conventional CABG and RA-CABG with the use of propensity score matching with 24 preoperative covariates. Of the 927 cases, 480 patients were matched with 240 patients each in the RA-CABG and CS-CABG groups. The matching successfully eliminated all preoperative differences between the groups. There were three conversions to median sternotomy in the robotic group. Number of distal anastamoses per patient was 2.84 in the RA-CABG group and was 3.05 in the CS-CABG group. The cross clamp time and cardiopulmonary bypass time were statistically significantly shorter in the CS-CABG group (cross clamp: 59 vs. 47, minutes and CPB: 95 vs. 67, minutes, p = 0.001). The RA-CABG group had significantly (p < 0.001) lower 24-h postoperative blood loss (400 ml vs. 800 ml), fewer blood transfusion, shorter mechanical ventilation time (6 vs. 10, hours), shorter length of ICU stay (20 vs. 28, hours), shorter length of hospital stay (6 vs. 7, days). Early mortality (in-hospital and operative) rates were similar between the groups, 0.8% for RA-CABG and 1.3% for CS-CABG (p = 0.683). In like manner, there was no significant difference in mid-term mortality rates between the two groups (p = 0.258). The RA-CABG is safe and feasible alternative to the conventional sternotomy CABG. This method offers advantages such as avoiding sternotomy, providing greater comfort to the surgeon during LIMA harvesting, and enabling longer LIMA harvesting.
Fenofibrate, an active pharmaceutical ingredient (API) of the fibrate class approved by the U.S. Food and Drug Administration (FDA) for the treatment of hypercholesterolemia and hyperlipidemia, has emerged as a promising candidate for repurposing in cancer. Indeed, beyond its lipid-lowering effects, fenofibrate has demonstrated antitumor properties, including antiproliferative and cytotoxic effects in normoxia (21% O2) in preclinical models of GBM. In addition, fenofibrate may inhibit hypoxia-inducible factor-1 alpha (HIF-1α) and reduce the expression of hypoxia-inducible genes, potentially overcoming hypoxia-induced chemoresistance. These findings, combined with its established clinical safety profile, support further investigation of fenofibrate as a potential therapeutic strategy in GBM. On this basis, physico-chemical characterization of fenofibrate was assessed to determine its ability to cross the blood-brain barrier. We determined in vitro on GBM cell lines U251-MG, U87-MG and GL261, the efficacy of fenofibrate in decreasing GBM cell viability and proliferation under normoxia (21% O2) and hypoxia (1% and 0.2% O2). We investigated the kinetics of its internalization by HPLC. In addition, the safety of fenofibrate was evaluated on non-tumoral brain cells. Dunn's post-hoc test was used after a significant Kruskal-Wallis. Being practically insoluble in water, fenofibrate was dissolved in dimethyl sulfoxide (DMSO) for studies. Primary astrocytes showed no signs of toxicity following treatment with fenofibrate. The effect of fenofibrate on GBM cell lines was studied at various time points and exhibited a cell type-, time- and concentration-dependent cytotoxicity with an LC50 of 25, 43.7 and 49.6 µM for U251-MG, U87-MG and GL261 cells, respectively. Interestingly, fenofibrate uptake was confirmed, 12.9 ± 5.7% and 14.2 ± 6.6% of fenofibrate was found in U251-MG and U87-MG cells, respectively. Intracellular concentrations of fenofibrate increased over time and no precipitation was observed. In hypoxia (1% and 0.2% of O2), the cytotoxic effect of fenofibrate was still present, albeit decreased. Furthermore, fenofibrate induced a cytostatic effect on U251-MG and U87-MG under both normoxia (21% O2) and hypoxia (1% and 0.2% O2), reducing their proliferation. Under the experimental conditions tested, namely in vitro studies, fenofibrate appeared more efficient compared with temozolomide, the standard treatment for GBM. This study demonstrated the cytotoxic and cytostatic effect of fenofibrate on GBM cells. These results indicate that fenofibrate may be a therapeutic alternative for GBM treatment.
The pre-weaning period is critical because early-life nutrition and management influence growth, metabolic function and Rumen development, thereby affecting subsequent productivity in dairy calves. Zinc (Zn) supplementation plays a key role in supporting these processes through its involvement in enzymatic activity, antioxidant defense systems, and metabolic regulation, but conventional sources often have bioavailability limitations due to the formation of insoluble complexes in the gastrointestinal tract. This study addresses this challenge by evaluating three Zn forms (ZnO, Zn-lysine, and nano-ZnO) to identify the most effective source for enhancing growth rates, nutrient utilization, and metabolic health. Twenty-four newborn Holstein calves, each with an initial body weight of 40.5 ± 4.24 kg, were selected and randomly allocated to receive one of three treatments: ZnO, Zn-lysine, and nano-ZnO supplementation. Each calf received 80 mg of Zn daily. Supplementation with nano-ZnO increased dry matter intake (P < 0.01), average daily gain (P < 0.01), and hip width (P < 0.01) compared to Zn-lysine and ZnO. However, there were no differences in feed conversion ratio. The treatments did not affect apparent digestibility or rumen fermentation, except for a lower rumen ammonia nitrogen concentration in the nano-ZnO group compared to the other two treatments (P < 0.01). Regarding blood parameters, calves receiving Nano-ZnO showed higher blood triglyceride concentration (P = 0.04) and superoxide dismutase activity (P < 0.01), while blood D-lactate concentration was lower in the nano-ZnO and Zn-lysine groups than in the ZnO group (P = 0.01). Additionally, both fecal consistency (P = 0.02) and nasal discharge (P < 0.01) scores were significantly reduced in the nano-ZnO group. In summary, the study suggests that nano-ZnO is a more effective Zn source and an efficient additive for improving dairy calf performance.
Chronic ankle conditions often lead to persistent functional limitations. Blood flow restriction (BFR) training is a potential adjunct to rehabilitation, but its specific efficacy for chronic ankle conditions remains to be synthesized. To systematically evaluate the effects of BFR-assisted rehabilitation on primary outcomes (dynamic balance and patient-reported ankle stability) and secondary outcomes (ankle range of motion and muscle strength) in individuals with chronic ankle conditions (including chronic ankle instability, chronic ligamentous injury, and tendinopathy). A systematic search of PubMed, Web of Science, Embase, CNKI, and Wanfang databases was conducted up to April 13, 2025. We included randomized controlled trials (RCTs) involving adults with chronic ankle instability (CAI) defined by a history of sprain and/or Cumberland Ankle Instability Tool (CAIT) score < 24. Grey literature was excluded. The protocol was registered on PROSPERO (CRD420251249207). Methodological quality was assessed using the Cochrane Risk of Bias (RoB) 1.0 tool. Data were pooled using random- or fixed-effects models. Seven RCTs (n = 204) were included. The overall risk of bias across the included studies was generally low to moderate. Meta-analysis of post-intervention values indicated that BFR-assisted rehabilitation significantly improved the primary outcome of dynamic balance (MD = 5.75; 95% CI [2.10, 9.40]; P < 0.01; I2 = 34%, P = 0.22) compared with conventional rehabilitation. Significant improvements in the other primary outcome, CAIT scores were also observed (MD = 3.68; 95% CI [0.26, 7.11]; P = 0.05). However, secondary outcomes for dorsiflexion and plantarflexion range of motion exhibited high heterogeneity and unstable pooled estimates, showing no significant benefit. Muscle strength data were insufficient for meta-analysis. BFR-assisted rehabilitation appears to enhance dynamic balance and perceived ankle stability in patients with chronic ankle conditions. However, evidence regarding its effect on joint range of motion remains inconclusive because of data instability. Current evidence supports BFR as a functional intervention, though standardized protocols are needed to further validate its clinical utility.
This article presents the design and the numerical analysis of a smart label-free Surface Plasmon Resonance (SPR) sensor to detect the concentration of haemoglobin in blood and the concentration of glucose in the urine samples. The suggested sensor uses a thin film of silver (Ag) on the prism's surface to excite the surface plasmons. The finite element method (FEM) was used to do numerical simulations to optimize the layer thickness and to analyse the sensor's performance in terms of the sensitivity and the Figure of merit (FOM). The results of the simulation showed that there is a linear correlation between resonance wavelength shift and change in analyte refractive index. The optimised design obtained a sensitivity of 288.29 °/RIU, QF of 780.80 [Formula: see text], SNR of 15.62, FoM of 492.51 [Formula: see text] and CSF of 539.20. The label-free methodology involves no chemical tagging and thus allows biosensing that is quick, real-time and economical. The suggested SPR sensor has great possibilities to be implemented in the non-invasive biomedical applications, diagnostics and point-of-care monitoring. In addition, machine learning models were employed to predict sensor sensitivity based on structural and optical parameters, demonstrating the strong capability of data-driven approaches for rapid performance estimation and design optimization.
To compare differences between candidemia and non-candidemia in the past 12 years and to construct a predictive model of candidemia to enrich clinical data and improve the diagnosis and treatment of candidemia. A matched case-control study design was used to collect the clinical data of inpatients in a tertiary hospital in Yunnan from 2013 to 2024. The patients were divided into candidemia group and non-candidemia group (control group), and accurately matched at 1:1. SPSS was used to compare the differences in epidemiological characteristics, risk factors and survival. Drug sensitivity analysis of candidemia was performed by broth dilution method. Logistic regression analysis was performed using R language and a clinical prediction model was constructed. A total of 134 cases were collected, including 67 with candidemia and 67 without candidemia. Elderly men were significantly more susceptible to candidemia and non-candidemia. The time to positivity of blood cultures, hospitalization duration, urolithiasis distribution rate, and mortality rate of patients with candidemia were significantly higher than those of patients without candidemia (all P < 0.05). Five types of Candida were isolated from patients with candidemia, and the antifungal drug sensitivities of amphotericin B, anidulafungin, caspofungin, and micafungin to all detected Candida strains were 100%. Chronic kidney disease, hepatorenal syndrome, tigecycline and amikacin use, abdominal infection, and invasive pulmonary fungal infection may be potential risk factors for candidemia. Logistic regression analysis showed that the time to positivity of blood cultures (≥ 2 days) (odds ratio [OR] = 121.03, P < 0.001), number of concurrent infections during hospitalization (OR = 3.9, P < 0.01), and blood transfusion treatment (OR = 6.91, P < 0.05) were risk factors, whereas fibrinogen levels (OR = 0.77, P < 0.01) and C-reactive protein levels (OR = 0.98, P < 0.01) were protective factors. The receiver operating characteristic curve, calibration curve, and clinical decision curve showed that the model was meaningful. This study found that there were differences in multiple outcomes between candidemia and non-candidemia. The five potential risk factors analyzed can be used as predictors of candidemia and may provide a reference for the diagnosis and treatment of clinical candidemia.
To evaluate the feasibility and preliminary clinical outcomes of percutaneous endoscopic visual trephine decompression (PEVTD) in combination with pedicle screw fixation for the treatment of lumbar burst fractures with secondary spinal stenosis. A retrospective analysis was conducted on 21 patients with lumbar burst fractures and secondary spinal stenosis treated from December 2022 to December 2024. All patients underwent PEVTD combined with percutaneous pedicle screw internal fixation. The duration of endoscopic decompression surgery, blood loss, the incidence of intraoperative dural tearing, the Visual Analogue Scale (VAS) for pain score, the American Spinal Injury Association (ASIA) grade, and the MRI dural sac cross-sectional area were recorded to assess the effectiveness of decompression. The mean operative time for endoscopic decompression was 49.3 ± 10.2 min (41-60 min) with an estimated blood loss of approximately 160 ml. No intraoperative dural tearing occurred. Compared with the preoperative values, the postoperative VAS score and ASIA grade at 1 week and 1 year significantly improved(P < 0.05). The dural sac cross-sectional area increased significantly postoperatively (P < 0.01). PEVTD appears to be a feasible minimally invasive technique for treating lumbar burst fractures with secondary spinal canal stenosis in selected patients (those without posterior column injury and with ≤ 50% canal compromise). The preliminary results suggest that PEVTD is a feasible minimally invasive technique with short-term outcomes that appear acceptable in this selected cohort, although these findings are exploratory and require confirmation in prospective comparative studies.
African swine fever virus (ASFV) is a large, complex DNA virus that causes a haemorrhagic disease with lethality rates approaching 100%. Mechanisms of virulence are still poorly understood, but loss of members of the multigene families (MGF) is commonly observed in naturally occurring attenuated virus strains, indicating that these proteins play an important role in disease pathogenesis. Here, a suppressor of cytokine signalling (SOCS)-box like motif was identified in proteins of the MGF505 cluster. SOCS-box motifs typically recruit the Cullin-RING-ligase (CRL) machinery, a superfamily of E3 ubiquitin ligases that target proteins for proteasomal degradation. Data presented show that MGF505-1R inhibits the host innate immune responses controlled by the activation of the transcription factors IRF3 and NFκB. MGF505‑1R expression was shown to correlate with a reduction in the protein levels of the transcriptional co‑activator p300. Targeted mutations of the SOCS box motif were shown to reverse the observed IRF3 and NFκB inhibition. The data support a role for MGF505-1R in hijacking the host ubiquitin machinery to evade the host immune responses.
Centenarian patients constitute a rapidly growing yet understudied population in emergency medicine. Evidence regarding prognostic factors and one-year outcomes in individuals aged 100 years and older presenting to the emergency department remains limited. This retrospective observational study was conducted in the emergency department of Giresun Training and Research Hospital and included centenarian patients presenting between 2010 and 2023. A total of 160 emergency department visits from 83 unique patients were evaluated. Demographic characteristics, clinical variables, comorbidities, frailty indices, and laboratory parameters obtained at admission were recorded. Frailty was assessed using a modified frailty index excluding functional dependence (mFI-4) and the Clinical Frailty Scale (CFS). The primary outcome was one-year all-cause mortality. Kaplan-Meier survival analysis and Cox proportional hazards regression analysis were performed at the patient level using the index emergency department visit. In the descriptive visit-level analysis, non-survivor visits showed higher hospitalization frequency and less favorable inflammatory and renal function marker profiles than survivor visits, while pulmonary diseases were more frequent among non-survivors and cardiovascular diseases were more common among survivors. Modified frailty index scores did not differ significantly between groups. Higher CFS categories were associated with shorter median survival times, although Kaplan-Meier analysis showed no statistically significant separation between frailty categories. In Cox regression analysis, hospitalization at the index emergency department visit and higher blood urea nitrogen levels remained independently associated with one-year mortality. In centenarian patients presenting to the emergency department, traditional comorbidity-based frailty indices show limited discriminatory value for one-year mortality. Acute clinical presentation and laboratory parameters reflecting inflammatory burden, renal function, and physiological reserve appear to be more closely associated with outcomes. The study is not registered in a clinical trial registry.
Eimeria necatrix, a member of the Apicomplexa phylum, is one of the most pathogenic parasites, causing high mortality in chickens. Microneme proteins (MICs) play essential roles in host cell recognition and invasion by apicomplexan parasites and are also attractive candidates for vaccine development. However, comprehensive studies on E. necatrix MICs remain limited. Eimeria necatrix MIC3 gene (EnMIC3) was amplified and expressed in Escherichia coli. The recombinant protein (rEnMIC3) was characterized via SDS-PAGE and Western blot. The antigenicity of rEnMIC3 and its localization in sporozoites (SZ) and second-generation merozoites (MZ-2) of E. necatrix were determined by Western blot and indirect immunofluorescence analyses (IFAs). The dynamic expression of EnMIC3 across different developmental stages and its impact on sporozoite invasion of host cells were analyzed. The immune protection provided by rEnMIC3 was evaluated in chickens using weight gain, lesion scores, oocyst production, anticoccidial index (ACI), and antibody levels. The open reading frame of EnMIC3 was 798 bp, encoding a 265-amino acid protein with a predicted molecular weight of 28.50 kDa. EnMIC3 contained a signal peptide and a single epidermal growth factor (EGF)-like domain. The rEnMIC3 with an approximate molecular weight of 36 kDa could be specifically recognized by convalescent sera from chickens infected with E. necatrix. The molecular mass of the native protein was approximately 35 kDa, and it localizes to the apical region in SZ but exhibits a cytoplasmic distribution in MZ-2. EnMIC3 mRNA was expressed at significantly higher levels in SZ than in MZ-2, whereas protein expression displayed an inverse pattern. Anti-rEnMIC3 polyclonal antibodies inhibited sporozoite invasion of DF-1 cells in a dose-dependent manner. Vaccination with rEnMIC3 conferred effective protection against E. necatrix challenge, with the high-dose group (200 µg) achieving the highest ACI value (171.32) and markedly elevated serum antibody levels. These findings not only offer a foundation for understanding the role of EnMIC3 protein in the host invasion of E. necatrix but also present a potential protective antigen of E. necatrix for the development of a subunit vaccine against avian coccidiosis.
To investigate the influence of technical and clinical variations in uterine balloon tamponade (UBT) procedures on postpartum blood loss and maternal outcomes in women of postpartum hemorrhage (PPH) due to uterine atony. This retrospective cross-sectional study was conducted on 63 patients who underwent UBT for atonic PPH between January 2020 and March 2025. Data collected included demographic characteristics, uterine balloon inflation volume, timing and duration of insertion, and maternal outcomes. The primary outcomes were success of UBT and occurrence of composite adverse maternal outcomes (CAMO). ROC analysis was used to identify optimal threshold values for procedural variables. UBT successfully controlled hemorrhage in 74.6 % of women. CAMO occurred in 17.5 % of patients. Severe PPH was observed in 60.3 % of women and was significantly associated with higher maternal shock index, increased transfusion requirements, and longer ICU stays. A delay of ≥45 min from delivery to UBT insertion was strongly associated with increased blood loss and higher CAMO incidence (AUC 0.858, p<0.001). Though not independently indicative of adverse outcomes, balloon volumes greater than 232.5 mL were associated with increased estimated blood loss (p=0.021). Duration of balloon retention and deflation timing had no significant impact on clinical outcomes. UBT is an effective intervention for controlling PPH secondary to uterine atony. Early application, within 45 min postpartum, is critical in reducing blood loss and maternal morbidity. Inflation volume may reflect bleeding severity but does not correlate with adverse outcomes. Findings support prompt and standardized application of UBT in PPH protocols.
Chronic low back pain (CLBP) is a prevalent condition with unclear pathophysiology and substantial socioeconomic burden. Cerebral blood flow (CBF) alterations have been implicated in CLBP, yet previous arterial spin labeling (ASL) studies using single post-labeling delay (PLD) have yielded inconsistent results. In this study, multi-PLD ASL was combined with machine learning to characterize CBF alterations in CLBP and to explore their classification feasibility. Seventy-eight patients with CLBP and seventy-eight age- and sex-matched healthy controls underwent multi-PLD ASL scanning. Voxel-wise comparisons of normalized CBF were performed, followed by correlation analyses with clinical measures. Radiomics features extracted from brain regions showing significant CBF differences were used to construct machine learning classification models via a rigorous nested cross-validation and LASSO feature selection framework. Compared with healthy controls, patients with CLBP exhibited significant hyperperfusion in the right lingual gyrus and right thalamus. CBF values in the right lingual gyrus were positively correlated with Oswestry Disability Index scores, while thalamic CBF was positively correlated with pain intensity. Among the evaluated models, the XGBoost classifier achieved the best performance, with an area under the curve of 0.842 (95% CI: 0.774-0.901). These findings indicate that region-specific CBF alterations are closely associated with pain severity and functional impairment in CLBP. Machine learning analysis of CBF radiomic features shows potential discriminative performance in identifying patients with CLBP.
Cross-study inconsistencies in autism spectrum disorder (ASD) blood microRNA biomarker studies suggest that methodological heterogeneity may substantially limit reproducibility. We conducted an exploratory meta-analysis of publicly available ASD blood miRNA datasets from the Gene Expression Omnibus, applying rigorous inclusion criteria and standardized analytical protocols. Three datasets were included (GSE89596, GSE67979, GSE222046) comprising 614 miRNAs across 90 participants (45 ASD, 45 controls). Random-effects meta-analysis was performed using Hedges' g effect sizes, with comprehensive heterogeneity assessment and leave-one-dataset-out cross-validation. No miRNAs survived multiple testing correction (Benjamini-Hochberg FDR < 0.05), though seven candidate signals showed consistent evidence with unadjusted p < 0.01 and large effect sizes. These candidates demonstrated near-zero between-study heterogeneity and consistent directionality across validation analyses. Potential age-related and platform-related differences were observed, with near-zero correlation between adult and pediatric effect sizes (Kendall's τ = -0.022); however, these two sources of variability were fully confounded in the available data and could not be separated. Some miRNAs exhibited extreme between-study variability (I² > 80%), indicating substantial methodological differences. Cross-validation revealed that excluding the single adult dataset reduced sign consistency from 89.9% to 68.9%. Our findings suggest that age-related and methodological factors, including technical platform differences, may contribute to limited reproducibility in ASD blood miRNA research, and that blood-derived signals should be interpreted as potentially reflecting peripheral physiological states rather than central disease mechanisms. A supplementary cross-tissue analysis using post-mortem prefrontal cortex data (GSE59286; n = 45) provided direct empirical support for this interpretation: the majority of blood candidate miRNAs showed no corresponding expression in brain tissue, with only hsa-miR-29c-5p demonstrating directional concordance across both tissues. These findings suggest that age stratification, platform harmonization, and cross-tissue validation should be considered essential prerequisites for reliable ASD miRNA biomarker discovery, rather than optional refinements.