The use of angiogram-based fractional flow reserve technologies in routine clinical practice is increasing steadily. Available data suggest that it is associated with comparable clinical outcomes compared with wire-based fractional flow reserve (FFR), but evidence is still limited, especially in patients with acute coronary syndromes (ACSs). The aim of this study was to examine the real-world clinical outcomes of an angiogram based fractional flow reserve technology (FFRangio; CathWorks, Kfar-Saba, Israel) to guide treatment for coronary artery disease in patients presenting with acute versus chronic coronary syndromes (CCSs). This is a retrospective, observational study. Our cohort included consecutive patients undergoing coronary angiography in 7 centers (Japan, 6; Israel, 1), whose treatment was guided by FFRangio between January 2021 and December 2023. The primary end point was 2-year cumulative incidence of major adverse cardiac events, stratified by presentation (ACS/CCS) and treatment strategy (revascularization/deferral). Overall, 2245 patients (ACS, 756; CCS, 1489) were included. Patients with ACS were younger and more likely to be diabetic. At 2 years, ACS presentation resulted in higher risk of major adverse cardiac events (9.8% versus 5.6%; HR, 1.80; P<0.001). Stratified by presentation and treatment, deferred patients had similar risk of major adverse cardiac events (ACS, 5.4%; CCS, 4.7%; HR, 1.08; P=0.672) for treated patients, ACS presentation associated with increased risk for major adverse cardiac events (11.7% versus 7.3%; HR, 1.64; P<0.001). In a real-world setting, FFRangio-guided treatment yields 2-year outcomes comparable with current data for wire-based fractional flow reserve for both patients with ACS and patients with CCS. ACS presentation is associated with increased risk for major adverse cardiac events in patients undergoing FFRangio-guided revascularization but not deferral. These results demonstrate the clinical utility of FFRangio-guided treatment across the spectrum of coronary artery disease presentation excluding ST-segment-elevation myocardial infarction. URL: https://clinicaltrials.gov/study/NCT05648396?term=FFRangio%20&viewType=Card&rank=3; Unique Identifier: NCT05648396.
Venetoclax, a potent B-cell leukemia/lymphoma-2 inhibitor, is an antineoplastic agent used in various hematologic malignancies, including acute myeloid leukemia (AML). The aim of the study is to evaluate the incidence, risk factors, and outcomes of major adverse cardiac events (MACEs) among patients with newly diagnosed AML receiving venetoclax-based therapy. We conducted a retrospective, single-center cohort study of 214 patients with newly diagnosed AML treated with venetoclax. MACE was defined as a composite of new-onset heart failure, heart failure exacerbation requiring hospitalization, myocardial infarction/coronary revascularization, or stroke/transient ischemic attack during active venetoclax-based therapy. The Fine-Gray subdistribution hazard model determined the association between baseline clinical characteristics and MACEs, with noncardiovascular death as a competing risk. A Cox proportional hazard model determined the association between time-dependent MACEs and overall survival. Among 214 patients (mean age, 71.5 years; 41% women), MACEs occurred in 14 (6.5%) patients. Patients with non-de novo (secondary /treatment-related) AML had a higher incidence of MACEs compared with de novo AML (85.7% versus 14.3%; subdistribution hazard ratio [HR], 7.1 [95% CI, 1.5-33]; P<0.01). Time-dependent MACE was independently associated with inferior overall survival (adjusted HR, 2.75 [95% CI, 1.41-5.35]). In conclusion, MACE is a clinically significant event that occurs in 6.5% of patients with AML treated with venetoclax and is a higher-risk event for patients with secondary AML or treatment-related AML.
暂无摘要(点击查看详情)
The therapeutic effect of traditional oral ulcer medications is limited, due to their complex synthetic routes, weak wet tissue adhesion, and poor bioactivity. To address these issues, a novel sprayable self-gelling powder composed of polyacrylic acid (PAA), quaternized chitosan (QCS) and puerarin (PUE) is developed, which can absorb saliva quickly and form a hydrogel patch in situ at the ulcer site through hydrogen bonding and ionic interactions. More importantly, this hydrogel patch undergoes phase separation under excessive flowing saliva and spontaneously converts into a hierarchical structure with a non-adhesive top PUE layer and an adhesive bottom PAA/QCS/PUE layer, avoiding unexpected adhesion between the hydrogel patch and adjacent tissues except oral ulcer site. In addition, this hierarchical hydrogel patch also exhibits excellent antibacterial and anti-inflammatory properties, as proven by the universal antibacterial performance toward multiple bacteria, high reactive oxygen species (ROS) scavenging capacity and promotion of macrophages from M1 to M2 types. Owing to the synergism among good one-sided tissue adhesion, antibacterial and anti-inflammatory properties, in vivo wound healing of rabbit oral mucosa defect can be realized efficiently using self-gelling PAA/QCS/PUE powders.
Cardiac rehabilitation (CR) is a critical secondary prevention strategy for patients with acute myocardial infarction (AMI). Although CR has been covered by the National Health Insurance in Korea since 2017, real-world participation remains suboptimal. We assessed trends in CR participation over the past 5 years and identified factors associated with non-participation in treatment. This retrospective observational study analyzed data from the National Health Insurance Service claims database between 2018 and 2022. Patients aged ≥ 40 years who were hospitalized for AMI and received acute-phase interventions, such as thrombolysis, percutaneous coronary intervention (PCI), or coronary artery bypass grafting (CABG), were included. CR participation was defined as having at least one claim for education, assessment, or treatment services. Multivariable logistic regression was conducted to identify factors associated with non-participation in CR treatment, focusing on patients who received care at CR-providing institutions, to minimize confounding related to institutional availability. In total, 109,436 patients were included. Overall participation rates in CR education, assessment, and treatment increased from 2018 to 2022. However, sustained participation in outpatient treatment remained low, with only 5.7% of patients in 2022 completing ≥ 11 treatment sessions. Factors associated with lower treatment participation included the absence of prior education and assessment, lower socioeconomic status, undergoing PCI rather than CABG, and receiving care at non-tertiary hospitals. Regional disparities and inadequate institutional infrastructure further contributed to reduced access to CR services. Although CR participation among patients with AMI in Korea has gradually increased, treatment continuity remains suboptimal. To enhance CR utilization, policy efforts should prioritize reducing patient burden, addressing provider-level barriers, and promoting equitable access through financial support mechanisms and infrastructure expansion.
Acute ischemic stroke (AIS) triggers a complex systemic immune-inflammatory response. While proprotein convertase subtilisin/kexin type 9 (PCSK9) inhibitors provide significant lipid-lowering and pleiotropic anti-inflammatory effects, their impact on early peripheral inflammatory and immune responses in AIS patients due to large artery atherosclerosis (LAA) remains underexplored. A total of 72 AIS patients attributed to LAA were included in the final analysis of this prospective study (the standard treatment group, n = 24; the intensive treatment group, n = 48). Fasting blood samples were obtained at admission and the 2-week follow-up. A comprehensive panel of laboratory parameters was assessed at baseline and the 2-week follow-up, encompassing lipid profiles, a spectrum of inflammatory biomarkers (including but not limited to high-sensitive C-reactive protein (hs-CRP), interleukin-6 (IL-6), and tumor necrosis factor-α (TNF-α)), and lymphocyte subsets. The intensive treatment group achieved a significantly greater reduction in low-density lipoprotein cholesterol (LDL-C) compared to the standard treatment group (P < 0.001). Notably, the intensive treatment group also showed significant reductions in key pro-inflammatory cytokines IL-6 (P = 0.024) and TNF-α (P = 0.041). However, no significant between-group differences were observed in the changes of peripheral blood lymphocyte subsets (P > 0.050). In AIS patients, early adjunctive PCSK9 inhibitor therapy provides superior lipid-lowering and may modulate specific pro-inflammatory cytokines compared to statin monotherapy, without significantly altering peripheral lymphocyte subset distributions within 2 weeks. Further large-scale studies are warranted to validate its immunomodulatory role and long-term clinical outcomes. ClinicalTrials.gov NCT05410457. Registered May 24, 2022. https://www.clinicaltrials.gov/ct2/show/NCT05410457; ClinicalTrials.gov NCT05397405. Registered May 23, 2022. https://www.clinicaltrials.gov/ct2/show/NCT05397405. Acute ischemic stroke (AIS) triggers a complex systemic immune-inflammatory response. While proprotein convertase subtilisin/kexin type 9 (PCSK9) inhibitors provide significant lipid-lowering and pleiotropic anti-inflammatory effects, their impact on early peripheral inflammatory and immune responses in AIS patients due to large artery atherosclerosis (LAA) remains underexplored. Our study evaluated the early immunomodulatory effects of PCSK9 inhibitor add-on therapy versus statin monotherapy in acute ischemic stroke patients. The combination therapy achieved superior LDL-C reduction and significantly decreased key pro-inflammatory cytokines (IL-6 and TNF-α), while no significant impact on peripheral lymphocyte subset distributions was observed within 2 weeks.
Automatic segmentation of gliomas on amino acid PET is essential for quantitative tumor assessment, a pillar in monitoring gliomas under treatment. This study aimed to develop a deep learning model for the automated extraction of PET RANO criteria from [18F]FDOPA PET, with external validation. A total of 635 static [18F]FDOPA PET scans from three European centers were retrospectively included for glioma diagnosis, recurrence assessment, or treatment monitoring. The training cohort comprised 530 scans from Nancy Hospital, with external validation and test sets from Pitié-Salpêtrière Hospital (n = 66) and Turin Hospital (n = 39). Ground truth segmentations followed international guidelines. A 3D U-Net was trained to segment tumor and healthy brain volumes. Performance was evaluated using the Dice coefficient using the whole tumor volume. Quantitative agreement for PET RANO criteria 1.0 parameters, tumor-to-background ratios (TBRmean, TBRmax) and metabolic tumor volume (MTV), was assessed at the lesion-level. Tumor segmentation achieved Dice of 0.925 [0.841; 0.970] in training, 0.885 [0.829; 0.925] in validation, and 0.851 [0.733; 0.911] in the test set. At lesion level, agreement with expert quantification was high, with low bias and strong reliability for MTV (2.293 [-4.734; 9.321] mL), TBRmax (0.056 [-0.189; 0.301]), and TBRmean (-0.139 [-0.424; 0.146]) and intra-class correlation coefficients superior to 0.93. Measurable lesions were correctly identified in more than 97% of cases. Our [18F]FDOPA PET deep learning model (available at https://github.com/IADI-Nancy/FDOPA-PET-GliomaSeg) demonstrates robust multicenter performance and enables fully automated, reproducible quantification, supporting broader clinical adoption of amino-acid PET in neuro-oncology. This study presents an automated method to measure glioma burden on brain imaging routinely used in clinical practice. Using a large multicenter dataset, we trained and validated a deep learning model that reliably identifies tumor tissue and produces quantitative measurements that closely match expert assessments. These measurements are central to evaluating treatment response and guiding clinical decision making. By reducing the variability and workload associated with contouring, this approach has the potential to streamline follow-up, improve consistency across centers, and support more robust response assessment in both routine care and clinical trials for patients with glioma.
Renal denervation is an innovative method of treating hypertension, used in clinical practice for about a decade. The pathophysiological basis of this method derives from the role of efferent and afferent fibers of the sympathetic nervous system, entering and leaving the kidneys, in the development of hypertension, especially treatment-resistant hypertension. Initial clinical trials suggested the high efficacy of denervation in the treatment of patients with resistant hypertension. However, the results of the Symplicity HTN-3 trial, which introduced a sham procedure as control procedure, undermined these hopes. This trial was criticized in terms of its methodology Recent research, using latest generation catheters and more stringent protocols, confirms a significant reduction in blood pressure in patients after surgery. Renal denervation should be considered in patients with hypertension (after excluding all hormonal causes and ischemic etiology), which is resistant to pharmacological treatment. Pseudo resistance, e.g., caused by the so-called white-coat hypertension, should be ruled out. Every patient should have their blood pressure monitored 24 hours a day. Lack of cooperation between the patient and the physician, and non-systematic taking of antihypertensive medications and non-compliance with other therapeutic recommendations should also be excluded. However, the condition for the success of renal denervation is the precise determination of patient eligibility and the performance of the procedure in specialized centers with appropriate experience.
Adolescents with overweight/obesity have an elevated risk of mental health and behavioral difficulties. Exercise has been shown to impart psychological benefits to these individuals; however, whether the effects of moderate-to-vigorous physical activity (MVPA) delivery differ between weekdays and weekends is limited. Therefore, this study aimed to examine the effects of weekday and weekend MVPA at age 14 on internalizing and externalizing problems at age 17 among adolescents with overweight/obesity. We analyzed data from two assessment waves of the UK Millennium Cohort Study (MSC): MCS6 (2015-2016; age ∼14) and MCS7 (2018-2019; age ∼17). Data were restricted to adolescents classified as overweight/obesity at MCS6 using the UK90 thresholds. Weekday and weekend MVPA were measured at age 14 using the wrist-worn GENEActiv accelerometer, including one pre-specified weekday and one weekend day. Outcomes at age 17 were parent-reported Strengths and Difficulties Questionnaire (SDQ) internalizing (emotional + peer problems) and externalizing (conduct + hyperactivity/inattention) composites. We estimated average treatment effects (ATEs) and conditional average treatment effect (CATEs, heterogeneous effects) using a causal forest framework (EconML) and adjusted for pre-exposure covariates (age, sex, body mass index, ethnicity, cognitive decision-making, household income, parental education, and parental mental health). Missing data were treated via K-Nearest Neighbors imputation. The analytic sample included 1,238 adolescents (mean age 14.25 years). Mean MVPA was higher on weekdays than weekends (135.74 ± 62.08 vs 113.80 ± 64.37 min/day). Covariate-adjusted average treatment effects (ATEs; per 1 min/day MVPA) were small and not statistically significant for internalizing or externalizing problems. Weekday MVPA ATEs were -0.0025 (95% CI -0.0062 to 0.0012) for internalizing and 0.0003 (-0.0027 to 0.0033) for externalizing; weekend MVPA ATEs were -0.0008 (-0.0051 to 0.0035) and 0.0005 (-0.0026 to 0.0037), respectively. Heterogeneity was evident only for weekday MVPA effects on internalizing (22.98% with significant individual effects), with height as the strongest moderator (β = 0.0002; p < 0.001; R² = 0.519) and more negative CATEs among shorter adolescents (Q1 -0.004994 vs Q4 -0.001903; ANOVA F = 106.741, p < 0.001). In adolescents with overweight/obesity, estimated average effects of weekday and weekend MVPA at the age of 14 on parent-reported internalizing and externalizing problems at age 17 were close to zero under a selection-on-observables framework. Any potential benefits may be subgroup-specific and context-dependent; the observed weekday-specific heterogeneity warrants replication with more reliable exposure measurement and a richer set of contextual covariates.
Multiple sclerosis (MS) is a chronic inflammatory autoimmune disease of the central nervous system. It is characterized by inflammation, areas of demyelination and axonal loss called plaques, recruitment of lymphocytes and monocytes, and bursts of focal blood-brain barrier leakage. Treatment strategies for MS focus on delaying disease progression and increasing patients' quality of life. However, most therapies have inconsistent efficacies and are associated with various side effects. Recently, long non-coding RNAs have been found to play a major role in the pathogenesis and development of several diseases. Several long non-coding RNAs have been correlated with MS. We focus on the role of AFAP1-AS1 in regulating the function of M2 macrophages, one of the immune cells believed to attenuate MS. Assessing this long non-coding RNA will improve our understanding of the molecular mechanics of immune cells in MS. We observe the impact of AFAP1-AS1 silencing in M2 macrophages on essential effector and regulatory proteins like MMP9, CCL5 and CXCL10 in MS patients receiving different treatments (Fingolimod, Interferon beta-1a, Interferon beta-1b, Teriflunomide or Dimethyl fumarate). Our results reported an upstream regulatory effect of AFAP1-AS1 on MMP9, CCL5, and CXCL10 in differently treated patients. By measuring the levels of proteins upon silencing of AFAP1-AS1, it was confirmed that this lncRNA has varying effects on the expression of these proteins depending on the treatment the patient is undergoing. These data shed light on the potential role of manipulating the anti-inflammatory activity of M2 cells making it a possible therapeutic target for certain MS patients.
Strongyloidiasis is a neglected soil-transmitted helminthiasis caused by Strongyloides stercoralis, affecting an estimated 300-600 million people worldwide. Due to its ability to cause autoinfection, the parasite may persist lifelong and lead to severe complications. While gastrointestinal symptoms are common, presentation as gastric outlet obstruction (GOO) is rare and can mimic malignancy. A 42-year-old male presented with several months of non-projectile vomiting, epigastric pain, indigestion, anorexia, and significant weight loss. Examination revealed dehydration, hypotension, and tachycardia. Laboratory tests showed mild microcytic anemia and mild hyponatremia, with normal eosinophil counts. Abdominal ultrasonography suggested gastric outlet obstruction. Upper gastrointestinal endoscopy demonstrated a circumferential ulcerated duodenal lesion with pyloric deformity, initially suspicious for malignancy. Histopathological examination of duodenal biopsies revealed S. stercoralis. After correction of hypovolemia and electrolyte imbalance, the patient was treated with ivermectin, resulting in complete resolution of symptoms. Strongyloidiasis has a broad clinical spectrum, ranging from asymptomatic infection to disseminated disease. Gastric outlet obstruction is an uncommon manifestation and poses diagnostic challenges, particularly in endemic regions. Endoscopy with histopathological confirmation is essential for diagnosis when clinical and laboratory findings are non-specific. Prompt treatment with ivermectin is highly effective and prevents serious complications. Strongyloidiasis should be considered in the differential diagnosis of gastric outlet obstruction, especially in endemic areas. Early diagnosis and timely antiparasitic treatment are crucial to reduce morbidity and prevent life-threatening complications.
Multimodal imaging has expanded treatment for patients with acute ischemic stroke with large vessel occlusion. Blood-brain barrier (BBB) disruption measured in the ischemic core is associated with hemorrhagic transformation. However, the associations between core BBB disruption (cBBBD) and baseline clinical/imaging variables, as well as 3-month outcome, have not been explored. This is a retrospective multicenter analysis of consecutive anterior circulation patients with acute ischemic stroke with large vessel occlusion, presenting over a 4-year time period, who were transferred from a primary to a comprehensive stroke center for possible endovascular therapy, with magnetic resonance imaging that included perfusion-weighted imaging before transfer. Magnetic resonance imaging scans were processed using RAPID software to generate penumbral imaging variables. Perfusion-weighted images were processed to detect and quantify contrast leakage; cBBBD was calculated as the average of all leaky voxels in the ischemic core. Poor functional outcome was defined as a modified Rankin Scale score of >2 at 3 months. Linear regression was used except for the outcome, which used logistic regression, controlling for age, stroke severity, and baseline functional status. Out of 411 patients transferred for endovascular therapy, 291 were included in this analysis with a median age of 74 years; 49% were female patients. The median National Institutes of Health Stroke Scale score was 13, the mean core volume was 32.3 mL, and the mean cBBBD was 2.1%. 71% of patients underwent endovascular therapy. Admission National Institutes of Health Stroke Scale score (P<0.001) and glucose level (P=0.033) were independently correlated with cBBBD. All imaging variables correlated strongly with cBBBD (P<0.001). The strongest correlation was 0.50, observed between cBBBD and mismatch ratio (r2=0.254). Increasing cBBBD was independently associated with poor functional outcome (adjusted odds ratio, 1.16 [CI, 1.03-1.32]; P=0.019; n=279), indicating that for every 1% increase in cBBBD, the odds of having a poor functional outcome increase by 16%. In acute ischemic stroke with large vessel occlusion, disruption of the BBB in the core lesion is independently associated with clinical outcome. cBBBD represents a new imaging profile for acute stroke that may help guide treatments.
Chronic spine pain is linked to self-reported cognitive complaints. However, objective markers are lacking. The 90-s Continuous Visual Attention Test (CVAT) quantifies key attention subdomains: reaction time (RT, alertness), RT variability (VRT, sustained attention), omission errors (focused attention) and commission errors (impulsivity). This study aimed to identify which specific attentional subdomains, measured by the CVAT, are impaired in adults with chronic spine pain. This prospective case-control study included 84 adults with chronic lumbar/cervical pain (≥ 3 months) and 118 healthy controls. A MANCOVA tested group differences on CVAT variables, controlling for age and sex, followed by Bonferroni-corrected ANCOVAs. To isolate cognitive variability from general processing speed, the coefficient of variability (VRT/RT) was also analysed. Logistic regression assessed the predictive power of CVAT indices for pain status. The MANCOVA showed a significant group effect (Pillai's Trace = 0.46, F(4,195) = 41.43, p < 0.001). Patients exhibited impairments in all measures p < 0.001 (η2 = 0.093-0.44). The VRT deficit persisted when using VRT/RT. Logistic regression identified VRT as the strongest predictor of chronic spine pain (χ2(1) = 147.53, p < 0.001), correctly classifying 86.6% of participants. This finding remained when using VRT/RT (χ2(1) = 130.55, p < 0.001; 82.7% accuracy). Patients with chronic spine pain demonstrate attentional deficits, with sustained attention instability (VRT and VRT/RT) as the most robust marker. The CVAT detects this impairment, offering a practical tool for clinical assessment to inform treatment and monitor cognitive function in pain management. III (prospective case-control). A brief, 90-s computerized attention test provides an objective, clinic-ready screen for sustained-attention instability in spine pain patients. Identifying cognitive vulnerability at the point of care can inform perioperative counselling, driving/work-safety guidance and rehabilitation planning, and it may help monitor treatment response alongside pain metrics, offering a noninvasive, nonpharmacologic complement to standard pain assessment.
BackgroundDietetic care plays a crucial role in the management of Gestational Diabetes Mellitus (GDM). Little is known about its utilization in the United Arab Emirates (UAE).ObjectiveThis study assessed the prevalence, correlates, and patient experiences of the use of dietetic services among pregnant women with GDM.DesignA cross-sectional study was conducted in six hospitals (public and private) in the Emirates of Dubai, Sharjah, and Ajman, between May and November 2024.MethodsA total of 223 pregnant women, aged 18-45 years, in their third trimester with a confirmed GDM diagnosis, were recruited. Data were collected using a multicomponent questionnaire and medical record reviews. The main outcome in this study was visiting a dietitian following GDM diagnosis (yes/no). Descriptive, bivariate, and multiple logistic regression analyses identified factors associated with dietitian visits.ResultsOf the 223 participants, 93 visited a dietitian (41.7%); of whom 92 (99%) were referred by a physician. Examining the sociodemographic, lifestyle and health related factors showed that daily breakfast consumption (aOR=3.3, p=0.009), initiating exercise post-diagnosis (aOR=3.9, p=0.012), sleeping ≥8 hours/night (aOR=3.2, p=0.004), previous history of GDM (aOR=2.7, p=0.039), and family history of GDM (aOR=3.2, p=0.009) were associated with higher odds of visiting the dietitian. Non-Emirati nationality (aOR=0.3, p=0.004), GDM medication use (aOR=0.3, p=0.007), and perceiving health as good/excellent (aOR=0.3, p=0.030) were inversely associated with visiting the dietitian. Among participants who visited the dietitian, the majority were either very satisfied (16.1%) or satisfied (63.4%), with only 3% reporting dissatisfaction with their visit to the dietitian.ConclusionA considerable proportion of women with GDM did not visit the dietitian, with physician referral being the most influential factor of dietitian visits. Dietitian consultations are associated with healthier behaviors and high satisfaction, emphasizing their critical role in GDM care. Promoting access to dietetic care, including strengthening referral systems, is needed to improve pregnancy outcomes and reduce future disease burden in mothers and children. Gestational diabetes mellitus (GDM) is a common condition during pregnancy that can affect both mothers and babies. Diet and lifestyle changes are the main treatment, and dietitians are key in providing this support. Yet, little is known about how often women in the United Arab Emirates (UAE) actually use dietitian services. In this study, we surveyed 223 pregnant women with GDM across six hospitals in Dubai, Sharjah, and Ajman. We examined not only how many women visited a dietitian but also what factors encouraged or discouraged these visits, and how women felt about the care they received. We found that fewer than half of the women (42%) saw a dietitian, and nearly all of them did so only when referred by a doctor. Women who had healthier daily habits or a history of GDM were more likely to visit, while non-Emirati women, those on GDM medication, and those who believed they were already in good health were less likely to go. Importantly, most women who did attend reported being satisfied or very satisfied with their dietitian visits. This is the first study in the UAE to explore both the use of dietitian services and women’s experiences of care in GDM. Our findings highlight the need for stronger referral systems and better access to dietetic care. By addressing these gaps, healthcare providers can improve pregnancy outcomes and reduce future health risks for mothers and children.
Blood pressure (BP) fluctuates with an annual cycle, with BP levels generally lower in summer and higher in winter. In this narrative review with a quasi-systematic literature search, we aim to summarise current evidence from a wide variety of research fields on seasonal BP variability, and to present our findings to stimulate interest in this largely unexplored area. Environmental temperature would be the key driver of seasonal BP variation, while other external and internal factors mediate this variation; quantifying these effects is difficult because of the various factors and their interactions. Self-measured home BP would be the most favorable information for evaluating long-term seasonal BP changes. Nocturnal home BP reduction is lower in summer than in winter, leading to a higher prevalence of a riser BP pattern in summer; we need to pay attention to the various risk factors associated with higher BP across seasons. Temperature obviously affects BP; therefore, it seems reasonable to assume that seasonal BP changes linearly contribute to seasonal mortality variation. However, our patients database showed that there are >10% of patients whose home BP is higher in summer than in winter, and the cardiovascular risk was significantly higher in patients with such inverse seasonal variation as well as those with contrarily large-sized seasonal variation in home BP (≥9.1 mm Hg in systolic or ≥4.5 mm Hg in diastolic measurements) than in those with the other small- to middle-sized seasonal variation. The inverse seasonal BP pattern is not rare in patients under antihypertensive drug treatment, and the mechanisms warrant investigation. Although the importance of BP level as a dominant and modifiable risk factor should be emphasized, assessing seasonal BP variation through long-term BP measurement, particularly based on home BP monitoring, has clinical benefits and should be widely promoted.
Preeclampsia is a pregnancy-specific hypertensive disorder, clinically defined by new-onset hypertension with proteinuria or other maternal organ dysfunction. It is a leading cause of maternal and perinatal mortality worldwide and remains an incurable condition, with delivery as the only definitive treatment. The disorder originates in placental dysfunction, characterized by inadequate trophoblast invasion and impaired spiral uterine artery remodeling, resulting in syncytiotrophoblast stress and the release of placenta-derived factors. Increased placental secretion of antiangiogenic factors, including soluble fms-like tyrosine kinase-1 and soluble endoglin, promotes angiogenic imbalance and drives widespread maternal endothelial dysfunction. This narrative review synthesizes findings from human studies and experimental research models used to investigate angiogenic dysregulation in preeclampsia, with particular focus on in vitro and in vivo models, and emerging 3-dimensional placental platforms. In vitro models, such as trophoblast organoids, placenta-on-a-chip devices, and 3-dimensional bioprinted constructs, recapitulate key aspects of the placental microenvironment and enable detailed study of aberrant angiogenesis and how to target it, while reducing reliance on limited primary tissue and improving experimental control. Complementing in vitro models, in vivo models are essential for linking placenta-derived angiogenic imbalance to maternal disease phenotypes, therapeutic responses, and fetal outcomes. This review provides a comprehensive assessment of current literature on models that recapitulate angiogenic imbalance in preeclampsia. We emphasize the importance of integrating bioengineering, cell biology, and clinical insights to improve our understanding of preeclampsia pathogenesis and to develop predictive tools and therapeutic strategies to manage angiogenic dysregulation in preeclampsia.
To assess current utilization of the Prosthodontic Diagnostic Index (PDI), identify perceived benefits and limitations, and evaluate support for future updates. A survey regarding the use of the PDI was made available through email invitations to 68 US dental school prosthodontic/restorative department chairs (PD), 48 graduate prosthodontic program directors (GP), and 1834 private practice prosthodontists (PP) from the American College of Prosthodontists (ACP) member database. The results of this initial survey were inconclusive due to low response rates. The survey was also administered during the 2024 ACP Annual Session with an improved response rate for educators. Descriptive statistics were used to analyze responses from predoctoral education programs and graduate prosthodontic programs. Responses were received from 43.8% (n = 21) graduate prosthodontic programs (GP) and 35.3% (n = 24) predoctoral programs (PD). The PDI was taught to graduate prosthodontic residents at 100% (n = 21) of the responding programs and at 58% (n = 14) of responding predoctoral programs. In contrast, the response rate for private practice prosthodontists, 2% (n = 43), was too low for statistical analysis. The PDI was used for new patient screening in 76.2% (n = 16) of GP and 41.7% (n = 10) of PD programs. The PDI was valued for enhancing diagnostic consistency (81.0%, n = 17 GP and 83.3%, n = 21 PD) and objective patient screening (90.5%, n = 19 GP and 87.5%, n = 21 PD). Common themes were observed in open-ended questions regarding the limitations of the PDI, including that the system was cumbersome, complicated, time-consuming to use, issues with calibration across all cohorts, and lacked recognition by general dentists and other dental specialists. The majority of respondents agreed that the PDI needs an update (76.2%, n = 16 GP and 66.7%, n = 16 PD), including the development of an ACP-endorsed classification system for implant-based treatment (81.0%, n = 17 GP and 91.7%, n = 22 PD). The PDI is viewed as a beneficial diagnostic and educational tool in academic settings. However, it is complex and has limited alignment with contemporary prosthodontic practice. A revision of the classification system could address current limitations and better support clinical decision-making.
Poststroke fatigue (PSF) affects nearly half of all stroke survivors and significantly hinders rehabilitation and daily functioning. There is no established treatment. Low cardiorespiratory fitness may contribute to PSF, suggesting aerobic training as a potential intervention. In this 2-center, randomized, open-label, blinded end point trial, we evaluated a home-based, supervised cardiorespiratory interval training program (HS-CITP) in individuals with PSF (Swedish Fatigue Assessment Scale score ≥28) 1 to 7 months poststroke. Participants were randomized (1:1) to either HS-CITP or usual care with self-directed activity after early supported discharge. The intervention consisted of 35-minute cycling sessions performed 3 times per week at 70% to 80% of maximum heart rate for 8 weeks. The study was powered to detect a 9-point between-group difference on the Swedish Fatigue Assessment Scale. The primary outcome was self-reported fatigue (Swedish Fatigue Assessment Scale score) at 8 weeks (postintervention), and the secondary outcome was peak oxygen uptake (mL/kg per minute) at 8 weeks. Analyses were performed according to the intention-to-treat principle using adjusted between-group comparisons. Forty-five participants were randomized; the mean age was 64 years, and 56% were women. Forty-three participants completed the postintervention assessment (HS-CITP: n=22; control: n=21). Adherence to HS-CITP was 92%, and no adverse events were reported. In adjusted analyses, compared with the control group, HS-CITP significantly reduced fatigue (between-group mean difference -5.35 Swedish Fatigue Assessment Scale score points [95% CI -9.03 to -3.67]; P<0.001) and improved cardiorespiratory fitness (+4.48 mL/kg per minute [95% CI, 3.41-5.54]; P<0.001). No significant group-by sex interaction was observed. Supervised home-based interval training significantly reduced PSF and improved cardiorespiratory fitness, with good adherence and no safety concerns. These findings support integrating structured aerobic exercise into stroke rehabilitation. Larger, longer-term trials are needed to confirm durability, determine the optimal timing poststroke, and evaluate other exercise modalities. URL: https://www.clinicaltrials.gov; Unique identifier: NCT03458884.
Persistent post-lumbar pain syndrome (PSPS) type II is a frequent condition in Pain Units, with limited treatment options such as epidural corticosteroids or epidurolysis. Epidural pulsed radiofrequency (PRF) using a catheter has been reported as more effective than transforaminal PRF. This study evaluates the efficacy of epidural PRF with or without corticosteroids in patients with PSPS type II. A randomized, controlled, multicentre trial included 131 patients. Participants were allocated to either a control group receiving epidural corticosteroids alone or an experimental group receiving epidural PRF plus corticosteroids. Outcomes were assessed with the Visual Analogue Scale (VAS), Oswestry Disability Index (ODI), DN4 questionnaire, and Patient Global Impression of Improvement (PGI-I) at 1, 2, 4, and 6 months. Of the 131 patients, 72 were assigned to PRF + steroids and 59 to control. The PRF + steroids group showed significantly greater improvement in VAS (p ≤ 0.002 across follow-ups), ODI (p ≤ 0.004), DN4 (p ≤ 0.015), and PGI-I (p ≤ 0.004) compared with control. Number needed to treat for a 2-3 point reduction in VAS ranged between 2.7 and 2.9. No major complications were observed, and adverse effects were minor and transient. Epidural PRF combined with corticosteroids provides superior pain relief, functional improvement, and patient satisfaction compared with corticosteroids alone in PSPS type II. This technique appears safe, with no significant complications, and represents a promising minimally invasive option for managing persistent post-surgical lumbar radiculopathy. This randomized controlled trial provides the first multicentre evidence that epidural pulsed radiofrequency combined with corticosteroids is more effective than corticosteroids alone for persistent spinal pain syndrome type II. The study shows sustained improvements in pain, disability, and patient-reported outcomes with a favourable safety profile, supporting epidural PRF as a valuable therapeutic option for complex post-surgical lumbar pain.
Fine-scale heterogeneity in soil nutrient availability can enhance plant growth, yet its effect across soil depth remains inadequately understood. Communities with increased species diversity showed varying growth strategies among different plant functional groups to occupy soil niches. However, the interactive effects of soil depth heterogeneity and species diversity on plant functional groups and community productivity remain poorly known. This study aimed to examine whether plant functional groups respond differently to soil depth heterogeneity, whether such heterogeneity modifies interspecific relationships and community productivity, and whether higher species diversity alters plant growth strategies under vertical resource variation. A greenhouse experiment was conducted using twelve common herbaceous species with diverse functional traits and root strategies grown under one homogeneous and two heterogeneous treatments that varied in patch size, crossed with two species diversity levels. Plant species differed in responses to soil depth heterogeneity. Increased soil depth promoted the growth of legumes and forbs and enhanced community productivity. In high-diversity mixtures, forbs accumulated greater aboveground biomass in deeper heterogeneous soils, whereas grasses showed no significant response to soil depth or diversity. Legumes displayed a flexible growth pattern, reducing allocation to deeper layers, which may limit competitive pressure. Higher species diversity strengthened the selection effect in forbs, increasing their proportional contribution and intensifying interspecific competition under soil depth heterogeneity. This study provides experimental evidence for soil-plant interactions driving species coexistence and community productivity. Decreased-scale heterogeneity may also function as a reduced-scale form of habitat fragmentation. Long-term field experiments and multi-dimensional niche analyses are needed to better understand community dynamics across gradients of biotic and abiotic factors.