Complement Factor B (CFB) is a component of the alternative complement pathway with emerging roles in cancer biology. This study investigated the role of CFB in lung cancer progression and characterized its downstream molecular and immunological effects. We performed transcriptomic analysis of CFB-silenced lung adenocarcinoma cells and validated findings in clinical specimens. CFB deficiency was assessed using in vitro functional assays, metabolite measurements, and T cell coculture systems. Immunomodulatory effects were evaluated in vivo using tumor models treated with an indoleamine 2,3-dioxygenase (IDO) inhibitor. CFB silencing activates a proto-oncogene C-Fos (FOS)-dependent upregulation of tryptophan metabolism enzymes. CFB expression inversely correlates with FOS and tryptophan metabolic enzymes in human lung cancer specimens, with stronger alterations in advanced disease. Mechanistically, CFB deficiency enhances tryptophan catabolism to kynurenine, promoting regulatory T cell activation through aryl hydrocarbon receptor (AhR) signaling while suppressing CD8 + T cell function. In vivo, CFB-knockdown tumors show accelerated growth, enhanced immune suppression, and increased tryptophan catabolism. These effects are reversed by IDO inhibitor treatment. We identify a CFB-FOS-IDO axis that regulates tryptophan metabolism and shapes the tumor immune microenvironment in lung cancer. CFB deficiency promotes tumor progression by enhancing tryptophan catabolism, resulting in an immunosuppressive microenvironment. These findings reveal potential therapeutic targets to overcome immune evasion in lung cancer. Lung cancer is a leading cause of death worldwide, partly because tumors can hide from the body’s immune system. We investigated how a protein called Complement Factor B (CFB), often reduced in lung cancer, contributes to this immune evasion. Using laboratory experiments, patient tissues, and mouse models, we found that when CFB is lost, cancer cells activate another protein called FOS, which increases the breakdown of a nutrient called tryptophan. This produces a substance called kynurenine that suppresses cancer-fighting immune cells while boosting cells that protect the tumor. Treating mice with a drug blocking tryptophan breakdown reversed these effects and slowed tumor growth. These findings suggest that targeting this pathway could help restore immune responses in lung cancer patients, potentially improving treatment outcomes.
What is this summary about?This summary describes an original article about a clinical study called HD21. The HD21 study compared two treatment combinations for people with advanced Hodgkin lymphoma. Advanced Hodgkin lymphoma is a type of cancer that starts in white blood cells and eventually spreads to other parts of the body.In this study, each treatment combination contained multiple different medications. One of the treatment combinations, called eBEACOPP, is extremely effective in treating advanced Hodgkin lymphoma and is a current standard of care treatment. However, there are concerns around whether eBEACOPP causes a lot of serious side effects. The other treatment combination in this study is called BrECADD. BrECADD includes a medication called brentuximab vedotin, which is designed to specifically deliver a chemotherapy drug to Hodgkin lymphoma cancer cells and stop them from dividing. All the medications in the BrECADD treatment combination were selected to be at least as effective as the eBEACOPP treatment combination, but to cause fewer side effects.The HD21 study included adults aged 18 to 60 years who had not yet received any treatment for their Hodgkin lymphoma. Patients who participated in the study were randomly assigned (by chance) to one of the two treatment groups: The ‘BrECADD’ group (742 participants), orThe ‘eBEACOPP’ group (740 participants)For participants in both groups, the medications in each treatment combination were given on specific days during a 21-day treatment cycle. Each participant had 4 or 6 treatment cycles, depending on how well the cancer was responding to treatment after the first 2 treatment cycles.The researchers wanted to find out whether participants who received BrECADD tolerated treatment better than those who received eBEACOPP. They also wanted to find out whether BrECADD was at least as good at treating advanced Hodgkin lymphoma as eBEACOPP.What are the key takeaways?Participants who received BrECADD (compared with those who received eBEACOPP) were: More likely to tolerate treatment.More likely to be cancer-free during follow-up.What were the main conclusions reported by the researchers?Based on these results, the authors of the original article recommended that BrECADD should be a new, preferred standard treatment option for people with advanced Hodgkin lymphoma who have not had any previous treatment.Clinical trial number: 2020-000158-88.
Phylogenetic tree shapes capture fundamental signatures of evolution. We consider "ranked" tree shapes, which are equipped with a total order on the internal nodes compatible with the tree graph. Recent work has established an elegant bijection between ranked tree shapes and a class of integer matrices, called F-matrices, defined by simple inequalities. This formulation is for isochronous ranked tree shapes, where all leaves share the same sampling time, such as in the study of ancient human demography from present-day individuals. However, branch lengths of phylogenetic trees can represent units other than calendar time, such as evolutionary distance. A tree equipped with branch lengths quantifying evolutionary distance, called a rooted phylogram, is output by popular maximum-likelihood methods. These trees are broadly relevant, such as to study the affinity maturation of B cells in the immune system. Discretizing time in a rooted phylogram gives a fully heterochronous ranked tree shape, where leaves are part of the total order. Here we extend the F-matrix framework to such fully heterochronous ranked tree shapes. We establish an explicit bijection between a class of F-matrices and the space of such tree shapes. The matrix representation has the key feature that the value at any entry is highly constrained by four previous entries, enabling straightforward enumeration of all valid tree shapes. We also use this framework to develop probabilistic models on ranked tree shapes. Our work extends understanding of combinatorial objects that have a rich history in the literature.
Due to their ability to kill closely related strains, phage tail-like bacteriocins, also called tailocins, play an important role in shaping bacterial communities. One such tailocin, called carotovoricin, is also known to be present in the Pectobacterium genus. However, little is known about its evolutionary dynamics and the scope of impact on species interactions in this genus. To investigate the diversity and evolution of carotovoricin, we performed a genus-wide, phylogenetically-structured pangenome study. This analysis inferred that the gene cluster responsible for carotovoricin biosynthesis is conserved across the genus and is located in the same gene neighborhood in all the species. Within the carotovoricin cluster, the tail fiber genes, which determine the host range specificity, exhibit high variability and discordance with the species phylogeny. We show evidence for an evolutionary mechanism involving recombination-mediated exchange of these tail fiber loci across the entire Pectobacterium genus, which complements the previously known mechanism for DNA sequence inversion to maintain tailocin polymorphism at the population level. In addition, the ability to exchange tail-fiber loci in a highly targeted and genus-wide manner could influence the community dynamics in nutrient rich environments such as infected plant tissues. In conclusion, the strong signal for carotovoricin retention and ability to exchange tail fibers indicates that it significantly contributes to the community interactions of the Pectobacterium phytopathogens.
When in contact with microbes or other pathogens plants develop an induced defense response. This reaction is triggered by pathogen-derived molecules that provoke the so-called microbe-associated molecular pattern (MAMP)-triggered immunity (MTI) or pathogen-associated molecular pattern (PAMP)-triggered immunity (PTI). Recognition of a MAMP or PAMP by a pattern recognition receptor (PRR) activates rapid downstream signaling, manifested in, e.g., a rise in the cytosolic Ca2+ concentration. As a consequence, defense-related genes are expressed and antimicrobial substances are produced. There is also evidence that Ca2+-induced responses show a refractory behavior in plant cells, as the reaction to an identical stimulus applied shortly after the first one is strongly suppressed, if it can be observed at all. Subsequent elicitations over a longer period of time, on the other hand, can trigger stronger Ca2+ responses, which lead to so-called "defense priming". Although refractory behavior has been documented in various plant cell types, its underlying function and causative mechanisms remain unclear. In this review article we give an overview of the refractory machinery, including elicitors, receptors, typical Ca2+ responses, and signal transduction pathways. We shed light on possible explanatory scenarios and address open questions.
Diabetic retinopathy is a leading cause of visual loss. Hypothesis-generating data from cardiovascular outcome trials suggest that fenofibrate therapy may reduce the progression of diabetic retinopathy. To determine whether treatment with fenofibrate reduces the progression of diabetic retinopathy. We conducted a parallel-group, double-masked, placebo-controlled clinical trial of fenofibrate. A web-based algorithm allocated participants to treatment arms by minimisation. The trial was positioned within NHS Scotland's Diabetic Eye Screening Programme. Adults with diabetes and non-referable retinopathy or maculopathy (based on Diabetic Eye Screening retinal image grading) were eligible. Study treatment was mailed to participants' homes. Participants who were eligible at the screening assessment entered an active pre-randomisation run-in during which they took 145 mg fenofibrate. After randomisation, participants received 145 mg fenofibrate tablets or placebo. Study treatment was taken daily in those with normal renal function, or on alternate days in those with impaired renal function. The primary outcome was a composite of developing referable diabetic retinopathy or maculopathy, or requiring treatment for diabetic retinopathy or maculopathy. Incremental cost-effectiveness was assessed in terms of the primary outcome and per modelled quality-adjusted life-year gained. Data were obtained from 6-monthly interviews by research nurses and linkage to national healthcare data sets. Selected adverse events were adjudicated by study clinicians masked to treatment allocation. One thousand four hundred and eighty-four participants entered the pre-randomisation run-in, of whom 1151 were randomised. The primary outcome occurred in 131 (22.7%) of 576 participants assigned fenofibrate and 168 (29.2%) of 575 participants assigned placebo (hazard ratio 0.73; 95% confidence interval 0.58 to 0.91; p = 0.006) over a median of 4.0 years. Any progression of retinopathy or maculopathy, and development of macular oedema were also reduced. There was no effect on visual function, quality of life, or visual acuity. Fenofibrate use resulted in a non-significant reduction in 6-monthly health service costs (mean difference -£101, 95% confidence interval -£243 to £42), leading to dominance over standard care and a high probability of cost-effectiveness. Based on modelling (assuming no difference in background healthcare costs by treatment allocation), fenofibrate led to a small increase (£6) in cost for a small gain (0.02) in quality-adjusted life-years; incremental cost-effectiveness ratio £406 per quality-adjusted life-year gained. The probability of cost-effectiveness was 79-86% at thresholds of £20,000-30,000 per quality-adjusted life-year gained. Early Treatment Diabetic Retinopathy Study retinopathy grading is considered the gold standard, but it is not used in large-scale retinal screening programmes; Diabetic Eye Screening grading is based on Early Treatment Diabetic Retinopathy Study but is less granular. Fenofibrate was clinically effective and cost-effective for reducing the progression of diabetic retinopathy compared with placebo among participants with early retinal changes. LENS participants will be followed for 10 years to assess the long-term effects of fenofibrate therapy. This synopsis presents independent research funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme as award number 14/49/84. Diabetes can affect the inner layer at the back of the eye, a condition called diabetic retinopathy. Many people need to see a National Health Service eye specialist or need treatment for diabetic eye disease. Each year, diabetic retinopathy leads to 1500 people being certified as blind in the United Kingdom. This makes it a leading cause of blindness in working age adults. Fenofibrate is a drug that is sometimes used to lower cholesterol. Two studies from the 2000s suggested that fenofibrate may lower the risk of diabetic eye disease getting worse. However, those results were not convincing enough to change how doctors treat their patients. We ran the Lowering Events in Non-proliferative retinopathy in Scotland study to find out if fenofibrate may be useful for treating people with diabetic eye disease. Lowering Events in Non-proliferative retinopathy in Scotland was a large clinical trial. We studied 1151 adults with early diabetic eye disease from across Scotland. Participants came to a research clinic at the start to check if they were eligible. Study treatment was sent to peoples’ homes by post. Half the people in the study took fenofibrate tablets. The other half took placebo (i.e. dummy) tablets. Nobody knew which treatment they were getting. Research nurses phoned them every 6 months over the next 4 years. The study team used information from these calls and from National Health Service records to find out what happened to participants. We found that the people taking fenofibrate had a lower chance of their diabetic eye disease getting worse compared to those taking placebo. Fewer people taking fenofibrate needed to see a National Health Service specialist, have treatment for eye disease or developed swelling at the back of the eyes (called macular oedema) compared to placebo. We now have better evidence about the positive effect fenofibrate in patients with early diabetic eye disease, and the potential for savings to the National Health Service.
Falls among older adults remain a critical patient safety concern globally. Conventional walkers frequently fail to accommodate end-user needs, particularly in low-to-middle-income country (LMIC) settings. Understanding stakeholder design requirements is essential for developing contextually appropriate, technology-enhanced assistive devices. To explore multidisciplinary stakeholder perspectives on the design requirements for a sensor-integrated smart walker for fall prevention among older adults in Indonesia. A qualitative descriptive approach was employed within the empathize and define phases of a design thinking framework. Purposive sampling recruited 13 participants across three stakeholder groups: older adult walker users (n=5), informal caregivers (n=5), and healthcare professionals (n=3). Semi-structured interviews were conducted between October and December 2025 in Bandung, West Java. Data were analyzed using Braun and Clarke's reflexive thematic analysis. Cognitive screening used a standardized, licenced assessment tool. Twenty themes were identified across stakeholder groups: seven from older adults, eight from caregivers, and five from healthcare professionals. Convergent needs included a pre-impact fall warning system, design simplicity, structural durability, and affordability. Culturally specific findings included the Sundanese preference for human-accompanied walking (papah), rejection of wheeled designs, and narrow-bathroom (jamban) navigation challenges. Healthcare professionals emphasized BPJS affordability constraints and regulatory integration. Nine prioritized design requirements were derived from cross-stakeholder synthesis. These findings establish an empirically grounded, culturally responsive design foundation for the TEMAN JALAN smart walker - demonstrating that effective assistive technology for LMIC settings requires ground-up needs assessment rather than adaptation of high-income-country prototypes. The identified requirements will directly inform the subsequent ideation, prototyping, and clinical testing phases of this design thinking project, with implications for allied health-led innovation in fall prevention globally. Falls are among the most dangerous and common accidents experienced by older adults. Walking frames (walkers) help prevent falls, but the standard designs currently available in Indonesia are often too bulky for small bathrooms, lack any form of warning system, and do not consider local living conditions. This study asked older adults who use walkers daily, their caregivers, nurses, and a doctor specializing in elderly care about what they need from a better walker. Interviews took place at a residential care home and a hospital clinic in Bandung, Indonesia. Older adults said they feel safer with a walker but struggle in tight bathroom spaces and wish for an alarm that warns them before they fall. Some preferred being physically guided by another person — a cultural practice called papah in the Sundanese community. Caregivers reported that walkers reduced their physical workload but emphasized that proper training was essential during the first days of use. Nurses and the doctor stressed that most patients rely on government health insurance (BPJS) and cannot afford expensive devices, and that any new technology must fit into hospital procedures. These perspectives are now being used to design a smart walker called TEMAN JALAN that uses motion sensors to detect when someone is about to fall and sends alerts to caregivers.
The Veterans Health Administration (VHA) and other health care systems have been moving to patient-centered health care models with a focus on individual goal setting and patient engagement. In VHA, this model of care is called Whole Health, implementation of which began in 2011. To examine whether Whole Health utilization is associated with subsequent improvements in clinical quality measures of chronic disease management and preventive health care services. Controlled pre/post quality improvement evaluation accounting for secular trends. Veterans at 125 VHA medical centers who utilized care between 10/01/2022 and 03/31/2023.  INTERVENTIONS: Veterans with Whole Health utilization that included Whole Health educational classes, coaching activities, and Whole Health clinical care conversations with providers were assessed alongside reference patients who used general VHA care during the same period.  MAIN MEASURES: Change in clinical quality measures one month before using Whole Health care to 6 months after. The population included 548,968 Whole Health users and 5,428,413 general VHA users. Improvements were observed in all nine clinical quality measures following Whole Health utilization and continued to improve over 6 months. The proportions of Whole Health cohorts with quality improvements were greater than the reference cohorts over the same period. For example, among patients with diabetes, 80.1% of the Whole Health group had good HbA1c control (<8%) at 6 months compared to 74.9% before using Whole Health, an increase of 5.2% (95% CI 4.8-5.6%). The adjusted proportion improving after accounting for demographic and clinical characteristics as well as secular trends in the reference cohort was 4.4% (95% CI 4.1-4.7%). These exploratory findings observed that Whole Health utilization was associated with subsequent improvements in health care quality, suggesting this patient-centered care model can potentially augment primary care, achieving a ripple effect improving chronic disease management and preventive care adherence through increased patient activation and engagement.
Significant advancement was recently achieved in the field of hydroxyapatite nanoparticles, contributing to their implementation in numerous biomedical applications, including bone tissue reconstruction. Nevertheless, the influence of these materials on the immune system still remains an unresolved issue. Here, we managed to reveal a direct in vitro impact of nano-hydroxyapatite particles on immunological, cellular and humoral components. Further, we analyzed the effect of nanoparticle size and shape on healthy human immune cells residing in the peripheral blood. Special attention was given to particles produced using the microwave hydrothermal synthesis method and precisely controlled size. Viability, phenotypes, activation status, cytokine production and release were assessed using flow cytometry and immunoenzymatic techniques. We showed that large-sized nanoparticles caused significant induction of immune response associated with innate and acquired immunity, in size-dependent manner. The synthesized nanoparticles allowed for avoidance of the immunogenicity, with no changes in inflammatory cytokines, supporting high biocompatibility. Various effects suggest a different potential use of examined nanoparticles, whether proinflammatory or neutral conditions are required in the field of anticancer therapy or transplantology respectively. Obtained results indicate that precise selection of nano-hydroxyapatites with specific immunomodulatory properties might be crucial for application in the clinical setting. Subsequent studies would establish most suitable therapeutic approaches where selected nano-hydroxyapatites could be implemented. Nanoparticles are tiny materials with unique properties used in many medical applications, such as bone repair or cancer treatment. One type, called nano-hydroxyapatite (nHAP), is especially promising due to its similarity to human bone. However, we still do not fully understand how these particles affect the immune system, our natural defense network. In this study, we tested several types of nHAPs to see how they interact with healthy human immune cells. We found that some particles, especially larger ones, can harm immune cells at high concentrations. In contrast, smaller particles had no harmful effects and seemed to be well tolerated. We also discovered that certain nHAPs changed how immune cells behave. For example, Sigma particles caused monocytes, cells that participate in both inflammatory and non-inflammatory immune response, to shift toward a more inflammatory state. They also increased activity in helper T cells (acquired immunity cells) and led to the production of both protective and inflammatory signaling proteins. These findings suggest that not all nHAPs are equal. Choosing the right type and size of particle is essential depending on the intended medical use. Some nHAPs may help fight cancer by boosting the immune response, while others could be better suited for treatments requiring low immune activation, such as wound healing or organ transplants. Our research highlights the need for careful design of nanoparticle-based therapies to ensure safety and effectiveness in future clinical applications.
The growth of Internet of Things (IoT) applications is driving demand for Low-Power Wide-Area Networks (LPWANs) to support higher data rates with the same energy efficiency. While Long Range (LoRa) provides excellent noise immunity and receiver sensitivity, its data rate might be insufficient for some applications, including those real-time applications in which LoRa is required to have infrequent transmissions to maintain low power consumption. In this paper, a novel modulation is introduced to address these limitations by utilizing narrowband chirp to represent a data symbol with chirp slopes, called a multi-slope chirp signal. At the receiver, a new blind non-coherent detection technique is also presented to recover the proposed signal. The simulation results confirm that the proposed scheme can successfully transmit information at 2 to 4 bits per symbol, and when compared to LoRa SF 6, it reduces the Time-on-Air (ToA) by half and also achieves an improvement in spectral efficiency in the frequency domain.
Post-market surveillance (PMS) under the European Union In Vitro Diagnostic Regulation (IVDR) demands proactive, literature-based evidence, but mature assays like QuantiFERON TB Gold Plus (QFT-Plus) generate volumes of peer-reviewed and other literature that can strain manual workflows. We ran a comparative study of an AI-enabled literature-surveillance platform (jointly developed with Huma.ai called the Huma.ai Platform) versus manual search for QFT-Plus PMS. PubMed and PubMed Central were queried for publications in 2024; human studies published in English underwent duplicate screening and full-text appraisal. Outcomes were yield, precision, overlap/unique entries, and reviewer time. The Huma.ai Platform retrieved 673 records, with 661 relevant to screening (98.21% precision). Manual searching retrieved 111, with 106 relevant to screening (95.50% precision): there were 103 shared and three manual-only items (metadata gaps). The Huma.ai Platform contributed 561 unique papers, 5 of which were excluded after full-text appraisal. In total, 664 articles were evaluated; no new safety signals were identified. Screening time averaged ∼16 s per article with Huma.ai Platform versus ∼60 s manually; full-text time (∼15 min per article) was similar. AI-assisted surveillance substantially increases coverage and reduces screening effort while maintaining high precision. Thus it supports efficient, reproducible PMS for QFT-Plus.
Background/Objective: Patients with peripheral artery disease (PAD) are known to have poor awareness and understanding of the diagnosis. The role of generative AI chatbots in improving PAD patient education is unknown. Our goal is to compare a generative AI chatbot customized for PAD patient education to publicly available AI chatbots. Methods: This is a cross-sectional comparative evaluation of the responses of four AI chatbots to ten prompts that are commonly asked questions about PAD. The three publicly available AI chatbots were ChatGPT-5, Gemini 2.5 Flash, and Claude Sonnet 4.5. We created a customized, voice AI chatbot for PAD education grounded on curated and prompt-injected guidance called Vascular Education and Resources using Artificial Intelligence, or "VERA." De-identified chatbot-generated responses to inputs were assessed for readability (Flesch-Kincaid Grade Level, Flesch Reading Ease, Gunning Fog Index, Simple Measure of Gobbledygook Index, and Average Reading Level Consensus Score), accuracy, comprehensiveness, and patient education quality (Patient Education Materials Assessment Tool; PEMAT) using validated instruments and expert scoring rubrics. Nonparametric statistical testing was used to compare chatbot performance across all evaluation domains. Results: VERA generated the most accessible text compared to the other chatbots and produced responses at a median grade level of 6.6, which was lower than responses from the other chatbots. PAD expert-rated accuracy scores were high across all the chatbots without significant differences between them. Comprehensiveness scores were more varied and demonstrated that VERA was less comprehensive than the other chatbots. PEMAT understandability scores were uniformly high. PEMAT actionability scores were low overall but did not differ significantly across chatbots on post hoc analysis. Conclusions: A generative AI chatbot research tool customized for PAD patient education generates textual information about PAD that is more accessible (mean grade level 6.6) than publicly available AI chatbots without loss of accuracy, albeit with modestly reduced comprehensiveness that reflects intentional simplification for patient-centered communication. Future research will assess the acceptability and feasibility of this research tool to be adopted as part of PAD patient education.
Indoor environments often contain numerous areas with sparse structural features, such as long corridors, large atriums, and glass curtain walls, and other scenarios. These conditions can lead to difficulties in loop closure detection and accumulated positioning errors, resulting in localization drift or even mapping failure during map construction. This paper proposes an indoor mapping algorithm called IS-LEGO-LOAM that integrates tightly coupled LiDAR-IMU fusion and Scan Context. A tightly coupled LiDAR-IMU odometry is constructed, and an adaptive covariance matrix is designed to solve the problems of abnormal LiDAR echoes and insufficient effective feature extraction caused by sparse indoor feature points. By introducing the Scan Context global descriptor and adopting the strategies of vector nearest neighbor search and similarity score matching, the drift problem in large-scale scenes is alleviated. Finally, validation is performed on the KITTI dataset and in real-world scenarios, respectively. Experiments show that the improved IS-LEGO-LOAM achieves superior mapping performance.
We aimed to investigate the relationship between peripheral iron stores and symptom severity in drug-naive patients with idiopathic Restless Legs Syndrome (RLS) whose serum ferritin levels were within the laboratory reference range. Additionally, we sought to determine the independent effects of age, sex, comorbidities, and biochemical parameters in predicting RLS severity. This cross-sectional study was conducted at the neurology outpatient clinic of a single university hospital. A total of 113 drug-naive patients with idiopathic RLS were prospectively enrolled during clinical visits. Patients with ferritin levels outside the laboratory reference range (22-322 μg/L) or with major confounding comorbidities (e.g., peripheral neuropathy, renal failure, pregnancy) were excluded to ensure sample homogeneity Demographic data and comorbidities were recorded through face-to-face interviews. Disease severity was assessed by a neurologist using the International Restless Legs Syndrome Rating Scale. Ferritin, magnesium, vitamin B12, vitamin D, folate; and hemoglobin were analyzed in fasting venous blood samples. Multiple linear regression analysis was used to identify independent predictors of symptom severity. The mean age of the participants was 52 ± 13.8 years, and their median ferritin level was 51.4 μg/L. Multiple regression analysis identified serum ferritin level (p < 0.001) and age (p = 0.001) as independent and significant predictors of RLS severity. No independent effects of sex, hemoglobin, vitamin D, or other comorbidities on severity were detected. Even when serum ferritin levels are within the so-called "normal" limits, a very strong negative correlation exists between ferritin level and symptom severity. Our findings highlight the importance of evaluating ferritin not merely as a threshold value in RLS management but as a continuous parameter that can be associated with symptom severity even within normal ranges.
Brain blood perfusion is typically characterized through regional cerebral blood flow metrics. However, recent research demonstrates that the distribution of blood transit times through the capillary network also plays a significant role in physiological processes. Here, we introduce a novel kinetic model, called the Outflow model, to quantify this distribution and calculate capillary transit time heterogeneity. The model is tailored to account for transport across the blood‒brain barrier and tissue-binding and is applicable for both for Magnetic Resonance Imaging and Positron emission Tomography scenarios. The model was employed across three methodologies: (1) contrast-enhanced T1-weighted Magnetic Resonance Imaging; (2) dynamic long axial field-of-view Positron Emission Tomography imaging using the radiotracers O-(2-[18F]fluoroethyl)-L-tyrosine and (3) 2-deoxy-2-[18F]fluoroglucose. The temporal resolution was optimized (1-2 s) to capture the bolus passages. Tissue curves were derived from the putamen, thalamus, frontal white matter, and regions of malignancy for comparative analysis. For the PET data the Outflow model was compared with a conventional 3-compartment model for validation. For Magnetic Resonance Imaging, the capillary transit time heterogeneity ranged from 0.5 to 1 s, the extraction fraction ranged from 0.03 to 0.07%, and the unidirectional influx constant ranged from 0.01 to 0.2 ml/min/100 ml. For O-(2-[18F]fluoroethyl)-L-tyrosine, the capillary transit time heterogeneity ranged from 1.5 to 3 s, with an extraction fraction of 5.4% to 7.8% and the unidirectional influx constant ranging from 1.6 to 3.2 ml/min/100 ml, aligning closely with the anticipated parameters. For 2-deoxy-2-[18F]fluoroglucose obtain from the basal ganglia, capillary transit time heterogeneity was around 2 s, extraction fraction 40.2%, and cerebral metabolic rate of glucose consumption was 44 µmol/100 ml/min. Both the outflow model and the 3-compartment model gave excellent fit to data and for the comparable metrics the two models gave results in reasonable agreement. This proof-of-concept study demonstrated that calculating capillary transit time heterogeneity in the brain via the proposed new tracer kinetic model is feasible for both MRI and PET data. The model adeptly addresses scenarios where tracers or contrast agents undergo bidirectional transport across the BBB.
Numerous cellular functions, such as apoptosis, proliferation, differentiation, survival, transformation and cell migration, are regulated by a crucial transcription factor called activator protein 1 (AP-1). Growing evidence indicates that AP-1 is involved in severe conditions like fibrosis, cancer, and organ damage, as well as inflammatory diseases, including rheumatoid arthritis, psoriasis, and asthma. In recent years, AP-1 has become a significant focus in drug research. The activation of AP-1 by TNF is crucial for essential components of the inflammatory reaction, including the expression of tissue remodelling proteases such as collagenase, as well as pro-inflammatory cell adhesion molecules like E-selectin. This transcription factor is formed by the assembly of jun-jun homodimers, jun-fos heterodimers, and jun-ATF (Activating Transcription Factor) heterodimers. As a member of the basic leucine zipper (bZIP) class, AP-1 regulates target genes by binding to their promoters in a sequence-specific way. New research suggests that reducing AP-1 function could improve various disease outcomes and treatments. Transfection of decoy oligonucleotides (ODNs) offers an innovative approach to gene therapy by targeting specific gene regulatory elements. Transcription factor decoys mimic the sites where transcription factors bind, competing with promoter regions within the cell nucleus. These molecules can regulate interactions between DNA sequences and transcription factors, which play a role in altering gene activation during both normal and disease-related cellular processes. This review aims to summarize the effects of AP-1-targeted decoys on various conditions. The studies demonstrated the great promising role of AP-1 decoys as therapeutics for various diseases, especially cardiovascular diseases and cancers. Moreover, it was shown that modifications of AP-1 (circular and hairpin as well as phosphorothioate backbones structures) decoys made them more stable and effective.
Physical inactivity and suboptimal diet in pregnancy are important modifiable risk factors for gestational diabetes, a major contributor to pregnancy complications. We aimed to assess the effects of physical activity and/or diet-based lifestyle interventions during pregnancy on gestational diabetes and if these vary by maternal (body mass index, age, parity, ethnicity, education) and intervention characteristics using individual participant data meta-analysis of randomised trials, and a cost-effectiveness analysis. International Weight Management in Pregnancy Collaborative Network database was updated by searching major databases from February 2017 to March 2022. The main outcomes were gestational diabetes by any criteria and by the National Institute for Health and Care Excellence. Other outcomes were gestational diabetes as per International Association of Diabetes in Pregnancy Study Group and maternal and perinatal outcomes. We performed a two-stage random-effects individual participant data meta-analysis to obtain summary estimates (odds ratio) with 95% confidence intervals. Study quality of included trials was assessed, and heterogeneity summarised using τ2. Where possible, we added the aggregate data from non-individual participant data trials to the meta-analysis. We ranked interventions by effectiveness using network meta-analysis and undertook model-based economic evaluation to assess cost-effectiveness. The cost-effectiveness analysis took an NHS cost perspective compared an overall lifestyle intervention versus usual care with a time horizon covering the beginning of pregnancy until the discharge of the mother and infant from the hospital following delivery. Ninety-two trials (32,284 women) were included; 54 (23,698 women) provided individual participant data. Lifestyle interventions reduced the odds of gestational diabetes (any criteria) by 10% in individual participant data trials (odds ratio 0.90, 95% confidence interval 0.80 to 1.02, 54 studies, 23,361 women), and the findings reached statistical significance when non-individual participant data were included (odds ratio 0.81, 95% confidence interval 0.73 to 0.89, 92 studies, 31,947 women). Physical activity significantly reduced the odds of gestational diabetes by 36% (odds ratio 0.64; 95% confidence interval 0.48 to 0.84), and diet by 19% (odds ratio 0.81; 0.69 to 0.96), but not mixed interventions. Women with middle (odds ratio 0.68, 95% confidence interval 0.51 to 0.90) and high educational level (odds ratio 0.71, 95% confidence interval 0.54 to 0.93) benefited more than those with low educational status, and no differences by maternal body mass index, age, parity or ethnicity. There was no significant reduction in gestational diabetes defined by National Institute for Health and Care Excellence criteria (odds ratio 0.98, 95% confidence interval 0.84 to 1.13) in individual participant data trials. For gestational diabetes defined using International Association of Diabetes in Pregnancy Study Group criteria, interventions reduced gestational diabetes by 14% (odds ratio 0.86, 95% confidence interval 0.75 to 0.97, τ2 = 0.00, 16 studies, 6174 women) in individual participant data trials and by 17% (odds ratio 0.83, 95% confidence interval 0.72 to 0.95, τ2 = 0.01, 25 studies, 7883 women) when non-individual participant data trials were added. Overall, physical activity reduced caesarean section (odds ratio 0.83; 0.72 to 0.96), small-for-gestational age (odds ratio 0.72; 0.56 to 0.92) and large-for-gestational age babies (odds ratio 0.81; 0.71 to 0.94); diet-based interventions reduced any preterm birth (odds ratio 0.37; 0.20 to 0.68) compared to controls. No differences were observed for other outcomes. Lifestyle interventions were on average more expensive and more effective at averted gestational diabetes and major outcome averted compared to usual care. We could not identify the specific intervention components and delivery methods associated with improved outcomes, due to variations in reporting. Lifestyle interventions in pregnancy prevent gestational diabetes, and the effects vary according to the definition of gestational diabetes. Physical activity-based interventions may be the most effective. Lifestyle interventions should be implemented and evaluated in routine clinical practice to prevent gestational diabetes, with additional support for women with low socioeconomic status. This study is registered as PROSPERO CRD42020212884. www.crd.york.ac.uk/PROSPERO/view/CRD42020212884. This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: NIHR129715) and is published in full in Health Technology Assessment; Vol. 30, No. 39. See the NIHR Funding and Awards website for further award information. During pregnancy, not eating well, not moving enough, and being overweight can lead to a condition called ‘gestational diabetes’. It is when mothers have high sugar levels for the first time. This can cause problems for both the mother and the baby during pregnancy and later in life. Being more active and eating healthily could lower the chances of mothers developing ‘gestational diabetes’. However, these changes might help some mothers more than others. It could depend on things like how much they weigh, their age, how many babies they have had before, their ethnicity, and education level. We wanted to see if improving physical activity and diet – ‘lifestyle interventions’ – could prevent gestational diabetes, and whether all mothers benefit. We looked at individual information from almost 24,000 women in different studies from all over the world, that recruited a total of about 32,000 women. Some studies looked at changes in physical activity, some at diet changes and some at both. When we put all this information together, we found that lifestyle interventions could reduce the odds of gestational diabetes by about 10% when considering only studies that shared their data, although some women could see a slight 2% increase in risk. Including information from studies that did not share data, showed greater benefit, reducing the odds by about a fifth. Lifestyle interventions seemed to work better in mothers who were more educated, so support is needed to make them work for everyone. Physical activity seemed to be the most effective intervention, and reduced caesarean births, having babies who were either too small or too big for their age, and the need for special care after birth. Eating better also lowered the risk of having a baby too early. Although lifestyle intervention was more expensive to the NHS, it lowered the chances of gestational diabetes.
ERBB2 aberrations are established oncogenic drivers with validated therapeutic relevance in breast cancer and emerging indications across solid tumors. In gastrointestinal malignancies, the prevalence, molecular contexts, and therapeutic implications of ERBB2 alterations remain incompletely defined, particularly for mutation-driven (non-amplified) disease. This study aimed to delineate the genomic landscape of ERBB2 alterations across colorectal cancer (CRC) and gastric cancer (GC), with emphasis on mutation versus amplification subtypes and their associated molecular contexts. This was a retrospective observational genomic cohort study based on next-generation sequencing (NGS) data from patients with gastrointestinal malignancies. Tumors harboring ERBB2 alterations were identified through targeted NGS. Variants were annotated using InterVar and classified according to oncogenic significance. Alterations were mapped to HER2 functional domains and integrated with co-mutation patterns, copy number profiles, tumor mutational burden (TMB), and microsatellite instability (MSI) status to characterize subtype-specific genomic features. Across CRC and GC, ERBB2 mutations predominantly clustered within the HER2 kinase domain, with recurrent hotspots (R678Q, S310F/Y, L755S, V842I) largely classified as oncogenic or likely oncogenic. In CRC, ERBB2-mutant tumors frequently co-harbored alterations in APC, TP53, PIK3CA, ARID1A, and SMAD4, whereas ERBB2-amplified tumors showed co-gains in RARA, TOP2A, and SMARCE1. In GC, mutation-positive cases were enriched for APC, TP53, ARID1A, MUC16, and LRP1B alterations, while amplification was associated with EGFR and cell cycle regulators. Oncogenic ERBB2 mutation subgroups exhibited higher TMB and MSI-H enrichment than amplification-positive counterparts, with no material differences in overall copy-number burden between subtypes. These patterns indicate that non-amplified ERBB2-mutant tumors form a genomically distinct subset with potential immunogenic features. ERBB2 alterations in CRC and GC converge on recurrent kinase domain hotspots but arise within tumor type-specific genomic milieus that likely influence therapeutic response. Compared with amplification, oncogenic ERBB2 mutations are preferentially associated with higher TMB/MSI-H and characteristic co-mutation signatures, supporting the clinical evaluation of mutation-selective HER2 inhibitors and rational combinations with immune checkpoint blockade. Our findings expand the molecular epidemiology of ERBB2 in Chinese GI cohorts, suggesting potential implications for resistance to standard chemotherapy or anti-EGFR strategies in select settings. Insights into ERBB2 gene alterations in Chinese GI tumors and their impact on precision treatment HER2, also called ERBB2, is a gene that can drive the growth of several cancers. While HER2-targeted therapies are well established in breast cancer, the role of HER2 alterations in gastrointestinal cancers such as gastric and colorectal cancer is less clear, particularly when HER2 is altered by mutation rather than amplification. Understanding these differences is important because different types of HER2 alterations may respond differently to treatment. In this study, we analyzed genetic testing data from Chinese patients with gastric and colorectal cancers to better understand the patterns of HER2 mutations and their potential clinical significance. We found that HER2 mutations often occur in key functional regions of the protein and frequently coexist with other genetic changes. Compared with HER2 amplification, tumors with HER2 mutations were more likely to show high tumor mutation burden or MSI-H, features that may increase the likelihood of responding to immunotherapy. These findings suggest that HER2-mutated tumors may require different treatment strategies and could benefit from mutation-specific HER2-targeted therapies or combination approaches with immunotherapy.
Understanding why patients with the same diagnosis exhibit markedly different disease progression-some rapidly, others slowly, with distinct symptom patterns-remains a major challenge in medicine. Here, we developed a machine learning framework called DiSPAH (Disease-progression Speed and Pathway Analysis based on a Hidden Markov model) to estimate both the pathway and speed of disease progression in individual patients. DiSPAH models disease progression as continuous-time transitions among latent disease states with a patient-specific progression speed. We applied DiSPAH to longitudinal clinical scores from an amyotrophic lateral sclerosis (ALS) cohort and inferred each patient's trajectory of the latent disease states and progression speed. These dynamics were associated with baseline clinical features and enabled prediction of future course from first-visit data. Our results highlight that jointly modeling progression pathway and speed improves prediction of heterogeneous disease courses, offering a powerful tool for personalized care and research in ALS and other chronic conditions.
Increasingly, artificial intelligence (AI)-enabled clinical decision support has been incorporated into healthcare settings where trainees of medicine learn. A major pedagogical problem is emerging: at times, when an AI-generated recommendation is presented to learners as part of their training, there is disagreement between the clinical decision support recommendation and learner clinical judgment, reflecting the need to interpret probabilistic AI outputs within clinical reasoning. This article suggests that the frequency of this phenomenon represents a growing educational blind spot; if not addressed by educators, it could negatively affect the development of clinical reasoning among learners, especially in situations where AI outputs are perceived as objective or inherently authoritative rather than probabilistic or fallible, and therefore could potentially hinder the development of the learner's professional identity. This article presents illustrative educational scenarios that show how learner-AI conflicts may occur in educational settings. After presenting the examples, it describes the educational risks that exist when there is no structure to supervise responses to these types of learner-AI conflicts. Finally, it suggests an approach using a 4-stage framework called SEED (Surface-Explore-Evaluate-Decide) to transform learner-AI conflict into an intentional opportunity to learn and provides specific ideas on how to use the SEED framework to create structures for teaching, assessment, and faculty development. In light of the growing presence of technology in all areas of education, it is essential for educators to be prepared to respond to these types of tensions to preserve the values of accountability, thoughtfulness, and humanness in education.