Anxiety among adolescents has increased globally during the COVID-19 pandemic. Adolescents in juvenile detention centers (JDCs) may be particularly vulnerable due to restricted liberty and social isolation. Movement-based interventions such as Dance/Movement Therapy (DMT) have been studied as approaches to support emotional regulation. This exploratory study investigated the psychophysiological effects of a structured DMT intervention on anxiety reduction among adolescents in JDCs, focusing on anxiety-related changes in dopamine (DA) levels and body temperature. This quasi-experimental study included 55 female adolescents from a single juvenile detention center. Participants were allocated to either a non-DMT control group (n = 30, 16.17 ± 1.73 years), which maintained their usual institutional routine throughout the 8-week study period, or a DMT group (n = 25, 16.23 ± 1.68 years) that completed 24 DMT sessions over the same period. Anxiety and physiological measures were assessed before and after the intervention. Anxiety was measured using the Beck Anxiety Inventory (BAI). Physiological measures included mean body temperature (mTb), calculated from tympanic (core) and skin temperature measurements, and plasma DA levels measured using high-performance liquid chromatography (HPLC). Following the intervention, the DMT group showed a significant reduction in BAI scores (-16%, p < 0.001), along with significant increases in mTb (0.11 ± 0.07 °C, p < 0.001) and DA levels (+ 30%, p < 0.001). BAI scores were negatively correlated with mTb and DA levels, whereas mTb was positively correlated with DA levels. This study provides preliminary evidence that DMT may help alleviate anxiety and support psychophysiological regulation among adolescents in JDCs. However, the exploratory design, single-center setting, and all-female sample may limit generalizability. Future multi-center studies with more diverse samples are needed to confirm these findings and clarify the underlying mechanisms.
Home environments shape children's dietary habits, but which factors are most influential is unclear. The study purpose was to identify factors in the home environment associated with child intake of fruit and vegetables (FV) and sugar-sweetened beverages (SSBs) using a national dataset collected in 2013-2015 in the U.S. Data from 5,138 school-aged children (4-15 years old) from 130 U.S. communities were collected in 2013-2015. Parents and/or children completed a dietary screener and additional survey questions to assess household socioeconomic status (SES), grocery shopping sources, home food availability, social support for healthy eating, eating out frequency, and other home eating and related behaviors. Other child characteristics included breastfeeding history, intake of school foods, and participation in other nutrition programs. Community variables included predominant race/ethnicity and SES. Classification and regression trees (CART) identified key predictors of intake. The FV and SSB CARTS had 14 and 12 terminal groups, respectively. Children with the highest FV intake (0.54 SD from mean cups/day; 13% of sample) had fruit more often available at home, dark green vegetables more often available at home, ate dinner with family more often, had SSBs less often available at home, and were breastfed longer. Conversely, children in the two groups with the lowest FV intake either had fruit less often available at home, and family never complimented their eating (-0.86; 2%), or they had family that rarely or sometimes complimented their eating, and perceived school lunches as unhealthy (-0.87; 1%). For SSB intake, the lowest consumers (-0.63 SD from mean tsp/day sugar; 17%) never or rarely had SSBs available at home, and lived in higher SES communities. Children in the two groups with the highest SSB intakes had SSBs available at home more often, and lived in a SNAP-participating household and either ate out less often, used a phone/computer for social networking, and had SSBs available at home very often (1.3; 1%), or they ate out more often, and were breastfed for a shorter duration (1.1; 5%). Home availability of FV and SSBs were the most salient predictors of intake of both FV and SSBs, while other predictors differed between FV and SSB intake. Study findings highlight several actionable home-environment strategies to test in future studies to improve school-aged children's diets.
The goal of this study was to identify symptoms that occur in children post-SARS-CoV-2 infection, their trajectory over the first-year post-enrollment, and relationship to age. Longitudinal comparison of infected and uninfected cohorts. Participants (0-21 years) with laboratory-confirmed SARS-CoV-2 infection were enrolled as infected. The uninfected cohort was individuals without laboratory evidence of SARS-CoV-2 infection. Primary outcome was presence or absence of symptoms. 852 participants (705 infected, 147 uninfected) completed baseline visits. Of those, 558 infected subjects completed a 12-month post-enrollment visit. Twenty symptoms were identified as more common in infected participants compared to uninfected, at either baseline or 12-months, with symptoms varying by age. Some symptoms in the infected were more frequent at baseline (e.g. fever, weight loss), whereas many symptoms persisted through 12-months. Several symptoms were more frequent at 12-months (e.g. dysmenorrhea, persistent headache). Presence of symptoms at 12-months was not significantly associated with the wave of circulating virus at original infection. Interim analysis at one-year post-enrollment identifies 20 symptoms that infected participants were more likely to report post SARS-CoV-2 infection compared to uninfected, at either visit. Type of symptoms varies by age. Ongoing longitudinal data up to 3-years post-enrollment will increase understanding of long-term symptoms of SARS-CoV-2 infection in children and their trajectory. NCT04830852. Although most children recover fully from SARS-CoV-2 infection, some children experience a variety of prolonged symptoms following infection. Many studies attempting to characterize these symptoms and trajectory are not prospective nor longitudinal and lack comparison to uninfected controls. This longitudinal analysis identifies and characterizes post-COVID symptoms in children and adolescents and their trajectory through the first-year post enrollment compared to an uninfected cohort. 20 post-infection symptoms were identified as occurring more frequently in the infected as compared to uninfected cohort. Age played a critical role in the type and frequency of symptoms after SARS-CoV-2 infection. Gastrointestinal symptoms were prominent.
Peritoneal fibrosis, driven by M2 macrophage polarization, limits the long-term application of peritoneal dialysis (PD). Although ADAM19 is known to mediate fibrosis in other organs, its specific role in PD-associated peritoneal fibrosis remains unclear. PD patients were enrolled in a single center and divided into three groups depending on the PD time. Demographic and clinical data were collected. We detected the expressions of ADAM19, Notch1, Fibrosis-associated protein, chemokines and inflammatory factors in the peritoneum dialysis effluent by real-time PCR and western-blot assays. Macrophages were identified through flow cytometry. Then we analysis the relationship between ADAM19 and clinical data in PD patients. Furthermore, we established mouse models for peritoneal fibrosis to verify the biological function of ADAM19 in regulating macrophage polarization. In the long-term group, the fibrotic proteins (Fibronectin, α-SMA) and inflammatory factors (IL-6, IL-10) and chemokines (CCL5, CCL2, CXCL16) were higher than short-term group and more macrophages polarized towards M2. ADAM19 expression was linearly correlated with dialysis time and Kt/v. The AUROC of ADAM19 was 0.738 to identify the predictive value for peritoneal dialysis adequacy. The cut-off of ADAM19 RNA level was 7.84. In logistic regression models, higher ADAM19 (≥ 7.84) was also independently associated with lower Kt/v (< 1.67). Additionally, the results revealed a moderate increment of M1 macrophage (CD86+) and enormous rise of M2 macrophage (CD206+) with high-glucose dialysis fluid in mice model. Furthermore, the 8-week G4.25% group showed significant growth of M2 macrophage compared to the 4-week G4.25% group, indicating that prolonged dialysis duration has a more pronounced effect on promoting M2 polarization of macrophages via ADAM19/Notch1 signaling pathway. Through stimulating chemokines and inflammatory factors, ADAM19 regulated macrophage polarization and was correlated to the progression of peritoneal fibrosis. ADAM19 is expected to be a novel indicator for detecting peritoneal ultrafiltration function in PD patients.
The substantial antigenic diversity of Influenza A virus (IAV) presents significant challenges to the development of broadly protective vaccines for swine. Moreover, pigs vaccinated with whole-inactivated virus or hemagglutinin (HA) subunit vaccines may experience more severe lung consolidation than non-vaccinated pigs when exposed to antigenically mismatched IAV strains, a phenomenon known as vaccine-associated enhanced respiratory disease (VAERD). We recently developed a lipid nanoparticle-encapsulated DNA (LNP-DNA) vaccine encoding the HA of IAV, which elicited robust immune responses following a single immunization and protected pigs against homologous IAV challenges. In this study, we compared the immunogenicity and protective efficacy between the HA protein-based vaccine and the HA DNA-based vaccine against an antigenically mismatched IAV strain in pigs. Neither vaccine induced cross-reactive hemagglutination inhibition (HI) antibodies nor prevented viral shedding in nasal secretions following heterologous challenge. However, while the HA protein-based vaccine exacerbated lung lesions compared to non-vaccinated controls, the HA DNA-based vaccine prevented the development of gross lung pathology. Transcriptomic analyses revealed distinct gene expression profiles between the two vaccine groups. These findings suggest that the LNP-DNA vaccine platform may offer a safer and more effective strategy for developing vaccines against IAV in swine.
HIV broadly neutralizing antibodies (bnAbs) have been widely studied, and inducing a robust broadly neutralizing response remains a goal for many human vaccine studies. Much research focusses on creating higher affinity antigens to target bnAb precursors, however this ignores the role of non-neutralisers that are known to dominate HIV-specific responses. Furthermore, any effective vaccine will likely need to induce multiple bnAbs. Here, we question whether the previously postulated affinity ceiling of 10-9M limits bnAb induction above a certain affinity. Using adeno-associated virus (AAV)-mediated CRISPR-Cas9 engineering to express HIV-specific bnAbs and non-bNAbs in Ramos B cells, we investigate antigen-specific activation by BCRs of known binding/neutralization potency, epitope and affinity. We found that between different bnAb epitopes affinity was not predictive of activation. However, within individual epitopes and across non-bnAb epitopes, increased affinity results in increased activation. We propose that at activation-resistant bnAb epitopes, affinity is overruled to an extent by the physical BCR-antigen interaction, while more 'simple' and immunogenic non-bnAb epitopes are more heavily dependent on BCR affinity. These findings may have significant consequences for vaccine development, as a high affinity BCR-antigen interaction at an epitope with poor activation potential may be unsuccessful at participating in the humoral response.
Social vulnerability (SV) influences rehabilitation and postoperative care for patients with hip fracture. However, most previous work relies on area-level measures that overlook interindividual variation. The recent adoption of ICD-10 Z-codes allows clinical identification of patient-level SV and may offer a better understanding of its impact. This study aimed to evaluate healthcare utilization, including readmissions, discharge disposition, and length of stay (LOS) in surgically treated hip fracture patients with and without clinically acknowledged SV. Adults surgically treated for hip fracture between 2016 and 2020 were included from the Nationwide Readmissions Database. SV was defined as having at least one documented relevant ICD-10 Z-code. Primary outcome measures included complications, LOS, discharge disposition, and 30- and 90-day readmissions, stratified by SV and evaluated using chi-square analyses. Multivariable logistic regression assessed long LOS (≥ 5 days) and discharge to home, adjusting for age, insurance/income status, and substance use. Patients with SV were younger (35.6% with SV vs. 50.1% without SV were 81+), had a lower median household income (38.8% with SV vs. 25.7% without SV were in the lowest quartile), and were more often insured by Medicaid (19.3% vs. 3.8%). Alcohol/drug use disorders were significantly more prevalent in patients with SV (18.5% vs. 4.5%). SV was associated with 47% higher odds for long LOS (1.47, 1.41-1.54) and 23% higher odds for discharge to home (1.23, 1.16-1.30) but comparable 90-day readmissions (21.2% vs. 19.8%). Among surgically treated hip fracture patients, SV was associated with higher odds of long LOS and discharge to home but no meaningful difference in readmissions. The small number of patients with clinically documented SV highlights the limited reporting by healthcare workers. This analysis of a nationwide all-payer database highlights the need to identify these higher risk patients and implement appropriate care pathways to reduce healthcare utilization.
The automatic recognition of artistic styles is an important part of the management, analysis, and access to the huge digital cultural heritage archives. Nevertheless, the aesthetic complexity and minor distinctions between works of art become serious obstacles to this process. To overcome this, this study introduces a multi-stage, novel technique that, in comparison to current techniques, intelligently derives and combines local and global features to obtain a more comprehensive style representation. In the suggested approach, initially, the image is split into visual parts with the help of the K-means algorithm, and the local characteristics of each part are obtained with the help of specific CNNs. Simultaneously, a more detailed CNN obtains global features of the whole picture. The two sets of features are then intelligently fused with an attention mechanism. In the third step, an Autoencoder (AE) is employed to downsize the dimensions of the extracted features and eliminate noise and redundant data because of the large dimensionality of the extracted features. Lastly, a SoftMax classifier is used to perform classification. The proposed method was assessed using the Metropolitan Museum of Art (MET) Public Archives dataset in this study. Our approach on the MET dataset had an 89.07% accuracy and F-measure of 0.8867, which proves it to be superior.
Work in eating disorder (ED) services presents unique challenges and rewards that may affect clinicians' work-related and personal wellbeing. However, research on ED clinician needs, views, and experiences is still sparse, despite major service changes since the COVID pandemic. This study aims to explore and conceptualise NHS ED clinicians' work-related experiences, challenges, and needs, in order to inform future clinicians wellbeing and service improvement strategies. Clinicians working in ED services (N = 19) were interviewed using a semi-structured interview guide that probed their professional experiences, work-related needs, and views. Interviews were analysed using NVivo, following guidance from Braun and Clarke (2006) for reflexive thematic analysis. A holistic ecological systems framework for ED services was created, comprised of five levels of influence: intrinsic, intra-personal, departmental, systemic, and societal. These levels contain nine themes: [1] clinician motivation for working in ED services [2], complexities of ED management [3], clinician personality and emotional disposition [4], team dynamics [5], supervision, management, and organizational support [6], service-level concerns [7], macro-level systemic concerns [8], broader societal challenges in ED care, and [9] COVID-related challenges. Key concerns included the chronic nature and risk of EDs, growing service demands amid limited resources, and regulation through guidelines and commissioning targets. This presented framework illustrates the multifaceted array of complexities faced by ED clinicians. The interplay of personal, inter-personal, and systemic factors is explored, with clinicians' interest in and commitment to ED care at the core of the framework. These areas can be targeted to improve clinician job satisfaction and reduce burnout risk, with the goal to provide optimal patient care. This study explores the experiences and wellbeing of clinicians working in NHS eating disorder (ED) services. Through interviews with clinicians, the research explored both the positive and difficult parts of their job. While staff felt strongly committed to helping people with EDs, many also described feeling emotionally drained and frustrated. This was often due to high workloads, not enough resources, and long waiting lists. Clinicians found it especially hard when they had to follow strict service rules that didn’t work well for individual patients, and when they had to manage complex medical risks. Supportive teams and good supervision helped some staff cope. Wider problems like staff shortages, poor communication between services, and lack of funding compounded emotional strain. The findings show that ED clinicians urgently need more support, including better resources, more flexible ways of working, and proper training, to give safe, effective care without burning out.
Rheumatoid arthritis (RA) frequently affects the forefoot, causing pain and deformity even in remission. Subclinical inflammation may persist and vary with disease duration. This study aimed to compare clinical, ultrasound, and radiographic features of the forefoot in RA remission with metatarsalgia according to disease duration (< 10 vs. ≥ 10 years). Cross-sectional study including 84 RA patients in remission (DAS28 < 2.6) with metatarsalgia: 33 with < 10 and 51 with ≥ 10 years of disease duration. Clinical, biomechanical, ultrasound (B-mode and Power Doppler (PD), and radiographic variables were assessed by blinded evaluators. The ≥ 10-year group showed greater structural burden. Total synovitis was similar between groups, more frequent at central metatarsophalangeal joints (MTPs) (particularly MTP2 in < 10 years), and PD was rare (6%). The overall pattern was consistent with early MTP1/5 involvement and progressive structural accumulation at MTP2-4, with higher odds of erosions (OR 3.5-5.0), joint space narrowing (JSN; OR 2.8-4.4), and subluxations (MTP2 OR 5.31; MTP3 OR 2.50) in the ≥ 10-year group. In RA remission with metatarsalgia, our findings are consistent with early structural involvement at MTP1/MTP5 and more frequent B-mode synovitis at central joints, with progressive structural burden at MTP2-4 as disease duration increases. PD positivity was uncommon (6%), whereas ultrasound remained essential to detect synovitis. Our findings support systematic assessment of all MTP joints (MTP1-5), with particular attention to MTP1/MTP5 in earlier disease and careful structural evaluation of central rays in longer disease duration. As a cross-sectional study, temporal or causal sequences cannot be inferred.
Despite modest improvement, the lifespan of a child on dialysis continues to be 40 years shorter compared to healthy children. Cardiovascular disease (CVD) is the leading cause of morbidity and mortality in these patients. Risk factors for CVD are present even in early stages of chronic kidney disease (CKD) and accelerate as the child's renal function deteriorates. As a result, the highest burden of CVD exists in patients on chronic dialysis. Although dialysis is life-sustaining, the dialysis procedure promotes cardiovascular damage. Both traditional and non-traditional CVD risk factors drive this acceleration. More concerning, the dialysis procedure itself is cardiotoxic. Because of coexisting poor cardiac reserve and altered sympathetic tone in this patient population, dialysis induces repetitive contractile, ischemic injury termed myocardial stunning. This ischemia-reperfusion injury is reversible at first. However, with repetitive episodes, this injury will trigger alterations in cardiac function that decrease contractile function in order to preserve viability. Ultimately, these adaptations lead to remodeling and fibrosis. There is no targeted therapy available to reverse cardiovascular damage in these patients. Intensive monitoring and management of modifiable risks such as hypertension and anemia in the early stages of CKD to optimize cardiovascular health is imperative. However, in late CKD, especially in those patients who are not candidates for preemptive renal transplantation, optimization of the dialysis procedure is critical to prevent acceleration of their CVD burden. Improved assessment of dry weight as well as data-driven fluid management programs may decrease some risk. More importantly, standard implementation of intensified dialysis prescriptions by increasing time or through the addition of convective clearance may mitigate progressive cardiovascular damage and enhance survival. In this review, the pathophysiology of dialysis-induced cardiovascular damage will be reviewed. The management strategies to limit the cardiovascular burden in our patients are also discussed.
Healthcare-seeking behavior is a key factor in how well a health system performs and how fair it is. In Saudi Arabia, public healthcare services are free, yet many people still choose private healthcare, especially in cities like Riyadh. It is important to understand why people seek care from private clinics to help shape health policies, distribute resources better, and improve services across the healthcare system. This study aimed to examine the frequency of private healthcare use, defined as the reported usual or concurrent use of private healthcare services, and to identify sociodemographic, behavioral, and health-related factors associated with this choice among adults in Riyadh, Saudi Arabia. A cross-sectional study was carried out in Riyadh from March to July 2023 using a multistage cluster sampling method. We randomly selected 48 government primary healthcare centers and invited adults aged 18 and older who visited these centers to participate. We collected data electronically with a validated questionnaire that covered sociodemographic details, patterns of healthcare use, reasons for choosing private healthcare, behavioral risk factors, and existing health conditions. We used multivariate logistic regression analysis to find independent predictors of private healthcare use, reporting adjusted odds ratios (AORs) and 95% confidence intervals (CIs). Of 14,239 participants, 72.4% reported using private healthcare services either as a usual source of care or alongside public services. The multivariable analysis revealed several factors to be positively related to private healthcare utilization. Those who were married were more likely to use private healthcare services (AOR 1.23, 95% CI 1.11-1.36). Those with insurance coverage were threefold higher odds of private healthcare use (AOR 3.51, 95% CI 3.13-3.94). Smokers were more likely to seek private healthcare (AOR 1.60, 95% CI 1.45-1.77) than non-smokers, and those who exercised reported increased utilization (AOR 1.83, 95% CI 1.67-2.00). Obesity was also positively related to private healthcare utilization (AOR 1.38, 95% CI 1.12-1.71), and those with heart disease had substantially higher odds of using private healthcare services (AOR 2.09, 95% CI 1.59-2.76). Private healthcare use in Riyadh is common and associated with insurance coverage, marital status, behavioral factors, and certain chronic conditions. These findings provide descriptive insights into factors related to private healthcare utilization among public PHC attendees in Riyadh, without implying causal effects or policy recommendations beyond the scope of the data.
Limb salvage centers have increased in number over time, but lack standardized defining criteria. This systematic review aimed to assess organizational features of limb salvage centers and determine whether orthoplastic centers, in comparison to vascular limb salvage centers, represent a distinct care model that may benefit from standardization. We conducted a systematic review of publications related to limb salvage centers by searching MEDLINE, Embase, Web of Science, and Cochrane databases from their inception through 2024. We quantified binary data extraction as a reporting score of 26 organizational features across six structural care domains for limb salvage centers, based on a validated quality measurement framework. Organizational features differentiating distinct center types were identified to establish a quality framework for orthoplastic centers. Statistical comparisons between center types were performed using appropriate tests (p < 0.05). Of 118 included studies, orthoplastic (n = 43) and vascular (n = 48) centers represented 77% of all studies. Recent increases in orthoplastic publications show substantial variability in organizational features. Orthoplastic center literature more frequently reported plastic surgery consultation criteria (p < 0.001), surgical outcomes (p < 0.001), and centralized network integration (p ≤ 0.006), highlighting acute reconstructive approaches. Vascular center studies documented significantly more organizational team features (p < 0.001) and quality systems (p = 0.033), reflecting established care frameworks for chronic disease management. Six organizational features characterized orthoplastic centers with > 70% prevalence, providing a benchmark framework with standardization priorities. Orthoplastic limb salvage centers demonstrate distinct care paradigms that benefit from standardization. Our findings suggest structural benchmarks to support the need for standardized development of orthoplastic limb salvage centers.
Hidradenitis suppurativa (HS), an inflammatory skin disorder characterized by painful nodules and abscesses, has varying prevalence among different races/ethnicities. This study explored the social drivers of health, burden, and impact of HS among different racial and ethnic groups. An online, cross-sectional survey was conducted among adult patients with HS (September 2023-December 2023) in the USA. Patients were recruited through HS Connect (patient advocacy group) and AmeriSpeak (US national sample panel). Descriptive data were collected using patient-reported outcome measures and de novo questions about patients' disease knowledge and perception, healthcare access and utilization, impact on quality of life (QoL), and social impact. All analyses were descriptive and stratified by racial/ethnic groups. The study included 583 patients (mean age, 34.8 years; 95.5% female) representing a range of racial backgrounds: Black or African American (n = 273; 46.8%), white (n = 236; 40.5%), Two or More Races (n = 47; 8.1%), American Indian or Alaska Native (n = 18; 3.1%), Asian (n = 7; 1.2%), and Native Hawaiian and Other Pacific Islander (n = 2; 0.3%). Ethnic representation also varied (Hispanic/Latino = n = 76; 13.0%). Patients of all races and ethnicities reported considerable QoL impact (Dermatology Life Quality Index, EQ-5D-5L), with results for smaller subgroups (n < 10) included for descriptive completeness only and not intended for comparison with other groups. During flaring, most patients used over-the-counter products/medications (54.2%) or nonmedical intervention/home remedy (56.9%) Up to 36.5% of patients reported challenges in procuring food, utilities, medicine/healthcare, phone, clothing, or childcare when needed in the past year. Among those who paid out-of-pocket for their HS treatment, 55.6% reported that it stopped them from visiting a healthcare provider for treatment. The findings indicate a high burden and impact of HS across all races and ethnicities. Patients reported social drivers of health and challenges with healthcare utilization, indicating the need for integrating social workers and care management teams in dermatology practice, which could facilitate improved care of patients with HS. Hidradenitis suppurativa is a painful skin condition that causes lumps and abscesses. It affects people of all races and ethnicities but is more common in Black or African American individuals. This study surveyed 583 adults in the USA to understand how hidradenitis suppurativa affects people from different racial and ethnic backgrounds. Our focus was on how the disease impacts their daily lives, their ability to access healthcare, how often they visit doctors, their quality of life, and their mental and emotional well-being. Most people said that hidradenitis suppurativa lowers their quality of life and makes daily activities harder. During flaring, many used home remedies instead of seeing a doctor. People suffering from hidradenitis suppurativa also reported trouble getting basic needs such as food, medicine, and transportation. These challenges occurred among patients from different racial and ethnic groups; results for very small subgroups (Asian, Native Hawaiian/Other Pacific Islander) are reported descriptively only and should not be interpreted as representative of these groups or compared with other groups. The research underscores the importance of improving awareness and tailoring care for people with hidradenitis suppurativa, particularly those facing barriers to healthcare.
To evaluate the prevalence of degenerative bony changes of the mandibular condyle and their associations with age, gender, and joint laterality. CBCT scans of 112 temporomandibular joints of 56 clinically symptomatic patients were included based on predefined inclusion and exclusion criteria. Degenerative changes, including erosion, flattening, osteophytes, subchondral sclerosis, and subcortical pseudocysts, were assessed for their presence, frequency, and demographic associations using the Chi-square test, McNemar test, Spearman's correlation analysis, and Cohen's kappa statistics. Erosion was the most prevalent finding (84.8%) and frequently coexisted with flattening. Subcortical pseudocyst showed a positive association with increasing age (p < 0.05), osteophytes were more commonly observed in males (p < 0.01), and subchondral sclerosis occurred more frequently on the left side (p < 0.05). Symptomatic TMJs demonstrated at least one degenerative change, where Erosion was the most prevalent, and subcortical pseudocyst was the least common degenerative change in the mandibular condyle. Age, gender, and joint side showed associations with specific changes.
The deltoid ligament (DL) is the primary stabilizer of the medial ankle, but its injury mechanisms remain poorly understood. This study aimed to investigate the injury risk and mechanisms of individual DL bundles under both acute and chronic conditions to inform prevention and treatment strategies. A validated finite element model of the human foot was used to examine peak stresses in DL bundles under four acute loading scenarios. Chronic loading was simulated by applying gait loads after transecting the lateral ligaments, and the resulting DL stresses were compared with those of the intact model. Additionally, thirty-nine rats were assigned to three groups: a lateral ligament rupture group (LR, n = 13), a tibialis posterior tendon rupture group (TPR, n = 13), and a sham group (n = 13). After 6 weeks of treadmill running, the mechanical properties and histological characteristics of the DL, along with ankle joint morphology and articular stresses, were evaluated to further verify the hypothesized mechanisms of chronic injury. Under acute loadings, the tibiocalcaneal ligament (TCL), anterior tibiotalar ligament (ATTL), and deep posterior tibiotalar ligament (dPTTL) showed the highest stress under pronation-external rotation loading. Lateral ligament rupture increased DL stress during gait. After 6 weeks of treadmill running, the LR and TPR groups showed roughened articular surfaces with osteophyte formation, increased articular stress, decreased talar bone volume fraction, lower failure load and stiffness ratios of the DL (p < 0.01), reduced fluorescence intensity of COL1, and elevated levels of COL3, MMP-2 and IL-1β compared with the sham group (p < 0.01). The TCL, ATTL, and dPTTL bundles are particularly susceptible to acute injury, with pronation-external rotation posing the greatest risk. Chronic degeneration of the DL occurs following rupture of the lateral ligament or tibialis posterior tendon, with a more pronounced effect after lateral ligament rupture.
Plant cells are connected to their neighbors via plasmodesmata facilitating the exchange of nutrients and signaling molecules. During immune responses, plasmodesmata close, but how this contributes towards a full immune response is unknown. To investigate this, we develop two transgenic lines which allow to induce plasmodesmal closure independently of immune elicitors, using the over-active CALLOSE SYNTHASE3 allele icals3m and the C-terminus of PDLP1 to drive callose deposition at plasmodesmata. Induction of plasmodesmal closure increases the expression of stress responsive genes, salicylic acid accumulation and resistance to Pseudomonas syringae DC3000. More homogeneous plasmodesmal closure using icals3m also leads to the accumulation of starch and sugars, decreases leaf growth, as well as hypersusceptibility to Botrytis cinerea. Based on the profile of responses, we conclude that plasmodesmal closure activates stress signaling, raising questions about the signals mediating this response and whether these responses occur in all circumstances when plasmodesmata close.
The global ocean carbon dioxide flux (air-sea) has shown a slow upward trend. Based on more than 160,000 quality-controlled measurements of surface ocean carbon dioxide fugacity from 2000 to 2020, a satellite-based ocean-atmosphere carbon dioxide fugacity (fCO2) retrieval algorithm was developed using machine learning methods. A comparative analysis was conducted among various machine learning methods, including XGBoost, random forest, light gradient boosting machine, feedforward neural network, convolutional neural network, and backpropagation neural network. Based on the best performance, the random forest algorithm was selected for model construction. Independent in situ validation showed that the model achieved a low root mean square error (RMSE = 14.35 µatm), a low mean absolute percentage error (MAPE = 2.61%), and a high coefficient of determination (R² = 0.86). The distribution of global air-sea carbon dioxide fugacity from 2000 to 2020 was reconstructed at a resolution of0.25° × 0.25°, and the air-sea carbon dioxide flux (FCO2) of the global ocean during the period of 2000-2020 was further estimated at a resolution of 0.25°×0.25°. During the period of 2000-2020, the global ocean CO2 uptake increased from 1.443 PgC/year in 2000 to 1.894 PgC/year in 2020, and the air-sea carbon dioxide flux in the entire study area increased by 31.2% over the 20 years. These comprehensive oceanic carbon sink datasets and new insights will support future research on ocean carbon sequestration and its climate regulation potential.
This study aims to develop a precise predictive model for leaching chalcopyrite concentrates. It employs a leaching system comprising 1-hexyl-3-methylimidazolium hydrogen sulfate ([Hmim][HSO4]) and hydrogen peroxide (H2O2), offering a more efficient alternative to conventional hydrometallurgical approaches. Gene expression programming (GEP) was used to develop this model. To construct these GEP models, 120 experimental data points were collected initially. Input variables included time, acid concentration, temperature, particle size, oxidant concentration, stirring speed, and solid/liquid ratio, while output variables included copper extraction percentage. For modeling purposes, the experimental dataset was randomly partitioned into a training set (84 data points) and a testing set (36 data points). A correlation analysis (BCA) revealed weak linear correlations between input variables, justifying the use of advanced methods such as GEP. Using criteria such as coefficient of determination (R2), mean absolute error (MAE), and root relative square error (RRSE), we proposed the optimal model (GEP-3). As a new model with simplified mathematical expressions for accurate prediction of copper extraction from chalcopyrite concentrate, this model achieves R2 = 0.976, MAE = 2.80, and RRSE = 0.152 in the training set. Sensitivity analysis revealed that temperature, oxidant concentration, and particle size were the most influential parameters on the copper extraction percentage. By taking into account practical or economic constraints, the proposed model enables the optimization of the leaching process to maximize copper extraction and minimize material consumption.
This study introduces an improved model for neutron-gamma density (NGD) measurements, specifically designed to reduce the impact of pair production on density estimations. Using the MCNPX simulation software, the model simulates the interaction of the NGD tool with geological formations, focusing on the energy distribution of neutrons and gamma rays, and the effect of mudcake thickness on gamma flux ratios. The study investigates how varying hydrogen weight fractions, particularly those below 10%, influence gamma and neutron flux. The new model effectively mitigates errors caused by pair production, offering more accurate density estimations, especially for formations with low hydrogen content. Experimental benchmarking with a Cf-252 source was conducted to validate the simulation results, providing a comparison with real-world data. While the model demonstrates improved performance for materials with hydrogen content less than 10%, further refinements may be needed for high hydrogen fraction scenarios.