"Keep On Keep Up" (KOKU) is a tablet-based digital program based on the well-validated Otago and Fitness and Mobility Exercise programs for older adults to decrease the risk of falling. This substudy involved a process evaluation in order to analyze the usage patterns of the KOKU digital program, specifically training frequency, volume, and intensity among older adults over a 3-month self-managed training period. Pre-post changes in physical capacity and real-world walking were examined. This study is a nested cohort study within the three-armed randomized controlled SMART-AGE trial conducted in Germany (German Clinical Trials Register ID: DRKS00034316). Participants aged 67 years or older with basic digital literacy were included. KOKU provided guided but unsupervised progressive strength and balance training for 3 months. The data on training adherence, engagement, and progression were collected. Instrumented assessments included the Timed Up and Go Test, the 30-Second Chair Rise Test, and real-world walking monitoring using wearable sensors. A total of 113 participants (n=63, 56% female; mean age 74.02, SD 5.36 y) were included in the analysis. During the 3-month period, participants used KOKU for 24 (SD 15) days, that is, 2 to 3 times per week. Over the entire study period, no falls or other adverse events were reported due to KOKU usage. The number of exercises performed per participant ranged from 2 to 213, with a median value of 70. The instrumented Timed Up and Go Test results revealed a prolonged total duration (d=0.26; P=.009). In the instrumented 30-Second Chair Rise Test, improvements were observed in the number of completed repetitions (d=0.21; P=.04) and frequency of repetitions (d=0.23; P=.03). This was mainly due to a reduction in inactive time (d=-0.60; P<.001). Real-world walking parameters remained unchanged, except for a slower walking speed during walking bouts of less than 30 seconds (d=0.49; P<.001). All changes did not meet the criteria for minimally important differences. KOKU is a novel digital intervention for older adults, promoting balance and strength exercises. Physical capacity improvements were small. However, the use of instrumented assessments provided further insights into participants' capacity and mobility that would not have been identifiable with conventional assessments. Future improvements to the program should focus on incorporating more challenging exercises for individuals with varying levels of physical capacity.
Although many studies document advancing phenology in response to warming, it is challenging to identify which environmental drivers influence phenology and whether phenological shifts suitably track changing environmental conditions. By examining phenological trends on a climate-relevant scale (e.g., heat accumulated at flowering or days between snowmelt and flowering), we can test whether climate change outpaces phenological shifts. If these climate-relevant phenology metrics remain constant over time, that would suggest perfect phenological tracking to match the climate conditions under which plants historically flowered. We analyzed a long-term data set on timing of first flower and climate from 1995 to 2024 in 25 species in a semiarid intermontane grassland in west-central Montana, United States. We used an information theoretic approach to examine how flowering phenology shifted over time and in response to a suite of environmental variables (temperature, growing degree days [GDD], precipitation, snowmelt, drought), whether responses were nonlinear, and whether responses differed between early- vs. late-spring flowering species. We then asked whether climate-relevant measures of phenology remained constant over time, which would suggest tracking of suitable environmental conditions. Flowering phenology advanced nonlinearly over time and under warmer and drier conditions. In particular, early-spring flowering species advanced flowering under drought conditions. Plants flowered with fewer GDD accumulated and sooner post-snowmelt than historically. Phenological shifts keep pace with climate change in our system. However, nonlinear phenological responses to climate indicate that phenological shifts are slowing. Phenological shifts in spring flowering species may be limited as temperatures continue to increase.
暂无摘要(点击查看详情)
This study investigated queer therapists' perceptions of police bias and attitudes toward police. We used a mixed-methods explanatory approach by administering the Perceptions of Police Scale (POPS) to mental health professionals and conducted follow-up qualitative interviews with participants who identify with a queer sexuality to explore the factors influencing their perceptions of police and the ways in which they view and utilize reliance on police when a client is experiencing a mental health crisis. Quantitative data were analyzed with SPSS using correlations and t-tests to assess the relationship between sexuality and perceptions of police scores. Sexuality significantly correlated with perceptions of the police, meaning queer therapists viewed the police less favorably compared to heterosexual participants. A qualitative analysis using reflexive thematic analysis revealed seven themes that address how queer therapists perceive police and strategies utilized to decide if calling 911 for a client in crisis would be lifesaving or life-threatening.
Whom we meet shapes how infections spread. Where earlier focus of mathematical epidemiology was on incorporating age, more recent work has begun to reveal the importance of socioeconomic aspects for understanding and managing future epidemics.
Antifungal resistance (AFR), an emerging and significant threat to humanity, receives far less attention than antibacterial resistance. AFR management is plagued by rising incidence in both immunocompromised and immunocompetent populations, zoonotic infections, limited fungal diagnostics, limited antifungal armamentarium with even fewer in pipeline, cross-resistance between environmental fungicides and clinically used antifungals, lack of strong antifungal stewardship programs, limited applications of novel strategies for human AFR, to name a few. In this review, we compile available information, current advances and our perspectives on major challenges in the origin and management of AFR. We conduct systematic review of all available publications on standard databases using search terms related to AFR from the past until January 2026. We consider all major publications on key questions in AFR. We focus on the following aspects of AFR: environmental drivers, AFR and "One Health" perspective, climate change and AFR, emerging fungal zoonoses, fungal diagnostic challenges and advances, key resistant fungal pathogens, antifungal stewardship programs, summarize the use of newer antifungals and their clinical trials, and provide an overview of novel strategies for tackling AFR, like the potential use of mycoviruses and vaccines. We also briefly present our perspective on the future of AFR progression and management. Fungi are tiny germs that can cause serious infections in humans. There are only a few medicines available to cure these infections. Sometimes, these medicines cannot kill the fungi. This is called antifungal resistance (AFR). When AFR happens, the fungus keeps growing in the body, making the person sicker and the infection does not heal. This resistance can develop when these medicines are used too much or unnecessarily, either in farming (to protect crops) or in people (when a different medicine was needed as the germ was different). It is important to detect fungal infections and resistance early so that the right treatment can be given. In the future, we need to develop new medicines, improve testing methods in the laboratory for detecting this germ early in patients, use antifungal medicines carefully, and create rules to prevent their unnecessary use, especially in farming. These steps will help keep the current medicines effective and prevent antifungal resistance, so that patients are cured of such infections.
INTRODUCTION Anthropogenic climate change is reshaping where plants can live. As temperature and precipitation patterns shift, many species are moving to stay within suitable environmental conditions. Predicting how these range shifts will affect future biodiversity requires knowing both where suitable habitats will occur and whether species can reach them. The latter is challenging because dispersal abilities differ widely among species and depend on landscape structure, anthropogenic barriers, and climatic conditions. Large-scale biodiversity forecasts therefore often rely on overly simple assumptions-such as no dispersal, unlimited dispersal, or identical movement rates for all species-thus adding major uncertainty to projections and conservation planning. RATIONALE We used the largest global database of observed plant range shifts (BioShifts; 14,488 records across 6579 plant species) to build models that predict species-specific range shift velocities. Combining 6.8 million plant occurrence records, an ensemble of two top-performing habitat models, and climate projections from 10 global circulation models, we mapped current and future suitable habitats-areas with favorable climate, soil, and land use-at 8 × 8 km resolution for each species. Our analysis covers 18% of known vascular plant species under four greenhouse-gas emissions scenarios for 2081 to 2100. We then overlaid the projected future suitable habitats with species-specific range shift velocities to determine where each species is likely to persist or expand by the end of this century. From these results, we estimated global extinction risks, changes in local species richness, and temporal species turnover in community composition. RESULTS Overall, 7 to 16% of modeled plant species are projected to lose >90% of their range across emissions scenarios, placing them at high risk of extinction. Most of these losses (70 to 80%) stem from suitable habitats disappearing as a result of climate change, rather than from dispersal limitations, indicating that climate-induced habitat loss, rather than an inability to keep pace with changing climate, is the primary threat. Although range shifts are unlikely to prevent many global extinctions, they will strongly reshape local species composition. Plant movements into newly suitable habitats are expected to increase local species richness across 28% of Earth's land surface, maintain latitudinally averaged species richness in the tropics and subtropics (35°S to 35°N), and generate substantial species turnover in mid-latitudes (30° to 50° in both hemispheres). By contrast, in regions north of 50°N, warming is so rapid that most plants cannot keep pace, leading to widespread local extirpations and sharp declines in species richness. CONCLUSION Range shifts can help sustain local species richness but are unlikely to provide much relief from global extinctions. To reduce extinction risks, identifying and protecting climate change refugia to safeguard biodiversity, and expanding ex situ conservation efforts, such as global seed bank and botanic garden networks, may be more effective than facilitating migrations. At the same time, conservation strategies should anticipate changing community compositions and ecosystem functioning as new species arrive and ecosystems reorganize. In high-latitude regions where dispersal lags considerably behind the rapid warming, improving habitat connectivity, reducing human-made barriers, and where appropriate, assisting species movement could help maintain local species richness, ecosystem productivity, carbon sequestration, and ecosystem stability. [Figure: see text].
Alzheimer's disease (AD) is the primary cause of dementia, characterized by a progres-sive decrease in mental abilities and the accumulation of amyloid-beta (Aβ) peptides in the brain. The combination of these peptides leads to the development of neuritic plaques and neurofibrillary tangles that disrupt neural communication and eventually lead to the loss of neurons. One of the fac-tors that are involved in the development of AD is mitochondrial dysfunction. Disrupted function-ing of mitochondria leads to the production of less energy by the cells, increased oxidative stress, and accelerates the neurodegeneration process. Neurons that carry out their mitochondrial functions normally are required to keep the balance of calcium, in a reasonable energy production, and in the survival of the cells. Mitophagy, which guarantees the clearing of damaged mitochondria, is im-paired in AD. Cholinesterase blockers and NMDA receptor blockers are currently used as treat-ments, but these are not aimed at the underlying pathophysiology of the condition. New treatment approaches that are aimed at enhancing mitochondrial health, in contrast, are viable at providing a potential to decelerate or alter mitochondrial AD progression. The goals of these approaches include enhancement of the mitophagy process, alleviation of oxidative stress, and preservation of mito-chondrial health, which may disrupt major pathological events such as Aβ aggregation and tau hy-perphosphorylation. By concentrating on the replacement of mitochondria, scientists are moving in the right direction to develop therapies that will not only help control the symptoms but also cure the disease.
Anemia is a frequent complication in cats with chronic kidney disease (CKD), mainly due to decreased erythropoietin (EPO) production resulting from a loss of renal erythropoietin-producing (REP) cells. Anemia affects both survival and quality of life by inducing clinical signs such as lethargy and anorexia. Early detection of anemia by monitoring hematocrit/packed cell volume (Htc/PCV) trends is essential to initiate treatment before progressive renal damage limits therapeutic effectiveness. Molidustat, a hypoxia-inducible factor prolyl hydroxylase inhibitor (HIF-PHI), is a novel oral drug stimulating endogenous erythropoietin synthesis. However, treatment has to be started when REP cells are still functional, with PCV target ranges between 30-40%. Regular monitoring following the International Renal Interest Society (IRIS) guidelines is essential to keep PCV within the target range and to determine when treatment should be started, stopped, or restarted. Moreover, monitoring the iron status is important since both functional and absolute iron deficiency might be seen in cats with CKD and can impair response to treatment. Anämie ist eine häufige Komplikation bei Katzen mit chronischer Nierenerkrankung (chronic kidney disease, CKD), hauptsächlich aufgrund einer verminderten Produktion von Erythropoetin (EPO), die aus einem Verlust von renalen Erythropoetin-produzierenden (REP) Zellen resultiert. Anämie beeinflusst sowohl das Überleben als auch die Lebensqualität, indem sie klinische Symptome wie Lethargie und Appetitlosigkeit hervorruft. Die frühzeitige und regelmäßige Hämatokrit-Bestimmung (Htk/PCV) ist für eine Behandlungsentscheidung wichtig, bevor progressive Nierenschäden die therapeutische Wirksamkeit einschränken. Molidustat, ein Hypoxie-induzierbarer Faktor (HIF)-Prolylhydroxylase-Inhibitor (PHI), ist ein neues orales Medikament, das die endogene EPO-Synthese stimuliert. Die Behandlung sollte jedoch begonnen werden, wenn die REP-Zellen noch funktionsfähig sind, mit PCV-Zielwerten zwischen 30–40%. Eine regelmäßige Überwachung gemäß den Richtlinien der International Renal Interest Society (IRIS) ist entscheidend, um den PCV-Wert innerhalb des Zielbereichs zu halten und zu bestimmen, wann die Behandlung begonnen, beendet oder wiederaufgenommen werden sollte. Darüber hinaus ist die Überwachung des Eisenstatus wichtig, da sowohl funktioneller als auch absoluter Eisenmangel bei Katzen mit CKD auftreten können und die Antwort auf die Behandlung beeinträchtigen können.
While this book's theme is proteomics and virology, this chapter on the role of proteomics in clinical virology may seem unsatisfying to the reader. The reason is simple: proteomics has not made a major impact on the diagnosing of viral infections. However, it is still important to evaluate the present role of proteomics in clinical virology as it provides a vision of what is required if it is to ever have a major impact in diagnosing viral infections. This chapter will initially provide a background on current techniques for diagnosing viral infections. Although the names may not be familiar to the casual reader, the techniques will be as most individuals have been tested using one of these techniques during their lifetime. The chapter will later switch to describe proteomic methods for the direct detection of viruses in clinical samples. While outside the scope of this book, a proteomic method for bacterial identification is included because (i) it describes a proteomic technology that has been approved by the FDA for clinical use and (ii) since viruses and bacteria are both infectious agents it demonstrates the potential of proteomics for overall pathogen detection. To keep the topic in perspective, however, the advantages and disadvantages of using proteomics in clinical virology are compared to current existing methods particularly those that target the detection of genomic features within these pathogens.
The capacity to represent the mental states of other individuals, known as 'mindreading' or 'theory of mind', is key to successful social prediction. We suggest that cognitive systems for mindreading are resource-rational: they are optimized for generating good predictions about the behaviour of other individuals, while not exceeding the computational capacity of the mindreader. We explore this hypothesis in a simple formal model where we derive cognitive strategies that excel at social prediction while minimizing cognitive effort. We find that it is often optimal for resource-limited mindreaders to keep track of the facts that another agent also knows, instead of explicitly representing the content of the agent's beliefs. When evaluated in mindreading tasks, simulated agents that use this 'factive' strategy tend to make mistakes in the same cases as non-human primates and young human children. Even agents that use more sophisticated strategies avoid representing beliefs unless necessary. Our results elucidate the computational principles underlying efficient social prediction and explain many of the successes and failures of human and non-human mindreading from first principles.
The forced-attention dichotic listening (FADL) task is used to examine how people control attention during speech processing. Younger adults can flexibly shift attention to instructed ears from the baseline right-ear advantage (REA) for consonants (cued by temporal information). However, it remains underexplored whether tonal language speakers can control their attention to tones, which are cued by spectral information and typically trigger a left-ear advantage (LEA) at baseline. Moreover, older adults sometimes fail to show these patterns, as compared to younger adults, possibly due to age-related decline. This study investigated whether these attentional controls generalize to Cantonese tones and whether younger and older Cantonese speakers show different attentional control in an FADL task of Cantonese tones. Sixty native Cantonese-speaking younger adults, aged 18-25 years (Experiment 1), and 64 older adults, aged 59-72 years (Experiment 2), completed tone training followed by an FADL test. In three conditions (i.e., nonforced, forced-left [FL], and forced-right [FR]), they identified dichotically presented Cantonese tones according to attentional instructions. Both the younger and older adults successfully modulated attention. The participants, regardless of age, enhanced the LEA in the FL condition and reversed it to an REA in the FR condition. An exploratory group comparison showed that younger adults might exhibit a significantly larger shift of ear preference in the FL condition than older adults. The findings provide evidence that auditory attentional control is a highly flexible and cue-general cognitive function, which can be generalized to Cantonese tones. Crucially, the core performance of attentional control in Cantonese processing remains largely preserved with aging. It suggests that age-related decline is not the only consequence of aging, but compensatory strategies are adopted by older adults to cope with possible decline and keep their behavioral performance intact. https://doi.org/10.23641/asha.32177577.
Despite growing interest in the therapeutic potential of 3,4-methylenedioxymethamphetamine (MDMA), no targeted measure to systematically assess side effects of MDMA-assisted psychotherapy (MDMA-AP) exists. Our aim was to develop an MDMA-Assisted Psychotherapy Side Effects Tool (M-SET) to capture side effects over the course of MDMA-AP. Informed by a systematic review and a review of other relevant questionnaires, we drafted a list of potential side effects. Face and content validation were obtained via a modified two-round online Delphi process involving experts in MDMA-AP and the neuropsychopharmacology of MDMA. Twelve experts consented to participate over two rounds of Delphi panel deliberations (response rate: Round 1 = 83-92%, Round 2 = 75%). The Delphi panellists were asked to keep, discard, modify or suggest additional items. The final version of the M-SET consists of 165 items across four questionnaires that collect information at screening, baseline, the day of medication sessions and longer term follow-up. The use of a modified Delphi technique proved a successful method to generate content for the first structured tool designed to evaluate side effects specifically associated with MDMA-AP. The M-SET is recommended for use in both research and clinical settings. Its implementation has the potential to improve the safety of delivering MDMA-AP as well as support the development of a more systematic and robust evidence base on its safety and tolerability.
Adults think that sunk costs are relevant to predicting choices, but young children do not. In this article, we tested between two accounts for why children do not make sunk cost predictions. One account holds that children conceive of sunk costs in a fundamentally different way than adults. The other account denies this but holds that children do not reason about sunk costs spontaneously. We tested between these accounts in three experiments on 4- to 7-year-olds (N = 484). Children saw sunk cost scenarios where an agent collected objects that were easy or difficult to obtain but could only keep one. Before predicting the agent's choice, children were prompted to think about some aspect of the situation theorized to underlie sunk cost predictions in adults (effort, waste, emotion) or instead were asked a control question. The findings supported both accounts. In line with the spontaneity account, children aged 6 and older made sunk cost predictions when prompted to think about effort (all experiments), the agent's emotions (Experiment 1), and one form of waste (Experiment 3). However, in line with the account positing discontinuity between children and adults, prompting did not bring younger children to make sunk cost predictions, and even among older children, predictions were not at ceiling. This suggests that sunk costs are not fundamental to conceptions of choice. The developmental shift we observed likely reflects broadening in the factors that children see as relevant for choices around objects. We discuss potential drivers of this shift, including firsthand experience with costs and enculturation. (PsycInfo Database Record (c) 2026 APA, all rights reserved).
Social insect colonies tightly regulate who belongs. A new study shows that ants exposed to foreign individuals become tolerant toward them, but this tolerance fades without contact, and remarkably, only occasional re-exposure is enough to keep it alive.
Understanding the effects of burnout in the nursing profession will take years to fully realize. Evidence is emerging which reveals that as many as 18% of nurses leave the bedside within their first year of practice, many of whom cite the occupational phenomenon of burnout as a factor. This migration of nurses from the bedside into advanced practice prompts the need for peer and organizational support to mitigate burnout among nurses. This project aimed to evaluate the three domains of burnout (emotional exhaustion, depersonalization, and decreased personal accomplishment) among advanced practice nursing students in an effort to keep new advanced practice registered nurses (APRNs) in the workforce. Using simple random sampling, a cohort of 103 APRN students from Baylor University was surveyed using the Maslach Burnout Inventory Human Services Survey for Medical Personnel. Scores for each of the three burnout domains were analyzed for mean and standard deviation as well as using linear regression. Varying degrees of burnout were identified across demographics, with the highest scores belonging to inpatient medical nurses and emergency room nurses. Thirty percent of respondents scored poorly in the emotional exhaustion domain, regardless of the type of units in which they worked. These numbers were congruent with recent trends in data, which reveal burnout levels as high as 30%-50% in health care workers worldwide. This project highlights the urgent need for strategic interventions at the organizational level to diminish the migration of APRNs from the profession.
The application of artificial intelligence (AI) is increasingly valuable as a tool and assistant in many areas of clinical and academic medicine. Generative AI (GenAI) creates new content used by large language models, which can generate language that strongly resembles or even improves on that of humans. Learners and educators in many areas of education are using GenAI for essays and assessments, raising issues regarding learning and assessment. GenAI is also raising new concerns in health professions education (HPE), an area of health professions training that sometimes has different aims and assessment methods compared to its clinical counterparts. HPE needs to assess levels of knowledge and understanding of pedagogy, and the use of GenAI presents challenges to its current assessments, which are predominantly written. The study aimed to investigate educators' and learners' perspectives on the opportunities and challenges presented by GenAI in postgraduate HPE assessments. It particularly focused on perspectives of how GenAI may influence the future of assessment and essay-based assessments in HPE. Informed by a constructivist paradigm, a qualitative approach was adopted, undertaking 8 semistructured interviews conducted via Microsoft Teams. Purposive sampling ensured a mixture of educators and learners in current HPE courses from a range of health care professions. Data were thematically analyzed. There was no difference between educator and learner perspectives. Four themes were identified: AI is here, students are at a disservice if we do not embrace it; AI as an opportunity to rethink HPE assessments; AI is a "gray area"; and AI is fallible. The findings present AI as an external catalyst, highlighting the current internal desire for assessment change within HPE. It offers opportunities for creative, authentic assessments that reflect real-life academic and clinical practice, aiming to develop competent future HPE educators and keep courses relevant. These findings contribute to the debate around the future potential and development of AI in HPE assessments.
The repeated outbreaks of coronavirus show how hard it is to keep antiviral effectiveness when the virus mutates. Coronaviruses depend on tightly regulated proteolytic processing mediated by the 3-chymotrypsin-like protease (3CLpro), making it a key antiviral target. Most current inhibitors engage the catalytic site, an approach that remains vulnerable to resistance driven by active-site mutations. Here, we investigate an alternative strategy based on allosteric regulation of 3CLpro by targeting a pocket surrounding residue Asn28, previously shown to influence enzymatic activity and dimer stability. Structure-based virtual screening identified novobiocin as a candidate ligand for this region, which lies adjacent to but distinct from the catalytic center. Biophysical experiments showed direct binding of novobiocin to 3CLpro in solution, with sub-micromolar affinity (Kd ∼ 3 × 10-7 M). Protease thermal stability and dimeric assembly were lowered by ligand binding. Enzymatic assays revealed a pronounced reduction in catalytic turnover with minimal effects on substrate binding, consistent with an allosteric mechanism of inhibition, and yielded IC50 values of ∼0.5 μM across independent assays. Molecular docking and simulation analyses supported stable binding at the Asn28-associated pocket and revealed localized changes in conformational dynamics. These findings show that novobiocin allosterically inhibits 3CLpro and identify the Asn28-associated pocket as a relevant target for developing inhibitors with improved resistance to viral evolution.
Porcine epidemic diarrhea virus (PEDV) is an enteric alphacoronavirus that has shifted from a sporadic regional pathogen to a worldwide, persistent virus, despite years of vaccination, biosecurity measures, and monitoring. Since the early 1970s, PEDV has shown strong genetic flexibility, the ability to evade immunity, survive in the environment, and spread quickly through connected livestock networks. Major outbreaks in Asia, Europe, and North America, especially the 2013-2014 outbreak in the United States, have revealed key weaknesses in traditional control methods for enteric coronaviruses, especially those that depend on systemic immunization. In this review, we bring together over fifty years of PEDV research to look at how viral evolution, mucosal immune responses, and livestock production systems work together to keep the virus circulating. We point out that changes in the spike gene, including recombination and deletions, along with changes in other viral genes, affect disease severity, immune system recognition of the virus, and the inability of vaccines to provide protection. We also stress the key role of lactogenic immunity and the gut-mammary-secretory IgA system in protecting newborn animals, which helps explain why vaccine inoculations have not been effective in stopping the spread of enteric coronaviruses. By combining evidence from molecular, immune, epidemiological, and systems research, we suggest that PEDV is a strong example for studying how coronaviruses persist under immune pressure in managed animal groups. Using a One Health approach, this review encourages moving away from only reacting to outbreaks and instead focusing on ongoing, genome-based management of coronaviruses in livestock operations.