共找到 20 条结果
Recent papers argued that the domestication of horses can be equated with the appearance of favorable genetic mutations that are first evident in individuals in the DOM2 clade dated about ∼2200-2100 BCE. We challenge the idea that this genetic shift alone defines domestication. Evidence from archaeology, ancient DNA, osteology, and other disciplines shows that horses from multiple genetic backgrounds (DOM1, DOM2, and, as we suggest here, DOM3) were managed, milked, and ridden long before 2200 BCE. Yamnaya groups (∼3200-2600 BCE) rode DOM2 horses-the direct ancestors of modern domestic stock-while incorporating them into diets, rituals, and mobility systems. Selection for traits linked to endurance and temperament began centuries earlier. Rather than a sudden breakthrough, domestication was a protracted, regionally varied process whose transformative effects on human mobility and social organization began as early as the fourth, if not the fifth millennium BCE, and set the stage for later DOM2 dominance.
Direct oral anticoagulants (DOAC) have been associated with a reduced risk of dementia compared to warfarin in patients with atrial fibrillation (AF) without prior stroke. However, the impact of DOAC on dementia risk in AF-related ischemic stroke survivors is unclear. We conducted a retrospective, nationwide cohort study using the Korean National Health Insurance Service database. We identified patients with newly diagnosed ischemic stroke and concurrent AF who began DOAC or warfarin therapy within one month after stroke. Incidence of all-cause dementia, Alzheimer's dementia (AD), and vascular dementia (VaD) was compared between groups using multivariable Cox models with inverse probability of treatment weighting. A total of 3,112 patients (mean age 70.6 ± 9.5 years; 66.6% male) were analyzed, including 2,919 DOAC users and 193 warfarin users. Over a mean follow-up of 3.63 years, 673 all-cause dementia cases (538 AD, 168 VaD) occurred. After IPTW, DOAC use was associated with higher risks of all-cause dementia (HR 1.16, 95% CI 1.04-1.30) and AD (HR 1.85, 95% CI 1.62-2.13) but a lower risk of VaD (HR 0.54, 95% CI 0.45-0.66) compared to warfarin. In this retrospective nationwide cohort of AF-related ischemic stroke survivors, DOAC use was associated with a higher incidence of all-cause dementia and Alzheimer's dementia, but a lower incidence of vascular dementia, compared with warfarin. These observational findings suggest that anticoagulant type may be differentially associated with subsequent dementia subtypes in this high-risk population and should be interpreted with caution.
Neonatal conjunctivitis (NC), historically attributed to Neisseria gonorrhoeae, is now known to be caused by various pathogens, including Staphylococcus, Streptococcus, and Chlamydia trachomatis. Although viridans group streptococci are generally regarded as commensal organisms, they may cause opportunistic conjunctival infections. Here, we present a case of NC caused by Streptococcus mitis/oralis, a member of the Mitis group of viridans group streptococci. A 15-day-old female neonate presented with bilateral severe eyelid edema and moderate hemopurulent discharge that precluded anterior segment evaluation. Symptoms began at nine days of age with serous discharge, which gradually worsened. The severe presentation initially raised concern for orbital cellulitis. Upon hospital admission, a conjunctival swab was obtained, and treatment was initiated with topical moxifloxacin and fusidic acid, along with empirical intravenous penicillin and gentamicin. Eyelid edema and discharge improved significantly within 24-48 h. Culture results confirmed Streptococcus mitis/oralis, which was sensitive to the administered antibiotics. Complete clinical recovery was subsequently achieved without complications. This case highlights the diagnostic challenge of differentiating severe NC from orbital cellulitis and underscores the potential of S. mitis/oralis to cause significant ocular morbidity in neonates. The prompt and dramatic response to antibacterial treatment prevented long-term complications, and this case supports the hypothesis of postnatal transmission from the neonate's own nasopharyngeal flora or that of the caregiver rather than exclusive acquisition from the birth canal.
The residency application process has become increasingly complex, with constant updates to the Electronic Residency Application Service (MyERAS), the introduction of the Residency Common Application Service (RCAS), and changes to specialty-specific processes like the Urology and Plastic Surgery Matches. Even advisors struggle to provide consistent guidance, leaving students overwhelmed by program signaling, application customization, and other challenges. To address this, we developed and integrated into our curriculum a Residency Application Bootcamp, providing students with a standardized, structured approach to confidently navigate this high-stakes and ever-evolving process. We integrated a Residency Application Bootcamp into a 2-week Intersessions course at the end of core clerkships. The workshop began with a 2-h didactic covering essential application topics, followed by 1 day of interactive breakout sessions. Students rotated through interactive breakout sessions led by Student Affairs Deans, focusing on key application components such as personal statements, curriculum vitaes (CVs), noteworthy characteristics (NWCs), and an overview of residency application platforms. This structured format emphasized personalized guidance and iterative feedback to prepare students for residency applications. Two hundred forty-two students participated in three iterations of the workshop. Groups of 21-22 students rotated through the interactive breakout sessions. Students felt it was "the best part of Intersessions," "super helpful," and made other positive comments. Averaging the 3 years' scores together, the breakout sessions scored 4.27 out of 5. In addition, Student Affairs Deans who led the session felt it was an effective way to gain more personal exposure to students, provide individualized advising, and plant the seeds for upcoming MSPE meetings. Student Affairs Deans also reported that NWCs showed notable improvement after this session compared to last year. The Residency Application Bootcamp was successfully implemented at our institution. The flexible format of this workshop allows for improvement and incorporation of student feedback. Although developed within a single institution, the Bootcamp's structure is readily adaptable to other medical schools. Further innovation and research should be pursued to better support students during this critical phase of medical school.
Recognising the need to address high rates of smoking within priority populations, the Cancer Council New South Wales launched the 'Tackling Tobacco' program in 2006. This program aims to reduce tobacco-related disparities among groups experiencing social and economic disadvantage in the state of New South Wales (NSW). After almost 20 years of implementation, this paper describes the development, implementation, evaluation and national scale up of the Tackling Tobacco program. The Tackling Tobacco program uses an organisational change model to embed evidence-based smoking cessation support within community service organisations (CSOs) which provide services to priority populations across Australia. The program comprises six core elements: committed leadership, comprehensive smoking policies, supportive systems, training and follow up, monitoring and data, and consistent quit support. A desktop review of pre- and post-project surveys and annual reports was conducted and used alongside the published evidence from the Tackling Tobacco research team to describe the implementation and impact of the program from 2006 to current. This paper presents data from the 2015-2024 annual reporting periods, supplemented with data collected as part of annual reports from the program's inception in 2006. Since its 2006 inception, the Tackling Tobacco program has partnered with 252 CSOs across NSW, training a total of 4580 CSO staff, and reaching 21,648 clients who smoke. Since 2021 (when client-level indicators began to be measured as part of routine evaluation), 3001 clients had their smoking status assessed. Of these clients, 661 accessed nicotine replacement therapy, 214 received Quitline referrals, 295 received referrals to a clinician, and 1432 received behavioural strategies and written support information. Trusted partnerships and the active involvement of research end-users from program inception to implementation are essential for change at a practice and sector level. Community-based health promotion programs should be built on the highest level of research evidence. Achieving change and impact, particularly at a population level aimed at reducing disparities, requires leadership, resourcing and commitment.
Pharmacists and their teams have a long history of involvement with vaccines. This paper describes the contributions of pharmacy-based immunization delivery in the United States from the 1800's to date. Early activities centered around storage and distribution of antitoxins and vaccines. From the 1950s through the early 1990s, pharmacists were facilitating administration of vaccines by other health care providers. In the early 1990s, efforts began to allow pharmacists to administer vaccines, with the first pharmacist immunization training in 1994. By 2009, all 50 states allowed pharmacists to administer vaccines. From the 1990s onward, pharmacists increasingly contributed to national vaccine policies and guidelines, and states expanded their authority to a broader range of patient ages and vaccines. The coronavirus disease 2019 (COVID-19) pandemic opened more opportunities for pharmacy-based involvement, including federal expansion of immunizing authority to children aged 3 years and older, and the authority for pharmacy technicians and student pharmacists to administer vaccines. Pharmacists and their teams administered more than 50% of the COVID-19 vaccines. Today, the majority of adult vaccines are administered in pharmacy-based settings. Even with pharmacy's substantial contributions to immunization delivery in the United States, challenges persist for providers and patients, including inconsistent state-to-state immunizing authority by age and disease and payor payment. Pharmacy-based immunization delivery has become a significant contributor to improving public health by protecting against vaccine-preventable diseases.
Isolated abdominal aortic dissection (IAAD) is a rare clinical entity, particularly when confined to the infrarenal segment. The presentation is often atypical, leading to delayed diagnosis. We present the case of a 60-year-old female with a history of hypertension who presented to the emergency department with a five-day history of right lower limb pain and inability to stand. Her symptoms began as a sudden, severe, electric shock-like pain in the right flank radiating down the right leg, which subsided after 15 minutes and was replaced by persistent numbness and generalized weakness. Physical examination revealed normal lower limb coloration, palpable pulses, and no signs of acute limb ischemia, lacking the classic five Ps. Echocardiography was unremarkable for proximal aortic pathology. Abdominal CT angiography (CTA) revealed an isolated infrarenal aortic dissection extending to the aortic bifurcation and terminating in the right iliac arteries, with a completely thrombosed false lumen. Given the persistent symptoms suggestive of malperfusion and the absence of suitable endovascular landing zones, the patient underwent an open aortobifemoral bypass using a polytetrafluoroethylene graft. Postoperatively, her neurological symptoms resolved completely, and she was discharged on the 10thpostoperative day without complications. This case highlights the importance of considering aortic dissection in patients presenting with unexplained neurological symptoms, even in the absence of overt limb ischemia. Early diagnosis with CTA and appropriate surgical intervention can lead to excellent outcomes.
Extracellular vesicles (EVs) secreted by cells and tissues are promising tools for disease biomarkers and regenerative medicine, necessitating the establishment of technologies to efficiently enhance their secretion. Here, we investigated the effects of a temperature-controlled microwave irradiation system (TEMIS) on EV secretion from HEK293T cells. Unlike conventional microwave (MW) irradiation devices, TEMIS incorporates an active cooling mechanism that maintains the culture medium temperature at 37 °C during irradiation, thereby avoiding cell death associated with temperature elevation. Although cell viability remained unchanged, the amount of EV secretion in the MW-irradiated group increased by 2- to 3-fold compared to the non-irradiated control group, peaking approximately 3 h after irradiation began. No significant differences were observed between the irradiated and control groups in particle size distribution, proteome analysis, or miRNA expression analysis of the isolated EVs. These results indicate that MW irradiation can enhance EV secretion under temperature-controlled conditions, while preserving the molecular characteristics of the EVs. TEMIS enables MW irradiation without damaging cells, potentially making it a practical method for enhancing EV production from cells.
Respiratory disease remains an important threat to bighorn sheep (BHS; Ovis canadensis), complicated by fragmented populations and environmental stressors. Significantly, BHS epizootic pneumonia can lead to all-age die-offs and is population limiting. We describe an interagency investigation of a die-off within a BHS herd in the Badlands region of South Dakota, USA from 2021-22. Minimum survey counts began declining 2 mo following a reported domestic sheep (Ovis aries) and BHS interaction and dropped from 262 to 56 BHS over 9 mo. Postmortem testing included histopathology, bacterial culture, molecular testing, and partial microbial sequencing in 13 affected BHS. Data from neighboring domestic small ruminants and historical BHS data were compared to outbreak observations. Histopathology revealed moderate to severe bronchopneumonia in many cases. Half of animals examined harbored mild to heavy lungworm burdens (5/10). Pneumonic tissue culture yielded Trueperella pyogenes (10/11), Bibersteinia trehelosi (5/11), and no live Mycoplasma spp. (0/11). Conversely, real-time PCR detected widespread Mycoplasma ovipneumoniae (12/13) for the first time in this herd, with co-occurring bluetongue virus (BTV, 4/10). Two unique M. ovipneumoniae strains were found in BHS, one directly related to nearby domestic sheep isolates. Severe epizootic pneumonia was preceded by multi-strain M. ovipneumoniae spillover, with T. pyogenes causing suppurative pneumonia. High BHS density, a moderate-severe drought, and BTV transmission compounded disease susceptibility and mortality in this herd. Strengthened collaborations are necessary to manage BHS facing multiple coinciding health threats.
Inflammatory ocular diseases (IODs) are frequently associated with multisystem autoimmune conditions and can lead to substantial visual morbidity, including visual impairment and blindness. Emerging evidence suggests that inflammation contributes to the development of depression and other mental health disorders. Individuals with childhood-onset IOD may be at increased risk of poor mental health outcomes and quality of life. However, the prevalence of mental health conditions in this population remains unclear. This study aims to review the evidence regarding the prevalence of mental health conditions among children and adults with childhood-onset IOD. This systematic review will follow the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) 2020 guidelines. Eligible studies will report mental health disorder prevalence and outcomes in individuals with childhood-onset IOD, regardless of age at assessment of outcome. Studies evaluating interventions or focusing primarily on mental health effects secondary to visual impairment or blindness will be excluded. Searches will be conducted in PubMed, the Cochrane Central Register of Controlled Trials, Embase, Ovid, and PsycArticles databases. Gray literature will be identified through Google searches. Two researchers will independently screen titles, abstracts, and full texts; extract data; and assess risk of bias using the risk of bias in nonrandomized studies of exposure tool; disagreements will be resolved by a third reviewer. Data will be synthesized descriptively, with attention to study design, outcome measures, cooccurrence of multisystem disease, and methodological quality. A preliminary scoping search has been completed to estimate the volume of relevant literature. Full searches began in November 2025. Data collection and extraction is currently in progress. Data extraction, analysis, and synthesis will follow, using a narrative approach to summarize mental health outcomes across studies. The final review is expected to be completed by August 2026. The findings from this review will help to establish the prevalence of mental health conditions among people with childhood-onset IOD. The results from this review will support recommendations for further research and policies to ensure the best health outcomes for children. DERR1-10.2196/88888.
Beta-agonist medication and onabotulinumtoxinA injections are commonly used for the treatment of urgency urinary incontinence (UUI); however, there are limited comparative efficacy data to inform best practice. We describe the design and rationale for the Beta-Agonist versus ONA Trial for Urgency Urinary Incontinence (BEST), which was developed in partnership with engaged patients, stakeholders, and communities. The primary objective is to compare treatment efficacy, as measured by coprimary endpoints symptom severity and treatment satisfaction at 3 months. Secondary objectives include comparisons of quality of life, global improvement, cognitive and sexual function, complications, and to describe barriers to UUI treatment. BEST is a multisite, single-blinded, randomized trial of beta-agonist versus onabotulinumtoxinA in female participants with UUI. Patient eligibility, primary and secondary outcome measures, and substantive changes in trial design through engagement with patients, stakeholders, and communities are described. Four hundred thirty-two participants will provide 80% power to detect a 15% difference in both coprimary outcomes of treatment satisfaction and symptom severity between treatment arms at a split alpha=0.025, allowing for 20% attrition. Linear mixed-effects models following the intention-to-treat principle are planned. Participant enrollment began in July 2023 across 5 clinical sites. As of January 6, 2026, 396 of 432 participants (92%) have been randomized, 324 (82% of randomized) have reached the primary 3-month outcome with a 98.7% retention rate for completion of the coprimary outcomes, and 213 (54% of randomized) have reached the 12-month outcome with 94% retention rate. The BEST design was substantively informed by patient and stakeholder input and will provide patient-important comparative efficacy data for beta-agonist versus onabotulinumtoxinA for UUI.
Acute kidney injury (AKI) is common in the intensive care unit (ICU), and fixed creatinine thresholds may miss clinically relevant dynamics. We tested whether a deep-learning anomaly signal from short creatinine-estimated glomerular filtration rate (eGFR) series complements conventional criteria for risk stratification. Seven-step daily creatinine-eGFR instances were built from the Medical Information Mart for Intensive Care (MIMIC-III/IV; development/internal validation) and the eICU Collaborative Research Database (external validation). Time series began within 48 hours before ICU admission and ended at kidney replacement therapy (KRT), death, or ICU discharge. An unsupervised Anomaly Transformer trained on MIMIC produced final-step anomaly scores; the 95th percentile of training scores defined a fixed threshold. Anomaly-detected AKI required a final-step creatinine rise plus a score ≥ threshold. We compared scores across Kidney Disease: Improving Global Outcomes (KDIGO) stages and evaluated 24-96-hour KRT and mortality using area under the receiver operating characteristic curve (AUROC), accuracy, and F1-score. After exclusions, the internal dataset included 81,876 admissions (381,700 time-series instances) and the external dataset 140,237 admissions (494,684 instances). Anomaly scores increased stepwise across KDIGO categories and were higher in windows followed by KRT or death. For KRT prediction at 24, 48, 72, and 96 hours, AUROCs were 0.83, 0.82, 0.81, and 0.80 internally and 0.74 at all horizons externally. For mortality, AUROCs were 0.64-0.66 internally and 0.62-0.64 externally. In threshold-based classification, F1-scores were generally highest with the anomaly rule alone, whereas accuracy was greatest when requiring both anomaly detection and KDIGO stage ≥2. Event-capture analyses showed that anomaly detection identified more near-term KRT and mortality events than KDIGO stage ≥2, with the clearest separation for KRT. A creatinine-eGFR trajectory-based anomaly signal aligned with clinical severity, was associated with near-term outcomes, and appeared to complement KDIGO-based criteria in ICU populations.
Emerging diseases can have devastating consequences for wild species, with long-term effects depending on the ability of the host to evolve resistance. Here, we show that major histocompatibility complex (MHC) genes provided standing genetic variation that enabled rabbits to mount a rapid evolutionary response to the myxoma virus pandemic that began in the 1950s. Using historical and modern specimens starting in 1865 and spanning the pandemic, we found strong parallel shifts in MHC-I allele frequencies across Australia, Britain, and France, alongside population-specific signals. These evolutionary shifts are predicted to alter the peptides presented to T cells. Our results provide evidence that MHC-I is under strong selection in natural populations during a pandemic and that the high polymorphism of MHC may have contributed to the recovery of these populations.
Western Melanesia sits at the junction of three major tectonic plates, characterised by dynamic plate interactions and geological diversity. Tectonic activity in this region has given rise to modern New Guinea, a relatively young island with extreme topographic heterogeneity and evolutionarily complex biota. New Guinea-particularly the East Papuan Composite Terrane (EPCT)-supports exceptional frog diversity; however, evolutionary histories of many frogs of the EPCT are poorly understood. The microhylid frog genus Mantophryne currently includes five described species (M. axanthogaster, M. insignis, M. louisiadensis, M. menziesi and M. lateralis), with the first four confined to small portions of the EPCT and the last distributed across the eastern half of New Guinea. The biogeographic history of these frogs remained incompletely resolved at the time of our study. We estimate that Mantophryne diverged from other microhylid genera approximately 14.4 million years ago (MYA) and began to speciate ca. 8.5 MYA. We recover eight distinct lineages within frogs assigned to M. lateralis, which started to diverge ca. 5.5 MYA. Dispersal of Mantophryne out of the EPCT occurred in the late Pliocene, around the time the Central Highlands expanded eastward to connect with the EPCT. Six pairs of sister lineages arose during a time of significant climate reorganisation in the Late Pliocene-Early Pleistocene when it is possible that areas to the north of the Central Highlands experienced increased precipitation, while areas south of the Central Highlands experienced lower rainfall, resulting in contraction of rainforests and expansion of savannahs. We did not identify any significant differences in habitus between lineages, other than M. louisiadensis which is much larger than all other Mantophryne, and has exceptionally high sexual-size dimorphism. While an interrogation of the putative cryptic species complex within the M. lateralis clade is beyond the scope of our study, we find that a combination of tectonic rearrangement and climatic history are likely responsible for most of the diversity within the genus that we see today.
Molecular classification of endometrial tumors is now incorporated into the 2023 FIGO staging criteria. The POLE-mutated (POLEmut) subtype is reported to have a favorable prognosis despite aggressive morphologic features. Our institution began universal molecular testing for newly diagnosed endometrial cancers in April 2023. In this study, we describe our institutional experience with this testing and specifically highlight POLEmut endometrial cancers. We aimed to compare the molecular and histopathologic profiles of POLEmut endometrial carcinomas with and without aggressive histopathologic features. A total of 198 endometrial cancer cases were analyzed by a targeted next-generation sequencing panel. We reviewed the results for all cases from April 2023 to December 2024 and for cases with pathogenic POLE exonuclease domain/proofreading mutation, collected relevant molecular, clinical data, immunohistochemical, and resection histopathology if available. Aggressive histopathology was defined as having at least one of the following: FIGO grade 3, ≥ stage pT1b, or lymphovascular invasion. Of 198 tested endometrial cancers, 40.9% (n=81) had no specific molecular profile, 33.8% (n=67) were p53 abnormal, 19.7% (n=39) were MMRd, and 5.6% (n=11) had POLE exonuclease domain/proofreading mutations associated with the ultramutated phenotype. At least one aggressive histopathologic feature was seen in 5/10 POLEmut cases that had resection specimens. TP53 variants were present in 5/11 POLE specimens and were often subclonal. Without sequencing, 4 POLEmut cases would have been misclassified as either p53 abnormal or MMRd. There was no significant difference in biomarker results, molecular variants, or clinical outcomes between cases with aggressive or nonaggressive histopathologic features.
There is currently no known effective long-term treatment for cognitive impairments in bipolar disorder (BD) and understanding of the neurobiological mechanisms underlying cognitive impairments in BD is lacking. Identifying potential biomarkers for cognitive improvement is essential to aid targeted interventions. This study investigates the neural correlates of cognitive improvement versus lack of improvement over time in BD. Forty-six patients newly diagnosed with BD who recently began treatment in a specialized outpatient clinic and 38 healthy control individuals (HC) were assessed with a battery of neuropsychological tests and a verbal n-back working memory test during functional magnetic resonance imaging (fMRI) at baseline and follow-up (mean: 15 months). Patients were divided into two groups: those who improved cognitively over time, defined as global cognition change ≥ .2 SD, (BD+: n = 27) and those who showed a lack of normative improvement (BD-: n = 19). BD + showed hypoactivity in dorsolateral prefrontal cortex (dlPFC) and dorsomedial prefrontal cortex (dmPFC) compared to HC, both at baseline and follow-up. Hypoactivity in these prefrontal regions correlated with poorer out-of-scanner working memory and executive function for BD- and HC, but not for the BD + group. While hypoactivity did not predict change in cognition for BD+, it did predict improvement in functioning for both patient subgroups. The findings from this exploratory study suggest that dlPFC hypoactivity may identify those who are likely to respond well to treatment and show an improvement in functioning over time.
Checkpoint blockade immunotherapy has changed the landscape of cancer treatment for many types of malignancies, but it still does not work for all individuals and for all tumor types. Much work over the past decade has been dedicated to discerning what underlies the heterogeneity in patient responses. While the underlying mechanism for checkpoint inhibition is mostly thought to be through facilitating better cellular immunity, Ring and colleagues recently began to broadly categorize the autoantibody landscape in immune checkpoint blockade (ICB)-treated individuals. They used the rapid extracellular antigen profiling technique to survey circulating antibody profiles from 374 patients and 131 healthy controls sampled longitudinally. The authors discovered that autoantibodies were more commonly found in patients with cancer treated with ICB compared with healthy controls, and that the types and levels of autoantibodies remained largely stable over time. While the population-level responses were quite diverse, there were some specific autoantigens that were targeted by the immune systems of multiple individuals. Of these, individuals who possessed autoantibodies against one or more proteins in the type-I interferon, interleukin (IL)-6 or IL-17 pathways or to tumor necrosis factor-like ligand 1A (TL1A) tended to have better responses to therapy, especially if the antibodies were neutralizing against their respective targets. Individuals who generated antibodies that blocked proteins within the bone morphogenesis pathway, conversely, exhibited a worse response to checkpoint inhibition. Overall, using this autoantibody-wide association study approach, the authors were able to characterize the extracellular and secreted autoreactome in detail, and to identify promising new therapeutic targets and areas of future investigation for patients taking ICB therapies.
This prospective observational study included patients undergoing primary total hip arthroplasty (THA). Shortened hospital stays after THA have increased the importance of patient self-management and early engagement in rehabilitation. Smartphone-based digital health applications are increasingly used in perioperative care; however, the clinical value of preoperative initiation remains unclear. To examine whether preoperative initiation of a smartphone-based physiotherapy support application improves postoperative patient satisfaction and behavioral outcomes compared with postoperative initiation in patients undergoing primary THA. This prospective single-center observational study included patients who underwent primary THA between October 2024 and March 2025. Patients were classified into preoperative or postoperative application initiation groups based on the timing of use. The preoperative group started use at a mean of 19.3 ± 6.4 days before surgery (range: 7 to 30 days), while the postoperative group began use on postoperative day 1. Allocation was determined by smartphone ownership, patient preference, and timing of institutional implementation. Those who began application use one day before surgery were excluded. The primary outcome was the dissatisfaction score of the Japanese Orthopaedic Association Hip Disease Evaluation Questionnaire (JHEQ). Secondary outcomes included the Forgotten Joint Score-12 (FJS-12) and stages of behavior change. Outcomes were assessed preoperatively and at 1 week, 1, 3, and 6 months postoperatively. A total of 137 patients were analyzed. The preoperative initiation group demonstrated a lower JHEQ dissatisfaction score at 1 week postoperatively compared with the postoperative group (mean difference: -10.4, 95% confidence interval (CI): -19.62 to -1.20; unadjusted p = 0.03; adjusted p = 0.15 after Bonferroni correction). The effect size at 1 week was Cohen's d = 0.39 (95% CI: 0.04 to 0.74). No significant between-group differences were observed at later time points or in FJS-12 scores. At 1 month, behavior change stage distributions differed between groups (p = 0.02). Earlier (preoperative) initiation of a smartphone-based physiotherapy support application was associated with a clinically relevant reduction in patient dissatisfaction at one week postoperatively and more favorable early behavioral readiness for rehabilitation. Shifting the timing of digital health support to the preoperative phase, thereby facilitating earlier exposure to education and guidance, may enhance the early postoperative patient experience and facilitate engagement. The clinical magnitude of this effect was supported by the effect size and confidence interval at the initial recovery stage.
Objectives: To evaluate the role of treatment timing in the management of Class II malocclusions with mandibular retrusion using Twin Block (TB) followed by fixed appliances (FAs). Methods: Forty-one Caucasian patients (22 females and 19 males) with Class II malocclusion treated consecutively with TB and FA were selected from the Orthodontic Clinic of the Careggi University Hospital, Florence, Italy and from a sample treated by a private practitioner in Auckland, New Zealand. According to the Cervical Vertebral Maturation (CVM) method, the subjects were divided into two groups: an early treated group (ETG) including 21 patients (mean age 10.8 ± 2.1 years) who began treatment before the pubertal growth peak (CS1-CS2), and a late treated group (LTG) including 20 patients (mean age 12.4 ± 1.1 years) treated at the growth peak (CS3-CS4). Cephalometric skeletal, dento-alveolar and soft tissue parameters were evaluated before treatment with TB (T0) and at the mid-term observation (at a postpubertal stage CS4-CS6) after FA (T1). Independent samples t-tests were performed to compare intergroup differences at T0 and T1 while Fisher's exact test was used to assess differences for gender. Results: At T0, the groups showed statistically significant differences in mandibular dimensions, in accordance with the different age distribution. At T1, significant differences between ETG and LTG were observed in Co-Gn (5.0 mm, p = 0.048, 95% Confidence Interval (CI) -9.8 mm; -0.1 mm), Co-Go (3.7, p = 0.009, 95%CI -6.3 mm; -0.9 mm), and Pg'-TVL SN10 (2.7 mm, p = 0.039, 95%CI -5.2 mm; -0.1 mm). Conclusions: Class II treatment with TB and FA confirmed that including the pubertal phase in the functional-orthopedic treatment led to more favorable mandibular growth and chin projection when evaluated at a postpubertal observation.
Quality care for pediatric type 1 diabetes (T1D) requires frequent, multidisciplinary visits. Technological and clinical innovation have led to changes in T1D management, resulting in increasing data exchange required during these visits. Capturing comprehensive personal health and diabetes-related information discretely and integrating it into the clinical workflow is critical for optimal T1D care but is time-consuming. Time spent on data transfer often results in less time for holistic care and can lead to unmet needs for patients, families, and health care providers, as well as increased time pressures in clinic. To address this, the Children's Hospital of Eastern Ontario developed a caregiver proxy-reported questionnaire distributed via the MyChart patient portal, allowing families to input care information ahead of visits with the aim of dedicating more clinic time to personalized care. The launch of this tool, which integrates caregiver-entered information into the physician's documentation workflow, brings the opportunity to systematically evaluate its impact on care quality and efficiency, with potential implications for broader adoption. Our objective is to evaluate the impact of a caregiver proxy-reported, electronic health record-integrated preclinic questionnaire (MyChart questionnaire) on the quality of care in a pediatric diabetes clinic, through measurement of its impact on caregiver-perceived quality of care compared to the standard of care using 2 validated measures of care quality. We also aim to explore the impact of the intervention on glycemic control and visit efficiency. We conducted a single-center, parallel-group randomized controlled trial designed for 222 children with T1D. Participants were randomly allocated in a 1:1 ratio to either the intervention (MyChart questionnaire) or standard care. Our primary outcome is caregiver-perceived quality of care, as measured by the Patient's Evaluation of the Quality of Diabetes Care at 8 months, administered with caregivers serving as proxy respondents for patients. Secondary outcomes are the Patient's Evaluation of the Quality of Diabetes Care at 4 months and Perceived Quality of Medical Care at 4 and 8 months. Tertiary outcomes include glycemic control and physician-reported visit efficiency at 4 and 8 months. Analysis of covariance models will be used to assess changes between baseline and postintervention outcomes across treatment groups. Recruitment for this study began in April 2023 and was completed in February 2024, with a total of 139 participants enrolled. Data collection has concluded, and the first results are expected in the spring of 2026. This study is the first randomized trial to assess the impact of a caregiver proxy-reported, electronic health record-integrated, preclinic questionnaire distributed via a patient portal on caregiver-perceived quality of care in a pediatric care setting. The results will guide changes in health service infrastructure and delivery to enhance comprehensive data capture and improve care quality within and beyond pediatric T1D.