A position paper released by the European Association of Nuclear Medicine emphasised the need for multidisciplinary engagement to establish dosimetry-based personalised treatment in Radionuclide therapy (RNT). The uncertainty analysis results often ignored in routine clinical practice should be incorporated into the dose calculations to improve the efficacy and accuracy of treatment. In this study, patients with haematological malignancies undergoing radioimmunotherapy were evaluated. Our study aimed to calculate the uncertainties associated with each parameter of the single time point (STP) dosimetry chain and compare the with multiple time points (MTP) in the bone marrow and liver results. 28 patients received an intravenous injection of 111In-besilesomab (0.17 ± 0.01GBq) for pre-therapeutic dosimetry and were subsequently treated with 90Y-besilesomab(2.43 ± 0.53GBq). A dosimetry analysis was performed on bone marrow (BM) and liver with MTP and STP. We investigated the uncertainty in population mean effective half-life, volume, recovery coefficient, counts, measured activity, fitting parameters, time-integrated-activity, S-factors, and absorbed dose (AD) for a group of patients. The mean absorbed dose per unit administered activity (DpA) to BM was 5.8 ± 1.7 mGy/MBq with MTP and 5.8 ± 1.6 mGy/MBq with STP, and to the liver was 2.9 ± 1.9 mGy/MBq with MTP and 3.1 ± 2.4 mGy/MBq with STP. The mean fractional uncertainty associated with total absorbed dose to BM was 13.18 ± 3.46% with MTP and 18.75 ± 3.22% with STP, and to liver was 5.77 ± 3.13% with MTP and 49.78 ± 25.36% with STP. A moderate positive relationship (R2 = 0.7) was noted between post-injection acquisition time and AD uncertainty with STP for BM, whereas a strong positive relationship (R2 = 1) was noted for the liver. The absorbed dose uncertainty in STP was significantly higher compared to the MTP. Incorporating the uncertainty analysis for STP dosimetry parameters in routine clinical practice is strongly recommended. The accuracy in the acquisition time, population-based half-life and fitting function for time activity curve is vital for minimising uncertainty in STP dosimetry, which is less time-consuming and easier to implement in clinical practice than MTP.
Light pollution has been implicated in liver health. This study aimed to investigate the association between bedroom nighttime light pollution and the risk of hepatic encephalopathy (HE) in patients with hepatocellular carcinoma (HCC). A total of 454 HCC patients were enrolled from communities. Bedroom nighttime light intensity was measured using an illuminometer (lux) at baseline, 2 months, and 4 months. Sleep quality was assessed at these three time points using the Pittsburgh Sleep Quality Index. These data at different time points were averaged separately for subsequent analyses. All patients were followed up for 12 months from baseline (unless death occurred), and HCC-related adverse outcomes were recorded. Multivariate logistic and Cox regression were adopted for statistical analysis. The results indicated that higher mean bedroom nighttime light intensity (> 50 lx) was significantly associated with an increased risk of both overt HE and minimal HE. Furthermore, it was associated with impaired liver function, esophagogastric variceal bleeding, and elevated HCC-related mortality. Notably, interaction analysis revealed that age and TNM stage may modify the aforementioned associations to some extent. In conclusion, bedroom nighttime light pollution is linked to an elevated risk of HE and may represent a potential risk factor warranting future validation.
Ureteral stents are routinely used following endourological procedures to ensure adequate drainage and prevent obstruction. However, stent-related morbidity remains common, and optimal stent dwell time and removal methods are not well defined. This systematic review aimed to evaluate clinical and procedural factors influencing ureteral stent dwell time and the methods used for stent removal after endourological interventions. A systematic review was conducted in accordance with PRISMA guidelines and registered on PROSPERO. MEDLINE and Embase were searched from inception to October 2025. Randomized controlled trials and comparative observational studies evaluating ureteral stent dwell time and/or removal methods in adults undergoing endourological procedures were included. Risk of bias was assessed using RoB 2 and ROBINS-I tools. Thirty-two studies encompassing 4,373 patients were included. Reported stent dwell times varied widely, most commonly ranging between 10 and 14 days in uncomplicated cases, with longer durations associated with increased rates of encrustation and removal difficulty. Removal techniques included rigid cystoscopy (48.7%), flexible cystoscopy (19.9%), extraction strings (23.5%), and device-assisted methods (7.9%). Less invasive approaches, particularly flexible cystoscopy and extraction-string removal, were consistently associated with reduced pain scores and improved patient comfort, although extraction strings carried a small risk of premature dislodgement. While practice patterns vary, the evidence suggests that a 10-14 day dwell time might be the optimal window to balance healing with the prevention of encrustation. Less invasive removal approaches, particularly flexible cystoscopy and extraction-string techniques, were generally associated with lower pain scores and high procedural success rates in selected patients. While these methods are safe and better tolerated, extraction strings carried a small, reproducible risk of premature dislodgement. High-quality prospective studies are needed to define determinant-based, individualized stent management strategies.
The increasing application of time-series analysis in fields like biomedical engineering or telecommunications emphasizes the need for high-quality data to train and evaluate advanced machine learning models. Acquiring temporal data at suitable resolutions is often limited by ethical, economic, or practical constraints. We introduce CoSiBD (Complex Signal Benchmark Dataset for Super-Resolution), a synthetic dataset designed for reproducible time-series super-resolution research. CoSiBD provides 2,500 high-resolution signals (N = 5, 000 samples each over a reference domain τ ∈ [0, 4π]) with aligned low-resolution versions at four levels (150, 250, 500, and 1,000 samples) obtained via uniform decimation. Signals are generated with diverse non-stationary behaviors through piecewise frequency modulation and spline-based amplitude envelopes, and provides both clean and noisy variants. Signals are distributed as NumPy arrays, plain text, and JSON, with comprehensive metadata describing segment structure, generation parameters, and seeds for full reproducibility. Technical validation analyzes spectral properties and reports baseline SR benchmarking and transfer experiments on EEG and speech data.
Proximal femoral fractures are highly prevalent in Japan, with over 200,000 cases annually and a rising trend. Fracture liaison service (FLS) interventions improve osteoporosis treatment initiation and reduce refracture rates. The content of FLS interventions varies by institution, and the effectiveness of our intervention remains unclear. The aim of this study was to evaluate the effectiveness of our FLS intervention in preventing fragility fractures within 1 year after proximal femoral fracture surgery. A retrospective case-control study was performed on patients aged ≥ 50 undergoing surgery for proximal femoral fracture between February 2021 and January 2024. Patients were divided into non-FLS (pre-August 2022) and FLS groups. Data including demographics, comorbidities, fracture type, medication initiation, and refracture occurrence within 1 year were extracted. Statistical analyses involved Mann-Whitney U, χ2 tests, and Cox proportional hazards modeling. Among 521 eligible patients, osteoporosis medication initiation within 3 months improved from 14% in the non-FLS group to 100% in the FLS group (p < 0.05). Time to medication initiation decreased from 20 to 12 days (p < 0.05). The refracture rate was significantly lower in the FLS group (1.8% vs. 5.7%, p < 0.05). Multivariate analysis showed FLS intervention significantly reduced refracture risk (HR 0.32, 95% CI 0.12-0.89, p = 0.03) and robust in sensitivity analyses for cognition, walking ability, and discharge destination. FLS intervention effectively reduced fragility fractures within 1 year postoperatively by enhancing early osteoporosis treatment initiation. Continued FLS programs and long-term follow-up are recommended to sustain benefits.
Tuberculosis (TB) remains a significant public health challenge in Northeast Iran, yet longitudinal data evaluating regional transmission patterns following COVID-19 disruptions remain limited. This cross-sectional study, conducted from 2017 to 2023 at Qaem University Hospital, Mashhad, Iran, analyzed 14,572 patients with suspected TB using smear microscopy, culture, and PCR to characterize the test positivity rate (TPR), temporal shifts, and diagnostic challenges. We identified a 10.3% TPR (1,494 out of 14,572), with significant demographic disparities: females accounted for 51.7\% of cases (OR = 0.74, 95\% CI: 0.665-0.824; p < 0.001), although the association strength was weak (Cramér's V = 0.05). Adults aged 65 and older represented 51.6\% of the cases. The COVID-19 pandemic led to a 33.3\% decline in diagnoses from 2019 to 2020, with outpatient recovery lagging behind inpatient services. Time-series analysis identified a significant structural break in March 2020 (p < 0.001), statistically confirming the sharp decline in diagnoses due to the pandemic. Bronchoalveolar lavage showed the highest positivity rate at 54.7\%, identifying 135 smear-negative/culture-positive cases. Seasonal peaks in spring (27.8\%) are hypothesized to result from post-winter Vitamin D troughs and social gatherings during Nowruz. These findings emphasize the importance of geriatric-focused screening, multimodal diagnostic protocols, and pandemic-resilient TB surveillance. Regional policies should focus on integrated respiratory screening and community-based interventions to reduce seasonal transmission.
Depressive symptoms have been on the rise among young adults, with the transition to college, particularly the first year, being a critical period of vulnerability. Despite prior research on depression trajectories in college students, limited longitudinal studies have explored unique depressive symptom trajectory groups among first-year students and their associations with academic achievement (GPA), sleep patterns, and whether sociodemographic factors are associated with certain trajectories. This study analyzed a pre-existing dataset that was collected over two waves from a private university (spring semester 2017 and 2018). The final pooled sample resulted in first-year undergraduate students (N = 271) who reported on their depressive symptoms (CES-D scale) at the start and end of the semester, signed a release record for their fall and spring term GPA, and provided continuous sleep data across the academic spring term with Fitbits. K-means + + clustering was conducted to form depressive symptom trajectory groups. ANOVAs, Watson-Williams, and Dunnett's post hoc comparison tests were employed to examine how the resulting trajectory groups were associated with GPA and sleep outcomes (bedtime, waketime, total sleep time, time in bed). Associations between sociodemographic variables and trajectory groups were investigated using chi-square tests. K-means + + clustering identified four trajectory groups: low-stable (n = 109), increasing (n = 72), decreasing (n = 51), and high-stable depressive symptoms (n = 39). The low-stable and decreasing group had a higher spring term GPA (M = 3.44 and M = 3.39, respectively) compared to the increasing and high-stable groups (M = 3.22 and M = 3.18, respectively). The low-stable group generally had an earlier wake time and bedtime, greater total sleep time and time in bed, relative to the decreasing and increasing trajectory groups. Gender, ethnicity, international student status, and first-generation student status were not associated with trajectory groups. Consistent with prior work, there are unique depression trajectory groups among first-year college students that represent stability and change of depressive symptoms over the course of a spring semester. Favorable trajectories (low-stable and decreasing symptoms) are associated with better academic performance and sleep habits.
This paper proposes an integral sliding mode based adaptive robust backstepping control scheme to improve the trajectory tracking and hovering performance of a quadrotor unmanned aerial vehicle (UAV) under large-scale time-varying disturbances. It also considers the impact of variations in the payload mass of the UAV, such as in tasks such as power line maintenance or rescue operations. The proposed scheme effectively mitigates the influence of disturbances on the flight process when the upper bound of the time-varying disturbance is unknown, and estimates the potentially uncertain parameters of the system in real time. Using Lyapunov stability theory, it was proven that the designed controller ensured the asymptotic convergence of the tracking error to zero. Furthermore, this paper integrates adaptive control with the concept of integral sliding mode, combining their respective technical characteristics in a complementary manner. The proposed adaptive law, incorporating a σ-modification term, effectively suppresses the chattering inherent in sliding mode control, ensuring system stability. The integration of the sliding mode surface further accelerates the error convergence to zero. The simulation results validate the performance of the proposed control scheme in various scenarios, including continuous weak disturbances, changes in payload mass, and sudden large-scale time-varying disturbances. The results demonstrate that the proposed control scheme has strong robust stabilization and wide applicability, outperforming the traditional adaptive robust control methods and classical PID methods.
In the context of the aging of the global population, the prevalence of knee joint disorders continues to rise. Concurrently, the integration of robotic systems and intelligent implants represents an inevitable trend in orthopedic surgery. A comprehensive evaluation of the safety and effectiveness of robot-assisted total knee arthroplasty (RA-TKA) is therefore urgently needed to inform clinical decision-making. To explore the advantages of 9 RA-TKAs across 8 outcomes. A systematic literature search was conducted in the PubMed, Web of Science, Embase, Cochrane Library, CBM, CNKI, Wanfang, and VIP databases from inception to December 1, 2025. The risk of bias and methodological quality were assessed via Review Manager (version 5.4). Network meta-analysis was performed via RStudio (version 4.4.1). A total of 36 studies involving 2841 patients were included. In direct comparisons, conventional TKA (C-TKA) yielded shorter operative times than MAKO, HURWA, SkyWalker, ROSA, and Brainlab Knee did. CORI also had a shorter operative time than Brainlab Knee did. Compared with the C-TKA, MAKO, HURWA, SkyWalker and TiRobot groups, the ROSA group presented higher KSS-knee scores. In addition, C-TKA, HURWA, and CORI presented higher KSS-knee scores than did SkyWalker. For the KSS-function scores, the C-TKA and ROSA scores were higher than the HURWA score. C-TKA demonstrated a greater postoperative ROM than HURWA did. For HKA angle deviation, C-TKA resulted in greater deviation than MAKO, HURWA, SkyWalker, TiRobot, and EPMEDBOT did. In the comprehensive best probability ranking, C-TKA (93%) ranked highest in terms of operative time. SkyWalker (87%) ranked highest in terms of blood loss. SkyWalker (91%) ranked highest in terms of the KSS-knee scores. HURWA (87%) ranked highest in terms of the KSS function scores. MAKO (85%) ranked highest for HSS. The YUANHUA (76%) ranked highest for the WOMAC. The CORI (69%) ranked highest for ROM. SkyWalker (87%) ranked highest for HKA angle deviation. Overall, RA-TKA demonstrated superior safety and effectiveness compared with C-TKA, with different robotic systems exhibiting distinct advantages across outcome measures. Nevertheless, C-TKA retains a significant advantage in reducing the operative time, highlighting an important area for further optimization of robotic-assisted TKA.
Peritoneal fibrosis, driven by M2 macrophage polarization, limits the long-term application of peritoneal dialysis (PD). Although ADAM19 is known to mediate fibrosis in other organs, its specific role in PD-associated peritoneal fibrosis remains unclear. PD patients were enrolled in a single center and divided into three groups depending on the PD time. Demographic and clinical data were collected. We detected the expressions of ADAM19, Notch1, Fibrosis-associated protein, chemokines and inflammatory factors in the peritoneum dialysis effluent by real-time PCR and western-blot assays. Macrophages were identified through flow cytometry. Then we analysis the relationship between ADAM19 and clinical data in PD patients. Furthermore, we established mouse models for peritoneal fibrosis to verify the biological function of ADAM19 in regulating macrophage polarization. In the long-term group, the fibrotic proteins (Fibronectin, α-SMA) and inflammatory factors (IL-6, IL-10) and chemokines (CCL5, CCL2, CXCL16) were higher than short-term group and more macrophages polarized towards M2. ADAM19 expression was linearly correlated with dialysis time and Kt/v. The AUROC of ADAM19 was 0.738 to identify the predictive value for peritoneal dialysis adequacy. The cut-off of ADAM19 RNA level was 7.84. In logistic regression models, higher ADAM19 (≥ 7.84) was also independently associated with lower Kt/v (< 1.67). Additionally, the results revealed a moderate increment of M1 macrophage (CD86+) and enormous rise of M2 macrophage (CD206+) with high-glucose dialysis fluid in mice model. Furthermore, the 8-week G4.25% group showed significant growth of M2 macrophage compared to the 4-week G4.25% group, indicating that prolonged dialysis duration has a more pronounced effect on promoting M2 polarization of macrophages via ADAM19/Notch1 signaling pathway. Through stimulating chemokines and inflammatory factors, ADAM19 regulated macrophage polarization and was correlated to the progression of peritoneal fibrosis. ADAM19 is expected to be a novel indicator for detecting peritoneal ultrafiltration function in PD patients.
The study presents experience in using a wound protector device named lap-protector during costal cartilage harvest in auricular reconstruction in order to improve outcomes, particularly final scar quality and length. The present study retrospectively comprised fifty-five patients who underwent costal cartilage harvesting for auricular reconstruction were admitted between June to August 2022. The author divided the patients into 2 groups according to whether the lap-protector was used or not: group 1 underwent costal cartilage with the minimal invasive conventional technique by using the lap-protector, while group 2 underwent the same procedure without. Patients were followed-up for a 6-months period. The time of surgery, the amount of blood loss during surgery, postoperative pain, donor-site scar quality and length were recorded and measured for both groups. Twenty-five patients with lap-protector were compared to thirty without. There were no significant differences between the two groups in the demographic of patient characteristics (P > 0.05). The score of pain dropped steadily over the 5 days in both groups, and patients in group 1 reported a lower level of pain in the following days compared with group 2 (P < 0.05), except the fifth day. In those without, the length of the scar on average was 5.43 ± 0.44 cm, which was longer than the average 3.61 ± 0.29 cm in group 1 (P < 0.05). Analysis of VSS results showed a better formation of the scar in the group1 (P < 0.05). The differences in postoperative pain, scar length and quality between the two groups were statistically significant(P < 0.05). There were no differences between the two groups in terms of the operation time and the amount of bleeding during the operation (P > 0.05). The application of lap-protector in costal cartilage harvest can optimize the scar formation and reduce postoperative pain without prolonging the operative time, which is a convenient and effective technique for achieving satisfactory results.
Video microscopy, when combined with machine learning, offers a promising approach for studying the early development of in vitro produced (IVP) embryos. However, manually annotating developmental events, and more specifically cell divisions, is time-consuming for a biologist and cannot scale up for practical applications. We aim to automatically classify the cell stages of embryos from 2D time-lapse microscopy videos with a deep learning approach. We focus on the analysis of bovine embryonic development using video microscopy, as we are primarily interested in the application of cattle breeding, and we have created a Bovine Embryos Cell Stages (ECS) dataset. The challenges are three-fold: (1) low-quality images and bovine dark cells that make the identification of cell stages difficult, (2) class ambiguity at the boundaries of developmental stages, and (3) imbalanced data distribution. To address these challenges, we introduce CLEmbryo, a novel method that leverages supervised contrastive learning combined with focal loss for training, and the lightweight 3D neural network CSN-50 as an encoder. We also show that our method generalizes well. CLEmbryo outperforms state-of-the-art methods on both our Bovine ECS dataset and the publicly available NYU Mouse Embryos dataset.
We study the deterministic Susceptible-Infected-Susceptible (SIS) epidemic model on weighted graphs. van Mieghem et al. have shown that it is possible to learn an estimated network from a finite time sample of the trajectories of the dynamics that in turn can give an accurate prediction beyond the sample time range, even though the estimated network might be qualitatively far from the ground truth. We give a mathematically rigorous derivation for this phenomenon, notably that for large networks, prediction of the epidemic curves is robust, while reconstructing the underlying network is ill-conditioned. Furthermore, we also provide an explicit formula for the underlying network when reconstruction is possible. At the heart of the explanation, we rely on Szemerédi's weak regularity lemma.
To compare perioperative outcomes between the 48-h short-stay pathway and traditional inpatient management for patients undergoing robot-assisted partial nephrectomy (RAPN), and to evaluate the feasibility, safety, recovery efficiency, and economic benefits of the 48-h short-stay pathway. This retrospective study included 175 patients who underwent RAPN between February 2022 and June 2024. Patients were assigned to a 48-h short-stay group (n = 60) or a traditional inpatient group (n = 115). A 1:1 propensity score matching (PSM) was conducted to balance baseline characteristics, including age, sex, BMI, comorbidities, tumor features, surgeon identity, and surgical year. Perioperative outcomes, recovery indicators, complications, and medical costs were compared. After PSM, 53 matched pairs were analyzed. The short-stay group showed significantly shorter operative time, less intraoperative blood loss, shorter warm ischemia time, earlier mobilization, earlier oral intake, faster bowel function recovery, and shorter bed rest (all P < 0.05). The short-stay group had 71.7% of patients discharged on postoperative day (POD) 1 and 100% within 48 h, while the traditional group had 22.6% on POD1, 33.96% on POD2, and 43.4% on POD ≥ 3 (P < 0.001). Both total and postoperative hospital stays were significantly shorter in the short-stay group (2.00 vs. 6.00 days, P < 0.001), with lower hospitalization costs (P < 0.001). Postoperative creatinine was lower in the short-stay group (P = 0.023), while creatinine change was comparable (P = 0.063). Complication rates, emergency department visits, and 30-day readmission rates were similar between groups (all P > 0.05). The short-stay group had a significantly lower drain placement rate (P = 0.002) without increased adverse events. The 48-h short-stay pathway for selected patients undergoing RAPN is feasible and safe. It accelerates postoperative recovery, shortens hospital stay, reduces medical costs, and optimizes healthcare resource utilization, without compromising safety or oncological early outcomes.
Chronobiology has advanced scientifically since 2000. Translating this knowledge and approach to medicine can alter diagnosis, treatment, and prevention, and improve health. Adding time-of-day (or time-of-year) information is both a concrete and conceptual change to clinical practice and public health relevant to humans and other animals, with low implementation costs. Successful translation of chronobiology to medicine requires new methods, training, and organizational and regulatory action.
Non-small cell lung cancer (NSCLC) remains one of the leading causes of cancer-related mortality worldwide. However, the diagnostic sensitivity and specificity of commonly used tumor markers, such as carcinoembryonic antigen (CEA) and cytokeratin-19 fragment (CYFRA21-1), remain limited. This study aimed to evaluate the diagnostic value of serum exosomal 3'tiRNA-PheGAA and interleukin-6 (IL-6), alone and in combination with conventional tumor markers, for the detection of NSCLC. Peripheral blood samples were collected from 110 patients with NSCLC and healthy controls. Serum exosomes were isolated, and the expression of 3'tiRNA-PheGAA was measured using quantitative real-time polymerase chain reaction (qRT-PCR). Serum levels of CEA, CYFRA21-1, and IL-6 were determined by electrochemiluminescence immunoassay (ECLIA). The diagnostic performance of individual and combined biomarkers was evaluated using receiver operating characteristic (ROC) curve analysis. Serum levels of 3'tiRNA-PheGAA, IL-6, CEA, and CYFRA21-1 were significantly higher in NSCLC patients than in healthy controls (P < 0.0001). ROC analysis showed that the area under the curve (AUC) values for 3'tiRNA-PheGAA and IL-6 were 0.680 and 0.898, respectively. The combination of 3'tiRNA-PheGAA and IL-6 increased the AUC to 0.926, while the four-marker panel (3'tiRNA-PheGAA, IL-6, CEA, and CYFRA21-1) achieved the highest diagnostic performance with an AUC of 0.971. Serum exosomal 3'tiRNA-PheGAA and IL-6 may serve as promising non-invasive biomarkers for NSCLC diagnosis, and their combination with conventional tumor markers significantly improves diagnostic accuracy.
There have been discussions as to the time of elective induction of labour to curb the continuation of pregnancy that might endanger the lives of both the mother and child. This research was conducted to assess foetal and maternal consequences of planned delivery at 40 and 41weeks in women with low-risk singleton pregnancy. A randomised controlled trial with equal allocation of participants (96 pregnant women in each arm) into 40weeks and 41weeks. Participants were randomised at the antenatal clinic at 39 weeks for induction of labour. The main outcome was the caesarean section rate. Secondary outcomes were maternal (genital tract laceration rate) and foetal (rates of meconium staining of amniotic fluid, SCBU admission, perinatal mortality, birth trauma, birth weight, and neonatal APGAR score at 1 and 5 minutes). Student t-test and chi-square test were used for inter-group comparison. Incidence of caesarean delivery (26.6% vs. 21.3%; p=0.406), and genital laceration (2.1% vs. 5.6%; p=0.268) did not differ between groups. Significantly higher birth weight was noted among women induced at 41weeks (3.41 ± 0.37kg) than 40weeks (3.28 ± 0.46kg) (p=0.043). Also, there was significant variation in meconium staining of amniotic fluid between 40weeks (11.7%) and 41weeks (25.8%) (p=0.014). Other foetal outcomes showed no significant difference. Inducing labour at 40weeks is safe for low-risk women as it does not significantly increase the cesarean delivery rate and adverse perinatal outcomes. Therefore, elective induction of labour at 40weeks should be recommended and introduced into obstetric practice without the fear of adverse outcomes.
Federated learning (FL) has become a highly promising paradigm for privacy-preserving distributed model training by enabling edge devices to train without sharing raw data. But in practice, edge environments are both non-stationary and asymmetric, with varying data distributions due to shifts in user behaviour, sensing conditions, and overall environmental dynamics. This causes concept drift (sudden, gradual, and recurrent), leading to poor model performance, slower convergence, and predictive bias. Current approaches to FL are not combined to tackle problems of drift adaptation, differential privacy (DP) and resource efficiency (FedAvg, DP-FedAvg). To address these constraints, we present FedDriftGuard. This Federated learning layer unifies client-level drift detection, drift-adaptive aggregation, and adaptable differential privacy into a single, FLE architecture-compatible system. The proposed DP-DriftNet model implements attention-based time encoding to capture changing data patterns and drift-directed feature weighting to allow greater flexibility in the presence of distributional changes. A drift-optimal privacy scheduler allocates noise probabilistically, subject to a limited privacy budget, thereby enforcing an appropriate privacy-utility trade-off without cancelling formal DP guarantees. Also, update sparsification, compression and periodic transmission techniques are used to reduce communication overhead. Decades of experimentation on real-world and synthetic drift datasets have shown that FedDriftGuard outperforms baseline FL techniques, achieving accuracy and F1-score gains of 9-14% and 11-17%, respectively, with adaptation latency 28% shorter and communication cost 20-35% lower. Such findings are statistically significant and confirm the soundness of the suggested method. FedDriftGuard offers effective, scalable privacy-preserving learning in adaptable, edge-drifting environments.
The rapid integration of generative artificial intelligence (GenAI) tools, such as ChatGPT, into educational contexts has raised important questions regarding how adolescents conceptualize and make sense of these technologies. Understanding students' perceptions is essential for developing age-appropriate, ethical, and pedagogically sound approaches to AI use in secondary education. This descriptive qualitative study employed a phenomenological approach and metaphor analysis to explore secondary school students' perceptions of generative artificial intelligence. The study sample consisted of 332 students aged 14-18 years from four secondary schools in Türkiye. Data were collected using an open-ended prompt ("Generative artificial intelligence is like … because …") and analyzed through content analysis. Metaphors were categorized based on shared semantic and conceptual features, and inter-rater reliability was established using Cohen's kappa (κ = 0.92). Analysis revealed ten metaphor categories clustered under five overarching themes: generative artificial intelligence as (1) a source of knowledge, (2) a teaching and guiding entity, (3) a supportive and assisting tool, (4) a reflection of human intelligence, and (5) a dual-purpose (beneficial-risky) technology. Students most frequently conceptualized GenAI as a comprehensive knowledge source (e.g., book, encyclopedia) and as a human-like cognitive entity (e.g., brain, wise person). At the same time, metaphors reflecting ethical awareness and potential risks, such as misuse and overreliance, were also identified. The findings indicate that secondary school students hold multifaceted and nuanced perceptions of generative artificial intelligence, encompassing both educational opportunities and ethical concerns. These results highlight the importance of integrating AI literacy into secondary education in ways that promote critical thinking, responsible use, and awareness of GenAI's limitations alongside its potential benefits. It was determined that secondary school students perceive generative artificial intelligence ambivalently as both a useful tool and a source of ethical and emotional concern, highlighting the need for developmentally appropriate artificial intelligence literacy approaches. • GenAI tools such as ChatGPT are increasingly integrated into educational contexts and have the potential to support personalized learning, information access, and student engagement. • Existing research has primarily focused on educators' perspectives or higher education settings, while studies examining adolescents' perceptions of GenAI remain limited. • This study provides empirical evidence on secondary school students' metaphorical perceptions of generative artificial intelligence within a K-12 context. • Findings reveal that adolescents conceptualize GenAI in multifaceted ways, including as a knowledge source, teaching and guiding entity, supportive tool, reflection of human intelligence, and a dual-purpose (beneficial-risky) technology.
This study examines the long-term variations in the water-level time series of Lake Uluabat, located in western Türkiye, over the past six decades. Despite the decline in lake water level in recent years, the scarcity of reliable information remains a major problem in understanding this phenomenon. To overcome this limitation, monthly water-level observations spanning from October 1960 to September 2019 (708 months) were analyzed to explore temporal dynamics in trend, homogeneity, stationarity, frequency, persistence, entropy, and the reconstructed phase-space geometry. The analyses were conducted for the entire period (1960-2019) and six decadal intervals (i.e., 1960-1969, 1970-1979, 1980-1989, 1990-1999, 2000-2009, and 2010-2019) to identify regime shifts and decade-scale variability. The so-called autocorrelation function, mutual information, probability distributions, return period, and dimensional analysis were performed. Also, the teleconnections between lake-level fluctuations and 19 large-scale atmospheric-oceanic oscillation indices were investigated. Results indicated a persistent but gradually descending downward trend, accompanied by a rise in system entropy and short-term dependencies. This indicates increased complexity and dependence on external factors. So in a nutshell, the recent lake water-level properties indicate reduced degree of functionality and self-dependency of the hydrological regime. Yet, the temporal teleconnection between lake water level and the climatic oscillations showed stability. This indicates that the climate and the anthropogenic factors have a direct effect on the lake water level states, although in this case, the latter seems to have the upper hand.