Rheumatoid arthritis (RA) is a chronic, inflammatory autoimmune disease characterized by progressive joint involvement. The disease is associated with reduced health-related quality of life, functional ability, work productivity, and daily activity, in addition to pain, fatigue, and inflammation-induced structural damage. RA emerges through a complex interplay of multiple factors, including the release of inflammatory cytokines. One such cytokine is tumor necrosis factor (TNF), which is inhibited by golimumab, an approved treatment for the disease. Golimumab is produced using a recombinant cell line that originated from genetically modified mice immunized with human TNF. This narrative review presents the efficacy and safety outcomes from the golimumab pivotal randomized controlled trials (RCTs) that evaluated the subcutaneous and intravenous formulations. Across the RCTs, treatment was associated with an American College of Rheumatology ≥ 20% improvement. A majority of participants reported at least one AE and the most frequent were upper respiratory infection, nasopharyngitis, and bronchitis. The review also presents data from post hoc analyses and real-world studies, which demonstrated benefits that included improvements in health-related quality of life and daily productivity, prolonged drug survival, and reduced fatigue, pain, disease activity, and structural damage. Lastly, data from meta-analyses, including RCTs and real-world studies, demonstrated that golimumab had an adverse event pattern similar to other TNF inhibitors. Collectively, these studies demonstrated that golimumab is an effective and safe treatment for RA when used according to the approved indication.
To evaluate the effectiveness and safety of mirikizumab in patients with ulcerative colitis in clinical practice. We conducted a retrospective, multicenter cohort study across five Italian inflammatory bowel disease centers. Patients initiating mirikizumab between November 2024 and May 2025 were included. The primary endpoint was steroid-free clinical remission (SFCR) at 12 weeks, defined as the absence of rectal bleeding and near-normal stool frequency without corticosteroids. Secondary outcomes included clinical response (≥30% improvement in stool frequency and rectal bleeding), biochemical response (normalisation of fecal calprotectin (FCP) and C-reactive protein (CRP)), and safety (any adverse events). Follow-up at 24 weeks was assessed when available. Subgroup analyses were performed according to prior exposure to biologics (including ustekinumab) and induction regimen. A total of 95 patients were included (mean age 49.9±16.9 years; 50.5% male); 12.6% were biologic naïve and 17 (17.9%) received extended induction. At week 12, 43.2% of patients (41/95) achieved SFCR, 61.1% (58/95) clinical response, and 32/95 (33.7%) biochemical remission. SFCR rates were similar between biologic-naïve and biologic-experienced patients (42.6% vs 43.8%, p=0.906). SFCR rates were 36.2% and 88.2% in patients receiving extended and standard induction, respectively (p=0.002). At week 24, prior ustekinumab exposure was associated with numerically lower SFCR, although this difference was not statistically significant (38.5% vs 73.7%; OR=0.22, 95% CI 0.05 to 1.01, p=0.052). Changes in biomarkers were not statistically significant at week 12 (CRP Δ-0.08 mg/dL, p=0.464; FCP Δ-249 µg/g, p=0.127). Three patients discontinued treatment due to lack of efficacy. No adverse events were observed. Mirikizumab demonstrated meaningful clinical effectiveness and a favourable safety profile in a real-world setting, including in biologic-experienced patients.
Diabetes mellitus accounts for a significant share of morbidity and mortality in ages 30-70 years worldwide. In sub-Saharan Africa, diabetes care is often suboptimal for reasons ranging from health system weaknesses to patient illiteracy and non-compliance with recommendations. This study explores the potential costs and health benefits of optimising care for uncomplicated type 2 diabetes in Lagos State, Nigeria. Longitudinal data on medical care patterns and resource use (consultations, medications, diagnostics and lifestyle counselling) over a 1-year period were collected retrospectively from 84 health facilities in Lagos. Medical resource prices were obtained from a subsample of 26 facilities. Patient care gaps were assessed by comparing actual journeys to official diabetes management guidelines. Mixed-effect regression analyses were employed to explore the impact of care elements on blood glucose control and model the potential complications averted if all patients received recommended care, with extrapolation to the entire Lagos population. Data from 642 patients with uncomplicated type 2 diabetes were analysed. A one-unit increase in consultation score (a measure of the adequacy of consultation visits) and having health insurance coverage were linked to 47-unit and 29-unit lower blood glucose levels, respectively. Optimising diabetes care requires US$3716 per patient annually, totalling US$2.1 billion statewide, with medications comprising 97% of costs. Enhanced care could reduce stroke and myocardial infarction by 2% (12 675 cases) and 4% (22 282 cases) over 7 years, respectively, at a cost of US$61 492 per complication averted. The investment required to optimise diabetes care in Lagos is currently unfeasible in the existing approaches. There is a need to explore innovative financing and delivery options, including digital value-based care interventions and cost-saving care approaches such as pooled medication procurement, while also investing in local medicines production capacity and expansion of the health insurance coverage.
The availability of real-world object stimuli that meet researchers' requirements is an ongoing challenge in visual cognition research. While numerous manually curated object stimulus sets exist, stimulus features such as size, color, and orientation tend to vary widely within a given set and may not be suitable for studies with specific requirements regarding these parameters. However, recent advances in artificial intelligence (AI) can facilitate the generation of highly realistic, custom-made stimuli. Building on these developments, the present study aimed to share a set of 200 AI-generated images of everyday objects for research use. The objects were oriented as though 'placed' on a flat surface, such that they could be naturally embedded in virtual scenes. Moreover, they were created in greyscale and suitable for rendering in different colors. Here, we report the method used to efficiently generate the stimuli, as well as the results from a validation study in which we assessed the nameability, perceived realism and familiarity of the stimuli in a sample of 45 younger (18-35) and 45 older (65-85) adults. As anticipated, the majority of the stimuli were rated highly across all three measures, and no significant age differences were observed. The results thus validated most of the stimuli for future research. The stimuli, each in seven colors, and the corresponding validation scores are openly available for future use. Low-level image statistics of mean brightness and contrast for each image are also included in the dataset.
Bruton tyrosine kinase inhibitors (BTKis) are foundational therapies for B-cell malignancies but are associated with clinically important cardiovascular toxicities. Ibrutinib has been linked to atrial fibrillation/flutter (AF/AFL), bleeding, and ventricular arrhythmias, whereas second-generation BTKis such as Zanubrutinib may confer improved cardiovascular safety. To compare adverse cardiovascular outcomes between zanubrutinib and ibrutinib in a large real-world cohort. We conducted a retrospective, multicenter cohort study using the TriNetX Global Collaborative Network. Adults aged ≥18 years with chronic lymphocytic leukemia/small lymphocytic lymphoma, mantle cell lymphoma, follicular lymphoma, waldenström macroglobulinemia, or marginal zone lymphoma initiating a BTKi were included. Patients with AF/flutter before starting BTKi were excluded. Propensity score matching (1:1) was done across 41 covariates. Outcomes were assessed within 12 months of beginning the BTKi. The Cox-proportional model was used to calculate hazard ratios (HR), and a p-value <0.05 was considered significant. Propensity matching yielded 3,447 balanced pairs. The mean age was 71 years, with 40% females in each cohort. Compared to Ibrutinib users, Zanubrutinib users experienced significantly lower risks of incident AF/AFL (HR=0.47; 95% CI: 0.38-0.58; p<0.001), major bleeding (HR=0.79; 95% CI: 0.68-0.91; p=0.001), and all-cause mortality (HR=0.27; 95% CI: 0.15-0.48; p<0.001). There was no significant difference between the two cohorts in terms of intracranial hemorrhage, ischemic stroke, myocardial infarction, ventricular arrhythmias, incident heart failure, and incident hypertension. Zanubrutinib was associated with a lower risk of AF/AFL, major bleeding, and all-cause mortality compared with Ibrutinib. These findings support the preferential use of Zanubrutinib when cardiovascular safety is a key consideration in patients with B-cell malignancies.
To describe the frequency of platelet count testing and decreased platelet counts (DPCs), time to onset and resolution of DPCs, and clinical management of DPCs among patients with epithelial ovarian cancer (EOC) who received first-line maintenance (1LM) niraparib monotherapy. This retrospective US-based cohort study included adults with EOC who had a platelet count of ≥100,000/μL at 1LM niraparib monotherapy initiation (April 1, 2020-June 3, 2023). Platelet count testing patterns were described and patients grouped for any-grade DPC, grade ≥3 DPC, or no DPC. Timings of DPC onset and resolution were estimated and niraparib clinical management strategies (ie, dose modifications) and the frequency, timing of, and reasons for niraparib discontinuation were described. Among 543 eligible patients, most received frequent platelet count tests; for those who experienced DPCs (≈37% any-grade, ≈22% grade ≥3), onset was usually early (≤1 month after 1LM niraparib initiation), with ≥98.0% of cases resolving, with a median time to resolution of ≈2 weeks. Aligning with management guidelines, most patients received dose modifications during first DPC, and most continued treatment (≈84%). Most patients who discontinued treatment during first DPC did not receive dose modification prior to discontinuation. In this real-world setting of patients with EOC prescribed 1LM niraparib, DPCs usually occurred early, were quickly resolved for nearly all patients, and were generally managed with dose modifications, resulting in low discontinuation rates. However, dose modifications were absent in most patients who discontinued, highlighting an opportunity to improve clinical management strategies to support treatment continuation, thereby potentially optimizing patient benefit.
Artificial intelligence (AI) systems in healthcare often fail to improve patient outcomes despite high development accuracy. We conducted semi-structured interviews with patients (n = 18), health professionals (n = 8), and AI developers (n = 8), using a postpartum depression risk algorithm as a use case. Through thematic analysis informed by sociotechnical frameworks, we identified six themes: harm mitigation, clinical utility, communication strategies, data quality, privacy/security, and responsible governance. All stakeholders emphasized that patient-centered AI must provide actionable benefits while minimizing bias, stigma, and anxiety. Patients wanted professional interpretation of AI outputs. Participants identified tensions between explainability and accuracy, varying patient preferences for accessing predictions, and unclear accountability when AI recommendations cause adverse outcomes. Our findings support patient-centered implementation through four strategies: providing professionals with competencies and protected time; engaging stakeholders throughout development; offering flexible communication accommodating diverse health literacy; and establishing multi-layered governance with shared accountability across developers, professionals, and institutions.
Evidence-based parenting programmes are widely used to prevent violence against children and improve parenting and mental health. Despite hundreds of randomised trials, little is known about their outcomes when delivered at scale within routine delivery. This study assesses the WHO-endorsed and UNICEF-endorsed Parenting for Lifelong Health programme for caregivers and adolescents, delivered through non-governmental organisation and government in Botswana, the Democratic Republic of the Congo, Eswatini, South Africa, South Sudan, Tanzania, Zambia and Zimbabwe, with support from the President's Emergency Plan for AIDS Relief (PEPFAR), the United States Agency for International Development (USAID) and the European Union. Pre-post surveys for caregivers and adolescents were integrated into service data collection between 2016 and 2022. Abbreviated standardised measures of physical abuse, emotional abuse, approval of corporal punishment, positive involved parenting, monitoring/supervision, caregiver depressive symptoms, parenting stress and adolescent depressive symptoms and externalising behaviour were used. Individual country scores were analysed separately for caregivers and adolescents using generalised linear mixed-effects models, and cross-country data were combined using a random-effects meta-analytic model. 123 050 participants were included (93% retention, 57 908 adolescents (96% female), 56 423 caregivers at follow-up). In all-country meta-analyses, estimates showed reduced physical abuse (-65%; 95% CI 51% to 74%), emotional abuse (-59%; 95% CI 48% to 68%) and approval of corporal punishment (-55%; 95% CI 48% to 60%). Positive involved parenting increased (+52%; 95% CI 24% to 87%) and poor supervision/monitoring decreased (-48%; 95% CI 34% to 58%). Caregiver depressive symptoms (-25%; 95% CI 8% to 48%), parenting stress (-46%; 95% CI 41% to 52%), adolescent depressive symptoms (-22%; 95% CI 1% to 38%) and adolescent externalising behaviour problems (-43%; 95% CI 29% to 54%) all declined. There was heterogeneity in pre-intervention scores and extent of change between humanitarian and development settings, and between different target groups, but strong consistency across caregiver and adolescent reports. In eight African countries, including humanitarian and pandemic-affected contexts, an evidence-based parenting programme showed consistent associations with reduced violence against adolescent girls and improved parenting and mental health.
Infliximab (IFX) for inflammatory bowel disease (IBD) treatment may increase the risk of hepatitis B virus (HBV) reactivation, particularly in areas with high HBV prevalence such as China. This study aimed to evaluate HBV reactivation/infection, liver dysfunction, vaccination efficacy and strategies in IBD patients undergoing IFX therapy. This retrospective, multicenter study included 4183 IBD patients from 15 hospitals across China, who were divided into six groups according to the HBV status. Demographic features, HBV vaccination status, reactivation/infection rates, and liver dysfunction outcomes were collected, with data collection performed from 2009 to 2022. We found that HBV reactivation rate was notably higher in HBsAg positive group than other groups (P < 0.05) despite antiviral treatment. Although only 29% of patients were immunized at IFX initiation and almost no patients got vaccinated against HBV during IFX treatment, no patients experienced HBV infection in the susceptible population group. The study underscores a critical need for rigorous HBV screening before IFX initiation. Despite antiviral prophylaxis, the importance of continuous monitoring of HBV DNA is necessary for HBsAg positive patients. HBsAg negative patients, including the susceptible population, had a very low risk of new HBV infection, thus reassuring patients and physicians of the safety of IFX in this cohort.
Severe hypertriglyceridemia (sHTG) has a causal role in acute pancreatitis (AP), whereas the relationship between triglyceride (TG) levels and risk of cardiovascular (CV) events is less well-known. To assess the incidence, risk, and odds of AP and CV events among US adults with sHTG and hypertriglyceridemia (HTG) compared to those with normal TG levels. The Optum Research Database identified 4 cohorts of adults with a TG test between January 1, 2017, and March 31, 2021. sHTG (500 ≤ TG <880 mg/dL) and extreme HTG (eHTG; TG ≥880 mg/dL) were identified first, followed by random identification of normal TG (35 ≤ TG <150 mg/dL) and HTG (150 ≤ TG <500 mg/dL) cohorts. Primary outcomes included incidence, adjusted risk, and adjusted odds of AP and CV events. A total of 134,116 patients were included: 46,676 (34.8%) with normal TG, 54,090 (40.3%) with HTG, 28,556 (21.3%) with sHTG, and 4994 (3.7%) with eHTG. Incidents of both outcomes were significantly higher for HTG, sHTG, and eHTG, compared with normal TG (P < .001). Adjusted hazard ratios (HRs) of AP were 1.491, 2.586, and 4.695 for HTG, sHTG, and eHTG, respectively (all P < .001). Adjusted HRs of CV events were 1.163 and 1.206 for sHTG and eHTG, respectively (both P < .001). In this cohort study, patients with sHTG and eHTG had significantly higher incidence of AP and CV events, compared with those with normal TG. The adjusted risk of AP and CV events increased stepwise with TG level; the association was stronger for AP.
暂无摘要(点击查看详情)
This study aimed to compare the efficacy and safety of concurrent chemoradiotherapy (CCRT) with or without induction chemoimmunotherapy (CI) in unresectable esophageal squamous cell carcinoma (ESCC). The study included patients with unresectable ESCC who received CCRT with or without induction CI at three cancer centers. Patients receiving concurrent immunotherapy were excluded. Propensity score matching (PSM) balanced baseline characteristics between groups. A total of 519 patients were included. After PSM, 183 patients per group were selected. Induction CI significantly improved OS and PFS compared to CCRT alone, consistently. The median OS for CCRT and induction CI groups were 29.9 months (95% CI: 18.8-41.0) and not reached (HR: 0.57, 95% CI: 0.42-0.77, p < 0.001). The median PFS was 17.6 months (IQR: 13.7-21.4) versus 30.6 months (IQR: 17.3-43.8) (HR: 0.66, 95% CI: 0.51-0.86, p = 0.002). Responders to the induction CI had significantly better OS (HR: 0.22, 95%CI: 0.14-0.36, p < 0.001) than nonresponders. Subgroup analysis showed radiation dose escalation or consolidation immunotherapy did not further improve survival in the induction CI group. The addition of induction chemoimmunotherapy to CCRT was associated with improved survival in patients with locally advanced unresectable ESCC, particularly in responders to induction chemoimmunotherapy, with acceptable toxicity. These findings warrant confirmation in prospective randomized trials.
Patients with end-stage kidney disease (ESKD) have been largely excluded from randomized trials of sodium-glucose cotransporter-2 inhibitors (SGLT2is). Despite the lack of guideline recommendations, SGLT2i prescriptions occur in real-world clinical practice. We aimed to describe real-world associations between SGLT2i exposure and clinical outcomes among patients with type 2 diabetes mellitus (T2DM) coded with ESKD. We conducted a target trail emulation with retrospective, new-user, active-comparator cohort study using the TriNetX US Collaborative Network (2016-2023). Adults with T2DM and ESKD who initiated an SGLT2i or a dipeptidyl peptidase-4 inhibitor (DPP4i) were included. Propensity score matching (1:1) was used to balance baseline characteristics. The primary outcome was all-cause mortality; secondary outcomes included sepsis, pneumonia, major adverse cardiovascular events (MACE), all-cause hospitalization, and emergency department visits. Subgroup analyses were exploratory, and heterogeneity was assessed using Cochran's statistics. After matching, 5295 SGLT2i users were compared with 5295 DPP4i users. Over a follow-up of up to 4 years, SGLT2i exposure was associated with lower all-cause mortality (hazard ratio [HR] 0.90, 95% confidence interval [CI] 0.84-0.97), sepsis (HR 0.87, 95% CI 0.79-0.95), and all-cause hospitalization (HR 0.93, 95% CI 0.89-0.97). No significant associations were observed for MACE, pneumonia, or emergency department visits. Subgroup-specific estimates varied in magnitude, with no consistent evidence of heterogeneity. In this large real-world cohort of patients coded with ESKD, SGLT2i exposure was associated with favorable outcome patterns compared with DPP4i. Given the observational design, potential misclassification of kidney disease status, and off-label drug use, these findings should be interpreted as hypothesis-generating and do not establish causality.
Objective: To analyze the incidence of malignant tumors in children and adolescents (aged 0-19 years) in China in 2022. Methods: Data were sourced from GLOBOCAN 2022 and Cancer Incidence in Five Continents (CI5) Volume Ⅻ. Incidence data by sex and age group for childhood and adolescent cancers from Chinese registries in CI5 Volume XII were extracted to calculate the proportion of each subtype. These proportions were then applied to the overall cancer incidence in children and adolescents in China from GLOBOCAN 2022 to estimate the number of incident cases for different tumor types. World standardized incidence rates (WSR) were calculated using Segi's world standard population. Results: In 2022, there were 32 792 new cases of malignant tumors in children and adolescents (aged 0-19 years) in China, with a WSR of 105.93 per million. Among them, 23 121 new cases occurred in children aged 0-14 years, accounting for 70.51% of all cases in children and adolescents, with a WSR of 100.30 per million. The most common diagnostic types of malignant tumors in Chinese children and adolescents in 2022 were leukemia (11 983 cases, 36.54%), followed by central nervous system tumors (4 485 cases, 13.68%) and lymphoma (2 764 cases, 8.43%), with WSR of 39.78 per million, 14.15 per million and 8.35 per million, respectively. Among different age groups, the highest WSR was observed in the 15-19 years (125.28 per million), followed by the 0-4 years (120.67 per million). Except for the 15-19 years group, the age-specific incidence rate was higher in males than in females. The top three cancer types by incidence in both sexes in 0-14 years group were consistent with those in the 0-19 years group, namely leukemia, central nervous system tumors, and lymphoma. In the 15-19 years group, the top three cancers in males were leukemia, bone tumors, and lymphoma, while in females they were malignant melanomas and other malignant epithelial tumors, leukemia, and malignant gonadal germ cell tumors. Conclusions: The incidence rates and cancer type distribution in children and adolescents vary considerably by sex and age group in China. Targeted prevention and control strategies for childhood and adolescent cancer malignant tumors should be developed accordingly. 目的: 分析2022年中国儿童和青少年(0~19岁)恶性肿瘤的发病情况。 方法: 数据来源于GLOBOCAN 2022和《五大洲癌症发病率(CI5)》第Ⅻ卷,提取CI5第Ⅻ卷中国各登记处不同性别、年龄别儿童和青少年肿瘤的发病数据,计算各分型的比例,结合GLOBOCAN 2022中国儿童和青少年恶性肿瘤总体发病数,估计中国儿童和青少年不同恶性肿瘤类型的发病数。标化发病率采用Segi's世界标准人口年龄构成计算。 结果: 2022年,中国儿童和青少年(0~19岁)恶性肿瘤新发32 792例,标化发病率为105.93/100万。其中0~14岁儿童新发23 121例,占儿童和青少年恶性肿瘤的70.51%,世标发病率为100.30/100万。2022年中国儿童和青少年最常见的恶性肿瘤诊断类型是白血病(11 983例,36.54%),其次为中枢系统神经肿瘤(4 485例,13.68%)和淋巴瘤(2 764例,8.43%),其标化发病率分别为39.78/100万、14.15/100万和8.35/100万。不同年龄组中,15~19岁组标化发病率最高(125.28/100万),其次为0~4岁组(120.67/100万)。除15~19岁组外,男性年龄别发病率均高于女性。0~14岁组男女发病率前3位的癌种与0~19岁组一致,均为白血病、中枢神经系统肿瘤和淋巴瘤。15~19岁青少年组中,男性发病前3位为白血病、骨肿瘤、淋巴瘤,女性发病前3位为恶性黑色素瘤及其他恶性上皮肿瘤、白血病、生殖细胞及性腺肿瘤。 结论: 中国不同性别、年龄儿童和青少年恶性肿瘤的发病率和癌种分布差异较大,应针对性制定儿童和青少年恶性肿瘤防控策略。.
Objective: To analyze the mortality trends of liver cancer in the general population of Qidong City, Jiangsu Province from 1972 to 2024, and to predict the mortality burden from 2025 to 2034, providing a basis for liver cancer prevention and control strategies. Methods: Liver cancer mortality data (1972-2024) were extracted from the Qidong Cancer Registry database. Using corresponding population data, we calculated: crude mortality rate (CR), Chinese age-standardized rate (ASRC, standardized using the 1964 Chinese population), world age-standardized rate (ASRW, standardized using Segi's world population), and median age at death. Joinpoint regression (Joinpoint 4.9.1.0) was employed to estimate annual percent change (APC) and average annual percent change (AAPC) in mortality, The ARIMA model in SAS 9.2 was applied to predict mortality trends over the next decade. Results: A total of 34 773 liver cancer deaths were recorded in Qidong from 1972 to 2024. Compared with 1972-1976, the proportion of liver cancer deaths among all cancer deaths in 2022-2024 decreased from 40.02% to 12.83%. The CR, ASRC, and ASRW declined from 49.33/105, 45.62/105, and 57.23/105 in 1972-1976 to 44.09/105, 8.54/105, and 13.91/105 in 2022-2024, respectively. The male-to-female ratio of ASRW was 3.33:1 from 1972 to 2024. For 2022-2024, the ASRW was 21.16/105 for males and 7.22/105 for females. Age-specific mortality rates showed declining trends in all age groups under 65 years from 1972 to 2024, with greater declines in younger age groups (all P<0.05). In contrast, the mortality rate in the 75+ years age group showed an increasing trend (AAPC=2.15%, P=0.001). The median age at death from liver cancer in Qidong rose from 49 years in 1972 to 72 years in 2024, and the peak mortality age group shifted gradually from 45-54 years to 75+ years across periods. The time trend analysis revealed that from 1972 to 2024, the AAPCs for ASRW were -2.11%, -2.23%, and -1.89% (all P<0.001) for both sexes combined, males, and females, respectively, all showing statistically significant downward trends. The CR showed a slow but significant increasing trend for females (AAPC=0.88%, P<0.001), while the trends for both sexes combined and males were not statistically significant (all P>0.05). Segmented fitting results showed the most pronounced decline occurred from 2008 to 2024, with an APC of -3.76% for CR and -7.15% for ASRW (both P<0.001). The overall CR is projected to decline to 40.30/105 in 2034, and the ASRW is projected to decline to 4.40/105. Conclusions: The comprehensive prevention and control efforts implemented over 53 years in the high-incidence area of Qidong have influenced the overall standardized mortality rate and the liver cancer mortality rate among those under 65 years of age, with the most significant decline observed after 2008. Future efforts should focus on strengthening comprehensive prevention and control of liver cancer in the elderly population. 目的: 分析江苏省启东市1972—2024年全人群肝癌死亡流行趋势,预测2025—2034年死亡负担,为肝癌预防控制提供依据。 方法: 基于启东市肿瘤登记病例数据库,提取1972—2024年肝癌死亡登记资料,结合相应人口资料,计算粗死亡率、中国人口标化率(中标率)、世界人口标化率(世标率)、中位死亡年龄等。中标率采用1964年中国人口构成标化,世标率采用Segi's世界人口构成标化。使用Joinpoint 4.9.1.0软件进行回归分析,计算肝癌死亡率年度变化百分比(APC)和平均年度变化百分比(AAPC)。用SAS 9.2软件时间序列分析中的ARIMA模型对未来10年死亡率趋势进行预测。 结果: 1972—2024年,启东市肝癌死亡病例共计34 773例,与1972—1976年相比,2022—2024年肝癌死亡的全癌占比从40.02%下降至12.83%,粗死亡率、中标率、世标率分别从49.33/10万、45.62/10万、57.23/10万下降至44.09/10万、8.54/10万、13.91/10万。1972—2024年肝癌总体世标死亡率男女性别比为3.33∶1,2022—2024年男性世标率为21.16/10万,女性世标率为7.22/10万。年龄别死亡率显示,1972—2024年65岁以下各年龄组肝癌死亡率均呈下降趋势,且年龄越小死亡率降幅越大(均P<0.05),而75+岁年龄组死亡率呈上升趋势(AAPC=2.15%,P=0.001)。启东肝癌死亡中位年龄由1972年的49岁上升至2024年的72岁,各时期肝癌死亡率高峰年龄组也从45~54岁逐渐后移至75+岁年龄组。时间趋势结果显示,1972—2024年启东市总体、男性、女性的肝癌世标死亡率均呈下降趋势,AAPC值分别为-2.11%、-2.23%、-1.89%(均P<0.001);女性粗死亡率缓慢上升(AAPC=0.88%,P<0.001),总体和男性粗死亡率趋势无统计学意义(均P>0.05)。分段拟合结果显示,2008—2024年启东市肝癌死亡率降幅最大,粗死亡率APC值为-3.76%,世标率APC值为-7.15%(均P<0.001)。预计2034年启东市总体粗死亡率将下降至40.30/10万,世标率下降至4.40/10万。 结论: 启东市肝癌高发现场53年的综合防控工作已对总体标化死亡率、65岁以下人群的肝癌死亡率产生了影响,尤其以2008年后的死亡率下降最为显著,未来将重点加强对老年人群肝癌的综合防控。.
A large body of research has examined the association between the cognitive triad (negative views of the self, the world and the future) and depression; however, findings regarding its strength and direction remain inconsistent. The present meta-analysis systematically synthesized existing empirical evidence to evaluate both cross-sectional and longitudinal associations between the cognitive triad and depression. A three-level random-effects model was used to pool correlation coefficients (r) for cross-sectional associations, and cross-lagged regression models were applied to test prospective relations. Fifty-nine cross-sectional studies (132 effect sizes) were included, demonstrating a strong overall association between the cognitive triad and depressive symptoms (r = 0.596, 95% confidence interval, CI [0.546, 0.622]). Depressive symptoms were most strongly associated with a negative view of the self (r = 0.593), followed by a negative view of the world (r = 0.558) and the future (r = 0.538). Exploratory cross-lagged analyses were conducted across eight longitudinal studies. Preliminary results suggested that depressive symptoms at Time 1 predicted more negative cognitive triad scores at Time 2 (β = 0.160, 95% CI [0.100, 0.220]). The prospective effect of the cognitive triad on later depressive symptoms appeared to be moderated by age group; this effect was significant among participants aged 15 years or older (β = 0.187, 95% CI [0.112, 0.261]) but not among those below 15 years (β = -0.035, 95% CI [-0.122, 0.051]). The findings indicate a strong association between the cognitive triad and depressive symptoms. Furthermore, exploratory longitudinal analyses provide preliminary evidence for a prospective effect of depressive symptoms on subsequent negative cognitions and suggest a potential developmental shift in cognitive vulnerability to depression from late adolescence into adulthood.
Quantifications of personal light exposure (PLE) are important for research on human health and well-being. To capture real-world PLE, dosimeters (light sensors) are often worn on the body, rather than at the eye. Currently, it is largely unknown how such placement influences the dosimeter's performance for measuring actual eye-level light exposure. To overcome limitations of real-world studies, we introduce a novel hybrid measurement-simulation approach to quantify three factors affecting dosimeter performance: the translational and rotational dosimeter displacement and body self occlusion. Using 3D body scans of twelve subjects, we illustrate how an individual's body morphology influences the magnitude and distribution of these factors and identify the dosimeter positions where these factors are minimized. Additionally, we defined illustrative limits to identify regions of the chest suitable for dosimeter placement. Depending on the subject's posture, between 0 and 46.4% of the chest area exhibits sufficiently small magnitudes of these factors to be considered suitable. To ensure valid PLE data in light-dosimetry field studies, it is essential to select a dosimeter position that minimizes each of these three factors for typical subject postures.
To address patient demand for rapid access to innovative digital medical devices (DMDs), several health technology assessment (HTA) authorities in European Union countries provide transitional or provisional access and reimbursement pathways. These pathways are available when only incomplete clinical trial data are accessible, and significant uncertainty remains regarding the clinical benefits, even after CE (European conformity) marking has been obtained. Once manufacturers complete the clinical studies, additional real-world data (RWD) may become available as a result of the device's use in the target population. Consequently, regulators can draw on both sources of information to support their final decision-making processes. For a statistically principled evaluation of such settings, we propose a statistical framework suitable for DMD evaluation under European HTA fast-track requirements, integrating both clinical trial data and RWD. The framework consists of three key steps: (1) an interim analysis of clinical trial data, which can support temporary regulatory authorization and enable the collection of RWD; (2) a final analysis of the clinical trial data; and (3) a meta-analysis combining the clinical trial data and RWD, contingent upon obtaining temporary authorization. To optimize the timing of the interim analysis and the application for temporary authorization, we introduce several metrics. The proposed framework was assessed by means of an extensive simulation study. This framework should be complemented by a post-market evaluation of the DMD once it has been widely adopted, aligning with the principles of phase IV studies.
WATER SCARCITY IS ONE OF THE MAJOR CHALLENGES TO SUSTAINABLE DEVELOPMENT. THIS CHALLENGE WILL INCREASE AS CLIMATE CHANGE CONTINUES, THE WORLD'S POPULATION GROWS AND THE DEMAND FOR FOOD INCREASES. THEREFORE, IT IS NECESSARY TO LOOK FOR AGRICULTURAL SOLUTIONS AND PRACTICES TO INCREASE FOOD PRODUCTION WHILE USING LESS WATER. THE STUDY WAS CONDUCTED TO INVESTIGATE THE EFFECTS OF PARTIAL ROOT-ZONE IRRIGATION (50% PRI) AND PACLOBUTRAZOL (GROWTH REGULATOR, PBZ) ON THE CAULIFLOWER CROP. THIS STUDY AIMS TO INCREASE THE WATER STRESS RESISTANCE OF THE CAULIFLOWER CROP. THEREFORE, PACLOBUTRAZOL WAS APPLIED 20 AND 40 DAYS AFTER PLANTING WITH DIFFERENT CONCENTRATIONS (0, 25, 50, 75 PPM). IN ADDITION, TRADITIONAL DRIP IRRIGATION (APPLYING 100% OF THE IRRIGATION REQUIREMENT) WAS USED FOR COMPARISON TO 50% PRI UNDER SANDY LOAM SOIL CONDITIONS. THE RESULTS INDICATED THAT THE SOIL MOISTURE CONTENT BEFORE IRRIGATION INCREASED WITH INCREASING PACLOBUTRAZOL CONCENTRATION UNDER TRADITIONAL DRIP IRRIGATION AND THE 50% PRI METHOD COMPARED TO THE CONTROL (0 PPM PACLOBUTRAZOL). CONSERVING SOIL WATER IS LIKELY DUE TO THE USE OF PACLOBUTRAZOL, WHICH MAY REDUCE THE RATE OF TRANSPIRATION FROM PLANTS. ALTHOUGH SOME GROWTH CHARACTERISTICS WERE REDUCED, PLANTS SHOWED NO OBVIOUS SIGNS OF WILTING UNDER 50% PRI WITH PACLOBUTRAZOL, AS SOIL MOISTURE CONTENT WAS MAINTAINED WITHIN AVAILABLE WATER LIMITS. MOREOVER, CAULIFLOWER YIELD UNDER 50% PRI WITH 75 PPM PACLOBUTRAZOL SHOWED A SLIGHT DECREASE JUST 1.06% AND 1.75% LOWER THAN YIELDS ACHIEVED UNDER TRADITIONAL DRIP IRRIGATION WITH 0 PPM PACLOBUTRAZOL DURING THE TWO GROWING SEASONS, RESPECTIVELY. ON THE OTHER HAND, THE HIGHEST IRRIGATION WATER PRODUCTIVITY WAS ACHIEVED WITH 50% PRI AND 75 PPM PACLOBUTRAZOL, WHICH SHOWED AN INCREASE OF APPROXIMATELY 49% COMPARED TO TRADITIONAL DRIP IRRIGATION WITHOUT PACLOBUTRAZOL. THESE RESULTS INDICATE THAT COMBINING 50% PARTIAL ROOT-ZONE IRRIGATION (PRI) WITH 75 PPM PACLOBUTRAZOL APPLICATION MAY ACHIEVE COMPARABLE YIELDS TO TRADITIONAL DRIP IRRIGATION (100% WATER REQUIREMENT), WHILE REDUCING IRRIGATION WATER USE BY 50%. FINALLY, AS WE ADDRESS FUTURE CHANGES IN WATER AVAILABILITY, IT IS IMPORTANT TO DEVELOP A LONG-TERM WATER MANAGEMENT STRATEGY.