Uncertainty assessment is critical to inform confidence in risk conclusions and guide risk-based decision making. This case study quantifies the impact of uncertainties in the United States Environmental Protection Agency (US EPA)'s chronic non-cancer risk evaluation of diisononyl phthalate (DINP) under the Toxic Substances Control Act and provides critical information regarding confidence in risk determinations and risk management decisions. Uncertainties associated with key elements in US EPA's risk determination approach (i.e., margin of exposure [MOE]) were qualitatively and quantitatively examined, including point of departure (POD) selection, exposure estimates, and benchmark MOE (bMOE) calculation, to demonstrate their impact on MOEs and implications for risk-based conclusions. There was medium overall confidence in the risk evaluation, with lowest confidence in POD selection. Quantitative uncertainty assessment for each exposure scenario for which US EPA determined unreasonable risk (i.e., MOE < bMOE of 30) considered a range of alternative values per key element. For the lowest MOE calculated by US EPA (2.7 = unreasonable risk), alternative MOEs derived herein ranged from 117 to 150 (no unreasonable risk), demonstrating the uncertainty imparted in US EPA's determinations of unreasonable risk. This uncertainty assessment illustrates the importance of applying qualitative and quantitative approaches to assess uncertainty to frame the confidence in assessments that underlie risk-based decisions.
Paramedics frequently incur reasonable overtime (RO) due to factors such as staffing, patient acuity and transport durations. Considering the risks associated with extended work hours, understanding paramedics’ opinions of reasonable and unreasonable overtime is vital for safe workforce management. The current study aimed to gain consensus on operational paramedics’ perception of conditions that are reasonable and unreasonable for overtime. The Delphi methodology comprised three iterative survey rounds completed by experienced paramedics working in metropolitan areas of Australia. In Round 1, participants gave a free-text definition of RO and voted on the degree to which a predetermined list of RO-related conditions were reasonable. Free-text responses underwent a content analysis to determine additional RO-related conditions based on the definitions of RO provided by participants. These conditions were then combined with the predetermined list of RO conditions and voted on by participants in Round 2 and 3. Out of 189 initial participants, 144 completed Round 2 (76.2% response rate) and 128 completed Round 3 (88.9% response rate). Consensus (≥ 80% agreement) was achieved for 50% (n = 17) of the RO-related conditions. Nearly all paramedics agreed that overtime was reasonable for time-critical emergencies where there is an expected detrimental effect to a patient without a paramedic’s attendance (96%), whereas nearly all paramedics agreed that RO to ‘clear the board/pending queue’ was unreasonable (99%). Paramedics reached consensus on four reasonable and thirteen unreasonable conditions for RO, resulting in the creation of the first set of consensus-based considerations for RO from the perspective of operational paramedics in Australia. These findings provide important insights into how paramedics perceive RO. Future research examining the proposed considerations for RO is needed to determine their feasibility and acceptability by ambulance services and other paramedics, and their potential effectiveness in this industry for workforce planning. The online version contains supplementary material available at 10.1186/s12913-025-13736-z.
Unmanned Combat aerial vehicles (UAVs) have evolved into assets of tactical-critical importance, thus safeguarding the occupational health of their operators is imperative. As UAV units of the Chinese are predominantly stationed in high-altitude regions, this study aimed to systematically assess the occupational health status and risk factors faced by these operators during their service, and to propose targeted protective measures. A cohort of 62 active-duty UAV operators stationed in high-altitude regions was recruited to complete a self-administered questionnaire. The investigation was conducted utilizing a comprehensive framework encompassing occupational health assessment, risk factor evaluation, and protective measures analysis. Statistical methodologies employed included descriptive statistical analysis, symptom co-occurrence network analysis, and clustering pattern analysis. UAV operators in high-altitude regions exhibited significantly higher prevalence of neurological (67.74%), musculoskeletal (64.52%), and psychological symptoms (46.77%) compared to their counterparts in plain areas, with additional manifestations of otorhinolaryngological (67.74%) and respiratory symptoms (64.52%). Symptom co-occurrence network analysis identified "musculoskeletal pain" and "memory impairment "as central hubs. Clustering analysis revealed distinct risk stratification, with high-, medium-, and low-risk subgroups constituting 20.97, 40.32, and 38.71% of the cohort, respectively. Among risk factors, "unreasonable work schedules" and "high-altitude hypoxia" received the two highest risk scores. Regarding protective measures, "health education," (0.66) "scientific training protocols," (0.62) and "rational shift scheduling" (0.60) demonstrated the highest comprehensive effectiveness scores. The UAV operator cohort in high-altitude regions demonstrated a notably poor occupational health status. "musculoskeletal pain" and "memory impairment "were identified as critical intervention targets. Given the prominent roles of "high-altitude hypoxia" and "unreasonable work-rest schedules" as dominant risk factors, occupational health strategies should be strategically redirected from over-reliance on personal protective equipment toward prioritized investment in efficient management systems. These include implementing "scientific training protocols," enhancing "health education," and establishing structured "work-rest rotation systems." Simultaneously, essential oxygen supply and noise reduction equipment should be deployed at high-altitude workplaces.
Due to the significant economic and ecological value of forest land resources, they have received extensive attention in recent years. The level of forest productivity is the primary factor considered when developing and utilizing forest land resources. However, the assessment indicators for the productivity level of forest land are complex and diverse. In practical applications, by using multi-index assessment methods, more reasonable assessment results can be obtained. As a widely used assessment model, the Measurement Alternatives and Ranking according to Compromise Solution (MARCOS) method is proposed to solve the problem of ranking alternatives in multi-index assessment. Since the proposal of this method, many research results have emerged regarding its theory and application. Aiming at the shortcomings of the MARCOS method in the ranking effect of other options and the unreasonable change trend of its ideal alternative utility degree, in this study, the utility formulas for both the positive and negative ideal alternatives are improved, and the calculation formula of comprehensive utility degree is reconstructed. The revised MARCOS enhances the applicability and effectiveness of the original MARCOS method. We also compare and analyze the revised method with several multi-index assessment methods. The comparison results show that the revised method exhibits excellent consistency with other methods and has better decision flexibility and resolution. By assessing forest productivity, the applicability and validity of the revised MARCOS method are illustrated.
Soil acidification is a critical global issue threatening agricultural productivity and ecosystem health. In recent years, human activities have intensified soil acidification, posing significant challenges to food security and sustainable agriculture. Acidified soils are characterized by a decline in pH, increased activity of toxic metal ions, and nutrient depletion, leading to soil structure degradation and restricted plant growth. This article systematically summarizes the main driving factors of soil acidification in farmland, including non-human factors (weathering and leaching, nutrient absorption, organic matter decomposition, root exudates release, lightning volcanic eruption and deposition, microbial activity, etc.) and human factors (unreasonable agricultural management, climate change, etc.), and further analyzes the main processes of microbial-mediated soil acidification affecting nutrient cycling. To address soil acidification, existing mitigation strategies are summarized, such as lime application, balanced fertilization, organic matter management, biochar application, and the promotion of acid-tolerant crops. However, due to the diversity of soil types, the complexity of acidification processes, and variations in crop types and cultivation practices, the effectiveness of these measures varies significantly. By providing a comprehensive synthesis and outlook and considering the context of climate change, this study explores future research directions, offering both theoretical foundations and practical guidance for the scientific management of soil acidification and the advancement of sustainable agriculture.
This paper outlines a systematic approach to tackle the creation of a safety case for Automated Driving Systems (ADS) that operate without a driver. A safety case is a formal way to explain how an ADS developer determines that its system is safe enough to be deployed on public roads without a human driver, and it includes evidence to support that determination. It involves an explanation of the system, the methodologies used to develop it, the metrics used to validate it and the actual results of validation tests. Yet, in order to develop a worthwhile safety case, it is first important to understand what makes it credible and well crafted, and align on evaluation criteria. This paper helps enable such alignment by providing foundational thinking into not only how a system is determined to be ready for deployment but also into justifying that the set of acceptance criteria employed in such determination is sufficient and that their evaluation (and associated methods) is credible. The presentation is anchored around the acknowledgement that absolute zero risk is unattainable, framing the definition of safety around the notion of "absence of unreasonable risk" in accordance with state of the art safety standards. The publication is structured around three complementary perspectives on safety: a layered approach to safety; a dynamic approach to safety; and a credible approach to safety. Each perspective focuses on the principles and methodological approach, rather than specific results that are often proprietary and this paper does not feature a full safety case nor the evidence to support it. While centered around the example of a SAE Level 4 ADS, the proposed approach is technology- and methodology-agnostic, making it adaptable for use in whole or in part by any entity in the field.
Knowledge is lacking regarding how the experience of being exposed to violence is affected when the perpetrator suffers from behavioral and personality changes (BPC) due to a brain tumor. This study is part of the Swedish national research project BRAVE - Brain Tumor Related Aggression and Violence Exposure. The aim was to explore experiences of family members exposed to violence by a person suffering from BPC associated with a brain tumor. Individual interviews were conducted with 25 family members who have been exposed to violence by patients with primary brain tumor. The interviews were analyzed using qualitative content analysis. The participants reported various forms of violence and expressed intense suffering, loneliness and social isolation. The homes sometimes shifted from being a safe place to being a place marked by fear and unpredictability. In adapting to violence, what initially seemed unreasonable, gradually became "the new normal". Different strategies to minimize risks and damage were described. Self-blame and shame were often associated with an inability to love the patient "in sickness and in health", despite the violent actions by the patient. When the death of the perpetrator was viewed as the only means of escape, participants also expressed feelings of guilt and shame. Our study highlights extensive suffering, vulnerability, loneliness and isolation among family members exposed to violence by brain tumor patients. An intervention that provides appropriate support for brain tumor patients, family members and staff who encounter them is urgently needed.
Withholding or withdrawing life-sustaining therapies (WLST) was introduced in France in 2005 through the Leonetti law to prevent futile treatments and "unreasonable obstinacy." In France, WLST decisions affect 8.5-14% of ICU patients, according to the literature. The 2016 Claeys-Leonetti law updated the previous legislation, but debates surrounding end-of-life care persist. To describe WLST patients and practices under current legislation, we conducted a multicenter, prospective, observational study in ICUs across Eastern France. Eligible adult patients facing WLST decisions were included, requiring written consent from the patient or a trusted person. Patients were followed for 1 month. We described the decision-making process and assessed family satisfaction using the FS-24R-ICU questionnaire. Between May 3rd and October 3rd, 2023, 73 patients were included (mean age 69 years). The majority of admissions were medical (72.7%), and 50.7% of patients had neurological impairments. ICU staff initiated WLST discussions primarily due to poor survival or quality of life prospects. Only 12.5% of patients had written advance directives, and 59.1% had designated a trusted person. External consultation was not involved in 19.1% of decisions. Families were informed in 91.7% of cases. Decisions to withhold therapies occurred in 68.1% of cases, with resuscitation during cardiac arrest being the most commonly withheld intervention (98.0%). Treatment withdrawal occurred in 31.9% of cases. Family satisfaction was generally positive. WLST management in Eastern French ICUs is partially compliant with the Claeys-Leonetti law. Improved law application and public awareness could enhance end-of-life care management in France.
Student pharmacists pursuing residency often report feeling extreme pressure to be licensed by their residency program's deadline, which may be as early as July 1 or the start of residency. American Society of Health-System Pharmacists accreditation standards, however, only require a resident to be licensed for two-thirds of their residency program for successful completion. Although expecting licensure as soon as possible may achieve some residency programmatic fiscal and experiential goals, those benefits to the program must be balanced with what is feasible and appropriate for the resident. This commentary aims to encourage residency programs to select a deadline for licensure that is more reasonable and reflective of the current licensure climate for several reasons: (1) data do not support an increased likelihood of passing the North American Pharmacist Licensure Examination and jurisprudence examination when both are taken prior to the start of the residency; (2) students and schools have limited control over test date availability; (3) no evidence-based licensure examination preparation approach exists; (4) unreasonable licensure deadlines may contribute to premature testing and resident burnout; and (5) examination failures may be mitigated with adequate study time postgraduation.
Failure mode and effects analysis (FMEA) is a widely used proactive tool for risk assessment in radiotherapy. With the introduction of 3D-printed template-assisted intracavitary/interstitial brachytherapy (3DP-IC/IS), the increasing procedural complexity and newly introduced steps highlight the need for systematic risk management. However, the traditional FMEA approach has been criticized for its mathematical and logical limitations. To develop an enhanced FMEA framework using a cloud model and data envelopment analysis (DEA), and to validate its performance through a comparative risk assessment against the traditional FMEA method in 3DP-IC/IS. The analysis was performed on a dataset from 80 patients who underwent 3DP-IC/IS. The proposed framework incorporates a cloud model to manage the fuzziness and randomness of linguistic risk assessments and a modified DEA model for multi-criteria decision-making. Both the proposed and traditional FMEA methods were used to rank failure modes (FMs) in the 3DP-IC/IS workflow. Risk priority numbers and model-derived efficiency scores were calculated to identify high-priority FMs, and the ranking results from both methods were compared to evaluate performance. A total of 66 FMs were analyzed. The average rank shift between the traditional and introduced FMEA was 4.39 (max: 17, min: 0). Among the top 20 FMs, 18 were consistent between the two methods. "The introduced FMEA uniquely identified "Insufficient checking of needle labels" and "Excessive number of needles" as high risk." The proposed framework also increased risk-ranking discrimination, reducing the number of FMs with identical scores from 12 groups to a single group. The two highest risk FMs-"Unreasonable needle track design" and "Error in identifying non-coplanar needle number"-were consistently ranked first by both methods. Notably, the proposed framework reclassified four FMs as high risk, which demonstrated improved sensitivity to risks potentially underestimated by traditional FMEA. We developed a hybrid cloud model-DEA-FMEA framework that mitigates key limitations of traditional risk assessment. When applied to 3DP-IC/IS, this framework demonstrated both feasibility and an enhanced ability to identify and prioritize critical FMs. Our findings highlight its clinical value for improving quality assurance and patient safety, especially in complex, high risk radiotherapy procedures.
Requests by parents or other caregivers for treatment to prolong the lives of minors with profound cognitive disabilities can be ethically challenging. Some patients have very limited capacity for conscious experience, and so it becomes difficult to say that a longer life is truly good for them. For such cases, some commentators have proposed the relational potential standard as an alternative to the best interest standard. Yet, if the relational potential standard holds that requests for care ought to be honored because they respect patients' familial relationships even though they provide no benefit to patients themselves, then the proposal is objectionable. We have good ethical reasons to accept at least one element of the best interest standard: the exclusionary criteria that no one's interests but the patient's should count when making medical decisions on their behalf. This paper defends a pluralistic conception of what can be in a severely cognitively disabled minor patient's interests. This approach can yield the same result that proponents of the relational potential standard want (honoring requests for care even when providers doubt that these requests are in the patient's best interest) while avoiding committing clinicians to honoring unreasonable requests that discount the patient's other interests.
Deep learning has shown promise in diabetes management but faces challenges in real-world application due to its "black-box" nature, characterized by opaque internal decision-making processes. Explainable artificial intelligence (XAI) methods have been proposed to enhance model transparency. However, most of current XAI methods applied in the medical field often ignore the interaction of features in complex environments and pose deviation from clinical domain knowledge. Our study used two Electronic Health Record (EHR) cohorts of hospitalized patients with type 2 diabetes (T2DM), including an internal dataset of 1,275 inpatients (mean age 58.5 ± 14.3 years) and an external dataset of 292 patients (mean age 69.3 ± 14.5 years). We introduce an expert-guided XAI framework to improve the transparency and trustworthiness of deep learning models for insulin titration in diabetes management. The framework utilizes a post-hoc XAI model named Shapley Taylor Interaction Index (STII) to capture the impact of feature interactions. Additionally, the model is refined iteratively in a doctor-in-the-loop (DIL) process by encoding clinical constraints to align with medical expertise. Here we show that our STII-DIL model could explore the interaction factors and reduce unreasonable explanations compared with other explanation models. The final XAI system explanations demonstrated strong alignment with experts' explanations and increased correctness by expert evaluation An AI-human collaboration study revealed that insulin titration accuracy significantly improved for junior clinicians with STII-DIL assistance, while senior clinicians showed minimal change. Both junior and senior clinicians reported increased confidence when using the STII-DIL system. We present an explainable deep learning framework that combines post-hoc XAI and expert domain knowledge to provide transparent and expert-aligned explanations for insulin titration in type 2 diabetes management. This framework enhances decision-making accuracy and confidence, especially for junior clinicians, and may facilitate broader clinical adoption of AI-assisted decision-making tools. This study addresses the need for transparent and reliable artificial intelligence (AI) tools in diabetes care. We developed an explainable deep learning system that helps doctors adjust insulin doses by showing how different clinical factors contribute to its recommendations. The system combines an AI explanation method with guidance from medical experts, allowing the model to be refined to better match real clinical reasoning. We found that this approach produced clearer and more accurate explanations and supported better decision-making, especially for junior clinicians. Both junior and senior clinicians also reported greater confidence when using the system. These findings suggest that explainable AI may improve safety and support wider clinical use of AI-assisted diabetes management.
Perioperative prophylactic antimicrobial use for hysteroscopic surgery is suboptimal, even with guidelines and official policies in place. This study aimed to assess the role of clinical pharmacists in the management of perioperative prophylactic antimicrobial therapy in hysteroscopic surgery and to analyze its impact on clinical outcomes and cost-effectiveness. This retrospective study reviewed hysteroscopic surgery cases at a tertiary-level Class A hospital from September to December 2022 and September to December 2023. The cases were divided into routine and intervention groups on the basis of the involvement of clinical pharmacists. The primary outcomes assessed included the rational use of antimicrobial drugs, postoperative infections, and economic benefits. To control for confounding factors, a 1:1 propensity score matching method was applied. A total of 849 patients were included in this study, with 226 patients in each group after propensity score matching. Significant differences were observed between the conventional and intervention groups in several areas: unreasonable timing of preoperative drug administration (22.7% vs 0%, p < 0.001), prolonged duration of prophylaxis (32.6% vs 11.5%, p = 0.029), unindicated prophylaxis (69% vs 28.8%, p < 0.001), and inappropriate dosing regimens (61.5% vs 14.6%, p < 0.001). In terms of economic benefits, the intervention group showed a significant reduction in the length of hospitalization (3.15 vs 2.15 days, p < 0.001). Additionally, significant differences were found in the costs of patient medications ($53.17 vs $42.66, p < 0.001) and antimicrobials ($8.16 vs $2.18, p < 0.001). Clinical pharmacist interventions can significantly enhance the rational use of perioperative antimicrobial drugs in hysteroscopic surgery, leading to notable clinical outcomes and economic benefits.
Multimeric enzymes are prevalent in nature and widely applied in industrial applications, so the stabilization of multimeric enzymes is of significance and practical value in enzyme engineering. Here, we proposed an energy-guided molecular dynamics (MD)-aided interface engineering (MDAIE) strategy aiming at enhancing the catalytic performance of a dimeric enzyme, organophosphate hydrolase (OPH), for degrading methyl parathion, one of the widely used organophosphorus pesticides. By identifying the key amino acid residues influencing the dimeric subunit interactions and filtering out unreasonable residues, 13 single-site mutations with potentially stabilizing performance were generated, and eight of them showed significantly improved activity compared to the wild-type enzyme (WT). By further combination of beneficial single-site mutations, four combinatorial mutants (Q155R/E181L, T147V/E181L, T147V/Q155R, and T147V/E159L) with significantly improved performance compared to the WT were achieved from six double-site mutations. Thus, a total of 12 mutants that efficiently degraded organophosphorus pesticide over the WT were obtained, and the best mutant, Q155R/E181L, exhibited 3.0-fold increase in catalytic efficiency, >1.8-fold extension in half-life (t1/2) at 60 °C, and 2.1 °C rise in melting temperature (Tm). Comprehensive structural analysis and MD simulations unraveled the mechanism underlying simultaneous gains in both activation and stabilization by the dimer engineering. The results validated the MDAIE strategy as a reliable and effective approach to boosting the performance of multimeric enzymes. In this specific case, the dimer engineering of OPH offered a promising and robust biocatalyst for bioremediation and environmental protection.
Information technology-driven pre-prescription review system (PPRS) is critical pillars for medication safety. How to balance the efficiency and accuracy of review has become a core issue. This study aims to retrospectively analyze the establishment and application effectiveness of the PPRS for rational drug use. It summarizes the system's limitations and operational challenges encountered, further explores refined management pathways for the system, and provides insights and considerations for smart healthcare to assist clinical practice in promoting rational drug use. A single-center real-world retrospective analysis study was conducted at a tertiary hospital in Chongqing, China. Using evidence-based methods, a descriptive analysis was conducted on the construction and refined management path of PPRS, and its effectiveness was evaluated. Before and after the PPRS went online, prescription and inpatient order data were monitored. Unpaired sample t-test and one-way ANOVA were used to study the primary outcome of the rationality rate of prescriptions and medical orders, and the secondary outcome of the types of unreasonable prescriptions and the changes in system warning levels. The construction and refined management of PPRS have significantly increased the rationality rate between the total prescriptions (92.53% vs. 99.94%, P < 0.0001) and medical orders (97.77% vs. 99.99%, P < 0.0001). The proportion of prescriptions with high problem proportions decreased significantly after intervention, such as repeated medication (24.94% vs. 3.85%, P < 0.0001). In addition, following implementation, the proportion of prescriptions with usage and dosage issues (34.31% vs. 19.51%) also decreased before intervention. The number of PPRS intercepted alerts has increased annually, with the proportion of Level 3 prescription alerts (12.77% vs. 15.71%) and Level 4 medical orders alerts (42.40% vs. 55.48%) increased, while the proportion of Level 2 alerts for prescriptions (2.61% vs. 0.91%) and medical orders (2.12% vs. 1.04%) generally showed a downward trend, reducing the frequency of invalid alerts. The construction and implementation of PPRS is associated with enhancing the rationality of prescriptions and medical orders. Under the guidance of the refined management pathway, a replicable template has been established to support clinical practice in smart healthcare, reduce invalid alerts, and promote personalized medication.
Current conceptions of special responsibilities often adopt a narrow, individualistic lens that fails to consider the broader socio-relational context. In response to this gap, we propose a concept of relational responsibility that emphasises the interconnectedness of individuals and the wider societal context in which they exist. We posit that assigning relational responsibilities should not solely hinge on the voluntary nature of one's relationships, but rather on the intrinsic value of these connections, as determined by individuals who hold pertinent roles within those relationships or who would be impacted by the definition of value. Our account acknowledges that many responsibilities, especially in caregiving contexts, are not chosen freely, and there should be normative limits to protect individuals from unreasonable burdens. Recognising the role of structural conditions in shaping responsibilities, we argue that collectives with the capacity and resources have an obligation to support individuals by mitigating these burdens and creating just conditions for care. This relational and structural reframing offers a more ethically attuned and practically responsive understanding of responsibility.
Emergency cesarean deliveries should follow the internationally recognized standard of ≤ 30 min between the decision and delivery of the fetus, even though this is impractical in Ethiopia. There is a correlation between higher rates of perinatal morbidity and mortality and longer times between decision-making and delivery of the fetus/s. In Ethiopia, there is no pooled national representative prevalence of the prolonged decision-to-delivery time interval. This systematic review with meta-analysis, which is based on the most recent data, presents the factors affecting the extended decision -to-delivery time interval and the pooled prevalence of it in emergency Caesarean deliveries. This systematic review and meta-analysis was conducted using the Preferred Reporting Items 2020 checklist. We search through the electronic databases PubMed/Medline, Scopus, Web of Science, Hinari, and Google Scholar. We included the six articles based on the reported decision-to-delivery time interval as inclusion criteria. The pooled prevalence of extended decision-to-delivery intervals and the factors associated with them were calculated using a random-effects model. We used Egger’s and funnel plots to further evaluate the publication bias. All statistical analyses were conducted using STATA software, version 17.0. The results of this meta-analysis and systematic review showed that the pooled prevalence of prolonged decision-to-delivery intervals (> 30 min) was 79.20% [95% CI = 73.46, 84.93]. Time spent gathering supplies and equipment necessary to emergency caesarean delivery (6.68, 95% CI = 3.92, 11.36), the day of the decision to have a caesarean delivery (working day, weekend or night time) (4.28, 95% CI = 1.17, 15.66), and the type of anesthesia (spinal or general) (3.53, 95% CI = 2.43, 5.13) were statistically significantly associated with prolonged decision-to-delivery time intervals. Childbirth was not finished within the recommended time intervals (≤ 30 min) in most emergency cesarean deliveries in Ethiopia. Therefore, In order to address institutional delays in emergency cesarean deliveries, Obstetrics care providers and health facilities should be better prepared and ready for rapid emergency action. Additionally, it is important to discourage unreasonable gaps between decision-making and the child’s birth.
ConspectusIn 1981, the year he won the Nobel Prize, Roald Hoffmann together with Kazuyuki Tatsumi published two papers entitled "Metalloporphyrins with Unusual Geometries" that strongly influenced the state of the art in porphyrin and tetrapyrrole research at that time. The 1970s and 1980s saw the dramatic expansion of bioinorganic chemistry, using the tools of molecular coordination chemistry to model complex processes of metalloenzymes and metal cofactors. Synthetic porphyrin ligands emerged as key platforms for high-valent metal-oxo and -nitrido species, metal-metal multiple bonds and the emergence of organometallic chemistry with porphyrins as the supporting ligands. These developments were the drivers for the Hoffmann papers, which featured extended Hückel calculations to expand our then understanding of types of metalloporphyrins for which experimental evidence was just emerging, and which challenged the notion of porphyrin functioning simply as a tetradentate macrocycle with a coordinated metal ion sitting squarely in the middle.Another unquestioned assumption from that time was that porphyrin and tetrapyrrole coordination chemistry was anchored firmly in the d block of the periodic table, a not unreasonable stance given the origins of this field in heme and vitamin B12 featuring iron and cobalt. Surprisingly, the presence of group 2 element magnesium in chlorophyll had not tempted chemists to interrogate more deeply the role of main group elements in tetrapyrrole chemistry, and at the time of the Hoffmann papers, examples of s and p block elements as porphyrin complexes could be almost counted on one hand. In the nearly half century since then, as the chemistry of tetrapyrrole main group complexes has unfolded, major new examples of complexes with unusual geometries have emerged. The chemistry of the d block elements is largely governed by oxidation states and d-electron configurations while in the s and p blocks the fundamental properties of size and electronegativity dominate. Porphyrins and tetrapyrroles offer four nitrogen donors in a square-planar arrangement of fairly fixed radii, not an obvious fit for main group elements with their widely ranging sizes, electronegativities, coordination geometries and bonding types. Main group tetrapyrrole complexes are "misfits" in which the poor match between the ligand environment and the requirements of the coordinated elements stimulates unusual chemistry for both partners.In this Account, I will use the concept of metalloporphyrins with unusual geometries addressed in the Hoffmann papers to look at how tetrapyrroles bearing coordinated main group elements have extended these ideas well beyond those originally envisaged. Main group metals range from lightweight lithium to the p block heavies thallium, lead and bismuth; all are known to form porphyrin complexes, some with dramatic out-of-plane metal coordination. The classic p block elements carbon, boron, and phosphorus challenge the "metalloporphyrin" paradigm; these small, light nonmetals nevertheless exhibit a rich chemistry in a tetrapyrrole setting. The extensive range of diboron porphyrinoids feature tetrapyrroles acting as binucleating ligands, incorporating not one but two elements within the N4 coordination site. Silicon and germanium porphyrins and phthalocynanines demonstrate the interplay between redox properties of the ligand and central element. The underlying theme in this discussion will be the new concepts that can be translated into other areas of the chemical sciences.
With the reduction in police involvement in front-line mental health responses in New Zealand (implemented November 2024), this study evaluates the primary legal justification supporting paramedics in using force to prevent suicide: Section 41 of the Crimes Act 1961. We conducted a qualitative analysis of 20 legal cases (1986-2023) identified through a systematic search. The analysis was structured thematically, focussing on the legal criteria for intervention, the role of de-escalation, the threshold for "reasonable force" and the unresolved conflict between intervention and patient autonomy. Cases were included if Section 41 was central to the legal reasoning or as a justification; cases where Section 41 was cited in passing without substantive discussion were excluded. Case review suggests that while Section 41 can provide a legal justification, its application is highly context dependent. Courts have held that force is justifiable only to prevent immediate and unlawful harm and that it should be proportionate to the threat. Failing to attempt viable, less-restrictive alternatives can render even minimal force unreasonable. Cases involving excessive force demonstrate judicial focus on proportionality and the availability of other options. The analysis highlights a legal and ethical "grey area" concerning patient capacity and the limits of intervention. There is limited case law interpreting Section 41 (20 substantive cases identified between 1986 and 2023). To improve safety for patients and practitioners, we suggest clearer operational guidance, multidisciplinary education and consideration of legislative options that would better support healthcare-led responses to mental health emergencies.
With the rapid development of intensive aquaculture, unreasonable stocking density has become a major factor restricting the healthy growth of grass carp (Ctenopharyngodon idella). This study aimed to evaluate the effects of three stocking densities (0.57, 1.13, and 2.27 kg/m3) on the growth performance, stress response, antioxidant capacity, and immunity of grass carp. Grass carp with an initial body weight of 81.76 ± 17.69 g were randomly assigned to three groups with three replicates. After 75 days of cultivation, we randomly sampled and measured their growth performance. Reagent kits were used to detect serum biochemical indicators, kidney immune enzyme activity, and liver antioxidant indicators in each treatment group. The expression of spleen immune-related genes was detected using real-time quantitative PCR (RT-qPCR). Results showed that the final body weight, weight gain rate, specific growth rate, and condition factor were significantly higher in the medium-stocking-density group (p < 0.05). High stocking density significantly increased serum cortisol, glucose, transaminases, creatinine, and urea nitrogen, and decreased cholesterol and triglyceride levels (p < 0.05). For immune parameters, the activities of immunoglobulin M (IgM), lysozyme (LZM), antimicrobial peptide (AMP), alkaline phosphatase (AKP), and acid phosphatase (ACP) in the kidneys decreased with increasing density. The mRNA levels of IL-1β, IL-6, TNF-α, and IL-10 in the spleen were significantly upregulated, while IgM was downregulated in the high-density group (p < 0.05). Regarding antioxidant capacity, hepatic total antioxidant capacity (T-AOC), catalase (CAT), and glutathione (GSH) levels increased initially and then decreased with increasing density, while malondialdehyde (MDA) content increased continuously. Collectively, these findings suggest that high stocking density induces growth inhibition, oxidative stress, and immune dysfunction in grass carp. The medium density of 1.13 kg/m3 was found to be optimal for the growth and physiological health of grass carp in this study, providing a scientific basis for the optimization of intensive farming strategies.