Canadian Blood Services implemented a re-entry programme in 2014. Donors deferred because of false-reactive or indeterminate screening tests can provide a specimen for re-entry testing 6 months after their initial test, and resume donating if they test negative for all infectious markers. We evaluated the impact of the programme on donor return and retention, and assessed factors associated with reactive results upon re-entry testing. Using extracted data from our donor database, we followed up donors who tested false-reactive for hepatitis B virus (HBV), human immunodeficiency virus (HIV) and hepatitis C virus (HCV), from the implementation of the programme on 3 February 2014 until 30 June 2025 (n = 8094). We characterized the donors reaching each step of the programme by sex, age, donation status and infection marker. We employed logistic regression to identify important predictors of obtaining reactive results upon re-entry testing. Overall, 17.3% of donors eligible for re-entry testing successfully returned and donated. Re-entry testing participation rates were 35.5%, with repeat and older donors more likely to participate. Among re-entry tested donors, 59.2% tested negative. Most donors continued to donate for multiple years upon re-entry. Upon return, 13.1% tested false reactive on a subsequent donation. False-reactives upon re-entry testing were more likely for first-time donors and less likely when a different assay was employed than for the initially false-reactive. The implementation of a donor re-entry programme has resulted in donor retention and increases in the blood supply. Sending reminder notifications to eligible donors could increase participation.
Systematic evaluations of blood discard patterns remain limited. We analysed decade-long discard trends at a representative blood establishment to identify quality improvement opportunities. Discard data for blood products collected from 1 January 2015 to 31 December 2024 were obtained from the Beijing Red Cross Blood Center, classifying discards as test-related or non-test-related. Pareto charts identified major drivers of discard, and χ 2 $$ {\chi}^2 $$ and Cochran-Armitage trend tests were used. The overall discard rate averaged 3.050%, peaking at 4.398% in 2021 before declining to 2.410% in 2024 (Z = -6.111, p < 0.0001). Non-test-related discards increased significantly from 0.720% in 2015 to 2.636% in 2021 (mean: 1.118%/year; Z = 26.478, p < 0.0001), before recovering to below pre-pandemic levels by 2024. Test-related discards decreased significantly from 2.254% in 2015 to 1.663% in 2020 and continued to decrease through 2024 (mean: 1.921%/year; Z = -27.454, p < 0.0001). Major discard drivers included lipaemic blood retest (LBR) in plasma, crossmatch incompatibility (CI) in red blood cell (RBC) products and alanine aminotransferase (ALT) retest failure. Plasma bag leakage (BL), platelet visual inspection abnormality (VIA), expired units and Treponema pallidum antibody (TP-Ab) and hepatitis C virus antibody (HCV-Ab) test-related discards, all showed significant downward trends (p < 0.001). Divergent trends between test-related discards (declining from 2.254% to 1.663%) and non-test-related discards (increasing from 0.720% to 2.636%) indicate that although laboratory screening has improved substantially, greater attention is needed to optimize collection, processing and handling practices.
Iron deficiency (ID) and iron deficiency anaemia (IDA) are prevalent conditions impacting various patient populations, both surgical and non-surgical conditions. The advent of patient blood management (PBM) has promoted intravenous (IV) iron therapy as an alternative to oral iron and blood transfusions. However, concerns remain regarding IV iron's potential association with infection risk. This narrative review critically examines the relationship between parenteral iron therapy and infection risk across various clinical settings. It evaluates various IV iron formulations, their benefits, safety profiles and potential adverse effects, particularly infection-related complications. A structured literature search was conducted across PubMed, EMBASE, Medline and CINAHL (2014-2024) using pre-defined keywords. Observational studies and clinical trials relevant to IV iron formulations and infection risk were analysed. IV iron therapy effectively improves haemoglobin levels and reduces transfusion dependence. Studies in cardiovascular, renal, antenatal and surgical populations suggest that it is difficult to conclude that IV iron therapy significantly increases the risk of infection. Older formulations, high-dose IV iron therapy and various underlying conditions may elevate infection susceptibility due to increased levels of non-transferrin-bound iron. Emerging formulations, such as ferric carboxymaltose and ferric derisomaltose, appear to have a more favourable safety profile. While IV iron remains a cornerstone in ID management, patient-specific risk factors must be considered. Further research is needed to clarify infection risk variations among different IV iron formulations and patient populations. Optimizing IV iron therapy through individualized approaches may enhance its clinical benefits while minimizing potential adverse effects.
暂无摘要(点击查看详情)
Daratumumab, a therapeutic human anti-CD38 monoclonal antibody, improves multiple myeloma outcomes but interferes with pre-transfusion testing by binding CD38 on reagent red blood cells (RBCs), potentially masking clinically significant alloantibodies. This study aimed to evaluate the effectiveness of a novel high-affinity anti-idiotypic anti-daratumumab reagent in neutralizing daratumumab interference while preserving RBC alloantibody detection and ensuring compatibility with routine transfusion laboratory workflows. A non-interventional, multicentre study across 28 transfusion laboratories in 15 countries evaluated a novel anti-idiotypic anti-daratumumab reagent (DaraClear). In total, 443 daratumumab-containing plasma samples and 197 RBC alloantibody-containing samples were tested. All samples were initially tested at 10% (v/v), with stepwise escalation to 20% and 30% only if neutralization was incomplete. Neutralization efficiency, antibody detection, cross-match resolution and workflow integration were assessed, alongside comparisons with dithiothreitol (DTT), papain, trypsin and DaraEx. Daratumumab interference was neutralized in 99.5% of samples, with 86.2% resolved using the 10% protocol. Detection of 190/197 RBC antibodies (96.4%) was preserved, including weak/low-titre antibodies. Neutralization was superior to DTT (93.3%), papain/trypsin (84.9%) and DaraEx (68.4%; all p < 0.0001). Titration studies confirmed efficacy across various ranges of daratumumab titres. The reagent integrated seamlessly into workflows, remained stable over time and avoided RBC modification. Daratumumab interference was efficiently neutralized while preserving alloantibody detection. The robust performance and workflow compatibility provide a practical solution for pre-transfusion testing in daratumumab-treated patients, supporting safe and timely transfusions with reduced laboratory burden.
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
The For the Assessment of Individualized Risk (FAIR) framework, introduced by NHS Blood and Transplant (NHSBT) in 2021, aims to reduce stigma and improve equity in blood donor selection, particularly for gay, bisexual and other men who have sex with men (GBMSM). While pre-exposure prophylaxis (PrEP) is highly effective at preventing sexual transmission of human immunodeficiency virus, its declared use excludes individuals from blood donation. This study examined PrEP use among male blood donors with current or past syphilis in England to evaluate guideline compliance and implications for blood safety. Residual plasma samples from syphilis-positive male blood donors collected in 2023 were tested for PrEP. These data were combined with two previous studies of syphilis-positive donors conducted between July 2018 and June 2024, incorporating demographics and reported PrEP use. The rate of syphilis-positive blood donations increased from 4.09 to 10.32 per 100,000 donations between 2018 and 2024 (p = 0.048, Mann-Kendall trend test) with a rising proportion of past syphilis cases attributed to GBMSM (18%-37%; p = 0.004, Fisher's test, p = 0.001 Mann-Kendall test); 7.1% of syphilis-positive blood samples from male blood donors tested positive for PrEP in 2023, indicating frequent non-compliance with donation guidelines. Persistent PrEP use among syphilis-positive donors since 2018 suggests gaps in donor education regarding eligibility. Targeted public health interventions, particularly for younger GBMSM, are needed to strengthen sexual health education, PrEP messaging and awareness of donation criteria. Further research into other infections associated with high-risk sexual behaviour is warranted.
Within the digital transformation of medicine, transfusion medicine has quietly become a big-data discipline. The long-standing tradition of blood product standardization (e.g., ISBT-128) and large donor cohorts being followed over years-some of which are sampled in national biobank projects, build a favourable setting. In parallel, recent advances in artificial intelligence (AI) and data integration facilitate efficient data use for research and clinical care. Consequently, next-generation blood services might monitor donor phenotype data and match this information to AI-predicted recipient demands and their outcomes. Here, we attempt to provide a comprehensive introduction to the possibilities and challenges of big data and AI in transfusion medicine along with data integration opportunities related to the Fast Healthcare Interoperability Resources standard. We educate on the principles of AI and the digital transformation of transfusion medicine and analyse the evidence of blood establishments as digital platforms. We illustrate possible roadmaps for data integration and how federated learning initiatives and national networks may scale value while preserving donor and patient privacy. Finally, we exemplify the ongoing transformation with precision red blood cell (RBC) diagnostics using lab-on-a-chip and the digital crossmatch. The practice of transfusion medicine is undergoing transformation and experimentally appears to profit from synergies in precision diagnostics and AI. Its translation into routine practice remains a challenge for the current decade to leverage the full potential of blood establishments as 'big-data engines'.
The Paris 2024 Olympic Games posed a unique challenge due to their scale, associated risks and the need for robust healthcare preparedness. This review outlines the forecasting and anticipatory measures taken by the Etablissement français du sang (EFS) to ensure a resilient blood supply chain throughout the event. A steering committee was formed 2 years in advance to coordinate risk assessments, operational planning and institutional collaboration. The EFS aimed to maintain daily reserves of 90,000 red blood cell (RBC) units, anticipating a 25% shortfall in the Île-de-France region, which was to be balanced by increased contributions from other regions. To support operations, logistical strategies involved the prepositioning of supplies, securing transport routes and reinforcing both trauma centres and Olympic venues. Staffing was also adapted, with changes to work schedules, an extended summer leave period and provisions for remote work at EFS headquarters. In anticipation of seasonal infectious risks, nucleic acid testing (NAT) was implemented for West Nile virus (WNV) and dengue (DENV) in high-risk areas. Cybersecurity measures were also reinforced through strengthened information technology infrastructure and integration with the national cyber crisis response system. These proactive measures proved effective: blood stocks remained stable, only a few arboviral infections were detected and no major cyber incidents occurred. The Paris 2024 experience emphasizes the importance of early, coordinated and cross-sectoral planning in safeguarding national blood supplies during mass events. The centralized structure of the EFS, along with its integration into public health systems and past experience with major events, enabled uninterrupted and resilient service delivery.
Vasovagal reaction (VVR) is the most frequent adverse reaction in blood donation, which usually recovers soon without complications. However, syncopal VVR can result in severe injury due to a fall. Some reports have shown the possibility to prevent VVR with salt-loading prior to blood donation. This semi-randomized controlled study aims to clarify the preventive effect of salt-loading on VVR among whole blood (WB) donors. During the study period, donors who donated 400 mL of WB on odd days were assigned to the salt-loading group and those who donated on even days to the control group. Participants in the salt-loading group were asked to take three tablets containing 0.3 g salt in total. The incidence of VVRs occurring >20 min after needle removal (delayed VVR [dVVR]) did not significantly differ between the two groups. Secondary analysis suggested a possible dose-dependent preventive effect against dVVR (odds ratio [OR]: 0.70; 95% confidence interval [CI]: 0.52-0.94; p = 0.02). The number of delayed syncopal VVR episodes in summer was lower in the salt-loading group than in the control group (n = 1 vs. 8; OR: 0.13; p = 0.04; pre-specified α = 0.0125). There was no significant preventive effect of pre-donation salt-loading on VVR. Further investigations with sufficient number of participants are needed to conclude the suggested dose dependency and seasonal preventive effect on dVVRs.
Liquid plasma (LQP) stands out as an alternative to thawed plasma (TP) for emergent transfusions due to its longer shelf-life. We aimed to measure fibrinogen, protein C (PC), protein S (PS), factor V (FV), factor VII (FVII) and factor VIII (FVIII) activity in LQP, quantify how these factors' levels change during storage and characterize how they compare in LQP and TP. Coagulation factor activities were measured on days 15, 26 and 27 for LQP (n = 26) and Day 5 for TP (n = 31). Bayesian statistics was used to compare coagulation factor activity and quantify changes in activity during storage. Fibrinogen and PC activity on Day 26 in LQP (LQP26) was comparable to that on Day 5 in TP (TP5) with posterior mean activity of 257 versus 246 mg/dL and 100.4% versus 108.7%, respectively. FV, FVII and FVIII had lower activity in LQP26 compared to TP5, with posterior mean activities of 42.6% versus 72.0%, 55.0% versus 59.7% and 48.8% versus 59.2%, respectively. PS in LQP26 was low, with posterior mean activity of 28.0%, which was less than half that of TP5 at 66.4%. From Day 15 to Day 26, FVII in LQP decreased at a rate of 3.49% per day, whereas fibrinogen, PC, PS, FV and FVIII activity in LQP remained relatively stable. LQP26 has comparable activities of fibrinogen, PC and FVII as TP5, lower activities of FV and PS and slightly lower activity of FVIII. LQP is a viable alternative for use in emergency transfusions and massive transfusion protocols.
To maintain a stable blood supply while protecting donor and recipient health, the World Health Organization (WHO) recommends minimum inter-donation intervals of 12 weeks for men and 16 weeks for women. These recommendations are merely based on observational studies reporting reduced haemoglobin (Hb) and iron levels and increased deferrals in repeat donors. This systematic review aims to examine the best available experimental evidence on how whole-blood donation frequency impacts donor health. PubMed (Medline, PMC and NCBI Bookshelf), Embase and the Cochrane Library were systematically searched by two reviewers independently. Risk of bias and certainty of evidence were evaluated using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) approach. Four records on three randomized controlled trials (RCTs) were included. The results showed that donating every 12 weeks may reduce Hb and ferritin levels compared to not donating. Similarly, shortening inter-donation intervals (from 12 to 8 or 10 weeks for men and from 16 to 12 or 14 weeks for women) may lead to decreased Hb and ferritin levels, increased low-Hb deferral and increased reporting of tiredness, fainting, dizziness and restless legs. Evidence was of low to very low certainty, and limitations in study design and imprecision prohibit strong recommendations on optimal inter-donation intervals. In the absence of more robust evidence and pending future high-quality RCTs, it is prudent to adhere to the current WHO guidelines on minimum inter-donation intervals for whole-blood donation, that is, 12 weeks for men and 16 weeks for women, as a precautionary measure.
The functions of cold-stored platelets (PLTs) have recently been re-evaluated. In Japan, cold storage may be started immediately after blood collection (Day 0, control group), after the period for improving bacterial screening sensitivity has ended (Day 2, DCS2) and after the expiration date of room-temperature storage has ended (Day 5, DCS5). The present study investigated the effects of different cold storage start dates on PLT function. Concentrated PLTs, collected using an apheresis device, were placed in an additive solution (Terumo-platelet additive solution+, Terumo BCT) and then divided into three groups with different cold storage start dates (days 0, 2 or 5). The three groups were stored until Day 21, and various PLT function parameters-including aggregation (adenosine diphosphate [ADP] + collagen-induced and ristocetin-induced aggregation), viscoelasticity (maximum clot firmness [MCF], clotting time and α-angle) and PLT activation (CD62P, PAC-1, CD63 and Annexin V)-were measured and compared. Comparisons of results until Day 14 revealed that among the PLT function parameters examined, only ADP + collagen-induced aggregation and MCF showed a deterioration in DCS2 and DCS5. However, aggregation and MCF values in DCS2 and DCS5 on Day 10 were not markedly different from those in the control group on Day 14 (US expiration date). PLTs stored at room temperature for 2 or 5 days after collection and then refrigerated may be preserved until Day 10.
China faces a tight balance between clinical blood supply and demand. Vasovagal reactions (VVRs) hinder donor recruitment and retention. This study aimed to develop and validate a predictive model for the early identification of high-risk donors to guide targeted interventions. This unmatched case-control study enrolled whole-blood donors from a blood station in Xinjiang, China. VVR was defined according to the International Society of Blood Transfusion (ISBT) criteria for donation-related complications. Significant predictors identified by multivariable logistic regression were used to develop a prediction model. Model performance and calibration were evaluated using receiver operating characteristic (ROC) analysis and the Hosmer-Lemeshow (H-L) test. A total of 319 donors experienced VVRs, and 1276 served as controls. The final prediction model incorporated nine significant predictors, categorized as follows-demographic factors: age, pulse, systolic blood pressure (SBP) and body mass index (BMI); psychological factors: perceived stress level; and behavioural factors: number of donations, regular exercise, postprandial interval (≥4 h) and average sleep duration. The model demonstrated good discrimination, with areas under the ROC curve of 0.830 and 0.827 in the modelling and validation sets, respectively. Good calibration was indicated by H-L test results (modelling set: χ2 = 13.817, p = 0.129; validation set: χ2 = 8.698, p = 0.466). The constructed prediction model is effective, providing a reference for blood station staff to assess the risk of VVRs in whole-blood donors and implement early interventions.
Accurate detection of transfusion-transmissible infections, such as human immunodeficiency virus (HIV) and hepatitis C and B viruses (HCV/HBV), is critical to ensure blood safety, and screening assays must demonstrate high specificity and sensitivity. In this study, we compared the performances of Elecsys® HIV Duo, HCV Duo, Anti-HCV II and HBsAg II immunoassays with those of comparator assays HIV Ag/Ab Combo, Anti-HCV II and HBsAg on the Alinity® s platform and HIV Ag/Ab Combo, Anti-HCV and HBsAg Qualitative II on the Alinity i platform. Approximately 2050 plasma samples from first-time blood donors and commercial seroconversion panels were used. Specificity was assessed as the proportion of true non-reactive samples identified by each assay. Seroconversion sensitivity was evaluated based on the detection time and average interval from nucleic acid testing (NAT)-positivity to assay reactivity. Overall, all Elecsys assays evaluated had similar specificities as the corresponding Alinity s/i assays, although the absolute difference in specificity between Alinity i Anti-HCV assay and the other HCV assays was statistically significant. The seroconversion sensitivities of Elecsys HIV Duo and HBsAg II assays were similar to the corresponding Alinity s/i assays. For HCV, Elecsys HCV Duo assay detected infection earlier than Alinity s/i assays for most panels (87.0% and 91.8%, respectively), with an average detection time from NAT positivity for HCV RNA of 1.5 days versus 21.4 and 23.8 days, respectively. The robust specificities and early detection capabilities of the evaluated Elecsys assays support their routine use in blood donor screening and diagnosis.
Patients with haematological malignancies are at high risk of anaemia due to disease and treatment factors, making patient blood management (PBM) strategies critical. We aimed to identify the barriers and enablers to implementing the principles of PBM to conserve patients' own blood and reduce the risk of iatrogenic anaemia in inpatients receiving treatment for haematological malignancies. A cross-sectional study of inpatient clinicians was conducted using a survey. Acute care clinicians (nursing and medical) from two wards that treat haematological malignancies at one hospital were invited to complete an online survey between December 2024 and January 2025. The 31 questions (answered on a 5-point Likert scale) were designed to assess awareness, enablers and barriers related to PBM. The survey was conceptually aligned to the 14 domains of the Theoretical Domains Framework and the six domains of the Capability, Opportunity, Motivation-Behaviour Model. Responses were condensed into barrier, neutral and enabler categories, and proportions were calculated. The questionnaire was completed by 80 participants (52% response rate), 82% being nurses. Barriers were found in five domains, namely reinforcement, knowledge, behavioural regulation, training and goals. Enablers were found in 10 domains: beliefs about consequences, intentions, role and identity, emotion, skills, environment, social influence, beliefs about capabilities, memory and equipment. This study highlights the barriers and enablers to implementing PBM strategies into inpatient haematology practice. Future implementation studies should address these enablers to assist in overcoming the barriers of implementing PBM strategies in this vulnerable patient cohort.
Complexity science investigates how complex systems behave and how we interact with them. Its implementation in clinical transfusion research is limited, even though human cells, organ systems, bodies, hospitals and blood supply systems are all examples of complex systems. Complexity science can help us to better understand the systems we study using novel and existing research methods. It also highlights that there is always unpredictability in complex systems, challenging us to both accept and cope with this uncertainty. This review provides a brief introduction of complex systems, explores why many transfusion-related research questions should be viewed through the lens of complexity and discusses how we can apply this perspective to transfusion research.
Provision of red cells for intrauterine transfusion (IUT) can be challenging when the mother has an antibody to a high frequency antigen, as compatible allogeneic red cells may be scarce. The aim was to validate a novel component: split, frozen, thawed, washed and irradiated red cells for IUT, which could be used if foetal transfusion with maternal blood is necessary. Six standard red cell concentrates were glycerolized on Day 5 and split into three smaller packs before freezing. A split from each donation was thawed and deglycerolized using the ACP215 and resuspended in saline-adenine-glucose-mannitol. The haematocrit was adjusted to IUT (70%-85%). The components, each approximately 80 mL, were immediately irradiated and tested for full blood count, haemolysis and supernatant potassium 8 and 24 h later. At 8 and 24 h post manufacture, haemolysis was 0.20% (0.16%-0.25%) and 0.24% (0.18%-0.29%), respectively, and supernatant potassium was 20.3 mmol/L (15.1-27.1 mmol/L) and 43.1 mmol/L (33.5-58.4 mmol/L), respectively (mean [range]). Both haemolysis and potassium were within levels for standard irradiated IUT units. Red cells split, frozen, thawed, washed and manufactured into IUT units and immediately irradiated are expected to be a beneficial component for very rare clinical cases.