Introduction Venous thromboembolism (VTE) is a recognized postoperative complication in adults; however, limited data exist regarding VTE after surgery in children. The purpose of this study was to determine whether the incidence and associated clinical characteristics of pediatric VTE differ between orthopedic surgery-related (OSR) and non-orthopedic surgery-related (NSR) procedures. Methods Patients younger than 19 years with imaging-confirmed VTE (ultrasound (US) or computed tomography (CT)) within 30 days of an inpatient surgical procedure between January 1, 2009, and December 31, 2016, at Boston Children's Hospital, a single tertiary pediatric hospital in Boston, USA, were retrospectively identified. VTE was classified as OSR or NSR based on the index surgical procedure. The primary outcome was the incidence of VTE per 10,000 inpatient surgeries in the OSR and NSR cohorts. Secondary measures included VTE anatomic location and associated patient and procedural characteristics, including age, trauma, ambulatory status, central or peripherally inserted central catheter use, malignancy, infection, and clotting disorder or family history of VTE. Group differences were assessed using chi-square or Fisher's exact tests with a significance level of 0.05. Results Among 233,339 inpatient surgical procedures, 86 VTE events were identified, corresponding to an overall incidence of 3.69 per 10,000 cases. The incidence was 5.56 per 10,000 in the OSR cohort and 3.22 per 10,000 in the NSR cohort. OSR VTEs occurred predominantly in the lower extremity, whereas NSR VTEs were more evenly distributed between the lower and upper extremities. Patients with OSR VTEs were older than those with NSR VTEs (mean age of 15.1 versus 7.9 years; p < 0.001). Trauma and non-ambulatory status were more common among OSR VTEs, whereas central or peripherally inserted central catheter use and malignancy were more frequent among NSR VTEs (all p < 0.05). Postoperative VTE prophylaxis was used more frequently in the NSR cohort than in the OSR cohort (p < 0.05). Conclusion Pediatric OSR and NSR VTEs differ in incidence, age distribution, anatomic pattern, and associated clinical context. Recognition of these cohort-specific profiles may support more informed perioperative assessment and vigilance in pediatric surgical patients.
As the aging Indian population expands, anesthesiologists will increasingly encounter frail surgical patients. We aimed to study the prevalence of frailty, its association with sarcopenia, and its impact on postoperative outcomes among older patients (>65 years) undergoing lower limb orthopedic surgeries. Frailty was assessed using the Clinical Frailty Score (CFS), and values were compared with ultrasound-guided quadriceps muscle thickness measurements for sarcopenia, after normalizing for body mass index (BMI) and body surface area (BSA). A comparison of perioperative characteristics and postoperative outcomes with frailty was made using Spearman's correlation and the Chi-square test. An ROC curve was plotted to evaluate the validity of the normalized quadriceps muscle thickness value in diagnosing frailty. The proportion of frailty in our study population was 80.1% with a significant association for sarcopenia (Spearman's correlation coefficient -0.182 and -0.237 for quadriceps muscle thickness normalized for BMI and BSA, P value < 0.001), age (72.35 (64.2, 80.5) versus 68.76 (64.97,72.55) years, P = 0.001), ASA classification (P < 0.001), and surgical type (P = 0.006). Compared to the nonfrail, patients with frailty had higher rates of postoperative ICU admission (P = 0.003), increased lengths of ICU (P = 0.005) and hospital stay (P = 0.005), and more postoperative complications (39.4% vs 12.9% patients, (P = 0.001), OR 3.061 (1.636-5.730)). Frailty occurs in a large percentage of older orthopedic patients and significantly impacts postoperative outcomes.
HIV infection has historically been associated with impaired fracture healing due to immune dysregulation, altered inflammatory responses, and reduced bone mineral density. However, emerging evidence suggests that patients with well-controlled HIV receiving antiretroviral therapy (ART) may achieve surgical outcomes comparable to HIV-negative individuals, even following complex orthopedic trauma. This case report aims to describe the management and outcome of an open distal femur fracture in an HIV-positive patient in a resource-limited setting. We report the case of a 25-year-old male with previously undiagnosed HIV infection who sustained an open, displaced distal femur fracture secondary to a gunshot injury. HIV infection was confirmed using a WHO-recommended sequential rapid testing algorithm. The patient demonstrated moderate immune competence (CD4 count: 400 cells/mm3). Initial management included prompt surgical debridement, copious irrigation, intravenous antibiotics, and temporary immobilization. Definitive internal fixation using a distal femoral locking plate was performed ten days later, alongside initiation of ART (tenofovir, lamivudine, and dolutegravir). Postoperative recovery was uncomplicated. Radiographic follow-up at one and eight months demonstrated complete fracture union and stable fixation without evidence of infection or implant failure. This case supports growing evidence that HIV infection, when appropriately diagnosed and managed, should not be considered a contraindication to internal fixation of open femoral fractures. Timely surgical intervention, adherence to open fracture management principles, and early initiation of ART can result in favorable outcomes, even in resource-limited settings.
Spina bifida is a common neural tube defect and remains a significant public health concern in low- and middle-income countries. Data on neonatal presentation and maternal characteristics in Eastern Morocco are limited. We conducted a retrospective descriptive study over a nine-year period in the neonatal intensive care unit of a tertiary hospital in Eastern Morocco. All neonates diagnosed with spina bifida were included. Data on clinical features, maternal characteristics, management, and outcomes were analyzed. Thirty-three neonates were included. Myelomeningocele was the most frequent type. Common complications included hydrocephalus (45%), motor deficits (30%), and orthopedic abnormalities (39%). Surgical management was performed in more than half of the cases, and some patients required ventriculoperitoneal shunting. Maternal characteristics included inadequate folic acid supplementation, consanguinity, diabetes, and reported consumption of fenugreek during pregnancy. Survival at discharge was generally high, although morbidity remained significant. Neonatal spina bifida remains a significant cause of morbidity in our setting. Maternal characteristics, particularly inadequate folic acid supplementation, underline the importance of preventive strategies. The reported consumption of fenugreek during pregnancy warrants further investigation. These findings highlight the need for improved prenatal care and preventive strategies.
Robotic-assisted total knee arthroplasty (RATKA) has garnered attention in the field of orthopedic surgery. It has been developed to improve surgical precision and prosthesis alignment in comparison to conventional total knee arthroplasty (CTKA). It utilizes advanced robotic workflow systems as opposed to manual jig-based techniques. This review evaluated perioperative and radiographic outcomes to assess the overall safety and effectiveness of RATKA with CTKA. A comprehensive search for randomized and quasi-randomized control trials was conducted across 3 databases, PubMed/MEDLINE, Cochrane Library and Embase, from 1st January 2021 to 1st January 2026. Studies were chosen that compared RATKA to CTKA, where the primary indication was knee osteoarthritis. The primary objectives were operative time, length of stay, adverse events and blood loss. The secondary objectives were hip-knee-ankle (HKA) angle and absolute deviation of HKA angle from 180° (ΔHKA). The review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines (PRISMA). Risk of bias was assessed using the Cochrane risk of bias tool (RoB 2). Quantitative analysis performed using the RevMan 5.4 software package. Results were presented using mean difference (MD), 95% confidence interval (CI), and risk ratio (RR). A total of 12 controlled trials were identified that met the inclusion criteria, with a total of 2269 participants. Meta-analysis revealed a statistically significant difference in operative duration, with RATKA taking longer than CTKA (MD = 23.81, 95% CI = 13.39 to 34.24, P <0.00001). There were no statistically significant differences in hospital stay (MD = 0.11, 95% CI = -0.19 to 0.42, P = 0.47), intraoperative blood loss (MD = 9.00, 95% CI = -9.46 to 27.46, P = 0.34) and adverse events (RR = 0.80, 95% CI = 0.54 to 1.18, P = 0.28). However, a statistically significant difference was identified in postoperative mechanical alignment, favoring RATKA. Postoperative HKA angle (MD: 0.71°, 95% CI: 0.43 to 1.00, P <0.00001) and absolute deviation from 180° ΔHKA (MD = -1.33, 95% CI -2.12 to -0.55, P = 0.009). RATKA is associated with longer operating times but is associated with improved mechanical alignment. The intraoperative blood loss, length of hospitalization and complications were comparable to CTKA. Considering these findings, further studies are required to assess the long-term implications and clinical benefits of RATKA.
Aging is characterized by a decline in functional abilities, strength, balance, flexibility, agility, and coordination due to neurological and muscular changes. Hippotherapy (HPOT) has been recognized for its physical and psychological benefits for older adults. Hence, this study aimed to investigate the effects of hippotherapy on balance in elderly individuals. The novelty lies in using the Biodex Balance System SD for dynamic balance assessment in healthy elderly without neurological disorders. We recruited elderly men aged 60 to 70 who had independent mobility and medical clearance for hippotherapy, ensuring they met criteria such as consistent participation and the absence of pertinent diseases, orthopedic issues, recent surgeries, or medications that could influence outcomes. The dependent variables measured with the Biodex Balance System SD included (I) fall risk, (II) stability index, (III) balance index, (IV) anterior-posterior deviation, and (V) mediolateral deviation. The intervention consisted of 16 sessions of hippotherapy over 8 weeks, with two 30-min sessions each week, during which participants rode a horse. The main findings derived from this study indicate that the intervention significantly reduced Fall Risk in the training group compared to the control (p = 0.000, η = 0.77). Hippotherapy also significantly improved Stability index (p = 0.041, η = 0.15), Anterior-Posterior Sway (p = 0.013, η = 0.39), mediolateral sway (p = 0.002, η = 0.54), and Balance index (p = 0.008, η = 0.43). This suggests a noteworthy distinction across all measured variables. The findings demonstrate that hippotherapy significantly enhances balance in physically active elderly individuals, potentially improving their quality of life. These results will inform professionals working with this population.
Galeazzi fracture-dislocation is a complex injury involving a distal radius fracture and distal radioulnar joint (DRUJ) dislocation, typically treated through surgical intervention. However, persistent DRUJ dislocation remains a common complication following surgery. This report presents two cases of persistent DRUJ dislocation after surgical treatment of Galeazzi fractures. Both patients underwent initial radial fixation with plate and screws but developed limitations in forearm supination and pain due to residual DRUJ dislocation. Radiographs confirmed the dislocation, and subsequent surgeries addressed the causes, including radial malalignment, soft tissue entrapment, and untreated ulnar styloid fractures. Following reoperation, both patients achieved full recovery with pain relief and restored forearm mobility. This case report emphasizes the importance of comprehensive intraoperative evaluation and early detection of DRUJ dislocation to prevent long-term dysfunction and complications.
Adolescent idiopathic scoliosis bracing is best conceptualized as a dose-dependent therapy in which clinical benefit depends on the delivered, rather than prescribed, corrective exposure. This narrative review synthesizes post-2015 evidence to present a clinically oriented framework for "precision bracing" that links brace prescription to objectively measured delivered dose and subsequent clinical response. We summarize evidence that bracing reduces progression risk in skeletally immature patients, while highlighting the efficacy-effectiveness gap driven by incomplete adherence. We review objective adherence monitoring technologies, focusing on temperature-based data loggers and emerging force or pressure sensing approaches, and explain how objective dosing data refine outcome interpretation by distinguishing undertreatment from true treatment failure. We conceptualize brace dose as a multidimensional construct that includes quantity (objectively measured wear time), pattern (regularity), and correction-related indicators (including in-brace and early out-of-brace radiographic metrics and, when available, interface force surrogates). We synthesize key effect modifiers, including skeletal maturity, curve magnitude and phenotype, early radiographic response, and contextual determinants of achieved wear time. We translate this evidence into a practical clinical framework for risk stratification, brace schedule selection, objective monitoring, early reassessment, and escalation when delivered dose is inadequate or response is unacceptable We conclude by outlining research priorities for standardized reporting of delivered dose, pragmatic trials embedded in routine care, and core outcome sets that integrate radiographic and patient-reported outcomes. Many teenagers develop a sideways curve of the spine called adolescent idiopathic scoliosis. For some, wearing a brace can slow down curve progression and reduce the chance of needing surgery. In everyday practice, brace treatment works best when it fits well into daily life and when clinicians, patients, and families can understand how much treatment is actually being delivered. This review looks at bracing as a “dose-dependent” treatment. This means the benefit depends on the dose that is actually delivered, not just the number of hours written on a prescription. We reviewed research published since 2015 on how to measure brace wear objectively (for example, using small temperature sensors), how brace design is becoming more personalised (including computer-designed braces), how full-time and night-time schedules compare, what is known about younger children with scoliosis, and how bracing affects quality of life. Across studies, objective monitoring often showed that real-world brace use differed from self-report. Early wear patterns in the first weeks helped predict later wear. More time in the brace was generally linked to better curve control, and for some outcomes benefits continued above 18 hours per day. At the same time, “hours worn” is not the whole story; how well the brace corrects the spine and how consistently it is worn also matter. These findings support a practical “precision bracing” approach: set clear targets, measure real-world brace use, check early response, address comfort and psychosocial concerns, and adjust the plan promptly to achieve an effective, tolerable dose.
With an annual volume of total knee arthroplasties (TKAs) in the United States projected to exceed 1.5 million by 2050, optimizing perioperative pain control and facilitating early discharge remain clinical priorities. This study evaluated the effectiveness of liposomal bupivacaine (LB) vs conventional bupivacaine/ropivacaine in opioid-naïve patients undergoing primary TKA, comparing/contrasting: (1) pain score improvements, (2) opioid usages, (3) functional recoveries, and (4) hospital lengths of stay. Data from the Innovations in Genicular Outcomes Research (iGOR) registry were used to identify patients undergoing unilateral primary TKA between September 2021 and December 2024. Two hundred twenty-five patients were analyzed, 42 receiving LB and 183 receiving conventional local anesthetics. Patients were stratified based on intraoperative analgesia; pain was assessed using the Brief Pain Inventory-Short Form; opioid use was self-reported; function was measured with the Knee Injury and Osteoarthritis Outcome Score for Joint Replacement. Comparative analyses were performed using generalized linear mixed-effects regressions, with significance P < .05. The LB group showed lower average Brief Pain Inventory-Short Form scores in the early postoperative period (2.6 vs 2.9; P = .03), reduced opioid use at 3 months (24 vs 41%; P < .001), and improved Knee Injury and Osteoarthritis Outcome Score for Joint Replacement scores (65.2 vs 63.9; P = .004). Patients receiving LB were discharged 5.6 hours earlier than controls (15.0 vs 20.6 hours; P = .007). Use of LB is associated with improved pain control, reduced opioid use, and shorter lengths of stay following TKA. This study highlights LB's utility in facilitating early mobilization and advancing perioperative care.
Reverse total shoulder arthroplasty (rTSA) has proven to be a reliable surgical treatment option for a wide variety of shoulder pathologies. Bony remodeling around the humeral component occurs frequently but its impact on clinical outcome remains unclear. We retrospectively reviewed 102 patients who underwent primary rTSA between 2008 and 2020. At final follow-up (FU), patients underwent standardized radiographs and clinical examination, including patient-reported outcome measures and the adjusted Constant Murley score (aCMS). Serial radiographs (pre-operative, 6 weeks and ≥2 years post-operative) were evaluated for: bone remodeling (cortical thickness, pedestal formation, spot welds, reactive lines, osteolysis, radiolucent lines) and implant stem parameters (subsidence, size, alignment and filling ratio). Stress shielding was quantitatively defined as a decrease in combined cortical thickness measured at four sites (2 medial [M1 and M2], 2 lateral [L1 and L2]). Subgroup analysis was conducted to determine correlations between the radiographic findings and clinical outcome. The relationship between stress shielding severity and functional outcome was investigated by using percent-based thresholds to explore potential cutoff values. In 102 patients (mean age 70 years, 71% female, mean FU 65 months), the mean aCMS was 80. Stress shielding occurred in 74%, with combined cortical thickness decreasing from 3.0 to 2.2 mm at final FU; proximal lateral thinning (L1) was most pronounced. A ≥10% cortical thinning threshold at L1 is associated with a lower aCMS (77 vs. 86, P = .051). Stem-related parameters, including filling ratio, did not correlate with cortical thinning. Radiographic bone remodeling such as spot welds (36%), reactive lines (49%), and osteolysis (25%) were frequently observed but have no correlation with clinical outcome. This study assessed the long-term radiographic evaluations of humeral bone remodeling after rTSA. The quantitative assessment of cortical thickness showed that the proximal lateral humerus is most affected by stress shielding. A >10% cortical thinning at this site showed a tendency toward inferior clinical outcome, contrasting with prior reports suggesting no functional impact. Other radiographic bone remodeling was common but did not impair function. Our findings support the standardize assessment of cortical resorption with special attention to the region of the proximal lateral humerus during patient FU.
Femoral diaphyseal fractures are commonly treated with intramedullary nailing; however, malunion remains a clinically significant complication, particularly following delayed presentation, non-operative management, or treatment in resource-limited settings. Femoral diaphyseal malunion may be angular, rotational, longitudinal, or multiplanar and can result in gait disturbance, limb-length discrepancy, patellofemoral overload, secondary joint degeneration, and chronic pain. Given the substantial functional impact, corrective osteotomy is often indicated. This narrative review presents a structured approach to the assessment and management of femoral diaphyseal malunion. Key elements include comprehensive patient evaluation, detailed clinical and radiological deformity analysis, and contemporary operative strategies aimed at restoring alignment, rotation, and limb length while minimising morbidity. Successful correction requires meticulous preoperative planning and careful intraoperative execution, according to available resources. Treatment options include intramedullary nailing, plate fixation, hybrid constructs, and external fixation techniques, each with specific indications. Although circular external fixators allow accurate multiplanar correction, their prolonged use in the femur is poorly tolerated. Fixator-assisted correction has evolved to improve intraoperative control while avoiding long-term external fixation. Modern techniques such as Computer Hexapod-Assisted Orthopaedic Surgery (CHAOS) utilise a temporary intraoperative hexapod frame to achieve precise multiplanar correction, followed by definitive internal fixation within the same procedure. Adjuncts including poller screws, three-dimensional planning, and patient-specific guides further enhance accuracy and stability. Femoral diaphyseal malunion presents complex diagnostic and technical challenges. A systematic, patient-specific approach incorporating modern fixator-assisted and computer-guided techniques enables reliable restoration of alignment and rotation, improving function and quality of life.
Spinal cord injury without radiographic evidence of trauma (SCIWORET) is a rare condition but remains relevant in elderly patients. Herein, we report the rare case of a 78-year-old woman who sustained a spinal cord injury after falling from a bicycle. Computed tomography (CT) at admission revealed an absence of bony abnormalities; however, motor and sensory deficits were observed below the C4 level, corresponding to Frankel grade B. Magnetic resonance imaging (MRI) demonstrated T2-weighted hyperintensity at levels C4-C7, indicative of spinal cord edema. Notably, axial MRI images revealed a loss of flow voids in the right vertebral artery, while contrast-enhanced CT identified a thrombotic occlusion accompanied by subsequent cerebellar infarction. Decompression surgery and coil embolization of the vertebral artery were accordingly performed. Despite early surgical intervention, no change in neurological status was observed, indicating complete spinal cord injury. The patient is consequently undergoing management in an intensive rehabilitation program focusing on optimizing respiratory function and preventing secondary complications. This case underscores the diagnostic value of flow void loss on axial MRI, which serves as a crucial indicator in the evaluation of patients with SCIWORET.
Vascular injury during total knee arthroplasty (TKA) is rare, but potentially catastrophic, leading to ischemia, the need for vascular repair, or limb loss. Robotic-assisted TKA (RA-TKA) incorporates computed tomography (CT)-based three-dimensional planning, intraoperative balancing, and haptic boundaries that may reduce intraoperative saw excursion and mitigate neurovascular risk. This study compared the incidence of vascular injuries between RA-TKA and conventional manual TKA (M-TKA). A retrospective query of a nationwide insurance claims database (2010 to 2022) identified 2,522,651 primary TKAs (21,921 RA-TKA; 2,500,730 M-TKA). The RA-TKA was defined by robotic-assisted procedure codes and CT scan within 60 days of surgery. Vascular injuries within 30 days of surgery were identified using International Classification of Diseases, Ninth/Tenth Revision (ICD-9/10) diagnosis and procedure codes. A 1:5 nearest-neighbor matching algorithm controlled for age, sex, and comorbidities. Before matching, RA-TKA patients differed in age, sex, and comorbidity profiles compared with M-TKA. After matching, the cohorts were well-balanced. Multivariable logistic regressions calculated odds ratios (OR) with 95% confidence intervals (CI) for vascular complications, with significance set at P < 0.05. Popliteal artery injuries occurred in 49 M-TKA (0.002%) and zero RA-TKA (0%) cases. After adjustment, RA-TKA was associated with significantly lower odds of vascular injury compared with M-TKA (adjusted OR 0.21, 95% CI 0.05 to 0.86, P = 0.030). The absolute risk difference was 0.002%, corresponding to a number needed to treat of 50,000. In this large national cohort, RA-TKA was associated with a substantially reduced risk of popliteal artery injury compared with M-TKA. While rare overall, the catastrophic consequences of vascular injury underscore the clinical importance of this association. As TKA utilization rises, robotic platforms may offer a meaningful clinical benefit by helping surgeons reduce the incidence of these devastating neurovascular complications.
The aim of this study is to determine the influence of the center of rotation (CoR) and offset reconstruction in stable and unstable THAs. A prospective matched case-control study included patients who underwent primary THA and had a dislocation treated with closed reduction within two years. Cases were matched 1:2 by age, gender, and ASA classification to stable THA patients with >2 years follow-up. THAs performed for indications other than hip osteoarthritis or with unknown surgical approach/implants were excluded. Coronal CoR reconstruction (vertical and horizontal shifts) and offset were measured on calibrated pre- and postoperative pelvic radiographs. Logistic regression was used to calculate Odds Ratios (OR), 95% Confidence Intervals (CI), and p-values. Confounders altering regression coefficients were adjusted for, yielding adjusted ORs (aOR). 25 unstable THAs were matched to 50 controls. The maximum distance from the preoperative femoral CoR is 6.71 mm in unstable THA and 7.38 mm in stable THA. The maximum distance from the preoperative acetabular CoR is 6.85 mm in unstable THA and 7.22 mm in stable THA. No statistically significant differences between the unstable and stable THA were found for either the shifting of horizontal/vertical CoR and the femoral/total offset. Logistic regression for CoR shifting were not statistically significant, aORs were non-significant as well. No differences in CoR or femoral and total offset reconstruction were found between unstable and stable primary THAs.
Neuraxial ultrasound (USG) can increase the success rate of neuraxial blocks, such as combined spinal-epidural (CSE). The two commonly practiced approaches for CSE are midline and paramedian. However, only a few studies have compared the USG-assisted midline and paramedian approaches. Therefore, we planned this study to evaluate and compare the efficacy of USG-assisted paramedian and midline approach for CSE anesthesia. Eighty american society of anesthesiologists (ASA) grade I/II patients undergoing elective orthopedic surgery under CSE anesthesia were enrolled. Patients were allocated to Group M (midline) or Group P (paramedian group) for the USG-assisted CSE block. The primary outcome was the first-pass success rate. The secondary outcomes were the second-pass success rate, the number of needle insertions, the number of needle redirections, image quality score, scan time, and procedure time. The first-pass success rate was statistically comparable for Groups P and M (26.31% vs. 34.21%; P = 0.469). The number of needle insertions in Group P was statistically lower than in Group M (P = 0.021), and the number of redirections was statistically comparable between the two groups (P = 0.607). The image quality score was comparable for both groups. The scan time required for USG was statistically less in Group P than in Group M (224 sec vs 183 sec, P < 0.0001). The procedure time was, however, statistically comparable between the two groups (P = 0.297). USG-assisted midline and paramedian approaches for CSE have comparable first-needle pass success rates.
The impact of arthroscopic Bankart repair (ABR) alone vs. ABR with remplissage (ABR + R) on athlete return-to-play rates and number of games played after surgery is poorly understood. The objective of this study was to utilize online sports databases to compare number of games played and return-to-play rates between athletes who underwent ABR vs. ABR + R for "on-track" Hill-Sachs lesions (HSLs). We hypothesized that there would be no difference in relative games played nor return-to-play rates post-operatively between patients undergoing ABR vs. ABR + R. Patients aged 14-40 years with "on-track" HSLs who underwent either ABR or ABR + R between 2007 and 2022 for anterior shoulder instability were retrospectively reviewed. Exclusion criteria included revision surgery, <1-year follow-up, "off-track" HSLs, >20% glenoid bone loss, nonathletes, and missing data in online sports databases. Athletes were queried in online sports databases and games played were recorded the season before and 3 seasons after surgery, if available. The primary outcome was relative change in games played in the seasons after surgery compared to the season before surgery ( R e l a t i v e G a m e s P l a y e d = # G a m e s i n S e a s o n A f t e r S u r g e r y # G a m e s i n S e a s o n B e f o r e S u r g e r y ). Secondary outcomes included recurrent anterior shoulder instability, defined as recurrent dislocation and/or subluxation, and return-to-play. Eighty-one patients (ABR: 60 | ABR + R: 21) were included in the analysis, with average age of 18 ± 2 years and average follow-up of 6.5 ± 3.7 years (range, 1.0-14.4 years). "Near-track" HSLs (distance-to-dislocation <10 mm) were present in 20% of the ABR group vs. 76% of the ABR + R group (P < .01). Return-to-play rates were similar between groups (ABR: 75% vs. ABR + R: 81%, P = .58). In the first season after surgery, the ABR group had significantly higher relative games played compared to the ABR + R group (ABR: 1.5 ± 1.5; n = 38 | ABR + R: 0.8 ± 0.5; n = 13, P = .02). This difference did not persist in the second and third seasons after surgery (P > .05). In an adjusted Generalized Estimating Equation model, surgical technique was not significantly associated with relative games played after surgery. Rates of recurrent anterior shoulder instability were not statistically different between groups (ABR: 28% (17/60) vs. ABR + R: 10% (2/21), P = .13). Among athletes with "on-track" HSLs, ABR + R resulted in similar long-term game participation and return-to-play rates and compared to isolated ABR alone. Although patients undergoing ABR + R demonstrated fewer relative games played in the first post-operative season, this difference did not persist in subsequent seasons. Remplissage augmentation may therefore have a temporary impact on early post-operative performance without compromising overall return-to-play outcomes.
The impact of femoral stem fixation on perioperative blood loss and transfusion in total hip arthroplasty (THA) remains controversial. To compare cemented and uncemented stem fixation in elective primary THA regarding perioperative blood loss, transfusion requirements and postoperative complications. We hypothesized that stem fixation would not independently influence transfusion or complication risk after adjustment for patient-related factors. This retrospective cohort study analysed data from a institutional registry including THAs performed between 2016 and 2023. Patients were stratified according to femoral stem fixation (cemented vs. uncemented). Outcomes included intraoperative, total and hidden blood loss (HBL), allogeneic red blood cell transfusion and postoperative complications. Multivariable linear and logistic regression analyses assessed the independent effect of stem fixation, adjusting for age, sex, body mass index, American Society of Anesthesiologists (ASA) class, operative time and preoperative haemoglobin. A total of 648 patients were included (uncemented n = 548; cemented n = 100). Patients in the cemented group were older (79.8 ± 7.4 vs. 72.4 ± 9.1 years) and more frequently female (76.0% vs. 65.8%). Transfusion rates were higher in the cemented group. After multivariable adjustment, cemented fixation was independently associated with lower total blood loss (adjusted mean difference -146 mL; 95% confidence interval [CI] -271 to -21), whereas no independent associations were observed for intraoperative and HBL. Stem fixation was not independently associated with transfusion requirement (adjusted odds ratio [OR] 1.06; 95% CI 0.56-1.99) or postoperative complications (adjusted OR 0.64; 95% CI 0.21-1.98). In elective primary THA, cemented femoral stem fixation is associated with a modest reduction in total blood loss but does not independently influence transfusion rates or postoperative complication risk. Clinically, these findings indicate that fixation should not be selected based on expectations of reduced transfusion risk. Instead, perioperative blood management should focus on patient-related factors, particularly preoperative haemoglobin optimization. Level III, retrospective comparative cohort study.
Infra-isthmal femoral fractures in elderly patients are difficult to stabilize due to the broad metaphyseal anatomy. In the absence of consensus on optimal management, this study assesses a minimally invasive technique combining anterior clamp-assisted reduction with retrograde intramedullary nailing. A retrospective case series was conducted on ten patients treated between January 2023 and April 2024. All underwent percutaneous fracture reduction via a limited anterior approach, followed by retrograde nailing. The mean patient age was 84.9 ± 6.2 years. All fractures were caused by low-energy trauma. Six involved native bone and four were periprosthetic. All cases progressed to union without infection or wound complications. One peri-implant fracture occurred at the proximal tip of the nail and was successfully treated with overlapping plate fixation. This technique was safe and effective for infra-isthmal spiral femoral fractures in elderly, frail patients. It allowed accurate reduction, minimized soft tissue disruption, and resulted in reliable fracture healing.
A 52-year-old gentleman presenting with tachycardia was found to have a large pericardial effusion and cardiac tamponade. This case was on a background of long-standing myelofibrosis managed with ruxolitinib, which was recently withheld for an orthopaedic procedure. He was diagnosed with presumed ruxolitinib discontinuation syndrome (RDS), resulting in a large pericardial effusion due to an increased inflammatory response. He was managed with a pericardial window and prompt recommencement of ruxolitinib and steroids. This case highlights the importance and increasing prevalence of RDS with the serious consequence of cardiac tamponade. Symptoms of RDS should be recognised and managed with recommencement of a JAK inhibitor and steroids.
There is broad consensus that successful repair of severe peripheral nerve injuries requires recreating key structural and cellular features of the natural regenerative process, particularly the action of Bands of Büngner (BoB), longitudinal Schwann cell (SC) structures that guide regenerating axons. Current biomaterial-based strategies have shown limited efficacy, in part because they do not sufficiently reproduce the anisotropic and cellular microenvironment established by BoB, resulting in disorganized axonal growth and reduced regenerative efficiency across long gaps. To address this limitation, a biohybrid scaffold designed to promote Schwann cell self-organization into Büngner-like structures through defined physical cues. Rather than relying solely on biochemical supplementation is developed, this system leverages anisotropic fiber architecture to induce SC alignment and early activation-associated phenotypic modulation. In this study, a self-organizing biohybrid BoB (BBoB) construct formed by Schwann cells within an aligned fiber-based scaffold is presented. It is demonstrated that these engineered structures recapitulate key morphological features of native BoB in vitro and promote enhanced axonal regeneration across a 11 mm sciatic nerve defect in vivo. Together, these findings support the concept that physically programmed Schwann cell organization within biomaterial conduits can enhance peripheral nerve regeneration, using clinically accessible biomaterials and autologous Schwann cells.