Background Aortic stenosis is a progressive valvular disease strongly associated with aging and high mortality once symptoms develop. Surgical aortic valve replacement has traditionally been the standard treatment for symptomatic severe disease; however, transcatheter aortic valve replacement (TAVR) has emerged as an effective, less invasive alternative for patients with increased surgical risk, supported by evidence from major clinical trials. Since its introduction in Nicaragua, procedural volume has steadily increased. This study aimed to describe the initial national experience with TAVR in patients with severe aortic stenosis in Nicaragua. Materials and methods This retrospective descriptive case series included 16 patients with confirmed severe aortic stenosis who underwent TAVR between July 2023 and March 2024 at a single center in Managua, Nicaragua. Baseline clinical and echocardiographic data were obtained from medical records. Outcomes were evaluated using Valve Academic Research Consortium-3 (VARC-3) criteria, including technical success, device success, and early safety at 30 days. Data were analyzed using IBM SPSS Statistics 25 (IBM Corp., Armonk, NY) with categorical variables reported as n (%), continuous variables as mean ± SD, and pre- versus post-procedural mean transaortic pressure gradients compared using the Wilcoxon signed-rank test. Statistical significance was defined as p ≤ 0.05, which was calculated through Fisher's exact test (dichotomous variables) and Fisher-Freeman-Halton exact test (polytomous variables). Institutional review board approval was obtained. Results The majority of participants were in the 61-70 year age group, accounting for 7 (43.8%) patients, with a mean age of 68.9 ± 8.7 years; 9 (56.3%) were male, and 9 (56.3%) were overweight. The most prevalent comorbidities were hypertension in 15 (93.8%) and diabetes mellitus in 6 (37.5%). Dyspnea was the most common symptom, reported in 15 (93.8%), followed by heart failure in 9 (56.3%) and angina in 8 (50%). Baseline hemodynamics showed a mean transaortic gradient of 55.7 ± 21.1 mmHg, with 14 (87.5%) presenting gradients >40 mmHg, and a mean aortic valve area of 0.68 ± 0.3 cm². Self-expanding valves were implanted in 15 (93.8%), predominantly Acurate Neo 2 in 13 (81.3%) (Boston Scientific Corp., Marlborough, MA). Overall, 13 (81.3%) procedures met VARC-3 criteria for technical success. No intraprocedural or 30-day deaths were recorded. At 30 days, device success was achieved in 12 (75%), early safety in 15 (93.8%), and intended valve performance was satisfactory in 14 (87.5%), with a post-procedural mean gradient of 9.08 ± 6.3 mmHg and no moderate-to-severe aortic regurgitation. The median transaortic pressure gradient decreased from baseline to 30 days after TAVR, from 52.5 (IQR 41.5-67.5) to 9.0 (IQR 3.47-11), with p<0.001 indicating a statistically significant change and a substantial clinical impact on patients' hemodynamic status. Permanent pacemaker implantation and paravalvular leak occurred in 1 (6.3%) patient each. Conclusion In this initial single-center experience in Nicaragua, TAVR was feasible and achieved favorable short-term outcomes, with significant improvement in transaortic hemodynamics at 30 days. These findings support the early safety and effectiveness of TAVR in this setting, although larger prospective studies are needed to confirm results and improve generalizability.
Atraumatic splenic rupture (ASR) is a rare but life-threatening surgical emergency that most commonly occurs in pathologically abnormal spleens. We present a unique case of ASR in a middle-aged man with recent use of performance-enhancing compounds, managed with both embolization and surgery. This case highlights an unusual potential contributing risk factor and underscores the importance of prompt diagnosis and definitive management of ASR. A 54-year-old male presented to the emergency department with acute left upper quadrant abdominal pain and vomiting. Notably, he had recently been using MK-677 (ibutamoren), a growth hormone secretagogue, and RAD-140 (testolone), a selective androgen receptor modulator (SARM), for bodybuilding purposes. The patient denied any history of trauma. Computed tomography revealed a large perisplenic hematoma consistent with splenic rupture. Following initial resuscitation, emergent splenic artery embolization was performed to control hemorrhage. This intervention stabilized the patient transiently; however, he ultimately required an emergency laparotomy with total splenectomy. Histopathological examination demonstrated large areas of splenic infarction, with features raising the possibility of an underlying vascular malformation.  These findings underscore the importance of recognising ASR in patients with acute abdominal pain and vascular risk factors. We explore the potential and speculative role of performance-enhancing compounds in precipitating splenic complications, drawing parallels to the established association between traditional anabolic steroids and peliosis - blood-filled vascular cavities that predispose to rupture. While RAD-140 and MK-677 have not been previously linked to splenic pathology, their pharmacological effects on androgen receptor signalling and IGF-1 pathways warrant consideration as possible contributing factors, particularly in the context of a suspected pre-existing vascular malformation. Early recognition, aggressive resuscitation, and timely surgical intervention remain critical, as delayed diagnosis carries significant mortality.
Background Proton pump inhibitors (PPIs) are generally employed to treat acid-related disorders. However, chronic acid blockade can reduce the absorption of essential micronutrients, such as vitamin B12, magnesium, calcium, and iron, potentially contributing to clinically significant deficiencies as defined by standard laboratory reference ranges. Methods A retrospective observational study was conducted using electronic medical records collected between July and December 2025. The study included adults (18 years and older) who had been taking PPIs continuously for six months or more. Nutrient deficiencies were defined according to institutional laboratory reference ranges, with the following cut-off values: vitamin B12 <200 pg/mL, magnesium <1.7 mg/dL, calcium <8.6 mg/dL, and ferritin <30 ng/mL. Both centers used identical reference ranges and standardized laboratory protocols, and all results were cross-checked to minimize inter-laboratory variability; any inconsistent or repeated values were excluded from analysis. The associations between PPI type/duration and deficiency status were analyzed using the Fisher-Freeman-Halton exact test, the chi-square test, and the Kruskal-Wallis test (p<0.05). Findings The sample consisted of 255 participants: 139 (54.5%) female patients and 116 (45.5%) male patients. The patients were on omeprazole (n=101; 39.6%), pantoprazole (n=64; 25.1%), esomeprazole (n=61; 23.9%), and lansoprazole (n=29; 11.4%). The prevalence of vitamin B12 deficiency varied by PPI type: 39 (38.6%) of omeprazole users and 36 (59.0%) of esomeprazole users were vitamin B12 deficient (p=0.018). Users of esomeprazole had the highest prevalence of magnesium deficiency (n=24; 39.3%) vs users of omeprazole (n=12; 11.9%; p=0.004). Calcium deficiency according to the type of PPI also differed, wherein eight (7.9%) patients on omeprazole and nine (31.0%) patients on lansoprazole were affected (p=0.021); the median serum calcium was found to be between 9.06 mg/dL and 8.71 mg/dL (H=11.29, p=0.010). There were no significant differences in iron/ferritin deficiency according to the type of PPI (33/101 (32.7%) in omeprazole users vs 29/61 (47.5%) in esomeprazole users) or PPI duration (p≈0.12). Conclusion Long-term use of PPIs was associated with a high prevalence of micronutrient deficiencies, especially B12, magnesium, and calcium, with substantial differences by PPI type. These results highlight the importance of routine nutritional monitoring in patients receiving long-term PPI therapy.
Introduction The development of indices enables the accurate identification of issues by consolidating the key factors associated with an outcome, which in turn allows for better prioritization of risks. Therefore, the purpose of creating indices is to generate information that supports evidence-based analysis, facilitating accurate decision-making and addressing risks through an understanding of the underlying components, thereby allowing decision-makers to develop prevention strategies and programs. The objective of this research was to develop and validate an index using the statistical technique of "conjunctive consolidation" to assess factors related to the quality of work life (QoWL) among nursing staff, providing a simple method that can be applied for specific predictions or decisions. Methods A methodological study was conducted using "conjunctive consolidation" analysis to develop an index of exposure factors associated with unsatisfactory QoWL among nursing staff at a tertiary care hospital. The index was created by consolidating the associated factors and assessing validity through convergence with constructs such as perceived institutional safety, musculoskeletal disorders, and mental health outcomes for each consolidation, as well as by evaluating the index's performance using a receiver operating characteristic (ROC) curve. Results The consolidations used to develop the index were as follows: consolidation of psychosocial perception, comprising work alienation (odds ratio (OR): 3.8, 95% confidence interval (CI): 1.9-7.4) and reports of family aggression (OR: 3.5, 95% CI: 1.6-8.0); consolidation of exposure due to work activity, comprising forced movements (OR: 4.16, 95% CI: 2.13-8.15) and overexertion (OR: 4.43, 95% CI: 2.20-8.92); and consolidation of exposure due to work instruments, comprising exposure to drug preparation dust (OR: 3.61, 95% CI: 1.79-7.27) and exposure to radiation (OR: 3.66, 95% CI: 1.80-7.45). The diagnostic capacity of the index score was demonstrated by an area under the curve (AUC) of 0.79 (95% CI: 0.73-0.86, p < 0.001) on the ROC curve. Conclusions The development and validation of the index demonstrate that it can effectively measure QoWL among nursing staff and provide decision-makers with guidance on associated risks in hospital settings.
Large vessel occlusions associated with ipsilateral high-grade extracranial internal carotid artery stenosis, also called tandem occlusions, remain under investigation for optimal management. The risk of in-stent thrombosis is heightened for those who undergo carotid stenting in conjunction with mechanical thrombectomy (MT). Tirofiban, an intravenous antiplatelet agent functioning by binding the GIIb/IIIa receptors, inhibits platelet aggregation and has been used in cases of tandem occlusions to reduce the risk of in-stent thrombosis. Unfortunately, the bleeding risk is elevated in patients who receive intraoperative antithrombotics during stenting procedures, and especially those who undergo reperfusion therapy with prior fibrinolytic therapy. We present a case of tirofiban used to reduce the risk of in-stent thrombosis during MT and stenting of a carotid terminus tandem occlusion in a 64-year-old man. We aim to compare our case with currently published data on tirofiban use during similar procedures, with regard to indications for treatment, hemorrhagic events, and functional outcomes.
 The management of ruptured dissecting cerebral pseudoaneurysms (RDPs) remains a significant neurosurgical challenge. Flow-diversion devices (FDDs) have emerged as an effective therapeutic option; however, their use typically necessitates dual antiplatelet therapy (DAPT), which carries an inherent risk of hemorrhagic complications. This concern has led to growing interest in single antiplatelet therapy (SAPT) with ticagrelor, given its more predictable pharmacodynamic profile. Nevertheless, the evidence supporting the use of ticagrelor-based SAPT in this setting remains limited. This study evaluated the efficacy and safety of single antiplatelet therapy (SAPT) with ticagrelor among patients following flow-diversion treatment for ruptured cerebral pseudoaneurysms (RDPs). In this study, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed for screening and selecting research articles. Electronic databases used for data extraction were PubMed, ClinicalTrials.gov, and Cochrane Library. The research timeline was set from January 2015 to September 2024. The Newcastle-Ottawa Scale (NOS) was applied to assess the quality of observational studies. This study used the Review Manager (RevMan) software version 5.4 (London, England: Cochrane Collaboration) for pooled analysis. Through a pooled analysis of six cohort or observational studies involving 1118 patients, the findings indicate that single antiplatelet therapy (SAPT) with ticagrelor may be comparable to standard dual antiplatelet therapy in managing thrombosis risk. The pooled analysis showed that the composite outcome of thromboembolic events, major hemorrhagic complications (OR: 0.86, 95% CI: 0.53-1.38), and hemorrhagic complications alone (OR: 0.58, 95% CI: 0.21-1.62) were not significantly different between patients receiving single antiplatelet therapy (SAPT) with ticagrelor and those receiving control antiplatelet regimens (including aspirin and clopidogrel), although the point estimates favored ticagrelor-based SAPT. Similarly, no statistically significant differences were observed in thromboembolic events (OR: 1.15, 95% CI: 0.57-2.32) or mortality (RR: 1.17, 95% CI: 0.21-6.39) between the treatment and control groups. Overall, this meta-analysis suggests that single antiplatelet therapy (SAPT) with ticagrelor may be a feasible and potentially safe alternative to dual antiplatelet therapy in patients undergoing flow-diverter treatment for ruptured dissecting cerebral pseudoaneurysms (RDPs). The available evidence indicates comparable safety profiles, although the certainty of these findings remains limited. Therefore, well-designed randomized controlled trials and prospective comparative studies are warranted to more definitively evaluate efficacy, safety, and optimal patient selection.
Background Tennis is a physically demanding sport characterized by explosive movements, repeated trunk rotations, and asymmetric loading of the dominant upper limb. Overuse injuries are common, with dominant-arm elbow pain (DEP) and low back pain (LBP) representing frequent musculoskeletal complaints among tennis players. Biomechanical differences between the one-handed backhand (1HB) and the two-handed backhand (2HB) may influence the distribution of mechanical loads on the upper limb and spine; however, their relationship with clinically relevant symptoms remains unclear. This cross-sectional observational study aimed to investigate the association between the backhand technique and the prevalence of DEP and LBP in a heterogeneous population of tennis players. Methodology An anonymous online questionnaire was distributed through tennis clubs and social media between February and November 2025. A total of 455 responses were collected, of which 445 complete questionnaires were included in the final analysis. Results The mean age of the participants was 35.1 ± 17.9 years, and 73.9% were male. The median weekly training volume was five hours (interquartile range = 3-11). In crude analyses, elbow pain was more frequent among players using the 1HB (56.5% vs. 32.9%, p < 0.001), while LBP was slightly more common in the same group (59.5% vs 49.5%, p = 0.049). However, multivariable Poisson regression models adjusted for age, sex, training volume, competitive level, and years of practice showed no independent association between the backhand technique and DEP (prevalence ratio (PR) = 1.08; 95% confidence interval (CI) = 0.80-1.46; p = 0.625) or LBP (PR = 1.18; 95% CI = 0.95-1.48; p = 0.142). Stratified analyses revealed that the 1HB was significantly more common among male players (p < 0.001), and elbow pain was also more prevalent in males (p = 0.00076), whereas the prevalence of LBP did not differ significantly between sexes (p = 0.798). Among participants reporting elbow symptoms (n = 190), 64.7% localized pain to the lateral epicondylar region and 35.3% to the medial epitrochlear region. Overall, the backhand technique was not independently associated with either elbow or lumbar symptoms. Conclusions These findings support a multifactorial model of injury risk in tennis, suggesting that factors such as age, sex, cumulative load, and player characteristics may play a greater role than the isolated choice of the backhand technique.
Postoperative pain remains a significant clinical challenge, and opioid-based analgesia alone is often insufficient, while also contributing to adverse outcomes and long-term opioid dependence. Osteopathic manipulative treatment (OMT) has been proposed as a nonpharmacologic adjunct within multimodal postoperative pain management frameworks, but the evidence has not been formally synthesized. We searched PubMed, EMBASE, CINAHL, Scopus, and the Cochrane Central Register of Controlled Trials in January 2026, supplemented by a manual journal search of the Journal of the American Osteopathic Association, International Journal of Osteopathic Medicine, Journal of Osteopathic Medicine, and Journal of Manual & Manipulative Therapy. Eligible studies were randomized or matched controlled trials that evaluated OMT as an adjunctive intervention in postoperative patients and reported quantitative pain outcomes. Primary outcomes included pain intensity, analgesic consumption, functional recovery, and length of hospital stay. Secondary outcomes included respiratory function, patient satisfaction, and adverse events. Three studies met the inclusion criteria and were carried forward to the final analysis. Two randomized controlled trials measuring postoperative pain via the Visual Analog Scale (VAS) demonstrated reductions of approximately 1.7 to 2.2 points in OMT-treated patients compared to controls at their respective assessment endpoints, exceeding established thresholds for clinical significance. A third prospective matched controlled study found that OMT patients achieved earlier functional milestones compared to controls, though differences in supplemental analgesic consumption were not statistically significant. An exploratory pooled analysis of the two VAS trials, using a DerSimonian-Laird random-effects model, yielded a mean difference of -1.16 (95% CI: -2.16 to -0.15); while this estimate excludes zero, with only two contributing studies and substantial heterogeneity (I² = 86.3%), it remains statistically fragile and should not be interpreted as confirmatory. The narrative synthesis of individual trial results provides a more defensible basis for clinical inference at this stage: across independent controlled trials in cardiac and orthopedic surgical populations, OMT was consistently associated with reduced pain, lower analgesic requirements, and faster functional recovery. Formal meta-analytic conclusions are not warranted given the current evidence base. Larger multicenter trials with standardized protocols are needed to define OMT's role in perioperative care.
Background Helicobacter pylori (Hp) infection remains a major public health challenge in Colombia, a region with a high incidence of gastric cancer. Optimizing amoxicillin dosing and adherence to weight-based regimens are key modifiable factors to improve eradication. Objective This multicenter study aimed to evaluate real-world Hp eradication rates in Colombian children and identify independent predictors of treatment success. Methods Secondary analysis of the multicenter "LatamPed-Hp" REDCap registry in REDCap included 26 centers in 14 Colombian cities, including treatment-naïve children with confirmed Hp infection. Demographic data, clinical symptoms, and pharmacological regimens were recorded. Eradication was confirmed after four weeks post-treatment primarily via non-invasive methods. Factors associated with success were analyzed using bivariate tests and a multivariable logistic regression model. Results A total of 330 patients were included (mean age 11.0 ± 3.8 years; 52.7% female). The overall eradication rate was 81.8% (270/330). Bivariate analysis showed that success was significantly associated with treatment adherence (p < 0.001), absence of non-gastrointestinal comorbidities (p = 0.005), and adequate amoxicillin dosing (p = 0.017). In the multivariable model, adequate weight-based amoxicillin dosing was a robust independent predictor of success (adjusted odds ratio (aOR) 2.76; 95% confidence interval (CI): 1.11-6.86; p = 0.028). Adherence remained the strongest protective factor (aOR 48.99; p < 0.001), while non-gastrointestinal comorbidities increased the odds of failure (aOR 0.21; p < 0.001). The model demonstrated high accuracy (84.59%) and sensitivity (98.31%). Conclusion Eradication rates in Colombian children remain below international targets. Pharmacological precision, specifically adequate amoxicillin dosing, is a critical modifiable factor for success. Ensuring strict adherence to weight-based protocols and improving treatment compliance are essential strategies to optimize Hp eradication and reduce long-term gastric cancer risk in this population.
Background Antimicrobial stewardship programs are fundamental strategies for improving prescribing practices. In Colombia, evidence of their impact is limited. Objective The objective of this study was to ascertain the role of antimicrobial therapy change, as an intervention of the antimicrobial stewardship programme, as an audit strategy, in the period from January to July 2024 in a tertiary care hospital in Colombia, Hospital Universitario San Rafael de Tunja. Methods Patients, including adults over 18 years, were evaluated through an audit-and-feedback stewardship strategy, while data analysis was retrospectively collected. Patients who died, were discharged, or were referred before the antimicrobial stewardship programme team intervention were excluded. Clinical, microbiological, and therapeutic characteristics, duration of treatment, and adherence to the antimicrobial stewardship programme's recommendations were analysed.  Results A total of 487 patients were included in the study, with a median age of 59 years and a male-to-female ratio of 53.8% to 46.2%. The primary diagnosis was urinary tract infection, accounting for 27.7% of cases. Piperacillin/tazobactam was the most commonly used empirical antimicrobial. Following the implementation of the antimicrobial stewardship programme at the hospital in January 2024, treatment was adjusted in 78.2% of cases, with de-escalation (37.8%) and stepwise escalation (37.6%) predominating. Following the interventions by the antimicrobial stewardship group, 92.2% of prescribers demonstrated overall compliance. Targeted treatment was significantly shorter in patients who underwent intervention (median: 5 days) compared to those who did not (7 days; p < 0.001). The 30-day readmission rate was 19.1%, with no significant differences between groups. Conclusions The implementation of antimicrobial stewardship programmes enabled the optimisation of antimicrobial therapy through individualised adjustments, achieving high adherence and a reduction in treatment duration, supporting the role of antimicrobial stewardship programmes as a key strategy for improving prescription quality. However, prospective studies are needed to evaluate its long-term clinical, microbiological, and economic impact.
Episiotomy is a commonly performed surgical procedure during vaginal delivery and is generally considered safe. However, rare complications may occur during episiotomy repair, including fracture of the suture needle. Localization and retrieval of a broken needle fragment can be challenging, particularly in the presence of active bleeding or limited visualization of the surgical field. We report a case of a fractured suture needle during episiotomy repair in a postpartum patient. Because the needle fragment could not be immediately located during the procedure, portable radiography was used to identify the exact location of the retained fragment. Following radiographic localization, targeted minimal dissection was performed, and the needle fragment was successfully removed within a short period of time. The patient remained hemodynamically stable throughout the procedure. No significant bleeding or postoperative complications were observed. Follow-up examination showed normal healing of the episiotomy site. This case highlights the usefulness of portable radiography as a practical and accessible method for the localization and safe removal of retained surgical needle fragments during episiotomy repair. Prompt identification and minimally invasive retrieval can prevent prolonged surgical exploration and potential complications.
Acute appendicitis remains one of the most common surgical emergencies worldwide and has traditionally been treated with appendectomy. However, growing evidence from randomized trials and international guidelines has challenged this paradigm, suggesting that nonoperative management with antibiotics may be a safe alternative for selected patients with uncomplicated disease. This narrative review aims to examine the current evidence on the management of uncomplicated acute appendicitis, with particular focus on the role of antibiotic therapy compared with surgical appendectomy, patient selection, and clinical outcomes. A narrative review of the literature was conducted using major medical databases, including PubMed, Scopus, and Google Scholar. Randomized clinical trials, systematic reviews, meta-analyses, and international guidelines published in recent years were analyzed to summarize current diagnostic strategies and treatment approaches for uncomplicated acute appendicitis. Recent studies demonstrate that antibiotic therapy can successfully treat a substantial proportion of patients with uncomplicated appendicitis, potentially avoiding immediate surgery. Nevertheless, nonoperative management is associated with higher rates of recurrence and subsequent appendectomy. The presence of an appendicolith has been consistently identified as a significant predictor of treatment failure. Laparoscopic appendectomy remains a safe and definitive treatment with low complication rates and minimal risk of recurrence. The management of uncomplicated acute appendicitis is evolving from a strictly surgical disease to a condition with multiple evidence-based treatment options. While appendectomy remains the definitive therapy, antibiotic treatment represents a feasible alternative in carefully selected patients. A patient-centered approach that incorporates imaging findings, clinical risk factors, and shared decision-making is essential for optimal outcomes.
Tooth loss is often followed by alveolar ridge resorption, which may compromise implant placement and long-term stability. Different implant site preparation techniques, such as conventional drilling, bone expansion, and ridge split procedures, are used to manage varying ridge conditions. These techniques may influence peri-implant bone remodeling in different ways. This study aimed to evaluate and compare crestal bone loss around dental implants placed using conventional drilling, bone expansion, and ridge split techniques. This prospective clinical comparative study included 72 implant sites in patients aged 25-55 years who required implant-supported rehabilitation. The implant sites were divided into three groups based on the implant site preparation technique used: conventional drilling (CD), bone expansion (BE), and ridge split (RS), with 24 implants in each group. Crestal bone levels were evaluated using cone-beam computed tomography immediately after implant placement and at 3, 6, and 12 months of follow-up. Mesial and distal crestal bone loss were measured and analyzed statistically using one-way analysis of variance (ANOVA) with Tukey's post hoc test for intergroup comparisons and repeated-measures ANOVA for intragroup comparisons. Categorical variables were compared with the chi-square test. Statistical significance was set at p < 0.05. Crestal bone loss increased progressively over time in all three groups. Intergroup comparison revealed statistically significant differences in both mesial and distal crestal bone loss at 3, 6, and 12 months (p < 0.001). The ridge split group demonstrated the highest bone loss, followed by the bone expansion group, while the conventional drilling group showed the least bone loss at all time intervals. Pairwise comparisons between all groups were also statistically significant (p < 0.001). Intragroup analysis showed a significant increase in crestal bone loss over time within each group (p < 0.001), except for the comparison between 6 and 12 months in the conventional drilling group, which was not statistically significant (p > 0.05).  Conclusion: Implant site preparation techniques significantly influence peri-implant crestal bone remodeling. Conventional drilling demonstrated the least crestal bone loss, whereas ridge split procedures showed greater bone remodeling during the follow-up period.
Gallstone ileus is a rare mechanical bowel obstruction caused by the migration of a gallstone through a biliary-enteric fistula, most commonly impacting at the ileocecal valve. Proximal obstruction is uncommon and is typically classified as Bouveret syndrome when involving the stomach or proximal duodenum. Obstruction near the ligament of Treitz represents an exceedingly rare variant. A 76-year-old female presented with nausea, vomiting, and abdominal distention. Computed tomography demonstrated gastric and proximal small bowel dilation secondary to an obstructing gallstone in the distal duodenum. After anticoagulation reversal, exploratory laparotomy was performed. The stone was manually advanced into the jejunum, followed by segmental resection and primary anastomosis. The gallstone measured 6 cm. The gallbladder and fistula were not addressed initially. The patient recovered well and was discharged on postoperative day four. She later underwent an elective robotic cholecystectomy with fistula takedown without complication. This case describes an atypical proximal gallstone ileus located near the ligament of Treitz, distinct from both classic gallstone ileus and Bouveret syndrome. A staged surgical approach allowed safe relief of obstruction followed by definitive biliary management. This case highlights the importance of individualized surgical planning and supports staged management as a safe and effective strategy for rare proximal gallstone ileus.
This is a novel case of a rapidly progressive right hemisphere stroke in the setting of multiple infections that highlights the impact of comorbidities on stroke progression. We hope that reporting this case draws attention to the need for a multidisciplinary focus on the progression of stroke symptoms in the face of septic processes. A 60-year-old African American man with a past medical history of hypertension, diabetes, bilateral above-the-knee amputations, and frequent cocaine use presented to the hospital with left-sided deficits and suspected seizures. The patient's presentation was complicated by a urinary tract infection (UTI), bladder outlet obstruction, pneumonia, and medication non-compliance. Computed tomography angiography (CTA) revealed severe stenosis of the cavernous segment of the right internal carotid artery (ICA). Magnetic resonance imaging (MRI) revealed a complete right hemisphere stroke with 19.28 mm midline shift. Criteria for brain death were met on day 6, and the patient died on day 10. This case highlights the rapid progression of infarction in the setting of critical internal carotid artery (ICA) stenosis, sepsis, and delayed urinary decompression. Early recognition and coordinated multidisciplinary care may improve outcomes in similarly complex presentations.
Injury to the marginal mandibular nerve (MMN) as an isolated complication of posterior cervical spine surgery performed in the prone position is exceedingly rare, with only a handful of documented instances in indexed surgical literature. We present the case of a 60-year-old gentleman who developed unilateral lower facial deviation immediately following a seven-hour posterior cervical decompression and multilevel instrumented stabilization procedure. The deficit was confined entirely to the distribution of the MMN, with no involvement of other facial nerve branches or adjacent peripheral nerves. The clinical findings were most consistent with positional MMN neuropraxia, and the patient was managed conservatively with physiotherapy and ENT guidance, with early evidence of recovery before discharge. This report aims to draw the attention of spine surgeons to a complication that is probably underrecognized and underreported and to underline the anatomical and technical considerations that can reduce its occurrence.
Cesarean delivery is one of the most common surgeries in the United States and is associated with greater postpartum blood loss than vaginal delivery. Despite increasing emphasis on selective, symptom-driven evaluation, routine postoperative hemoglobin testing following cesarean delivery remains common practice. Identifying patients at greatest risk for severe postpartum anemia may help guide more targeted laboratory assessment while avoiding unnecessary testing in low-risk individuals. This study aimed to develop and evaluate a predictive model for severe postpartum anemia following cesarean delivery. We conducted a retrospective cohort study of patients undergoing cesarean delivery at three hospitals within the WakeMed Health and Hospitals system in North Carolina during 2019. Patients aged ≥18 years with both preoperative and postoperative hemoglobin measurements were included. Demographic, obstetric, and clinical variables were extracted from the electronic health record. Severe postpartum anemia was defined as postoperative hemoglobin <7 g/dL. Elastic net logistic regression with 10-fold cross-validation was used to identify predictors of severe anemia. A total of 2,061 patients met the inclusion criteria. Severe postpartum anemia occurred in 80 patients (3.8%), and 116 patients (5.6%) received blood transfusion. The final predictive model retained two variables: preoperative hemoglobin level and number of uterotonic agents administered. The model demonstrated strong discrimination for predicting severe anemia (area under the curve: 0.89). Applying a prevalence-based probability threshold of 3.5% reduced postoperative hemoglobin testing by 65% while maintaining 90% sensitivity for identifying severe anemia. These findings suggest that a small number of routinely available clinical factors can effectively identify patients at increased risk for severe postpartum anemia following cesarean delivery. A risk-based approach to postoperative laboratory evaluation may support current guideline recommendations favoring selective testing and may reduce unnecessary laboratory utilization without compromising patient safety. A predictive model incorporating preoperative hemoglobin level and uterotonic use provides a simple method for identifying patients at the highest risk for severe postpartum anemia after cesarean delivery. Implementing selective postoperative hemoglobin testing based on risk stratification may improve resource utilization and streamline postpartum care.
Introduction Participation is a central outcome in community-based rehabilitation, particularly in aging societies where social isolation among older adults is increasing. Community nurses have emerged as key facilitators of social participation in community settings. However, the mechanisms through which community nurses reconstruct participation in everyday practice remain insufficiently understood. This study aimed to clarify how community nurses reconstruct and facilitate participation in community-based rehabilitation. Methods This retrospective qualitative study analyzed activity logs and reflective practice records documented by three community nurses working in a rural Japanese community between January and December 2025. Thematic analysis following Braun and Clarke's framework was conducted. Participation episodes were identified and analyzed to examine how participation emerged and was facilitated in everyday community practice. Identified activities were also interpreted with reference to the participation concept of the International Classification of Functioning, Disability and Health (ICF). Results A total of 134 participation episodes were identified across multiple participation domains, including community life, recreation and leisure, interpersonal relationships, and digital communication. Four themes were identified: (1) creating participation-ready environments, (2) facilitating reciprocal roles and resident empowerment, (3) reconstructing participation through personal narratives and identity, and (4) bridging participation barriers through relational and structural support. Participation emerged as a relational and dynamic process embedded in everyday community interactions. Conclusions Community nurses play a critical role in reconstructing participation in community-based rehabilitation by facilitating relational engagement, enabling meaningful social roles, and reducing participation barriers. These findings provide empirical insights into how participation can be operationalized in community practice and highlight the potential of participation-oriented approaches in aging societies.
Colorectal carcinoma (CRC) is one of the most common cancers in various parts of the world. The evolution of CRC follows a stepwise genetic acquisition of mutations and a resultant transition from normal mucosa to adenoma and, ultimately, adenocarcinoma. The goal of the study was to assess the different immunohistomorphologic characters in adenomas and adenocarcinomas using the Ki-67 tumor proliferation marker that may have diagnostic or prognostic relevance.  Materials and methods: This descriptive study was conducted on 40 paraffin-embedded tissue blocks collected from the colorectal tissues in a tertiary care hospital in Chennai, Southern India. Ki-67 proliferation index (PI) was analyzed by immunohistochemically stained slides by a set of two pathologists to eradicate bias.  Results: Tissue samples from 25 (62.5%) male and 15 (37.5%) female patients with a mean age of 56.9 ± 16.09 (age range six to 83 years) and a median age of 60 years were included in the study. Adenocarcinomas comprised 24 (60%), whereas 16 (40%) cases were adenomas. There was a significant mean difference between adenomas (Mean PI = 30.97 ± 11.44) and adenocarcinomas (Mean PI = 46.33 ± 25.16) when compared with Ki-67 PI (%), with a p-value = 0.02 (< 0.05).  Conclusions: The Ki-67 immunohistochemical technique is a simple and reproducible technique. A gradual increase in expression of Ki-67 was observed from low-grade adenomas to high-grade adenomas and adenocarcinomas, which may serve as a routine tumor proliferation marker that can be routinely used as a diagnostic or prognostic marker in CRCs.
Introduction Laparoscopic cholecystectomy is the standard treatment for symptomatic cholelithiasis. Although 3D laparoscopy may improve depth perception, its effects on efficiency, ergonomics, and complication rates in comparison to 2D systems remain unclear. Methods In this prospective, single-blinded, randomized comparative study, 100 patients underwent elective laparoscopic cholecystectomy at the Department of General and Minimal Access Surgery in a tertiary care teaching hospital in North India. Patients were randomized equally to 3D (Group A) or 2D laparoscopy (Group B). Exclusion criteria included prior major abdominal surgery, coagulopathy, or pregnancy. Total performance time, surgeon's subjective depth perception (scale 1-5), strain scores (eye, wrist/hand, neck/back, dizziness/headache), intraoperative complications, conversion rates, and hospital stay were documented through surgical records and postoperative questionnaires. Results The total performance time was significantly shorter in the 3D group (46 min) compared to the 2D group (50 min; p=0.0074). The median depth perception score was higher in the 3D group (5.0 vs. 2.0; p<0.001). Wrist/hand strain was lower in the 3D group (median 2.0 vs. 3.0; p=0.0117), while eye strain (4.0 vs. 3.0; p=0.0021) and dizziness/headache (4.0 vs. 2.0; p<0.001) were greater. There were no significant differences in neck/back strain (p=0.81), rates of gallbladder rupture (10% vs. 12%; p=0.76), liver bed bleeding (10% vs. 14%; p=0.76), biliary injury (none), conversion rates (2% in both), or hospital stay (2.06 days each; p=0.78). Conclusions 3D laparoscopy enhances operative efficiency and depth perception while reducing wrist/hand strain. However, increased eye strain and dizziness warrant further technological optimization to improve the overall ergonomic profile for surgeons.