Posterior glenoid dysplasia (PGD) has been described as an adaptive osseous change in young throwing athletes exposed to repetitive mechanical stress. However, prior studies have lacked consistent definitions, reliable classifications, or sport-specific analysis. This study aimed to analyze and classify PGD in young athletes across various sports and compare its proportion and severity between symptomatic baseball players and athletes from other sports. We retrospectively reviewed 10 years of shoulder CT imaging data from 568 young athletes presenting with shoulder pain: 418 baseball players, 50 overhead athletes, and 100 athletes from other sports. Dysplasia of posterior to posteroinferior glenoid, bone defect size, and additional bone formation were assessed. We established a five-type classification system: Type 0 (normal sharp rim), Type 1 (smooth rounded rim), Type 2 (triangular defect), Type 3 (retreated joint surface with posterior bulging), and Type 4 (new bone formation). PGD was identified in 85.4% of baseball players, 32.0% of overhead athletes, and 9.0% of non-overhead athletes (p<0.001). Type 2 was most common in baseball players (42.6%), and advanced morphological PGD (Types 3-4) was present in 13.4% of baseball players but was absent in other groups. PGD was more commonly observed among overhead athletes, particularly baseball players, within this symptomatic imaging cohort. Advanced morphologic types were identified predominantly in baseball players. These findings provide a structured framework for describing posterior glenoid morphology in throwing athletes and may facilitate future prospective studies investigating clinical relevance and natural history.
Musculoskeletal injuries are common among athletes and can negatively affect both physical and psychological wellbeing. Beyond physical disability, injuries have been linked to poor mental health, loss of athletic identity, and increased substance use. In low- and middle-income countries (LMICs), where awareness and management of mental health remain limited, the psychological burden of sports injuries may be underrecognized. Rugby, a physically demanding contact sport, has one of the highest rates of injury, but little is known about how such injuries influence mental health and related outcomes in African contexts. This study explored the relationship between musculoskeletal injuries and health outcomes-including mental health, physical functioning, identity, and substance use-among rugby players in Kenya. We conducted a cross-sectional survey between October 2024 and November 2024 among rugby players from ten clubs across Kenya. Eligible participants were adults aged 18 years and above, recruited using convenience sampling. Data were collected through electronic questionnaires that captured demographics and included various psychological and health assessment tools. Summary statistics were presented as medians and interquartile ranges for continuous data and frequencies and percentages for categorical data. The non-parametric Kruskal-Wallis test was used to compare the continuous variables and Fisher's exact test was used to compare the categorical variables between group associations. A total of 400 players were recruited and included in the analysis. The median age was 24.0 years (IQR 21.0-27.0) and majority were males (89.8%). Over half (52.5%) reported a significant injury in the past two years, most commonly affecting the knee (41.0%), ankle (32.4%), and shoulder (21.0%). The mean GHQ-12 score was 22.4 (SD = 5.6), indicating some burden of mental health symptoms, and injured players had significantly higher GHQ-12 scores (p = 0.001). On the SF-12, the mean physical and mental component scores were 84.5 (SD = 12.6) and 60.5 (Sd = 8.6), with injured players reporting lower physical scores (p < 0.001). The AIMS shower strong athletic identity overall with injured players demonstrating significantly higher social identity scores (p = 0.007). The ASSIST revealed that 72.8% had ever used substances, with alcohol (71.8%), cannabis (19.5%), and tobacco (13.3%) being the most common; cannabis use was significantly higher among injured players (p = 0.002). Musculoskeletal injuries among rugby players were associated with poorer mental health, reduced physical functioning, and greater cannabis use. These findings underscore the need for integrated injury management approaches that address both physical rehabilitation and psychological wellbeing, including mental health screening, counselling, and substance use surveillance.
This study determines the prevalence of ACL injury history among currently active female football players across age groups and playing levels and examines the association between ACL injury history and activity-related knee pain. A total of 1026 active Danish female youth- and senior-league football players were invited to an online questionnaire (response rate: 751 [73%]) on (1) ACL injury history and (2) present knee pain during physical activity. Prevalence was calculated separately according to age (youth vs senior) and playing level (non-elite vs elite). Logistic regression analyses investigated factors associated with the prevalence of ACL injury history and activity-related knee pain, respectively. The prevalence of ACL injury history was 5.0% (95% confidence interval [CI] 3.3%-7.4%) in youth football and 14.8% (95% CI 10.6%-19.8%) in senior football. The prevalence of activity-related knee pain was comparable between youth (24.6%, 95% CI 20.8%-28.7%) and senior football players (26.0%, 95% CI 20.7%-31.9%). Comparable prevalence of ACL injury history and knee pain was reported at elite and non-elite level. ACL injury history was strongly associated with activity-related knee pain (OR = 5.4-8.7, p < 0.0001). Playing with a previous ACL injury is common in active female football players, particularly at senior levels (nearly 1 in 6 elite players). The strong association between ACL injury history and activity-related knee pain underscores the long-term negative consequences. This study highlights the need for secondary and tertiary prevention strategies in female football.
Countermovement jump (CMJ) performance is widely used to assess explosive lower-limb function in football players. Although knee isokinetic strength is frequently measured in elite sport environments, the extent to which it relates to CMJ performance remains unclear, particularly when CMJ is performed with free arm movement. Therefore, the aim of this study was to examine the relationship between knee isokinetic muscle strength characteristics and CMJ performance in elite male football players. Twenty-four elite male football players (age 23.83 ± 5.98 years) participated in this cross-sectional study. CMJ height was assessed using an optical measurement system (Optojump Next). Concentric knee extensor and flexor peak torque was measured using an isokinetic dynamometer at angular velocities of 60°/s and 180°/s and expressed as peak torque/body weight% (PT/BW,%). Pearson correlation and linear regression analyses were used to examine associations between isokinetic strength variables and CMJ performance. Bilateral differences, hamstring-to-quadriceps (H/Q) ratios, and inter-limb asymmetries were also analyzed. Significant positive correlations were observed between CMJ height and knee extensor peak torque expressed as PT/BW (%) at both angular velocities. Stronger relationships were found at 180°/s (r = 0.558-0.642, p ≤ 0.005) compared with 60°/s (r = 0.483-0.500, p < 0.05). Regression analyses showed that knee extensor strength at 180°/s explained up to 41.2% of the variance in CMJ height. Hamstring strength demonstrated weaker and less consistent associations with CMJ performance, while H/Q ratios and inter-limb asymmetries were not significantly related to jump height. Quadriceps isokinetic strength expressed as PT/BW (%) was significantly associated with CMJ performance in elite male football players, with stronger relationships observed at higher angular velocity. These findings suggest that knee extensor strength assessed at higher angular velocity is meaningfully associated with explosive lower-limb performance and may provide useful complementary information within routine neuromuscular monitoring in professional football.
Change of direction and deceleration are crucial for soccer performance. However, these abilities may be influenced by growth and maturation during adolescence. Understanding these effects is essential for optimizing training and potentially mitigate injury risk in young players. This study aimed to evaluate the effects of maturation on change of direction (COD) and deceleration performance in adolescent male soccer players with varying levels of maturity. Ninety-four adolescent male soccer players (age 13.2 ± 1.9 years) participated in this cross-sectional study. The participants were divided into three groups based on their biological maturity levels: pre-PHV, circa-PHV, and post-PHV. Over a ten-day period, familiarization, determination of maturity status, and 15-meter linear sprint and 505 agility test protocols were implemented. COD and deceleration deficits (DD) were calculated and analyzed. Significant differences were detected among the maturity groups for 10m and 15m linear sprints, dominant and non-dominant directions in the 505-agility test, and full approach (p<0.001; η²p: 0.58-0.81). Notably, strong correlations were found between 505 COD performance and linear sprint times in the pre- and post-PHV groups, whereas no such correlation was observed in the circa-PHV group. Furthermore, deceleration ability exhibited a significant correlation with 10m sprint performance; however, no significant relationship was identified with 15m sprint performance. The results suggest that maturity level significantly affects COD and deceleration performance in adolescent male soccer players. Coaches and performance specialists should design individualized training programs tailored to the developmental stages of young athletes to enhance performance and potentially mitigate injury risk.
Athlete burnout is a common negative psychological state of athletes, which includes physical/emotional exhaustion, diminished personal accomplishment, and negative evaluation of sports. The aim of this study is to examine the burnout dimensions in amateur football players according to playing positions. A total of 113 amateur soccer players participated in the study (age range: 20-35) and the mean (SD) weight was 74.81 ± 8.41 kg, height was 179 ± 2.56 cm. An Athlete Burnout Questionnaire was used to determine the athlete burnout dimensions of soccer players. The results of study show that the data revealed the defense players had higher values in the two subscales: emotional/physical exhaustion (EPE) and sport devaluation (SDeval). On the other hand, the strikers had higher values in the reduced sense of accomplishment (RSA) subscale. Based on the indicators of burnout dimensions, goalkeepers, midfielders, and forwards appear to make adequate psychophysiological adjustments to the demands of competition regarding physical and emotional exhaustion and sport devaluation. In terms of reduced sense of accomplishment, goalkeepers, defenders, and midfielders demonstrate sufficient psychophysiological adaptation to competitive demands. The impact (preventing goals and scoring goals) represented by defenders and strikers with higher burnout scores should be investigated.
Background Tennis is a physically demanding sport characterized by explosive movements, repeated trunk rotations, and asymmetric loading of the dominant upper limb. Overuse injuries are common, with dominant-arm elbow pain (DEP) and low back pain (LBP) representing frequent musculoskeletal complaints among tennis players. Biomechanical differences between the one-handed backhand (1HB) and the two-handed backhand (2HB) may influence the distribution of mechanical loads on the upper limb and spine; however, their relationship with clinically relevant symptoms remains unclear. This cross-sectional observational study aimed to investigate the association between the backhand technique and the prevalence of DEP and LBP in a heterogeneous population of tennis players. Methodology An anonymous online questionnaire was distributed through tennis clubs and social media between February and November 2025. A total of 455 responses were collected, of which 445 complete questionnaires were included in the final analysis. Results The mean age of the participants was 35.1 ± 17.9 years, and 73.9% were male. The median weekly training volume was five hours (interquartile range = 3-11). In crude analyses, elbow pain was more frequent among players using the 1HB (56.5% vs. 32.9%, p < 0.001), while LBP was slightly more common in the same group (59.5% vs 49.5%, p = 0.049). However, multivariable Poisson regression models adjusted for age, sex, training volume, competitive level, and years of practice showed no independent association between the backhand technique and DEP (prevalence ratio (PR) = 1.08; 95% confidence interval (CI) = 0.80-1.46; p = 0.625) or LBP (PR = 1.18; 95% CI = 0.95-1.48; p = 0.142). Stratified analyses revealed that the 1HB was significantly more common among male players (p < 0.001), and elbow pain was also more prevalent in males (p = 0.00076), whereas the prevalence of LBP did not differ significantly between sexes (p = 0.798). Among participants reporting elbow symptoms (n = 190), 64.7% localized pain to the lateral epicondylar region and 35.3% to the medial epitrochlear region. Overall, the backhand technique was not independently associated with either elbow or lumbar symptoms. Conclusions These findings support a multifactorial model of injury risk in tennis, suggesting that factors such as age, sex, cumulative load, and player characteristics may play a greater role than the isolated choice of the backhand technique.
Knee articular cartilage lesions are frequent in football players, but evidence for the most suitable surgical treatments is lacking. The aim of this International Cartilage Regeneration & Joint Preservation Society, Fédération Internationale de Football Association and Aspetar (ICRS-FIFA-Aspetar) consensus was to develop expert-based, patient-specific practical recommendations on the appropriateness of surgical treatments for symptomatic knee articular cartilage lesions in competitive football players. The RAND/UCLA Appropriateness Method was used by 17 voting experts to provide recommendations on the suitability of six different surgical procedures (debridement, debridement+orthobiologics, bone marrow stimulation procedures, osteochondral autograft transplantation, allografts, and regenerative procedures) depending on four key clinical considerations: lesion location, defect size, bone involvement, and patient preference towards higher priority for a quick return to play or long-term results. These resulted in 96 scenarios (16 clinical scenarios for six surgical procedures). Altogether, in 94% of clinical scenarios, at least one surgical procedure was considered appropriate. Patient preference had the highest influence on the results. Debridement plus orthobiologics was most often considered appropriate in patients preferring a quick return to play, while regenerative techniques were more often considered appropriate with patients prioritising long-term results. Osteochondral autograft transplantation and allografts were considered appropriate only in selected scenarios, whereas bone marrow stimulation procedures were considered inappropriate or uncertain. The recommendations established by this ICRS-FIFA-Aspetar consensus on the appropriateness of different surgical procedures to treat symptomatic articular cartilage lesions in competitive football players should be used as broad guidelines, but the preferred treatment should be player-specific.
Badminton is a physically demanding sport requiring a combination of aerobic endurance, anaerobic power, agility, and explosive strength. Repeated sprint training (RST) has emerged as a time-efficient strategy for improving physical performance, but its comprehensive effects on badminton players are not fully elucidated. Twenty-eight male collegiate badminton players were randomly assigned to either a repeated sprint training group (RST; n = 14) or a high-intensity interval training group (HIIT; n = 14). In addition to regular skill-based badminton practice, the RST group performed 30-m all-out running sprints twice per week (2-3 sets of 6 × 30 m, 30 s passive recovery between sprints and 2 min active recovery between sets) over 8 weeks. The HIIT group completed standard badminton training combined with running-based high-intensity intervals prescribed around 90% HR_max. Before and after the intervention, all participants were assessed for aerobic capacity ( V ˙ O2max, v V ˙ O2max, first and second ventilatory thresholds), anaerobic power (Wingate peak power (PP), mean power (MP), fatigue index), repeated sprint ability (6 × 30-m sprints: ideal time (IS), total time (TS), performance decrement), agility (modified T-test), and lower-limb power (countermovement and spike jumps).Results: All participants completed the study, and no significant baseline differences were found between groups (p > 0.121). Significant main effects of time and group × time interactions were observed for V ˙ O2max, v V ˙ O2max, VT1, PP, MP, modified agility T-test, and spike jump height. The RST group showed greater post-intervention improvements in these variables than the HIIT group (all p ≤ 0.002), whereas the HIIT group demonstrated significant but smaller gains in V ˙ O2max, v V ˙ O2max, VT1, PP, and spike jump height (p < 0.05). Significant time effects were also found for IS, TS, VT2, and countermovement jump height, with both groups improving after training (all p < 0.05), particularly in the RST group.Conclusion: An 8-week, twice-weekly 30-m RST program added to standard badminton training proved effective for concurrently improving aerobic capacity, anaerobic power, repeated sprint ability, agility, and lower limb explosive power in collegiate male badminton players. These findings suggest that this specific RST protocol is a potent and time-efficient training modality for enhancing the multifaceted physical fitness required for badminton performance.
Visual attention plays a crucial role in basketball shooting, yet how it adapts under varying exercise intensities and from different shooting positions remains poorly understood, especially in elite female athletes. This study examined the effects of exercise intensity (low, moderate, high) and shooting position (left 45°, 90°, right 45°) on fixation behavior and 2‑point shooting accuracy in elite female basketball players. Twenty‑two players from a championship‑winning university team performed two‑point shots from three positions under three intensity conditions, each defined by heart‑rate zones (%HRmax). Fixation metrics (number of fixations, fixation duration, distribution) were recorded using Tobii Glasses 3. Data were analyzed using two‑way repeated‑measures ANOVA and generalized linear mixed‑effects models (GLMMs) with a binomial distribution for trial‑level accuracy; Pearson correlations are reported descriptively. Exercise intensity significantly influenced all fixation metrics. High intensity led to increased fixations across all areas of interest (hoop, backboard, net) and longer total fixation duration, indicating higher cognitive load. Moderate intensity was associated with the lowest total number of fixations and shortest duration, reflecting efficient visual processing. Shooting position also affected fixation: the 90° position attracted the most fixations and longest duration on the hoop (F = 4.56, p = 0.017, [Formula: see text] = 0.19, 95% CI [0.02, 0.37]), while the 45° positions shifted attention toward the backboard. Fixation duration on the hoop positively correlated with accuracy under high intensity (r = 0.499, p = 0.021, 95% CI [0.10, 0.76]), whereas number of fixations negatively correlated with accuracy across intensities. Moderate intensity promotes optimal visual‑attentional control and shooting accuracy in elite female athletes, whereas high intensity disrupts fixation stability and increases cognitive load. Position‑specific adaptations in visual strategy were also observed. These findings support the use of intensity‑ and position‑based visual training to enhance shooting performance under realistic game conditions.
This study examines the arcade memory of the first-generation players in China through semi-structured interviews. Using Menke's framework of media nostalgia as an analytical tool, this study found that the narrative of arcade memory is formed through game text, material technology and spatial structure: Players established a visualized memory mode by revisiting the game world and recalling the material sensations generated through human-machine interaction; the coexistence and interactions among multiple actants shape the arcade hall as a landscape of public memory and spatial affect. The memory of arcade games emerges as a geographical and interest-based collective memory centered on weakly connected neighborhood life and generational identity. By analyzing the memory practices surrounding arcade games, this study develops a theoretical framework for understanding game nostalgia, thereby contributing to the broader fields of game studies and memory studies.
Despite regulatory and conceptual progress in the field of inclusion, people with disabilities continue to face significant barriers which limit their full participation in mainstream sports and affect key social factors of quality of life. Within this context, the Mixed Ability (MA) model represents an alternative where people with disabilities can be included into grassroots sports settings without the need to modify rules or implement classification or identification systems. This study aimed to analyse perceived changes in the social factor of MA basketball players, as well as to identify current needs and challenges related to their participation. A qualitative methodology was employed, using individual semi-structured interviews with a convenience sample. Eleven players from a basketball MA club took part, distributed across two men's teams and one women's team. Three categories were identified: "interpersonal relationships", "social inclusion and rights", and "barriers, future, and proposed changes". Participation in MA basketball teams has fostered social connection, greater group cohesion, a natural support network, a sense of belonging, and a perception of equality between people with and without disabilities. However, relevant obstacles were also identified, such as the short duration and infrequent occurrence of training sessions and competitions, the need to rejuvenate the team, and issues related to communication and lack of awareness. The findings suggest that MA may contribute to the development of interpersonal relationships, social inclusion, and the recognition of the rights of people with disabilities. At the same time, they point to the importance of addressing structural, social, and organizational conditions in order to support more inclusive and sustainable sporting environments. In this sense, inclusion emerges not as a fixed outcome, but as a dynamic and ongoing process shaped by contextual factors and the quality of relationships within sport.
Flywheel training has been shown to enhance lower-limb power; however, evidence comparing unilateral and bilateral complex training, especially among elite volleyball players, remains limited. Twenty-four male college volleyball players were randomly divided into three groups: unilateral flywheel complex training (UFT, n=8), bilateral flywheel complex training (BFT, n=8), or a control group (CON, n=8) that continued their regular technical training only. The intervention lasted 8 weeks, with 2 sessions each week. Performance measures included linear sprint times (5 m, 10 m, 30 m), change-of-direction skills (T-test, 5-0-5, volleyball-specific agility), and movement endurance (seven T-tests and repeated 30 m runs). A two-way repeated-measures ANOVA was performed to identify differences between groups and across time points (pre-test vs. post-test). Significant group × time interactions were observed across sprint, COD, and repeated-movement tests (p < 0.05). UFT demonstrated greater improvements than both BFT and CON in short-distance sprint (5-10 m), COD performance, and repeated-movement measures. BFT also improved performance relative to CON in selected outcomes, particularly the 30 m sprint and repeated tests, although improvements were generally smaller than those observed in UFT. Eight weeks of unilateral flywheel-based complex training resulted in the greatest improvements in short sprints, COD, and movement endurance compared to bilateral training and the control group. Although bilateral training also enhanced performance, the gains were comparatively smaller. These findings support the effectiveness of UFT as a strategy for improving short-distance acceleration and multidirectional movement performance relevant to volleyball match play.
暂无摘要(点击查看详情)
This study employs the expert-novice paradigm to examine the visual search characteristics of expert players through basketball-specific decision-making tasks. A total of 48 female college students (24 basketball players and 24 regular college students) were selected as participants. First person video stimuli were used to explore the behavioral indicators and eye movement characteristics of professional basketball players in sports decision-making tasks using the eye tracker. (1) Compared with novice players, expert players demonstrated significantly shorter reaction times, higher decision accuracy, and greater decision-making confidence in the task. (2) Expert players exhibited shorter fixation durations and fewer fixation counts. There was a significant difference in fixation duration and frequency between novice and expert players in the relevant and irrelevant areas of interest, with the expert group showing longer fixation durations and higher fixation frequencies in the relevant areas of interest. The scanning trajectory of the expert group was simpler and more centralized. Expert players demonstrate faster reactions and higher accuracy in sport-related decision-making tasks, exhibiting visual search advantages, shorter fixation durations, fewer fixation counts, and more focused attention allocation in relevant areas of interest.
To evaluate changes in position distribution and capitellar osteochondritis dissecans detected at annual screening before and after the 2019 introduction of pitch limit guidelines, focusing on dual-role players. Retrospective observational study using an interrupted time series design. Annual ultrasonography screening was performed from 2017 to 2023 in 1576 players aged 10-12 years. Players were classified as pitcher, catcher, or dual-role. Position distribution was analyzed using interrupted time series binomial models; chi-square tests were used for supplementary comparisons. Osteochondritis dissecans was analyzed using interrupted time series Poisson regression with an offset for the annual number screened and 95% confidence intervals. Worst-case sensitivity analyses assumed all missing confirmatory examinations were osteochondritis dissecans; analyses were repeated in males only. Catcher proportion decreased from 21.2% to 15.6% in a supplementary pre-post comparison (p = 0.011). Dual-role players accounted for 18.1% pre-guideline and 22.3% post-guideline and increased from 17.4% in 2017 to 26.1% in 2023 (p = 0.040). Overall osteochondritis dissecans was 3.6% pre-guideline and 1.8% post-guideline (p = 0.055); among dual-role players, 6.7% and 1.9% (p = 0.083). In Poisson interrupted time series using confirmed cases only, no significant level change was detected (incidence rate ratio 0.53, 95% confidence interval 0.09-3.00). Under the worst-case assumption, guideline introduction was associated with an immediate decrease in overall osteochondritis dissecans (incidence rate ratio 0.15, 95% confidence interval 0.035-0.64). Findings were consistent in males only. Guideline introduction coincided with fewer catchers and more dual-role players, while osteochondritis dissecans detected at annual screening did not show evidence of an increase among dual-role players.
An increased carrying angle (CA) on the throwing side has been observed in professional pitchers. Elbow alignment is known to be a factor in the strain stress on the ulnar nerve. Ulnar nerve instability (UNI), in which the nerve snaps forward over the medial epicondyle during elbow flexion, is reportedly more common in pitchers than in other players. However, no studies have examined the association between elbow alignment and UNI severity in baseball players. This study assessed the correlation between the CA and UNI severity in high school baseball pitchers. A total of 106 high school baseball pitchers who underwent medical checkups during the off-season in 2023 were examined. Ultrasound examinations were conducted to assess the presence or absence of UNI and medial joint space under rest and gravity stress on both sides of the elbow. The participants were divided into the following 3 groups based on the ultrasonographic findings of UNI: no instability (type N), subluxation (type S), and dislocation (type D). Clinical and physical examinations were performed, which included measurement of CA, grip strength, key pinch strength, and a check for ulnar nerve symptoms bilaterally. The distribution of UNI types on the throwing side was 37%, 31%, and 32% for types N, S, and D, respectively. A similar distribution was found on the nonthrowing side: 36% for Type N, 32% for Type S, and 32% for Type D. Overall, there was no statistically significant difference in the prevalence of these types between the 2 sides. The CA of the throwing arm did not differ significantly between the 3 UNI groups: Type N (13.6° ± 0.4°), Type S (12.7° ± 0.3°), and Type D (12.0° ± 0.3°) (P = .25). On the throwing side, the different UNI types did not show significant differences in the degree of ulnar collateral ligament laxity or in other clinical and physical findings. This study found no significant difference in bony alignment, as indicated by the CA, between the degrees of UNI among high school baseball pitchers. When categorized by UNI type, no significant differences were observed in the throwing side with respect to ulnar collateral ligament laxity or other clinical and physical findings.
This study investigated longitudinal changes in strength- and power-related performance across an entire competitive season in elite soccer players using a frequent and systematic monitoring approach. Twenty-three professional players competing in the Brazilian first division were monitored over seven consecutive months. Athletes performed two to three resistance training (RT) sessions per week, with one weekly "control" session used to adjust training loads and assess neuromuscular status through bar-derived measures obtained in the half-squat exercise. Relative power (RP), relative strength (RS), and estimated one-repetition maximum (1RM) were recorded across 28 training sessions using the load associated with maximum power output. Weekly variations were analyzed using a 4-week rolling average, and pre-, mid-, and post-season periods were compared using repeated-measures ANOVA. Across the season, RP, RS, and 1RM exhibited a gradual and consistent positive trend. While only isolated meaningful changes were detected when individual sessions were compared with rolling averages, significant improvements were observed at the post-season time point compared with both pre- and mid-season values (p < 0.05). Notably, despite a modest ~5% increase from pre- to post-testing, absolute training loads did not change significantly across the season, indicating that performance gains were achieved without meaningful increases in training load magnitude. These findings demonstrate that a power-oriented RT model, supported by continuous monitoring and load adjustments, can effectively preserve and enhance strength and power throughout a competitive soccer season. This approach appears to mitigate commonly reported in-season declines in neuromuscular performance and offers a practical strategy for managing RT under congested competitive schedules.
Sudden cardiac arrest (SCA) remains the leading cause of sport-related death worldwide, yet survival outcomes in Africa are critically poor. We conducted a formative assessment of the knowledge, attitudes and practices of professional football stakeholders in Cameroon as a model for identifying system-level gaps in SCA preparedness in low-resource sport settings. We conducted a cross-sectional survey during the 2024/2025 season across five pre-competition medical centres in Cameroon. Eligible participants were players, referees, coaches and physiotherapists from Elite 1, Elite 2 and the Female Super League who had two or more seasons of professional experience. A validated questionnaire, adapted from international guidelines, was used to assess sociodemographic data, knowledge, attitudes and practices. Data were analysed descriptively with subgroup comparisons using SPSS V.20. A total of 745 participants were enrolled: 536 players, 90 referees, 84 coaches and 35 physiotherapists. Most (66.4%) demonstrated poor knowledge of SCA, only 12% reported confidence in cardiopulmonary resuscitation (CPR) or automated external defibrillator (AED) use and referees showed the lowest preparedness scores despite their critical first-responder role. While one-third expressed positive attitudes, self-reported barriers to providing CPR included fear of harm (66.3%) and legal concerns (78.3%). Only 33.4% reported good practices, with significant gaps in AED access, CPR training and emergency action plan awareness. This assessment identified critical gaps in SCA preparedness in Cameroonian football, exposing gaps that likely extend across Africa and similar environments. Compulsory CPR/AED training, AED availability and enforceable emergency action plans are urgently needed to align African football with international best practice and strengthen the survival chain.