Hitting a baseball is often described as the most difficult thing to do in sports. A key aptitude of a good hitter is the ability to determine which pitch is coming. This rapid decision requires the batter to make a judgment in a fraction of a second based largely on the trajectory and spin of the ball. When does this decision occur relative to the ball’s trajectory and is it possible to identify neural correlates that represent how the decision evolves over a split second? Using single-trial analysis of electroencephalography (EEG) we address this question within the context of subjects discriminating three types of pitches (fastball, curveball, slider) based on pitch trajectories. We find clear neural signatures of pitch classification and, using signal detection theory, we identify the times of discrimination on a trial-to-trial basis. Based on these neural signatures we estimate neural discrimination distributions as a function of the distance the ball is from the plate. We find all three pitches yield unique distributions, namely the timing of the discriminating neural signatures relative to the position of the ball in its trajectory. For instance, fastballs are discriminated at the earliest points in their trajectory, relative to the two other pitches, which is consistent with the need for some constant time to generate and execute the motor plan for the swing (or inhibition of the swing). We also find incorrect discrimination of a pitch (errors) yields neural sources in Brodmann Area 10, which has been implicated in prospective memory, recall, and task difficulty. In summary, we show that single-trial analysis of EEG yields informative distributions of the relative point in a baseball’s trajectory when the batter makes a decision on which pitch is coming.
To determine if shoulder and elbow kinematics, pitching velocity and accuracy, and pain change during a simulated baseball game in adolescent pitchers.
Baseball pitching imposes a dangerous valgus load on the elbow that puts the joint at severe risk for injury. The goal of this study was to develop a musculoskeletal modeling approach to enable evaluation of muscle-tendon contributions to mitigating elbow injury risk in pitching. We implemented a forward dynamic simulation framework that used a scaled biomechanical model to reproduce a pitching motion recorded from a high school pitcher. The medial elbow muscles generated substantial, protective, varus elbow moments in our simulations. For our subject, the triceps generated large varus moments at the time of peak valgus loading; varus moments generated by the flexor digitorum superficialis were larger, but occurred later in the motion. Increasing muscle-tendon force output, either by augmenting parameters associated with strength and power or by increasing activation levels, decreased the load on the ulnar collateral ligament. Published methods have not previously quantified the biomechanics of elbow muscles during pitching. This simulation study represents a critical advancement in the study of baseball pitching and highlights the utility of simulation techniques in the study of this difficult problem.
Recently, lumbopelvic control has been linked to pitching performance, kinematics, and loading; however, poor lumbopelvic control has not been prospectively investigated as a risk factor for injuries in baseball pitchers.
Effects of Three Recovery Protocols on Range of Motion, Heart Rate, Rating of Perceived Exertion, and Blood Lactate in Baseball Pitchers During a Simulate Game
- Journal of strength and conditioning research / National Strength & Conditioning Association
- Published over 4 years ago
Baseball pitching has been described as an anaerobic activity from a bioenergetics standpoint with short bouts of recovery. Depending on the physical conditioning and muscle fiber composition of the pitcher as well as the number of pitches thrown per inning and per game, there is the possibility of pitchers fatiguing during a game, which could lead to a decrease in pitching performance. Therefore, the purpose of this study was to evaluate the effects of 3 recovery protocols: passive recovery (PR), active recovery (AR), and electrical muscle stimulation (EMS) on range of motion (ROM), heart rate (HR), rating of perceived exertion (RPE), and blood lactate concentration in baseball pitchers during a simulated game. Twenty-one Division I intercollegiate baseball pitchers (age = 20.4 ± 1.4 yr, ht = 185.9 ± 8.4 cm, wt = 86.5 ± 8.9 kg, %BF = 11.2 ± 2.6) volunteered to pitch 3 simulated 5- inning games, with a maximum of 70 fastballs thrown per game while wearing a HR monitor. ROM was measured pre, post, and 24 hr post-pitching for shoulder internal and external rotation at 90° and elbow flexion and extension. HR was recorded after each pitch and after every 30 sec of the 6-minute recovery period. RPE was recorded after the last pitch of each inning and after completing each 6-minute recovery period. Immediately after throwing the last pitch of each inning, post-pitching blood lactate concentration (PPLa-) was measured. At the end of the 6-minute recovery period, before the next inning started, post-recovery blood lactate concentration (PRLa-) was measured. Pitchers were instructed to throw each pitch at or above 95% of their best pitched fastball. This was enforced to ensure that each pitcher was throwing close to maximal effort for all 3 simulated games. All data presented represent group means. Results revealed that the method of recovery protocol did not significantly influence ROM (p > 0.05) ; however, it did significantly influence blood lactate concentration (p < 0.001), HR (p < 0.001), and RPE (p = 0.01). Blood lactate concentration significantly decreased from post-pitching to post-recovery in the EMS recovery condition (p < 0.001), but did not change for either the active (p = 0.04) or the passive (p = 0.684) recovery conditions. RPE decreased from the post-pitching to post-recovery in both the passive and EMS recovery methods (p < 0.001), but did not decrease for active recovery (p = 0.067). HR decreased for all conditions from post-pitching to post-recovery (p < 0.001). The use of EMS was the most effective method at reducing blood lactate concentration after 6 minutes of recovery during a simulated game (controlled setting). Although EMS significantly reduced blood lactate concentrations post-recovery, blood lactate concentrations post-pitching in the simulated games were never high enough to cause skeletal muscle fatigue and decrease pitching velocity. If a pitcher were to throw more than 14 pitches per inning, throw more total pitches than normal per game, and have blood lactate concentrations increase higher than in the simulated games in this study, the EMS recovery protocol may be beneficial to pitching performance by aiding recovery. This could potentially reduce some injuries associated with skeletal muscle fatigue during pitching, may allow a pitcher throw more pitches per game, and may reduce the number of days between pitching appearances.
Weighted-ball throwing programs are commonly used in training baseball pitchers to increase ball velocity. The purpose of this study was to compare kinematics and kinetics among weighted-ball exercises with values from standard pitching (ie, pitching standard 5-oz baseballs from a mound).
Pitching biomechanics are associated with performance and risk of injury in baseball. Previous studies have identified biomechanical differences between youth and adult pitchers but have not investigated changes within individual young pitchers as they mature.
A glenohumeral internal rotation (IR) deficit or a total rotational motion (IR plus external rotation [ER]) deficit in the throwing shoulder compared with the nonthrowing shoulder has been shown to increase the risk of shoulder and elbow injuries. After a pitching session, both IR and total rotational motion deficits have been shown to occur naturally for an extended period of time in asymptomatic pitchers, but it is unclear how to best control these deficits between pitching sessions.
- Journal of strength and conditioning research / National Strength & Conditioning Association
- Published over 2 years ago
The purpose of this study was to examine the changes in resting heart rate variability (HRV) across a five-day pitching rotation schedule among professional baseball starting pitchers. HRV data were collected daily among eight Single-A level professional baseball starting pitchers (mean ± SD, age = 21.9 ± 1.3 yrs; height = 185.4 ± 3.6 cm; weight = 85.2 ± 7.5 kg) throughout the entire baseball season with the participant quietly lying supine for 10 minutes. HRV was quantified by calculating the natural log of the square root of the mean sum of the squared differences (lnRMSSD) during the middle five minutes of each R-R series data file. A split-plot repeated measures ANOVA was used to examine the influence of pitching rotation day on resting lnRMSSD. A statistically significant main effect of rotation day was identified (F4,706 = 3.139, p = 0.029). Follow-up pairwise analyses indicated that resting lnRMSSD on Day 2 was significantly (p ≤ 0.05) lower than all other rotation days. In addition, a statistically significant main effect of pitcher was also identified (F7,706 = 83.388, p < 0.001). These results suggest that professional baseball starting pitchers display altered autonomic nervous system function one day after completing a normally scheduled start, as Day 2 resting HRV was significantly lower than all other rotation days. In addition, the season average resting lnRMSSD varied among participants, implying that single-subject analysis of resting measures of HRV may be more appropriate when monitoring cumulative workload among this cohort population of athletes.