1. Introduction
One focus of sprint research is the interpretation of the force-velocity-power (FVP) profile for guiding individualized program development [
1]. The common FVP profile consists of theoretical horizontal force output per body mass (
HZT F
0; N·kg
−1), theoretical maximal running velocity (V
MAX; m·s
−1), maximal mechanical horizontal power output per body mass (
HZT P
0; W·kg
−1), the slope of the force–velocity curve (S
FV), and the ratio of decreasing force with increasing velocity (D
RF). Customizing training to the athlete’s specific FVP profile may allow coaches to develop optimal individual training interventions [
1,
2,
3]. The benefits of FVP profiling include addressing variability in playing level [
4,
5], sport position [
6], and age [
7] as well as the ability to take into account external variables, such as time of season [
8]. Team sports, such as rugby [
3,
9], soccer [
4,
7], ice hockey [
10], and netball [
11], as well as individual athletic activities, such as ballet [
12] and gymnastics [
13], have utilized this approach.
An important concern with FVP profiling is the ability to obtain reliable data for within- and between-day testing. Inconsistencies have been reported in jumping [
14] and short (<10 m) sprinting performance [
15], which causes concern about the efficacy of FVP profiling [
16]. Therefore, equally important to the FVP profile is the training approach and its accuracy of measurement. The technology for monitoring FVP continues to improve in terms of efficiency, allowing coaches to assess and evaluate their players more quickly and effectively [
17]. Currently, coaches and sports scientists have access to a variety of kinematic monitoring devices for sprinting performance, including high-speed cameras [
18,
19], iPhone apps [
20], radar guns (i.e., Stalker II) [
15], tether motor devices (i.e., 1080 Sprint) [
21], inertial measurement units (IMUs) [
22], global positioning systems (i.e., STATSports VIPER) (GPS) [
23,
24], and laser displacement measuring devices [
25,
26,
27,
28,
29,
30,
31]. Although high-speed cameras and radar technology have shown to be valid methodologies for sprinting analysis, these systems are typically expensive and time consuming, requiring further analysis beyond the raw data captured. Contemporary systems, such as laser displacement measuring devices, provide immediate feedback and are easier to use during regular training sessions [
21].
Studies on laser displacement measuring devices report intraclass correlation coefficients (ICCs) that range from moderate to good (0.64–0.83) [
25] and good to excellent (ICC > 0.77) [
27]. Recently, a commercially available device from Musclelab, Musclelab Laser Speed (MLS), was developed as a feasible means of assessing the FVP profile during sprinting. Van den Tillaar et al. [
32] measured the validity of the MLS, combined with an IMU (MLS + IMU), in comparison to the use of force plates on step length, step velocity, and step frequency in trained sprinters. The authors tested 14 trained sprinters in the 50-m sprint using the MLS to record continuous distance over time. Significant positive correlations between the MLS + IMU and force plates for step velocity (
r = 0.69) and step frequency (
r = 0.53) were reported. In addition to kinematic data, the MLS calculates several of the values for the FVP profile, which have not been studied. Therefore, the first objective of this study is to examine the level of agreement of the FVP profile with the MLS with a previously validated field device, the MySprint (MySp) app [
20], on maximal 30-m sprint time in a sample of Division Collegiate II athletes. Equivalence testing [
33], such as the comparison of the MLS and the MySp app, can provide coaches with the information needed to compare instruments. The rationale for comparing commercial devices across the same movement exercise is used for other field devices, such as linear transducers [
34,
35]. This comparison provides coaches and practitioners with a greater understanding of which devices are practical to guide programming and training [
34]. The results from this study can be used to help monitor training and provide direct feedback to the athlete.
The second objective is to report the minimal detectable change (MDC) data for the MLS. Minimal detectable change data are critical for practitioners to confidently identify actual change instead of typical within- and between-day variability [
36]. Few studies have reported MDC for the FVP profile [
9,
36,
37,
38]; however, established MDC data are lacking, particularly in the case of laser technology. Given that sport lasers report excellent validity and reliability, we hypothesize that the MLS device will have a good level of agreement with the validated MySp app [
20] and a high degree of reliability in terms of FVP profiling.
4. Discussion
This study reported the level of agreement, reliability, and MDC of the MLS device during maximal 30-m sprints in a sample of Division II Collegiate athletes. The major finding is that the MLS displayed a moderate to excellent level of agreement and reliability for nine of ten variables, with the exception of HZT F0. A second finding is that there was a significant proportional bias for VMAX as compared the MLS to MySp app, with the MySp app detecting faster velocities (mean difference = 0.31 m·s−1). Finally, significant differences occurred in four split times (15–30 m), HZT F0, and SFV between the two devices, with the MySp app having faster times, lower HZT F0, and a velocity-dominant slope.
The low to moderate ICC and
rP for
HZT F
0 and moderate ICC and
rP for S
FV indicate a partial rejection of our primary hypothesis that the MLS would agree with the MySp app for all components of the FVP profile. Our second hypothesis that the MLS is reliable for within- and between-day testing for split times and the FVP profile was primarily supported for nine of ten variables. Despite proportional bias for V
MAX, our finding is consistent with a previous study showing the mathematical model for the MySp app to have a 0.32 m·s
−1 bias compared to force plate analysis [
47].
Our ICCs were both higher [
25,
28] and similar [
27,
31] as compared to previous studies on the reliability of sports lasers. Within- and between-day ICCs and CVs across all split times were above 0.83 and less than 5%, respectively. Both
HZT P
0 and V
MAX within- and between-day data showed ICCs and CVs above 0.92 and less than 5%. Notably,
HZT F
0 showed acceptable internal consistency with CVs less than 5% for both within- (4.7% and 3.8%) and between-day (2.7%) sessions. Pearson-product moment correlations and absolute agreement ICCs for all split times,
HZT P
0, V
MAX, and S
FV ranged from 0.60 to 0.98 and 0.68 to 0.95. Similar to our reliability data, the MLS did not agree with the MySp app data for
HZT F
0, with
rP and ICC at 0.44 and 0.55, respectively. Bland–Altman plots show no fixed or proportional bias, except for proportional bias for V
MAX (MySp > MLS). The absence of bias is supported by all split times,
HZT F
0,
HZT P
0, and S
FV as having zero in the 95% CI for the slope and intercept. V
MAX did not have zero in the 95% CI for slope (−0.2–−0.02) but did have zero for the intercept (−0.15–1.29), indicating no fixed, but proportional, bias.
MDC is defined as the smallest change in a variable that reflects a true change in performance [
48]. MDC is important, particularly when monitoring athletes over several trials, as between-trial variation may suggest a change that has not exceeded a threshold error [
36,
49]. Studies by Edwards et al. [
36] and Ferro et al. [
37] used radar and laser technology, respectively, reported MDCs at the 90% confidence interval. Our split time and FVP profile MDC data were similar to those of Edwards et al. (2021) when comparing the average of three trials. SWC is defined as the smallest change in a metric that is likely of practical importance [
50]. In four splits (15–30 m) and V
MAX, the CV% was approximately equal to the SWC%, indicating that the MLS had an “okay” sensitivity rating in terms of detecting real change [
51]. In contrast, the CV% for
HZT F
0,
HZT P
0, and S
FV was greater than the SWC%. For Day 1 and Day 2,
HZT F
0 was CV (4.7%) > SWC (1.6%) and 3.8% > 1.2%,
HZT P
0 was 3.7% > 2.5% and 3.5% > 2.2%, and S
FV was 7.0% > 1.5% and 4.0% > 1.1%. Therefore, all three measurements had “marginal to poor” within-day sensitivity [
51,
52].
Coaches find FVP profiling to be useful as they allow for a more individualized training approach, and if the correct data collection methodology is used, FVP can provide accurate monitoring of progression [
1]. However, the value of FVP profiling is subject to debate [
14,
16]. Studies report
HZT F
0 to have moderate reliability [
15,
17,
36,
53], specifically for sprints < 10 m [
15,
36]. Our data are consistent with this finding for
HZT F
0, which achieved moderate ICCs for within- (0.71 and 0.69) and between-day (0.77) testing. Similar inconsistencies for
HZT F
0 have been reported when monitoring jumping with large variations between trials noted [
14].
Comparisons between the MLS and MySp app can be made in terms of their practical uses in real-world settings considering that both are accurate and valid sprint testing devices. The first advantage of the MLS is that it provides immediate feedback to the coach and sprinter allowing for better coaching instruction during the training session, while the MySp app uses video technology requiring further analysis post-sprint. The second is that the MLS accommodates a variety of settings, such as a field or gymnasium, with minimal setup time, while the MySp app requires a 15- to 20-min setup time (with two people) to ensure that all of the distances and parallaxes are accurate and that the viewing area is clear. This kind of setup can be challenging for short-staffed strength and conditioning coaches who might have one coach per training group. A final advantage is that the MLS can be combined with other measurement devices (e.g., IMU, contact grid) to access more sophisticated data, such as step velocity, step length, and contact time [
32]. However, the MLS costs approximately
$6000, whereas the MySp app is
$10. Further, the MLS does not display the entire FVP profile, as does the MySp app. In this regard, [
27] suggests that S
FV and D
RF are the most important factors in the FVP profile, which the MLS does not record, but the MySp app does. Nevertheless, according to [
27], the use of laser technology with trained sprinters resulted in poor reliability for both S
FV and D
RF.
This study has certain limitations. A research-grade criterion variable, such as a high-speed camera or force plate system, which the study lacked, would provide a better means to validate the MLS device. The rationale for using the MySp app was that the investigators wanted to provide two easily accessible devices for practitioners to make the comparison. A suggestion for future research would be a follow-up validation study that compares the MLS to a criterion variable. Second, the significant difference across the four split times,
HZT F
0, and proportional bias for V
MAX between the MLS and MySp app could be due to the sampling rate of the iPad camera (i.e., 120 fps), which is equivalent to a high-speed smartphone [
19]. Romeo et al. [
20] suggest using an iPhone at a sampling rate of 240 fps when utilizing the MySp app. A camera with a higher sampling rate might have reduced some of the statistical differences across the split times and our bias due to providing a more accurate identification point of the sprinter’s start and the hips as they crossed the specified marking poles. Notably, previous literature has used 120 fps to assess sprint performance [
54] and treadmill running [
55]; however, a practical suggestion for coaches is to have an iPhone, as opposed to an iPad, readily available when using the MySp app. Third, the inherent limitations of testing a sample of novice sprinters may have caused variation in our
HZT F
0 data. A contributor to the unwanted variation in
HZT F
0, particularly in the early phases of the sprint (<10 m), is that the changes in the lumbar point (where the laser is aimed) to the center of mass of the sprinter decreases as the sprinter continues to rise to an upright posture [
30]. This notion is supported by Talukdar et al. [
56], who reported
HZT F
0 to have the highest CV in their data set of young female team sport athletes (nevertheless, this study reported acceptable overall reliability for
HZT F
0 [ICC = 0.89, CI 0.77–0.94], using radar). One reason for this variation may be that novice sprinters rise too early and are inconsistent at the start of the sprint, and, thus, are not consistent in applying force at the start [
56]. The start positon (two-point vs. three-point) maybe a source of error for
HZT F
0 due to different athletes becoming accustomed to different starts. The current study used a two-point stance, which is similar to recent research when assessing 30 m sprint performance [
57]. Nonetheless, teaching athletes to have more consistency in their start position and sprinting technique (i.e., avoid rising too fast) in the first 10 m may reduce this variation. Therefore, caution should be used with
HZT F
0 data when evaluating progression and the training prescription.