Next Article in Journal
Depression Literacy, Associated Factors, and Correlation of Related Variables in Middle-Aged Korean Adults: A Cross-Sectional Study
Next Article in Special Issue
Strategies and Best Practices That Enhance the Physical Activity Levels of Undergraduate University Students: A Systematic Review
Previous Article in Journal
From the Perspectives of Pollution Governance and Public Health: A Research of China’s Fiscal Expenditure on Energy Conservation and Environmental Protection
Previous Article in Special Issue
Physical Inactivity and Sedentary Behaviour among Panamanian Adults: Results from the National Health Survey of Panama (ENSPA) 2019
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Knowledge in Motion: A Comprehensive Review of Evidence-Based Human Kinetics

Sport, Health & Exercise Research Unit (SHERU), Polytechnic Institute of Castelo Branco, 6000-266 Castelo Branco, Portugal
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2023, 20(11), 6020; https://doi.org/10.3390/ijerph20116020
Submission received: 28 February 2023 / Revised: 21 May 2023 / Accepted: 30 May 2023 / Published: 31 May 2023
(This article belongs to the Special Issue Exercise and Physical Activity in Health Promotion)

Abstract

:
This comprehensive review examines critical aspects of evidence-based human kinetics, focusing on bridging the gap between scientific evidence and practical implementation. To bridge this gap, the development of tailored education and training programs is essential, providing practitioners with the expertise and skills to effectively apply evidence-based programs and interventions. The effectiveness of these programs in improving physical fitness across all age groups has been widely demonstrated. In addition, integrating artificial intelligence and the principles of slow science into evidence-based practice promises to identify gaps in knowledge and stimulate further research in human kinetics. The purpose of this review is to provide researchers and practitioners with comprehensive information on the application of scientific principles in human kinetics. By highlighting the importance of evidence-based practice, this review is intended to promote the adoption of effective interventions to optimize physical health and enhance performance.

1. Evidence-Based Practice: Unfolding the Map

The importance of evidence-based practice (EBP) in advancing public health has been extensively documented in the literature [1,2]. EBP offers multiple benefits [3]: (1) it improves healthcare and increases efficiency; (2) it leads to better outcomes and promotes transparency; (3) it promotes collaboration and knowledge sharing among professionals; (4) it facilitates the effective application of evidence in practice, thereby improving individual health outcomes. Several initiatives, such as the European Union’s Evidence-Based Medicine project [4], as well as educational programs focused on EBP, actively support the promotion of EBP in health education. The integration of EBP into human kinetics is also seen as critical to improving athlete preparation and performance.
EBP has become a crucial decision-making process in various fields, including human kinetics [5]. The EBP process model, which originated in evidence-based medicine (EBM), was defined by Sackett and colleagues [6,7,8] as the “conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individuals clients” [6] (p. 71) and as “the integration of best research evidence with clinical expertise and client values” [8] (p. 1). Despite its growing importance, there is still no consensus on the exact definition of EBP, which leads to confusion and hinders its effective application. Misinterpretations often arise from inaccurate representations, limited access to primary sources, and the introduction of new EBP models that share similarities with the original model [9,10]. Establishing a clear and universally accepted definition of EBP across disciplines and settings is critical to improving professional practice and achieving optimal outcomes [9].
The three-part model of EBP assumes that clinical decision making is based on research evidence, clinical expertise, and client values [8]. While this model is widely accepted and successful in many healthcare settings, its application in other contexts has proven difficult. Key problems include overemphasis on research findings (“scientocentrism”), insufficient consideration of client values [11], overreliance on expertise, and limited communication between researchers and practitioners [12]. Practitioners should prioritize all three components of EBP when making decisions for their clients [11,12].
Overall, EBP is a decision-making process that practitioners go through in five steps [13]. The first step is to formulate a relevant practical question that can be answered. The PICO (population, intervention, comparison, and outcome) model is a commonly used framework for structuring clinical questions [14]. Second, practitioners need to search for the most reliable research evidence from sources such as peer-reviewed journals, systematic reviews, and electronic databases. Third, research findings must be critically evaluated to determine their validity and applicability (this includes factors such as study design, sample size, statistical analysis, and other methodological issues). Fourth, practitioners must match the research findings with their expertise and the characteristics of their clients to make a practical decision. Finally, they must monitor their clients’ progress, evaluate the effectiveness of the intervention in achieving the desired outcomes, and review the results when adjustments are needed [10,13,15,16].
How do practitioners evaluate scientific findings for practical use? Systematic reviews that meticulously summarize research findings to answer specific questions are considered the heart of EBP [17]. Scientific methods have been developed to summarize the results of multiple research studies to provide valuable insights for EBP. In fact, the types of evidence in EBP are hierarchically ordered based on their design quality and reliability. Typically, a pyramid model is used to order the types of evidence, with higher positions indicating stronger evidence. Each level builds on the data and research findings of the previous levels. Systematic reviews and meta-analyses represent the highest level of evidence but are relatively rare. As one moves down the pyramid, there is more evidence, but the quality may decrease [18]. There are different versions of the evidence pyramid, including the 4S [19], 5S [20], 6S [21], or 9S pyramid [18], which can lead to confusion. The multitude of different pyramid models highlights the need for clarity in understanding and navigating these models.
Murad and colleagues [22] argue that the old evidence pyramid was too simplistic and inflexible. They propose restructuring the pyramid to allow more flexibility in evaluating evidence. First, they suggest replacing the straight lines between levels of evidence with wavy lines to illustrate that lower-level evidence can outweigh typical higher-level evidence. Second, the authors recommend removing systematic reviews and meta-analyses from the top of the pyramid because not all are of equal quality, and some may even provide less robust data than the evidence at the previous levels. When systematic reviews and meta-analyses are removed from the traditional evidence pyramid, these tools can be used as lenses to assess the quality of evidence published at each level of the pyramid. This approach allows practitioners to interpret and evaluate evidence as it becomes available, rather than waiting years for a new or updated systematic review.
As mentioned earlier, the main goal of EBP is to provide the most effective interventions based on the best available evidence. Human kinetics serves as a prime example of the practical application of this approach, as sport science research often focuses on the translation of findings into practice. EBP in sport can be defined as the integration of technical expertise, athlete values, and the most reliable evidence to support decision making in the athlete training process [5]. In recent years, EBP has attracted considerable attention in the field of human kinetics, particularly in high-performance sport [5]. Numerous prominent global sport organizations have established research partnerships and innovation centers to further advance EBP [5,23]. Implementing EBP in human kinetics has the potential to improve training and performance outcomes, minimize training-related errors such as injuries, consider known benefits and risks in decision making, challenge subjective beliefs, and incorporate athlete and coach preferences into training and performance strategies [5].
Despite considerable progress, there is still a large gap between scientific knowledge and its practical implementation in human kinetics. This calls for the exploration of potential solutions to ensure the effective implementation of research findings and recommendations in real-world sport contexts. The purpose of this comprehensive review is, therefore, to examine several key aspects of EBP in human kinetics, highlight the existing gap between science and practice, and identify potential opportunities for EBP to improve outcomes in the field. Our goal is to enable researchers and practitioners to critically evaluate the application of EBP in human kinetics by presenting comprehensive information.

2. Paving the Way: Literature Search Strategy

This article presents a qualitative synthesis in the form of a narrative literature review. A literature search strategy was developed using the following electronic databases: PubMed, Web of Science, ScienceDirect, and Scopus. The selection of these databases was based on preliminary and exploratory research that indicated they contained significant and relevant work. Searches were conducted using keywords associated with the following groups of search terms: (a) evidence-based practice (e.g., evidence-based practice education, evidence-based medicine, evidence-based programs, researchers, practitioners, application of science, applied research, knowledge translation, the science–practice gap, artificial intelligence (AI)); (b) sport science (e.g., human kinetics, sport science research, sport scientists, sport science, athletic training, coaches, sport science, sport, exercise, physical activity, professional sport, and sport practice). The different search terms within each group were combined using the Boolean operator “OR”. The search terms from both groups were then combined with the Boolean operator “AND”. In addition, the reference lists of retrieved articles were analyzed to identify additional studies that met the defined eligibility criteria. In a further step, the studies were also searched via Google Scholar. An integrative perspective was adopted, including studies of all types to capture the context, processes, and important elements related to the topic under discussion. Therefore, publications that met the following criteria were eligible for this synthesis: books and peer-reviewed articles published in English between 1990 and 2023. Articles published in conference proceedings, abstracts, and unpublished manuscripts were not considered. Duplicate and unlinked articles were excluded before the full-text reading phase.

3. Bridging the Gap: Towards an Effective Evidence-Based Human Kinetics

Incorporating EBP into human kinetics has the potential to improve performance, minimize errors, facilitate informed and data-driven decisions, prioritize empirical evidence over faith-based perspectives, and foster collaboration between practitioners and clients [5]. However, despite the widely recognized need to translate human kinetics research into practical applications, barriers to EBP implementation persist [5,12,24]. This underscores the importance of bridging the gap between research and practice to develop effective evidence-based interventions and strategies. The purpose of this section is to identify the barriers and facilitators to this process. As the integration of research and practice becomes increasingly important in the field of human kinetics, the potential benefits of EBP should not be underestimated.
Collaboration between researchers and practitioners is critical to bridging the gap between research and practice. Unfortunately, this cooperation is often insufficient and leads to discrepancy between research results and their practical applications. One study found that high-performing coaches prefer informal conversations with their colleagues to acquire scientific knowledge, highlighting the need for more structured and formal collaboration [25]. In addition, researchers may prioritize topics based on their personal interest rather than their practical relevance to practitioners [26]. Marginalization of practitioners may also hinder collaboration, as some practitioners perceive themselves as less knowledgeable compared to researchers, while some researchers overemphasize the value of scientific contributions to success [27]. To overcome these challenges, sport organizations should proactively initiate collaboration with practitioners to identify and prioritize research questions [12]. Effective integration of research findings into practice requires a symbiotic relationship between practitioner experience and scientific research. Organizations can establish research and development (R&D) departments staffed by people with scientific expertise to improve decision-making processes. Similarly, staff with research experience are essential for organizations to work closely with practitioners [28]. However, the establishment of R&D departments may face obstacles such as organizational constraints, including financial limitations and staff acceptance issues [25,29]. In addition, successful collaboration requires a multidisciplinary approach that brings together key stakeholders from different fields [28].
Another barrier to implementing evidence-based human kinetics is the lack of access to research findings, which are often published in academic journals that require a subscription, making them inaccessible to practitioners [12,29,30]. This lack of access can lead to outdated or incorrect decisions based on intuition or experience rather than evidence-based practices. To address this issue, practitioners prefer more accessible ways of accessing scientific information, such as face-to-face conversations, infographics, podcasts, and social media platforms [12,29,31]. There are also websites dedicated to transforming scientific research into easily consumable formats such as videos and blogs to facilitate knowledge sharing and reuse [32]. Qualitative research and case studies have also been suggested as effective means of linking research and practice and developing hypotheses for future research questions [33,34]. In addition to accessibility, lack of research applicability is also a common barrier to EBP. Many studies rely on theoretical hypotheses without considering practical issues [12,25,28,30]. Experimental control in field-based research in high-performance environments is also a challenge [35]. Bias in training and randomization are other significant barriers to effective research implementation [5]. Finally, the quality of research design and implementation is critical to the strength of evidence, as high-quality research leads to more reliable and robust results necessary for effective decision making [36].
The application of EBP in human kinetics can be challenging for practitioners, who may lack the necessary scientific terminology to accurately communicate research findings. Consequently, research findings may be misinterpreted, misapplied, or overlooked. In addition, practitioners’ attitudes and beliefs may hinder the adoption of EBP, as some people lack confidence in research findings and instead rely solely on their intuition and personal experience, which may prove impractical and time consuming [25,29]. Practitioners’ ability to effectively apply research findings is also limited by their fast-paced work environment and lack of time and expertise to analyze the results. According to a study by Reade and colleagues [25], coaches are least likely to learn from academic journals due to their busy schedules. On the other hand, researchers may need time to address complex challenges, which can lead to a disconnect between the two groups and make it difficult to align their goals [12,29,30,31]. Nonetheless, high-performing organizations can facilitate collaboration between practitioners and researchers to leverage the strengths of both groups. Finally, the application of EBP in the field of human kinetics may be hindered by inadequate education and training. Therefore, it is critical to prioritize the integration of EBP into the academic curricula of human kinetics programs [12].
For those seeking a collaborative approach, Bishop [36] has developed the Applied Research Model for the Sport Sciences (ARMSS), a comprehensive framework for conducting applied research in the sport sciences. The ARMSS model emphasizes that applied research should aim to answer questions that arise in an applied context through description, testing, and implementation. The model includes eight phases that provide a structured approach to conducting research studies to improve athletic performance and enhance athletic training programs. These phases are as follows: (1) problem definition (identifying the research problem and clearly defining the research questions); (2) descriptive research (collecting and analyzing data describing the phenomenon under study); (3) predictors of performance (identifying potential predictors of athletic performance and conducting regression analyses to determine the strength of their relationship to performance outcomes); (4) experimental testing of predictors (conducting experiments to test the identified predictors of athletic performance); (5) determinants of key performance predictors (identifying underlying mechanisms that explain the relationship between predictors and athletic performance and selecting the best interventions to modify performance predictors); (6) intervention studies (evaluating the effectiveness of interventions to improve athletic performance, including efficacy studies); (7) barriers (and motivators) to intervention adoption (identifying factors that discourage stakeholders from adopting interventions and exploring potential motivators to promote adoption); and (8) sport implementation studies (conducting efficacy studies to evaluate the practical implementation of sport interventions). Although sport performance research has often been viewed as underfunded and ineffective, the ARMSS model has gained popularity. Ultimately, the model underscores the importance of linking academic research with practical applications in sport and collaborating with practitioners to develop and evaluate innovative solutions that could improve athlete development and performance.
Other models focusing on multidisciplinary approaches to performance optimization have been proposed in recent years [26,37] and involve multiple key stakeholders in the research process. Jones and colleagues [26] emphasize the importance of collaborating with policy makers and practitioners to develop research questions that maximize the utility and adoption of research findings in practice. The ultimate goal of applied research should be to provide useful results, not just interesting ones. The authors propose a model that combines internal research initiatives with input from experts outside the field, which can lead to a competitive advantage. Thus, they offer different perspectives on the roles, challenges, and positions of stakeholders in the research practitioner model, which includes research and performance management, researchers, practitioners, and research practitioners. The latter are involved in both practice (30% of their time) and research (70% of their time). Similarly, Bartlett and Drust [37] propose a framework for effective information transfer in sport that highlights the critical components required for successful knowledge transfer and performance delivery in high-performance sport, with a focus on practitioners. These critical components include: EBP (which requires strong collaborative relationships among stakeholders); philosophy (related to character, leadership, and peer evaluation); receiver (which requires an understanding of stakeholders and what contributes to the knowledge transfer process); facilitation (which is viewed as an enabling process that requires a range of personal attributes, expertise, and interpersonal skills). Incorporating such research approaches that consider multiple stakeholders and the context of sport can advance EBP in human kinetics.

4. The Pursuit of Expertise in Evidence-Based Practice

Expertise is one of the three components of EBP, along with client values/preferences and best available research [6,7,8]. However, what exactly does this term mean? It refers to the knowledge, skills, and experience that a practitioner has acquired over time in a particular field. This expertise is based on years of experience, current research knowledge, and ongoing education. In the context of EBP, expertise includes the ability to critically evaluate and integrate research findings and relate this knowledge to client values/preferences to make professional decisions [38]. Practitioners play a critical role in translating research findings into practice because they can use their knowledge and experience to select the most relevant and reliable information for their practice. They can then use their expertise to tailor research findings to the unique needs and situations of individual clients [19].
Education plays a key role in developing the necessary expertise for EBP. Although EBP has long been used in clinical practice, education and training in EBP is often inadequate [39]. Integrating EBP into education is critical to improve practitioners’ skills and knowledge and enable them to critically evaluate and incorporate research findings into practice to ultimately improve outcomes. Advanced courses that incorporate EBP can promote the development of critical thinking and problem-solving skills that are essential for informed decision making [40]. In addition, EBP education promotes a culture of lifelong learning and professional development that enables practitioners to keep pace with scientific advances. It also supports teamwork and collaboration by encouraging the sharing of expertise and knowledge among practitioners. Finally, EBP education can foster an organizational culture that supports EBP, leading to better outcomes and more effective use of resources [39].
Some authors, such as Straus and colleagues [41], acknowledge the importance of incorporating EBP into high-level courses to provide health professionals with the knowledge and skills they need to deliver high-quality care based on the best available evidence. They argue for the inclusion of EBP at all levels of education, including undergraduate, postgraduate, and continuing education. Other authors [38,42] emphasize the importance of practitioners keeping abreast of the latest research to ensure optimal care. Greenhalgh and colleagues [42] argue that healthcare education should not only cover the concepts of EBP but also provide the practical skills and tools for its application. Therefore, it is critical to integrate EBP into all health professions curricula, including courses in human kinetics, to ensure that future professionals develop a solid foundation in research and acquire the ability to translate evidence into practice.
A variety of teaching methods are used in college courses and training programs to promote the use of EBP among professionals. These methods may include didactic lectures, interactive workshops, online courses, and clinical practices. Didactic lectures and seminars are commonly used in EBP courses to provide students with a comprehensive understanding of EBP principles and the skills necessary to apply them in clinical practice. These lectures typically cover basic aspects of EBP, such as formulating clinical questions, conducting evidence searches, and assessing the quality of evidence [42]. Although didactic lectures have been shown to improve EBP knowledge and skills, they do not always lead to behavior change [43,44].
Incorporating interactive workshops into EBP education and training can be an effective method for developing EBP skills because they use small group activities, case studies, and role-playing to enhance EBP skills. A randomized controlled trial has shown that interactive EBP workshops are more effective than didactic lectures in improving EBP-related knowledge, skills, and attitudes [43]. In this approach, students are presented with clinical scenarios and are tasked with developing treatment plans using EBP concepts. In this and other methods, case-based learning can be used in which students learn to apply EBP concepts in real clinical scenarios. This technique has been shown to be particularly effective for long-term retention and application of knowledge [45]. Interactive and problem-based learning have the potential to improve problem-solving and critical thinking skills, thereby enhancing EBP skills and knowledge [46,47,48].
Online courses and modules are also popular because they offer learners convenience and flexibility. They use a combination of didactic lectures, interactive activities, and self-directed learning to teach EBP skills. For example, the Center for Evidence-Based Medicine in Oxford (https://www.cebm.net/ (accessed on 26 January 2023)) has developed several online EBP learning modules. These online courses/modules are often accessible via in-service learning and have quality criteria to ensure that students and practitioners apply the material in practice [49]. However, a review of online EBP courses found that while they can improve EBP knowledge and skills, they may not be as effective at translating knowledge into behavior change [50].
Incorporating information technology (IT) is another effective approach to EBP education. This involves the use of technology, such as mobile devices, in the classroom or clinical setting to teach EBP search tactics, critical evaluation of clinical guidelines, and task-oriented information for clinical practice [51,52]. However, despite its accessibility, technology is underutilized in teaching and clinical practice [52]. Nonetheless, IT technology has been shown to be an effective teaching method, and future research should explore the potential of the Internet and smartphone applications to promote interactive online learning and engagement [53,54,55].
In addition, integrating EBP education with clinical practice has proven to be a popular approach to EBP teaching because it provides learners with practical opportunities to apply their EBP skills in the real world [54]. Clinical experiences, such as internships and mentorships, are examples of these integrated approaches. A comprehensive review of EBP education found that clinical experiences contribute to behavior change and improve patient outcomes [44]. When learning is integrated into clinical practice rather than limited to traditional courses, health professionals demonstrate improvement in skills, attitudes, and behaviors, and they are more likely to retain and apply acquired knowledge in their practice. Individual courses may improve the content of information but not necessarily the skills, attitudes, and behaviors. Nevertheless, it remains unclear whether learners retain the acquired knowledge in the long term, apply the learned skills in practice, or use EBP more frequently [56].
On the other hand, informal gatherings such as journal clubs provide professionals with a platform to discuss current research findings and their practical implications. These clubs can promote critical thinking skills and the application of research findings to practice [57,58]. Although journal clubs are not widely used, they have had a positive impact on EBP education by improving students’ ability to read articles, understand EBP, and develop the skills needed to provide evidence-based care [54,59]. Even though journal clubs may complement classroom lectures or clinical practice, further research is needed to determine their effectiveness in teaching EBP [54].
Finally, the use of librarians is a valuable method of teaching EBP. These librarians can help students develop search strategies and understand EBP concepts before they enter clinical practice. They can teach students how to navigate databases, evaluate sources, and synthesize evidence, providing them with important information skills for their future practice. Involving academic librarians in EBP education can be an effective approach to teaching the five steps of EBP described in the Sicily Statement. The first three steps can be taught in a classroom setting, while the last two steps can be applied in a clinical setting, allowing students to apply what they learn in real-life situations [54].
Research courses and workshops are common methods for teaching EBP [53,54]. These approaches involve several steps described in the Sicilian statement and often include collaboration with clinical practice, which has been shown to be effective in improving knowledge, skills, and attitudes about EBP [60]. Kyriakoulis and colleagues [53] suggest a combination of interventions, including lectures, tutorials, journal clubs, and online sessions, as the optimal approach for teaching EBP. However, further research is needed to determine the most effective teaching strategies for learners at different skill levels, ranging from novice to expert [57]. In addition, the frequency of interventions should be examined, as repeated interventions may increase learners’ confidence in using EBP and help maintain their skills over time [53]. Future research should also use more reliable methods to assess long-term retention of EBP skills [57]. While previous research has focused primarily on medical or nursing settings, there has been little research in other areas, including sports science. Therefore, conducting high-quality and reliable research in various fields is essential [53].
In this regard, questionnaires are a valuable tool to better understand EBP because they provide a consistent and structured method to collect information about practitioners’ beliefs, attitudes, and behaviors. They offer several benefits, including identifying EBP barriers, addressing knowledge gaps, and promoting practitioner adoption of EBP. Questionnaires can be used to develop tailored interventions that address knowledge, attitudes, and barriers to EBP, thereby increasing effectiveness and the likelihood of behavior change. In addition, questionnaires can be used to track progress, assess long-term impact, and identify areas for improvement [61]. Finally, practitioners can collaborate and learn from each other by discussing their views, attitudes, and actions on EBP. In recent years, several tools have emerged to assess different aspects of EBP, particularly in the fields of medicine, nursing, and physical therapy [62,63,64,65]. In the field of human kinetics, although few tools exist to evaluate EBP approaches for various populations, considerable progress has been made, particularly in the area of athletic training and elite sports. Several assessment tools have been developed to address this need [25,29,66,67,68,69,70].
In summary, to improve the integration of research and practice in human kinetics, further research is needed that focuses on fostering collaboration between researchers and practitioners. This collaboration can be facilitated through qualitative research methods that can help understand practitioners’ goals and develop co-created objects of study [66]. Gaining insight into practitioners’ preferred feedback mechanisms and the challenges associated with translating research into practice is critical. Such understanding will enable the development of effective strategies for integrating coaches, staff, and players who share common goals. In addition, increasing access to educational and financial resources, actively engaging staff in the coaching environment, and developing a better understanding of player–coach relationships can help overcome barriers to EBP.

5. From Sprint to Marathon: When Artificial Intelligence Meets Slow Science

AI is a new and promising approach to EBP. It involves computer systems that are able to learn and reason similarly to humans, enabling them to perform cognitive tasks that normally depend on human cognitive abilities, such as problem solving, decision making, and perception [71,72]. By improving the accuracy and efficiency of evidence synthesis and decision-making processes, AI has the potential to increase the overall precision and effectiveness of EBP [73]. This section addresses the potential of AI-assisted EBP.
The amount of information available online is increasing exponentially every year. However, analyzing the voluminous data from clinical trials can be challenging with traditional data processing systems. With the continuous increase in information, the use of machine learning (ML) has become crucial for automated knowledge extraction [73]. ML is defined as “a field of artificial intelligence that systematically applies algorithms to identify the underlying relationships between data and information” [74] (p. 1). Thus, AI can help with automated literature searching and screening. AI algorithms can be trained to search and screen vast amounts of literature, which could reduce the time and resources required for systematic reviews and meta-analyses while improving the accuracy of search results [75].
ML algorithms have the potential to streamline data collection from multiple studies and enable semi-automated synthesis of results, including systematic reviews and meta-analyses. Several AI-based technologies such as RobotReviewer [76,77], ASReview [78], and the Cochrane Evidence Pipeline and Centralized Search Service [79] have already been used for this purpose. The integration of AI can significantly reduce the time and resources required to conduct systematic reviews [80]. In a study by Wagner and colleagues [81], the use of AI throughout the literature review process, from problem definition to data analysis and interpretation, was shown to improve search accuracy and speed while reducing repetitive tasks. For example, the RobotReviewer tool provides a user-friendly interface that identifies relevant studies, reduces reliance on manual searching, and provides real-time updates with new primary research findings [77]. Researchers also use automation tools such as SAMA [82], MetaCyto [83], and Python-Meta [84] to perform meta-analyses.
The use of AI is not limited to automated literature searches and synthesis of evidence. It can also be used to analyze electronic records and other datasets to uncover patterns and relationships that are not apparent using traditional analytical methods. This can lead to new insights and the development of tailored and effective solutions [85]. By leveraging client data and current research, AI-based decision support systems can help practitioners make more accurate and efficient diagnoses and intervention decisions, which could improve outcomes while reducing diagnostic errors and costs [85,86]. In addition, Topol [85] suggests that AI could accelerate the discovery of new treatments and improve healthcare delivery.
AI has the potential to revolutionize EBP by enabling professionals to access and analyze large amounts of data more efficiently, leading to better outcomes and cost reductions. Still, the application of AI in EBP comes with certain challenges and limitations. For example, if AI algorithms are not properly trained or validated, they may introduce bias or provide unreliable results, and ethical concerns have been raised about the use of AI in healthcare [76]. The notion that AI could improve and partially automate research has sparked lively debates in various scientific disciplines, including the health sciences [87,88]. In particular, the concept of automated science raises crucial questions about the future of research in areas that require “sophisticated abstract thinking, intricate knowledge of methodologies and epistemology, and persuasive writing skills” [89] (p. 292).
The concept of slow science offers an alternative approach to the traditional model of science. It emphasizes a careful, reflective, and collaborative method that values attention and slowness as crucial elements of scientific practice [90]. Although it may seem contradictory, AI and slow science can complement each other to provide a more comprehensive understanding of complex phenomena. AI can rapidly analyze large amounts of data and detect patterns that might be overlooked by human researchers by using ML algorithms to detect correlations, predict outcomes, and discover new associations between variables [75,80]. Meanwhile, slow science emphasizes a critical and thoughtful approach to research and encourages researchers to take the time to think deeply about their findings and engage in discourse with their colleagues. Quality takes precedence over quantity and leads to a more thorough and clear understanding of complex phenomena [90].
The combination of slow science and AI has the potential to revolutionize EBP by combining careful analysis with cutting-edge technology. AI can analyze massive amounts of data quickly, while slow science offers a more reflective and analytical approach to research, taking the time to consider the implications of the results. This approach can lead to the identification of new research topics, the refinement of current ideas, and the development of innovative EBP-based interventions [75,85]. There are several ways that slow science and AI can be used together to improve EBP. Slow science can ensure that research is of high quality [90], while AI can accelerate data analysis. This can help professionals make informed decisions based on the latest studies. For example, while slow science can help develop personalized interventions for clients, AI can analyze a large dataset and make suggestions based on the latest research [85]. Slow science can also ensure that research studies are conducted in a transparent and ethical manner [90]. Meanwhile, AI can be used to examine large datasets and detect patterns that are difficult to see with the naked eye [75,80]. This can help professionals identify areas that need more research and ensure that studies are conducted in a fair and ethical manner. In addition, slow science can foster collaboration between researchers and practitioners [90], while AI can analyze data from multiple sources and provide insights that may not be apparent from a single study [80]. By automating tasks such as literature searches and evidence synthesis, AI can give practitioners more time to reflect on their practice and apply research findings.
In summary, slow science and AI can work harmoniously to improve EBP by providing a thorough yet efficient approach to data analysis and research. In their recent study, Marshall and colleagues [77] highlight the importance of AI in the context of live systematic reviews, an innovative approach to updating evidence syntheses that can help reduce the burden and improve the timeliness of systematic reviews. However, they emphasize the importance of combining AI with human expertise. The literature review process involves both creative and mechanical tasks, and advanced AI-based tools offer new opportunities to reduce the time spent on routine tasks while allowing researchers more time for creative activities that require human interpretation, insight, and expertise [81,88].
The integration of AI and slow science holds the potential for significant advances in evidence-based human kinetics. ML techniques, supported by improved computational power and access to new data sources, can provide valuable insights for training, performance improvement, and injury prevention in human kinetics, both on and off the field [91]. Wearable technology is an example of how AI and slow science can be combined in the field of human kinetics. These technologies enable the collection of large amounts of data that can be used to develop evidence-based training plans and mitigate injury risk. However, to ensure accurate and meaningful conclusions, a careful and deliberate approach to data analysis and quality assurance that incorporates the principles of slow science is essential. Researchers must exercise caution in data collection and analysis while fully realizing the potential of AI to maintain the rigor and integrity of research [92,93].
While AI has the potential to automate repetitive tasks and provide support, human interpretation, synthesis, and creativity remain necessary for meaningful contributions and theory development [81]. Practitioners can optimize their solutions by leveraging the strengths of both approaches. Integrating slow science and AI can improve EBP by promoting transparency and ethical research behavior and enabling practitioners to make informed decisions based on the latest evidence. By using AI, professionals can expand their knowledge and optimize the efficiency of EBP. This allows them to effectively use research findings in thoughtful and reflective ways, ensuring that the specific needs of individual clients and communities are appropriately met. There is still much to be done to support repetitive tasks and enable meaningful contributions, but the future looks promising.

6. Evidence-Based Programs: A Proven Track Record

Evidence-based programs are comprehensive interventions designed to help clients with complex problems [94]. These programs have been rigorously tested in controlled settings and have been shown to be effective. They have then been translated into practical models that can be implemented by community-based organizations. To be considered evidence-based, a program must meet certain criteria that confirm its effectiveness and reliability. These criteria typically include using reliable scientific data, such as peer-reviewed studies or systematic reviews, ensuring replicability across settings, and conducting ongoing evaluations to confirm program effectiveness [95,96]. For a program to be truly effective, there must be solid evidence that its outcomes result directly from the program’s activities. When these criteria are met, programs are considered reliable and useful, enabling individuals and communities to achieve better outcomes [95]. In human kinetics, evidence-based programs provide proven techniques to improve health and prevent disease, ensuring that clients receive the most effective treatment and support to improve performance and reduce the risk of injury [97]. The purpose of this section, therefore, is to present evidence-based programs in the field of exercise science.
In recent decades, the use of research-based programs that rely on credible scientific evidence to improve health outcomes has gained prominence [97]. These programs are based on extensively validated research evidence that ensures their effectiveness and safety. They aim to provide individuals with a structured approach to achieve their health-related goals. Evidence-based human kinetics programs have been used in a variety of settings, including fitness [98,99], falls prevention [100,101], athletic training and injury prevention [102,103], adaptive sports [104], and dance [105]. These programs have been adapted to different populations and address specific conditions such as arthritis [106], diabetes [107], autism [108], and cancer [109]. Although there are numerous evidence-based programs, this discussion will focus on a few to provide an overview and highlight their effectiveness and applicability in different contexts. An extensive literature search enabled the identification of several representative evidence-based programs that have been extensively studied in terms of their outcomes. The goal is to present a comprehensive compilation of these programs as examples that demonstrate their effectiveness and applicability in different contexts.
As people age and look for ways to maintain their health and independence, evidence-based physical activity programs become increasingly valuable. These programs are developed based on sound scientific research and are designed to improve strength, balance, flexibility, and cardiovascular health. Vivifrail (http://www.vivifrail.com/ (accessed on 6 February 2023) is an example of such a program. It is an individualized and multi-part exercise program for the elderly that includes exercises to improve various aspects of physical fitness, nutritional counseling, and cognitive training to promote a healthy lifestyle [100]. Vivifrail has been scientifically validated to improve physical fitness and reduce the risk of falls in older adults [110,111]. In a randomized controlled trial of Vivifrail [100], study participants showed significant improvement in functional capacity, cognitive function, muscle function, and mood. Another study [110] confirmed the short-term effectiveness of the program and its ability to prevent functional impairment and loss of strength in institutionalized elderly. Evidence-based programs such as Vivifrail can be of great benefit to older people, helping them to maintain physical function and independence.
Evidence-based athletic training and injury prevention programs have been developed to reduce injury risk and improve performance. The FIFA 11+ program, introduced in 2006, is a comprehensive warm-up program for soccer players that includes running, plyometric exercises, and balance/coordination exercises. The program consists of 15 exercises that focus on muscle strength, balance, and coordination [112,113]. Research has shown that training with the FIFA 11+ program at least twice per week can minimize injury risk in male and female soccer players [114,115,116]. Sadigursky and colleagues [113] conducted a systematic review of randomized clinical trials to evaluate the effectiveness of the FIFA 11+ program in preventing injuries in soccer players of both sexes over the age of 13 years. The review found that the program resulted in a 30% decrease in injuries among soccer players. However, it was noted that a period of 10–12 weeks was required to achieve results. In addition, participation in the FIFA 11+ program has been shown to improve the physical performance of soccer players. Those who completed the program exhibited better dynamic balance and agility than those who did not [102]. Asgari and colleagues [117] also conducted a systematic review that demonstrated the effectiveness of medium- to long-term use of FIFA 11+ in improving most biomechanical parameters, core stability, and balance. Nevertheless, the study cautioned against using FIFA 11+ as a warm-up program before competitions, as it could have an immediate negative impact on performance. Overall, the scientific data supports the effectiveness of the FIFA 11+ program as a practical and accessible tool for coaches and players to prevent injuries and improve soccer player performance.
To optimize outcomes related to fitness and health, evidence-based fitness programs have also been established. High-intensity interval training (HIIT) is one such program that has gained popularity. HIIT alternates periods of high intensity with periods of active or passive recovery [118]. Studies have shown that HIIT can improve cardiovascular fitness, metabolic health, and body composition [98,119,120,121,122]. A systematic review and meta-analysis by Batacan and colleagues [98] showed that HIIT can effectively improve maximal oxygen uptake and several cardiometabolic risk factors in overweight or obese populations, including waist circumference, body fat percentage, resting heart rate, systolic and diastolic blood pressure, and fasting glucose. The physiological benefits of HIIT may not only improve cardiometabolic well-being, but also help mitigate the development and progression of disease-related risk factors associated with obesity and low aerobic fitness. In addition, there is a growing body of evidence supporting the beneficial effects of HIIT on cognitive performance [123,124] and functional training in older adults [125,126]. Stern and colleagues [118] conducted a systematic review and meta-analysis that found that HIIT interventions improve functional movement measures in older adults, even in those with movement limitations. In summary, evidence-based HIIT programs provide an effective way to improve health outcomes and thus are a valuable addition to any exercise plan.
In summary, evidence-based human kinetics programs play a critical role in improving athletic performance and promoting overall health and well-being, as shown in Table 1. These programs are based on sound scientific research and have demonstrated effectiveness in improving various health outcomes. However, it is important to note that evidence-based programs are not equally accessible to all people worldwide. This issue needs to be addressed to ensure equitable access for people around the world [97]. Despite these challenges, the development and implementation of evidence-based programs is an important step toward improving health on a global scale. These programs can help people reach their full potential while promoting their overall health and well-being by leveraging the latest research and best practices. As human kinetics continues to evolve, it is critical that evidence-based programs are promoted and made accessible to people from diverse backgrounds. This requires sustained efforts to support research, funding, and education, as well as a commitment to equal access to evidence-based programs for all people.

7. Stepping Stones to the Future: Next-Level Human Kinetics

Evidence-based human kinetics is a key component in promoting optimal health and performance outcomes for diverse groups. Despite a growing body of research on successful fitness programs and therapies, a large gap remains between evidence and practice. To close this gap, education and training programs must be developed to provide practitioners with the expertise and skills they need to effectively deliver evidence-based programs and interventions.
Collaboration between sport and health sciences also plays a critical role in pursuing a comprehensive and integrated approach to human well-being. By bridging the gap between these disciplines, we can improve our understanding of the intricate relationships between physical activity, performance, and health. This interdisciplinary approach sets the stage for EBP to maximize human potential, prevent disease, and promote overall wellness. The partnership between sport science and health science transcends the boundaries of their respective disciplines. By fostering collaboration and sharing knowledge, we can leverage the synergies between these fields and drive advances in both sport performance and public health. Together, we have the power to promote a culture of lifelong physical activity, improve athletic performance, and enhance the overall health and well-being of individuals and communities.
In addition, evidence-based programs have been shown to improve health outcomes, such as cardiovascular health, body composition, and muscle strength. However, for these programs to be successful, a detailed understanding of relevant research and practical concerns, such as individual differences in health status and preferences, is required. In addition, integrating AI and slow science into EBP has the potential to increase the effectiveness of interventions by identifying knowledge gaps and opportunities for additional research, thereby expanding the human kinetics database. This appears to be a potential avenue when combined with slow science.
Following this comprehensive review, see Figure 1 for a diagram summarizing the main concepts and ideas discussed.
In summary, evidence-based human kinetics provides a solid framework for understanding the positive effects of exercise and physical activity on health and performance. We can continue to improve our understanding of human kinetics and help people optimize their physical health and performance by promoting evidence-based sport science and advancing the dissemination of accurate and reliable information. Education, the use of evidence-based programs, and the incorporation of AI into practice are all viable ways to advance the discipline and improve outcomes for people around the world.

Author Contributions

Conceptualization, A.R.; validation, A.R. and J.P.; formal analysis, A.R., and J.P.; investigation, A.R., and J.P.; writing—original draft preparation, A.R.; writing—review and editing, A.R.; visualization, A.R., and J.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Armstrong, R.; Doyle, J.; Lamb, C.; Waters, E. Multi-Sectoral Health Promotion and Public Health: The Role of Evidence. J. Public Health 2006, 28, 168–172. [Google Scholar] [CrossRef] [PubMed]
  2. Brownson, R.C.; Fielding, J.E.; Maylahn, C.M. Evidence-Based Public Health: A Fundamental Concept for Public Health Practice. Annu. Rev. Public Health 2009, 30, 175–201. [Google Scholar] [CrossRef] [PubMed]
  3. Kumah, E.A.; McSherry, R.; Bettany-Saltikov, J.; van Schaik, P.; Hamilton, S.; Hogg, J.; Whittaker, V. Evidence-Informed vs Evidence-Based Practice Educational Interventions for Improving Knowledge, Attitudes, Understanding and Behaviour towards the Application of Evidence into Practice: A Comprehensive Systematic Review of Undergraduate Students. Campbell Syst. Rev. 2022, 18, e1233. [Google Scholar] [CrossRef] [PubMed]
  4. Thangaratinam, S.; Barnfield, G.; Weinbrenner, S.; Meyerrose, B.; Arvanitis, T.N.; Horvath, A.R.; Zanrei, G.; Kunz, R.; Suter, K.; Walczak, J.; et al. Teaching Trainers to Incorporate Evidence-Based Medicine (EBM) Teaching in Clinical Practice: The EU-EBM Project. BMC Med. Educ. 2009, 10, 9–59. [Google Scholar] [CrossRef]
  5. Coutts, A.J. Challenges in Developing Evidence-Based Practice in High-Performance Sport. Int. J. Sports Physiol. Perform. 2017, 12, 717–718. [Google Scholar] [CrossRef]
  6. Sackett, D.L.; Rosenberg, W.M.; Gray, J.A.; Haynes, R.B.; Richardson, W.S. Evidence-Based Medicine: What It Is and What It Isn’t. Br. Med. J. 1996, 312, 71–72. [Google Scholar] [CrossRef]
  7. Sackett, D.L. Evidence-Based Medicine. Semin. Perinatol. 1997, 21, 3–5. [Google Scholar] [CrossRef]
  8. Sackett, D.L.; Straus, S.E.; Richardson, W.S.; Rosenberg, W.M.C.; Haynes, R.B. Evidence-Based Medicine: How to Practice and Teach EBM, 2nd ed.; Churchill Livingstone: Edinburgh, UK, 2000. [Google Scholar]
  9. Parrish, D. Evidence-Based Practice: A Common Definition Matters. J. Soc. Work Educ. 2018, 54, 407–411. [Google Scholar] [CrossRef]
  10. Thyer, B.A. What Is Evidence-Based Practice? In Foundations of Social Work Practice; Roberts, A.R., Yeager, K.R., Eds.; Oxford University Press: Oxford, UK, 2006; pp. 35–46. [Google Scholar]
  11. Berg, H. Evidence-Based Practice in Psychology Fails to Be Tripartite: A Conceptual Critique of the Scientocentrism in Evidence-Based Practice in Psychology. Front. Psychol. 2019, 10, 2253. [Google Scholar] [CrossRef]
  12. Fullagar, H.H.K.; McCall, A.; Impellizzeri, F.M.; Favero, T.; Coutts, A.J. The Translation of Sport Science Research to the Field: A Current Opinion and Overview on the Perceptions of Practitioners, Researchers and Coaches. Sports Med. 2019, 49, 1817–1824. [Google Scholar] [CrossRef]
  13. Dawes, M.; Summerskill, W.; Glasziou, P.; Cartabellotta, A.; Martin, J.; Hopayian, K.; Porzsolt, F.; Burls, A.; Osborne, J.; Second International Conference of Evidence-Based Health Care Teachers and Developers. Sicily Statement on Evidence-Based Practice. BMC Med. Educ. 2005, 5, 1. [Google Scholar] [CrossRef]
  14. Richardson, W.S.; Wilson, M.C.; Nishikawa, J.; Hayward, R.S. The Well-Built Clinical Question: A Key to Evidence-Based Decisions. ACP J. Club 1995, 123, A12–A13. [Google Scholar] [CrossRef] [PubMed]
  15. Amonette, W.E.; English, K.L.; Ottenbacher, K.J. Nullius in Verba: A Call for the Incorporation of Evidence-Based Practice into the Discipline of Exercise Science. Sports Med. 2010, 40, 449–457. [Google Scholar] [CrossRef] [PubMed]
  16. Straus, S.W.; Richardson, W.S.; Glasziou, P.; Haynes, R.B. Evidence-Based Medicine: How to Practice and Teach It, 4th ed.; Churchill Livingstone: Edinburgh, UK, 2010. [Google Scholar]
  17. Stevens, K.R. Systematic Reviews: The Heart of Evidence-Based Practice. AACN Clin. Issues 2001, 12, 529–538. [Google Scholar] [CrossRef]
  18. Alper, B.S.; Haynes, R.B. EBHC Pyramid 5.0 for Accessing Preappraised Evidence and Guidance. Evid. Based Med. 2016, 21, 123–125. [Google Scholar] [CrossRef]
  19. Haynes, R.B.; Devereaux, P.J.; Guyatt, G.H. Clinical Expertise in the Era of Evidence-Based Medicine and Patient Choice. ACP J. Club 2002, 136, A11–A14. [Google Scholar] [CrossRef] [PubMed]
  20. Straus, S.; Haynes, R.B. Managing Evidence-Based Knowledge: The Need for Reliable, Relevant and Readable Resources. CMAJ 2009, 180, 942–945. [Google Scholar] [CrossRef]
  21. DiCenso, A.; Bayley, L.; Haynes, R.B. Accessing Pre-Appraised Evidence: Fine-Tuning the 5S Model into a 6S Model. Evid. Based Nurs. 2009, 12, 99–101. [Google Scholar] [CrossRef]
  22. Murad, M.H.; Asi, N.; Alsawas, M.; Alahdab, F. New Evidence Pyramid. Evid. Based Med. 2016, 21, 125–127. [Google Scholar] [CrossRef]
  23. McCall, A.; Dupont, G.; Ekstrand, J. Injury Prevention Strategies, Coach Compliance and Player Adherence of 33 of the UEFA Elite Club Injury Study Teams: A Survey of Teams’ Head Medical Officers. Br. J. Sports Med. 2016, 50, 725–730. [Google Scholar] [CrossRef]
  24. Faulkner, G.; Taylor, A.H.; Urban, S.; Ferrence, R.; Munreo, S.; Selby, P. Exercise Science and the Development of Evidence-Based Practice: A Better Practices’ Framework. Eur. J. Sport Sci. 2006, 6, 117–126. [Google Scholar] [CrossRef]
  25. Reade, I.; Rodgers, W.; Hall, N. Knowledge Transfer: How Do High Performance Coaches Access the Knowledge of Sport Scientists? Int. J. Sports Sci. Coach. 2008, 3, 319–334. [Google Scholar] [CrossRef]
  26. Jones, B.; Till, K.; Emmonds, S.; Hendricks, S.; Mackreth, P.; Darrall-Jones, J.; Roe, G.; McGeechan, S.I.; Mayhew, R.; Hunwicks, R.; et al. Accessing Off-Field Brains in Sport; an Applied Research Model to Develop Practice. Br. J. Sports Med. 2019, 53, 791–793. [Google Scholar] [CrossRef] [PubMed]
  27. Eisenmann, J.C. Translational Gap between Laboratory and Playing Field: New Era to Solve Old Problems in Sports Science. Transl. J. ACSM 2017, 2, 37–43. [Google Scholar]
  28. Sandbakk, Ø. Let’s Close the Gap between Research and Practice to Discover New Land Together! Int. J. Sports Physiol. Perform. 2018, 13, 961. [Google Scholar] [CrossRef] [PubMed]
  29. Malone, J.J.; Harper, L.D.; Jones, B.; Perry, J.; Barnes, C.; Towlson, C. Perspectives of Applied Collaborative Sport Science Research within Professional Team Sports. Eur. J. Sport Sci. 2019, 19, 147–155. [Google Scholar] [CrossRef]
  30. Schwarz, E.; Harper, L.D.; Duffield, R.; McCunn, R.; Govus, A.; Skorski, S.; Fullagar, H.H.K. Practitioner, Coach, and Athlete Perceptions of Evidence-Based Practice in Professional Sport in Australia. Int. J. Sports Physiol. Perform. 2021, 16, 1728–1735. [Google Scholar] [CrossRef]
  31. Coutts, A.J. Working Fast and Working Slow: The Benefits of Embedding Research in High Performance Sport. Int. J. Sports Physiol. Perform. 2016, 11, 1–2. [Google Scholar] [CrossRef]
  32. Sperlich, B.; Wicker, P. Knowledge Transfer into Sport Practice: An Empirical User Analysis of a Sport Science Website. Eur. J. Sport Sci. 2021, 21, 753–761. [Google Scholar] [CrossRef]
  33. Harper, L.D.; McCunn, R. “Hand in Glove”: Using Qualitative Methods to Connect Research and Practice. Int. J. Sports Physiol. Perform. 2017, 12, 990–993. [Google Scholar] [CrossRef]
  34. Halperin, I. Case Studies in Exercise and Sport Sciences: A Powerful Tool to Bridge the Science-Practice Gap. Int. J. Sports Physiol. Perform. 2018, 13, 824–825. [Google Scholar] [CrossRef] [PubMed]
  35. Buchheit, M. Houston, We Still Have a Problem. Int. J. Sports Physiol. Perform. 2017, 12, 1111–1114. [Google Scholar] [CrossRef] [PubMed]
  36. Bishop, D. An Applied Research Model for the Sport Sciences. Sports Med. 2008, 38, 253–263. [Google Scholar] [CrossRef]
  37. Bartlett, J.D.; Drust, B. A Framework for Effective Knowledge Translation and Performance Delivery of Sport Scientists in Professional Sport. Eur. J. Sport Sci. 2021, 21, 1579–1587. [Google Scholar] [CrossRef]
  38. Higgins, J.P.; Thomas, J.; Chandler, J.; Cumpston, M.; Li, T.; Page, M.J.; Welch, V.A. Cochrane Handbook for Systematic Reviews of Interventions; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
  39. Lark, D.S.; DeYoung, W. Using “Evidence-Based Practice” to Integrate Physiological Knowledge with Practical Application in an Introductory Undergraduate Exercise Physiology Course. Adv. Physiol. Educ. 2022, 46, 358–361. [Google Scholar] [CrossRef] [PubMed]
  40. Martyn, J.; Terwijn, R.; Kek, M.Y.; Huijser, H. Exploring the Relationships between Teaching, Approaches to Learning and Critical Thinking in a Problem-Based Learning Foundation Nursing Course. Nurse Educ. Today 2014, 34, 829–835. [Google Scholar] [CrossRef]
  41. Straus, S.E.; Glasziou, P.; Richardson, W.S.; Haynes, R.B. Evidence-Based Medicine: How to Practice and Teach EBM, 5th ed.; Elsevier: Philadelphia, PA, USA, 2019. [Google Scholar]
  42. Greenhalgh, T.; Howick, J.; Maskrey, N.; Evidence Based Medicine Renaissance Group. Evidence Based Medicine: A Movement in Crisis? BMJ 2014, 348, g3725. [Google Scholar] [CrossRef]
  43. Ruzafa-Martínez, M.; López-Iborra, L.; Armero Barranco, D.; Ramos-Morcillo, A.J. Effectiveness of an Evidence-Based Practice (EBP) Course on the EBP Competence of Undergraduate Nursing Students: A Quasi-Experimental Study. Nurse Educ. Today 2016, 38, 82–87. [Google Scholar] [CrossRef]
  44. Shaneyfelt, T.; Baum, K.D.; Bell, D.; Feldstein, D.; Houston, T.K.; Kaatz, S.; Whelan, C.; Green, M. Instruments for Evaluating Education in Evidence-Based Practice: A Systematic Review. JAMA 2006, 296, 1116–1127. [Google Scholar] [CrossRef]
  45. Yew, E.H.; Goh, K.P. Problem-Based Learning: An Overview of Its Process and Impact on Learning. Health Prof. Educ. 2016, 2, 75–79. [Google Scholar] [CrossRef]
  46. Thistlethwaite, J.E.; Davies, D.; Ekeocha, S.; Kidd, J.M.; MacDougall, C.; Matthews, P.; Purkis, J.; Clay, D. The Effectiveness of Case-Based Learning in Health Professional Education. A BEME Systematic Review: BEME Guide No. 23. Med. Teach. 2012, 34, e421–e444. [Google Scholar] [CrossRef] [PubMed]
  47. Lennon, O.; Phelan, D.; Wallace, D.; King, J.; Barrett, T. “The More You Did, the More It Made Sense”: Problem-Based Learning to Improve Early Evidence-Based Practice in an Undergraduate Physiotherapy Professional Programme. Physiother. Res. Int. 2019, 24, e1774. [Google Scholar] [CrossRef] [PubMed]
  48. Visconti, C.F. Problem-Based Learning: Teaching Skills for Evidence-Based Practice. Perspect. Issues High. Educ. 2010, 13, 27–31. [Google Scholar] [CrossRef]
  49. Dryden, T.; Moyer, C.A. Massage Therapy: Integrating Research and Practice; Human Kinetics Publishers: Champaign, IL, USA, 2012. [Google Scholar]
  50. Albarqouni, L.; Hoffmann, T.; Glasziou, P. Evidence-Based Practice Educational Intervention Studies: A Systematic Review of What Is Taught and How It Is Measured. BMC Med. Educ. 2018, 18, 177. [Google Scholar] [CrossRef] [PubMed]
  51. Doyle, G.J.; Furlong, K.E.; Secco, L. Information Literacy in a Digital Era: Understanding the Impact of Mobile Information for Undergraduate Nursing Students. Stud. Health Technol. Inform. 2016, 225, 297–301. [Google Scholar] [PubMed]
  52. Morris, J.; Maynard, V. Pilot Study to Test the Use of a Mobile Device in the Clinical Setting to Access Evidence-Based Practice Resources. Worldviews Evid.-Based Nurs. 2010, 7, 205–213. [Google Scholar] [CrossRef]
  53. Kyriakoulis, K.; Patelarou, A.; Laliotis, A.; Wan, A.C.; Matalliotakis, M.; Tsiou, C.; Patelarou, E. Educational Strategies for Teaching Evidence-Based Practice to Undergraduate Health Students: Systematic Review. J. Educ. Eval. Health Prof. 2016, 13, 34. [Google Scholar] [CrossRef]
  54. Larsen, C.M.; Terkelsen, A.S.; Carlsen, A.F.; Kristensen, H.K. Methods for Teaching Evidence-Based Practice: A Scoping Review. BMC Med. Educ. 2019, 19, 259. [Google Scholar] [CrossRef]
  55. Long, J.D.; Gannaway, P.; Ford, C.; Doumit, R.; Zeeni, N.; Sukkarieh-Haraty, O.; Milane, A.; Byers, B.; Harrison, L.; Hatch, D.; et al. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool. Worldviews Evid.-Based Nurs. 2016, 13, 59–65. [Google Scholar] [CrossRef]
  56. Coomarasamy, A.; Khan, K.S. What Is the Evidence that Postgraduate Teaching in Evidence-Based Medicine Changes Anything? A Systematic Review. BMJ 2004, 329, 1017. [Google Scholar] [CrossRef]
  57. Howard, B.; Diug, B.; Ilic, D. Methods of Teaching Evidence-Based Practice: A Systematic Review. BMC Med. Educ. 2022, 22, 742. [Google Scholar] [CrossRef] [PubMed]
  58. Jones, E.P.; Nelson, N.R.; Thorpe, C.T.; Rodgers, P.T.; Carlson, R.B. Use of Journal Clubs and Book Clubs in Pharmacy Education: A Scoping Review. Curr. Pharm. Teach. Learn. 2022, 14, 110–119. [Google Scholar] [CrossRef] [PubMed]
  59. Valizadeh, L.; Zamanzadeh, V.; Alizadeh, S.; Namadi Vosoughi, M. Promoting Evidence-Based Nursing through Journal Clubs: An Integrative Review. J. Res. Nurs. 2022, 27, 606–620. [Google Scholar] [CrossRef] [PubMed]
  60. Patelarou, A.E.; Mechili, E.A.; Ruzafa-Martinez, M.; Dolezel, J.; Gotlib, J.; Skela-Savič, B.; Ramos-Morcillo, A.J.; Finotto, S.; Jarosova, D.; Smodiš, M.; et al. Educational Interventions for Teaching Evidence-Based Practice to Undergraduate Nursing Students: A Scoping Review. Int. J. Environ. Res. Public Health 2020, 17, 6351. [Google Scholar] [CrossRef]
  61. Grol, R.; Grimshaw, J.; Eccles, M. Improving Patient Care: The Implementation of Change in Clinical Practice; Elsevier Health Sciences: Oxford, UK, 2004. [Google Scholar]
  62. Jette, D.U.; Bacon, K.; Batty, C.; Carlson, M.; Ferland, A.; Hemingway, R.D.; Hill, J.C.; Ogilvie, L.; Volk, D. Evidence-Based Practice: Beliefs, Attitudes, Knowledge, and Behaviors of Physical Therapists. Phys. Ther. 2003, 83, 786–805. [Google Scholar] [CrossRef]
  63. Ramos, K.D.; Schafer, S.; Tracz, S.M. Validation of the Fresno Test of Competence in Evidence-Based Medicine. BMJ Clin. Res. Ed. 2003, 326, 319–321. [Google Scholar] [CrossRef]
  64. Ruzafa-Martínez, M.; Fernández-Salazar, S.; Leal-Costa, C.; Ramos-Morcillo, A.J. Questionnaire to Evaluate the Competency in Evidence-Based Practice of Registered Nurses (EBP-COQ Prof©): Development and Psychometric Validation. Worldviews Evid.-Based Nurs. 2020, 17, 366–375. [Google Scholar] [CrossRef]
  65. Zabaleta-del-Olmo, E.; Subirana-Casacuberta, M.; Ara-Pérez, A.; Escuredo-Rodríguez, B.; Ríos-Rodríguez, M.Á.; Carrés-Esteve, L.; Jodar-Solà, G.; Lejardi-Estevez, Y.; Nuix-Baqué, N.; Aguas-Lluch, A.; et al. Developing Evidence-Based Practice Questionnaire for Community Health Nurses: Reliability and Validity of a Spanish Adaptation. J. Clin. Nurs. 2016, 25, 505–517. [Google Scholar] [CrossRef]
  66. Fullagar, H.H.K.; Harper, L.D.; Govus, A.; McCunn, R.; Eisenmann, J.; McCall, A. Practitioner Perceptions of Evidence-Based Practice in Elite Sport in the United States of America. J. Strength Cond. Res. 2019, 33, 2897–2904. [Google Scholar] [CrossRef]
  67. Hankemeier, D.A.; Walter, J.M.; McCarty, C.W.; Newton, E.J.; Walker, S.E.; Pribesh, S.L.; Jamali, B.E.; Manspeaker, S.A.; Van Lunen, B.L. Use of Evidence-Based Practice among Athletic Training Educators, Clinicians, and Students, Part 1: Perceived Importance, Knowledge, and Confidence. J. Athl. Train. 2013, 48, 394–404. [Google Scholar] [CrossRef]
  68. Manspeaker, S.A.; Lunen, B.L.; Turocy, P.S.; Pribesh, S.L.; Hankemeier, D.A. Student Knowledge, Attitudes, and Use of Evidence-Based Concepts following an Educational Intervention. Athl. Train. Educ. J. 2011, 6, 88–98. [Google Scholar] [CrossRef]
  69. Welch, C.E.; Lunen, B.L.; Walker, S.E.; Manspeaker, S.A.; Hankemeier, D.A.; Brown, S.D.; Laursen, R.M.; Oñate, J. Athletic Training Educators’ Knowledge, Comfort, and Perceived Importance of Evidence-Based Practice. Athl. Train. Educ. J. 2011, 6, 5–14. [Google Scholar] [CrossRef]
  70. Williams, S.J.; Kendall, L. Perceptions of Elite Coaches and Sports Scientists of the Research Needs for Elite Coaching Practice. J. Sports Sci. 2007, 25, 1577–1586. [Google Scholar] [CrossRef] [PubMed]
  71. Russell, S.J.; Norvig, P. Artificial Intelligence: A Modern Approach, 3rd ed.; Pearson Education Limited: Kuala Lumpur, Malaysia, 2016. [Google Scholar]
  72. Taulli, T.; Oni, M. Artificial Intelligence Basics, 1st ed.; Apress: Berkeley, CA, USA, 2019. [Google Scholar]
  73. Christopoulou, S.C. Machine Learning Tools and Platforms in Clinical Trial Outputs to Support Evidence-Based Health Informatics: A Rapid Review of the Literature. BioMedInformatics 2022, 2, 511–527. [Google Scholar] [CrossRef]
  74. Awad, M.; Khanna, R. Machine Learning. In Efficient Learning Machines; Awad, K., Khanna, R., Eds.; Apress: Berkeley, CA, USA, 2015; pp. 1–14. [Google Scholar]
  75. Zhang, Y.; Liang, S.; Feng, Y.; Wang, Q.; Sun, F.; Chen, S.; Yang, Y.; He, X.; Zhu, H.; Pan, H. Automation of Literature Screening Using Machine Learning in Medical Evidence Synthesis: A Diagnostic Test Accuracy Systematic Review Protocol. Syst. Rev. 2022, 11, 11. [Google Scholar] [CrossRef]
  76. Marshall, I.J.; Kuiper, J.; Wallace, B.C. Robot Reviewer: Evaluation of a System for Automatically Assessing Bias in Clinical Trials. J. Am. Med. Inform. Assoc. 2016, 23, 193–201. [Google Scholar] [CrossRef]
  77. Marshall, I.J.; Trikalinos, T.A.; Soboczenski, F.; Yun, H.S.; Kell, G.; Marshall, R.; Wallace, B.C. In a Pilot Study, Automated Real-Time Systematic Review Updates Were Feasible, Accurate, and Work-Saving. J. Clin. Epidemiol. 2023, 153, 26–33. [Google Scholar] [CrossRef]
  78. Schoot, R.; de Bruin, J.; Schram, R.; Broere, J.J.; Kievit, R.A. An Open Source Machine Learning Framework for Efficient and Transparent Systematic Reviews. Nat. Mach. Intell. 2021, 3, 125–133. [Google Scholar] [CrossRef]
  79. Thomas, J.; McDonald, S.; Noel-Storr, A.; Shemilt, I.; Elliott, J.; Mavergames, C.; Marshall, I.J. Machine Learning Reduced Workload with Minimal Risk of Missing Studies: Development and Evaluation of a Randomized Controlled Trial Classifier for Cochrane Reviews. J. Clin. Epidemiol. 2021, 133, 140–151. [Google Scholar] [CrossRef]
  80. Jayakumar, S.; Sounderajah, V.; Normahani, P.; Harling, L.; Markar, S.R.; Ashrafian, H.; Darzi, A. Quality Assessment Standards in Artificial Intelligence Diagnostic Accuracy Systematic Reviews: A Meta-Research Study. NPJ Digit. Med. 2022, 5, 11. [Google Scholar] [CrossRef]
  81. Wagner, G.; Lukyanenko, R.; Paré, G. Artificial Intelligence and the Conduct of Literature Reviews. J. Inf. Technol. 2022, 37, 209–226. [Google Scholar] [CrossRef]
  82. Ajiji, P.; Cottin, J.; Picot, C.; Uzunali, A.; Ripoche, E.; Cucherat, M.; Maison, P. Feasibility Study and Evaluation of Expert Opinion on the Semi-Automated Meta-Analysis and the Conventional Meta-Analysis. Eur. J. Clin. Pharmacol. 2022, 78, 1177–1184. [Google Scholar] [CrossRef] [PubMed]
  83. Hu, Z.; Jujjavarapu, C.; Hughey, J.J.; Andorf, S.; Lee, H.-C.; Gherardini, P.F.; Spitzer, M.H.; Thomas, C.G.; Campbell, J.; Dunn, P.; et al. MetaCyto: A Tool for Automated Meta-Analysis of Mass and Flow Cytometry Data. Cell Rep. 2018, 24, 1377–1388. [Google Scholar] [CrossRef] [PubMed]
  84. Masoumi, S.; Shahraz, S. Meta-Analysis Using Python: A Hands-On Tutorial. BMC Med. Res. Methodol. 2022, 22, 193. [Google Scholar] [CrossRef] [PubMed]
  85. Topol, E.J. High-Performance Medicine: The Convergence of Human and Artificial Intelligence. Nat. Med. 2019, 25, 44–56. [Google Scholar] [CrossRef]
  86. Loftus, T.J.; Tighe, P.J.; Filiberto, A.C.; Efron, P.A.; Brakenridge, S.C.; Mohr, A.M.; Rashidi, P.; Upchurch, G.R., Jr.; Bihorac, A. Artificial Intelligence and Surgical Decision-Making. JAMA Surg. 2020, 155, 148–158. [Google Scholar] [CrossRef]
  87. Adams, C.E.; Polzmacher, S.; Wolff, A. Systematic Reviews: Work That Needs to Be Done and Not to Be Done. J. Evid. Based Med. 2013, 6, 232–235. [Google Scholar] [CrossRef]
  88. Tsafnat, G.; Glasziou, P.; Choong, M.K.; Dunn, A.; Galgani, F.; Coiera, E. Systematic Review Automation Technologies. Syst. Rev. 2014, 3, 74. [Google Scholar]
  89. Johnson, C.D.; Bauer, B.C.; Niederman, F. The Automation of Management and Business Science. Acad. Manag. Perspect. 2019, 35, 292–309. [Google Scholar] [CrossRef]
  90. Stengers, I. Another Science Is Possible: A Manifesto for Slow Science; Polity Press: Cambridge, UK, 2018. [Google Scholar]
  91. Chmait, N.; Westerbeek, H. Artificial Intelligence and Machine Learning in Sport Research: An Introduction for Non-Data Scientists. Front. Sports Act. Living 2021, 3, 682287. [Google Scholar] [CrossRef]
  92. Adesida, Y.; Papi, E.; McGregor, A.H. Exploring the Role of Wearable Technology in Sport Kinematics and Kinetics: A Systematic Review. Sensors 2019, 19, 1597. [Google Scholar] [CrossRef] [PubMed]
  93. Rossi, A.; Pappalardo, L.; Cintia, P. A Narrative Review for a Machine Learning Application in Sports: An Example Based on Injury Forecasting in Soccer. Sports 2022, 10, 5. [Google Scholar] [CrossRef] [PubMed]
  94. Fixsen, D.L.; Naoom, S.F.; Blase, K.A.; Friedman, R.M. Implementation Research: A Synthesis of the Literature; University of South Florida, The National Implementation Research Network: Tamps, FL, USA, 2005. [Google Scholar]
  95. Ebbole, T. Evidence-Based Programs and Practices: What Does It All Mean? Available online: http://www.evidencebasedassociates.com/reports/research_review.pdf (accessed on 27 February 2023).
  96. Mitchell, P.F. Evidence-Based Practice in Real-World Services for Young People with Complex Needs: New Opportunities Suggested by Recent Implementation Science. Child. Youth Serv. Rev. 2011, 33, 207–216. [Google Scholar] [CrossRef]
  97. Heath, G.W.; Parra, D.C.; Sarmiento, O.L.; Andersen, L.B.; Owen, N.; Goenka, S.; Montes, F.; Brownson, R.C.; Lancet Physical Activity Series Working Group. Evidence-Based Intervention in Physical Activity: Lessons from around the World. Lancet 2012, 380, 272–281. [Google Scholar] [CrossRef]
  98. Batacan, R.B.; Duncan, M.J.; Dalbo, V.J.; Tucker, P.S.; Fenning, A.S. Effects of High-Intensity Interval Training on Cardiometabolic Health: A Systematic Review and Meta-Analysis of Intervention Studies. Br. J. Sports Med. 2017, 51, 494–503. [Google Scholar] [CrossRef] [PubMed]
  99. Petrescu-Prahova, M.G.; Eagen, T.J.; Fishleder, S.L.; Belza, B. Enhance®Fitness Dissemination and Implementation: 2010–2015: A Scoping Review. Am. J. Prev. Med. 2017, 52, S295–S299. [Google Scholar] [CrossRef] [PubMed]
  100. Casas-Herrero, Á.; Sáez de Asteasu, M.L.; Antón-Rodrigo, I.; Sánchez-Sánchez, J.L.; Montero-Odasso, M.; Marín-Epelde, I.; Ramón-Espinoza, F.; Zambom-Ferraresi, F.; Petidier-Torregrosa, R.; Elexpuru-Estomba, J.; et al. Effects of Vivifrail Multicomponent Intervention on Functional Capacity: A Multicentre, Randomized Controlled Trial. J. Cachexia Sarcopenia Muscle 2022, 13, 884–893. [Google Scholar] [CrossRef]
  101. Yang, Y.; Wang, K.; Liu, H.; Qu, J.; Wang, Y.; Chen, P.; Zhang, T.; Luo, J. The Impact of Otago Exercise Programme on the Prevention of Falls in Older Adult: A Systematic Review. Front. Public Health 2022, 10, 953593. [Google Scholar] [CrossRef]
  102. Gomes Neto, M.; Conceição, C.S.; de Lima Brasileiro, A.J.A.; de Sousa, C.S.; Carvalho, V.O.; de Jesus, F.L.A. Effects of the FIFA 11 Training Program on Injury Prevention and Performance in Football Players: A Systematic Review and Meta-Analysis. Clin. Rehabil. 2017, 31, 651–659. [Google Scholar] [CrossRef]
  103. Moran, R.W.; Schneiders, A.G.; Mason, J.; Sullivan, S.J. Do Functional Movement Screen (FMS) Composite Scores Predict Subsequent Injury? A Systematic Review with Meta-Analysis. Br. J. Sports Med. 2017, 51, 1661–1669. [Google Scholar] [CrossRef]
  104. Keeler, L.; Kirby, R.L.; Parker, K.; McLean, K.D.; Hayden, J.A. Effectiveness of the Wheelchair Skills Training Program: A Systematic Review and Meta-Analysis. Disabil. Rehabil. Assist. Technol. 2019, 14, 391–409. [Google Scholar] [CrossRef] [PubMed]
  105. Schroeder, K.; Ratcliffe, S.J.; Perez, A.; Earley, D.; Bowman, C.; Lipman, T.H. Dance for Health: An Intergenerational Program to Increase Access to Physical Activity. J. Pediatr. Nurs. 2017, 37, 29–34. [Google Scholar] [CrossRef] [PubMed]
  106. Callahan, L.F.; Cleveland, R.J.; Shreffler, J.; Hootman, J.M.; Mielenz, T.J.; Schoster, B.; Brady, T.; Schwartz, T. Evaluation of Active Living Every Day in Adults with Arthritis. J. Phys. Act. Health 2014, 11, 285–295. [Google Scholar] [CrossRef] [PubMed]
  107. Penn, L.; Rodrigues, A.; Haste, A.; Bell, R.; Goyder, E.; Chappell, L.; Simmons, D.; Mann, E.; Davies, M.J. NHS Diabetes Prevention Programme in England: Formative Evaluation of the Programme in Early Phase Implementation. BMJ Open 2018, 8, e019467. [Google Scholar] [CrossRef] [PubMed]
  108. Aithal, S.; Moula, Z.; Karkou, V.; Karaminis, T.; Powell, J.; Makris, S. A Systematic Review of the Contribution of Dance Movement Psychotherapy Towards the Well-Being of Children with Autism Spectrum Disorders. Front. Psychol. 2021, 12, 719673. [Google Scholar] [CrossRef] [PubMed]
  109. Huether, K.; Abbott, L.; Cullen, L.; Gaarde, A. Energy Through Motion©: An Evidence-Based Exercise Program to Reduce Cancer-Related Fatigue and Improve Quality of Life. Clin. J. Oncol. Nurs. 2016, 20, E60–E70. [Google Scholar] [CrossRef]
  110. Courel-Ibáñez, J.; Pallarés, J.G.; García-Conesa, S.; Buendía-Romero, Á.; Martínez-Cava, A.; Izquierdo, M. Supervised Exercise (Vivifrail) Protects Institutionalized Older Adults Against Severe Functional Decline After 14 Weeks of COVID Confinement. J. Am. Med. Dir. Assoc. 2021, 22, 217–219.e2. [Google Scholar] [CrossRef]
  111. Sánchez-Sánchez, J.L.; de Souto Barreto, P.; Antón-Rodrigo, I.; Ramón-Espinoza, F.; Marín-Epelde, I.; Sánchez-Latorre, M.; Moral-Cuesta, D.; Casas-Herrero, Á. Effects of a 12-week Vivifrail exercise program on intrinsic capacity among frail cognitively impaired community-dwelling older adults: Secondary analysis of a multicentre randomised clinical trial. Age Ageing 2022, 51, afac303. [Google Scholar] [CrossRef]
  112. Barengo, N.C.; Meneses-Echávez, J.F.; Ramírez-Vélez, R.; Cohen, D.D.; Tovar, G.; Bautista, J.E. The Impact of the FIFA 11+ Training Program on Injury Prevention in Football Players: A Systematic Review. Int. J. Environ. Res. Public Health 2014, 11, 11986–12000. [Google Scholar] [CrossRef]
  113. Sadigursky, D.; Braid, J.A.; De Lira, D.N.L.; Machado, B.A.B.; Carneiro, R.J.F.; Colavolpe, P.O. The FIFA 11+ Injury Prevention Program for Soccer Players: A Systematic Review. BMC Sports Sci. Med. Rehabil. 2017, 9, 18. [Google Scholar] [CrossRef]
  114. Bizzini, M.; Dvorak, J. FIFA 11+: An Effective Programme to Prevent Football Injuries in Various Player Groups Worldwide—A Narrative Review. Br. J. Sports Med. 2015, 49, 577–579. [Google Scholar] [CrossRef] [PubMed]
  115. Owoeye, O.B.; Akinbo, S.R.; Tella, B.A.; Olawale, O.A. Efficacy of the FIFA 11+ Warm-Up Programme in Male Youth Football: A Cluster Randomised Controlled Trial. J. Sports Sci. Med. 2014, 13, 321–328. [Google Scholar]
  116. Soligard, T.; Myklebust, G.; Steffen, K.; Holme, I.; Silvers, H.; Bizzini, M.; Junge, A.; Dvorak, J.; Bahr, R.; Anderson, T.E. Comprehensive Warm-up Programme to Prevent Injuries in Young Female Footballers: Cluster Randomised Controlled Trial. BMJ 2008, 337, a2469. [Google Scholar] [CrossRef] [PubMed]
  117. Asgari, M.; Nazari, B.; Bizzini, M.; Jaitner, T. Effects of the FIFA 11+ Program on Performance, Biomechanical Measures, and Physiological Responses: A Systematic Review. J. Sport Health Sci. 2023, 12, 226–235. [Google Scholar] [CrossRef] [PubMed]
  118. Stern, G.; Psycharakis, S.G.; Phillips, S.M. Effect of High-Intensity Interval Training on Functional Movement in Older Adults: A Systematic Review and Meta-analysis. Sports Med. Open 2023, 9, 5. [Google Scholar] [CrossRef]
  119. Atakan, M.M.; Guzel, Y.; Shrestha, N.; Kosar, S.N.; Grgic, J.; Astorino, T.A.; Turnagol, H.H.; Pedisic, Z. Effects of High-Intensity Interval Training (HIIT) and Sprint Interval Training (SIT) on Fat Oxidation during Exercise: A Systematic Review and Meta-analysis. Br. J. Sports Med. 2022, 56, 988–996. [Google Scholar] [CrossRef]
  120. Engel, F.A.; Ackermann, A.; Chtourou, H.; Sperlich, B. High-Intensity Interval Training Performed by Young Athletes: A Systematic Review and Meta-Analysis. Front. Physiol. 2018, 9, 1012. [Google Scholar] [CrossRef]
  121. Martin-Smith, R.; Cox, A.; Buchan, D.S.; Baker, J.S.; Grace, F.; Sculthorpe, N. High Intensity Interval Training (HIIT) Improves Cardiorespiratory Fitness (CRF) in Healthy, Overweight and Obese Adolescents: A Systematic Review and Meta-Analysis of Controlled Studies. Int. J. Environ. Res. Public Health 2020, 17, 2955. [Google Scholar] [CrossRef]
  122. Weston, K.S.; Wisloff, U.; Coombes, J.S. High-Intensity Interval Training in Patients with Lifestyle-Induced Cardiometabolic Disease: A Systematic Review and Meta-Analysis. Br. J. Sports Med. 2014, 48, 1227–1234. [Google Scholar] [CrossRef]
  123. Fiorelli, C.M.; Ciolac, E.G.; Simieli, L.; Silva, F.A.; Fernandes, B.; Christofoletti, G.; Barbieri, F.A. Differential Acute Effect of High-Intensity Interval or Continuous Moderate Exercise on Cognition in Individuals with Parkinson’s Disease. J. Phys. Act. Health 2019, 16, 157–164. [Google Scholar] [CrossRef]
  124. Hsieh, S.-S.; Chueh, T.-Y.; Huang, C.-J.; Kao, S.-C.; Hillman, C.H.; Chang, Y.-K.; Hung, T.-M. Systematic Review of the Acute and Chronic Effects of High-Intensity Interval Training on Executive Function across the Lifespan. J. Sports Sci. 2020, 39, 10–22. [Google Scholar] [CrossRef] [PubMed]
  125. Bouaziz, W.; Malgoyre, A.; Schmitt, E.; Lang, P.O.; Vogel, T.; Kanagaratnam, L. Effect of High-Intensity Interval Training and Continuous Endurance Training on Peak Oxygen Uptake among Seniors Aged 65 or Older: A Meta-Analysis of Randomized Controlled Trials. Int. J. Clin. Pract. 2020, 74, e13490. [Google Scholar] [CrossRef] [PubMed]
  126. Wu, Z.-J.; Wang, Z.-Y.; Gao, H.-E.; Zhou, X.-F.; Li, F.-H. Impact of High-Intensity Interval Training on Cardiorespiratory Fitness, Body Composition, Physical Fitness, and Metabolic Parameters in Older Adults: A Meta-Analysis of Randomized Controlled Trials. Exp. Gerontol. 2021, 150, 111345. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Summary diagram on evidence-based human kinetics concepts.
Figure 1. Summary diagram on evidence-based human kinetics concepts.
Ijerph 20 06020 g001
Table 1. Key findings of human kinetics programs.
Table 1. Key findings of human kinetics programs.
ProgramDescriptionFindings
VivifrailIndividualized and multicomponent exercise program for the elderly.Significant improvement in:
  • Functional capacity (reduced risk of falls).
  • Cognitive function.
  • Muscle function.
  • Mood state.
FIFA 11+Warm-up program designed specifically for soccer players that includes elements such as running, plyometric exercises, and balance/coordination exercises.
  • Minimizes injury risk in male and female soccer players.
  • Results in a 30% decrease in injuries among soccer players.
  • Improves dynamic balance and agility for program completers.
  • Enhances biomechanical parameters, core stability, and balance with medium- to long-term use.
HIITA training program that alternates periods of high intensity with active or passive recovery.
  • Improved cardiovascular fitness.
  • Enhanced metabolic health.
  • Positive changes in body composition.
  • Improved maximal oxygen uptake and cardiometabolic risk factors.
  • Cognitive performance enhancement.
  • Improvements in functional movement measures.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ramalho, A.; Petrica, J. Knowledge in Motion: A Comprehensive Review of Evidence-Based Human Kinetics. Int. J. Environ. Res. Public Health 2023, 20, 6020. https://doi.org/10.3390/ijerph20116020

AMA Style

Ramalho A, Petrica J. Knowledge in Motion: A Comprehensive Review of Evidence-Based Human Kinetics. International Journal of Environmental Research and Public Health. 2023; 20(11):6020. https://doi.org/10.3390/ijerph20116020

Chicago/Turabian Style

Ramalho, André, and João Petrica. 2023. "Knowledge in Motion: A Comprehensive Review of Evidence-Based Human Kinetics" International Journal of Environmental Research and Public Health 20, no. 11: 6020. https://doi.org/10.3390/ijerph20116020

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop