Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
# Background Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability. # Purpose This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
North American Sports Medicine Institute
2022-06-01
|
Series: | International Journal of Sports Physical Therapy |
Online Access: | https://ijspt.scholasticahq.com/article/35666-movement-competency-screens-can-be-reliable-in-clinical-practice-by-a-single-rater-using-the-composite-score |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1825196986242957312 |
---|---|
author | Kerry J. Mann Nicholas O'Dwyer Michaela R. Bruton Stephen P. Bird Suzi Edwards |
author_facet | Kerry J. Mann Nicholas O'Dwyer Michaela R. Bruton Stephen P. Bird Suzi Edwards |
author_sort | Kerry J. Mann |
collection | DOAJ |
description | # Background
Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability.
# Purpose
This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based MCS in novice and expert raters using different viewing methods (single and multiple views); and (ii) ascertain whether there were familiarization effects from repeated exposure for either raters or participants.
# Study Design
Descriptive laboratory study
# Methods
Pre-elite youth athletes (n=51) were recruited and videotaped while performing a MCS comprising nine dynamic movements in three separate trials. Performances were rated three times with a minimal four-week wash out between testing sessions, each in randomized order by 12 raters (3 expert, 9 novice), using a three-point scale. Kappa score, percentage agreement and intra-class correlation were calculated for each movement individually and for the composite score.
# Results
Fifty-one pre-elite youth athletes (15.0±1.6 years; *n*=33 athletics, *n*=10 BMX and *n*=8 surfing) were included in the study. Based on kappa score and percentage agreement, both inter- and intra-rater reliability were highly variable for individual movements but consistently high (>0.70) for the MCS composite score. The composite score did not increase with task familiarization by the athletes. Experts detected more movement errors than novices and both rating groups improved their detection of errors with repeated viewings of the same movement.
# Conclusions
Irrespective of experience, raters demonstrated high variability in rating single movements, yet preliminary evidence suggests the MCS composite score could reliably assess movement competency. While athletes did not display a familiarization effect after performing the novel tasks within the MCS for the first time, raters showed improved error detection on repeated viewing of the same movement.
# Level of Evidence
Cohort study |
format | Article |
id | doaj-art-07d430e092b74a1d9dc85d329280cc97 |
institution | Kabale University |
issn | 2159-2896 |
language | English |
publishDate | 2022-06-01 |
publisher | North American Sports Medicine Institute |
record_format | Article |
series | International Journal of Sports Physical Therapy |
spelling | doaj-art-07d430e092b74a1d9dc85d329280cc972025-02-11T20:27:34ZengNorth American Sports Medicine InstituteInternational Journal of Sports Physical Therapy2159-28962022-06-01174Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite ScoreKerry J. MannNicholas O'DwyerMichaela R. BrutonStephen P. BirdSuzi Edwards# Background Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability. # Purpose This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based MCS in novice and expert raters using different viewing methods (single and multiple views); and (ii) ascertain whether there were familiarization effects from repeated exposure for either raters or participants. # Study Design Descriptive laboratory study # Methods Pre-elite youth athletes (n=51) were recruited and videotaped while performing a MCS comprising nine dynamic movements in three separate trials. Performances were rated three times with a minimal four-week wash out between testing sessions, each in randomized order by 12 raters (3 expert, 9 novice), using a three-point scale. Kappa score, percentage agreement and intra-class correlation were calculated for each movement individually and for the composite score. # Results Fifty-one pre-elite youth athletes (15.0±1.6 years; *n*=33 athletics, *n*=10 BMX and *n*=8 surfing) were included in the study. Based on kappa score and percentage agreement, both inter- and intra-rater reliability were highly variable for individual movements but consistently high (>0.70) for the MCS composite score. The composite score did not increase with task familiarization by the athletes. Experts detected more movement errors than novices and both rating groups improved their detection of errors with repeated viewings of the same movement. # Conclusions Irrespective of experience, raters demonstrated high variability in rating single movements, yet preliminary evidence suggests the MCS composite score could reliably assess movement competency. While athletes did not display a familiarization effect after performing the novel tasks within the MCS for the first time, raters showed improved error detection on repeated viewing of the same movement. # Level of Evidence Cohort studyhttps://ijspt.scholasticahq.com/article/35666-movement-competency-screens-can-be-reliable-in-clinical-practice-by-a-single-rater-using-the-composite-score |
spellingShingle | Kerry J. Mann Nicholas O'Dwyer Michaela R. Bruton Stephen P. Bird Suzi Edwards Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score International Journal of Sports Physical Therapy |
title | Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score |
title_full | Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score |
title_fullStr | Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score |
title_full_unstemmed | Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score |
title_short | Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score |
title_sort | movement competency screens can be reliable in clinical practice by a single rater using the composite score |
url | https://ijspt.scholasticahq.com/article/35666-movement-competency-screens-can-be-reliable-in-clinical-practice-by-a-single-rater-using-the-composite-score |
work_keys_str_mv | AT kerryjmann movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore AT nicholasodwyer movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore AT michaelarbruton movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore AT stephenpbird movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore AT suziedwards movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore |