Monitoring and evaluation in disaster management courses: a scoping review
Abstract Background Owing to the infrequent emergence of disasters and the challenges associated with their management, responders need appropriate training beyond doubt. Ensuring the highest standard of disaster management (DM) training is of paramount importance for high-quality DM. However, the l...
Saved in:
Main Authors: | , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
BMC
2025-02-01
|
Series: | BMC Medical Education |
Subjects: | |
Online Access: | https://doi.org/10.1186/s12909-025-06659-0 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract Background Owing to the infrequent emergence of disasters and the challenges associated with their management, responders need appropriate training beyond doubt. Ensuring the highest standard of disaster management (DM) training is of paramount importance for high-quality DM. However, the literature concerning DM training monitoring and evaluation (M&E) is scarce. The primary objective of this review was to document the existing M&E strategies for DM training. Methods The authors conducted a systematic literature search on June 28, 2023, on the PubMed, Scopus, Embase and Cochrane databases, including studies that described the learning objectives and the M&E strategy of DM training. The authors categorized the learning objectives and the evaluation methodology according to the revised Bloom’s Taxonomy and the New World Kirkpatrick model, respectively. Results Fifty-seven articles met the inclusion and exclusion criteria, described DM training targeting healthcare and non-healthcare professionals and employed diverse teaching methods and topics. Five studies reported using monitoring, while all reported an evaluation methodology. The learning objectives focused on students’ ability to “Remember” (N = 50) and “Apply”(N = 44). The evaluations centred around the second level of the New World Kirkpatrick model (N = 57), with only 7 articles investigating the third level. Sixteen authors used existing, validated M&E frameworks. When corelating the learning objectives with the evaluation methodology, the authors observed a mismatch, as skills like the students’ ability to “Apply” and “Create” were evaluated using the second level of the New World Kirkpatrick model. Conclusions The great heterogeneity in DM training highlights the particularity of these educational programs. The lack of monitoring and the low usage of existing M&E frameworks highlighted a lack of awareness and standardization in the field. The mismatch between the learning objectives and the evaluation process led to deceptive evaluations, which may have resulted in graduates being deemed ready to deploy despite facing hardships in real-world settings, potentially leading to unprepared responders. |
---|---|
ISSN: | 1472-6920 |