Pre-simulation preparation and briefing practices for healthcare

Anuncio
S YS T E M AT I C R E V I E W P R O T O C O L
Pre-simulation preparation and briefing practices for
healthcare professionals and students: a systematic
review protocol
Jane Tyerman 1 Marian Luctkar-Flude 2 Leslie Graham 3 Sue Coffey 4 Ellen Olsen-Lynch 5
1
School of Nursing, Fleming College, Trent University, Peterborough, Canada, 2School of Nursing, Queen’s University, Kingston, Canada, 3School of
Nursing, Durham College, Canada, 4School of Nursing, University of Ontario Institute of Technology, Oshawa, Canada, and 5Library Department,
Trent University, Peterborough, Ontario, Canada
Review question/objective: The objective of this review is to systematically examine the use and effectiveness of
pre-simulation preparation and briefing practices for healthcare professionals and students. More specifically, the
objectives are to identify characteristics/activities of pre-simulation preparation and briefing and their effects on
knowledge, attitudes, self-confidence, self-efficacy, anxiety and skill performance in healthcare professionals and
students.
The review question is: What are the characteristics/activities of effective pre-simulation preparation and briefing?
Keywords Brief; facilitation; orientation; pre-scenario; simulation
Background
nterest in the use of simulation for teaching and
learning in academic and practice settings is
mounting. Simulation is currently used for a wide
range of purposes, including as a replacement for
clinical hours,1,2 as an adjunct to clinical hours,3 for
specialty experiences, such as those with a pediatric
or maternal child focus,4 for competency assessment,5,6 for crisis resource management or team
training,7 and for inter-professional education.8,9
As educational programs integrate simulation
experiences in an effort to meet the healthcare quality and patient safety agenda, simulation design
becomes preeminent. Simulation experiences involve
three main dimensions: pre-briefing, unfolding the
scenario and debriefing.10,11 Although concepts,
approaches and techniques associated with scenario
development and debriefing are widely reported in
the literature, there is little discussion of pre-briefing
or briefing practices,12–14 and how these practices
contribute to the simulation experience. It has been
suggested that pre-briefing is a critical element in
scenario design and learner engagement,10,15 yet is
underreported in the literature. The aim of this
I
Correspondence: Jane Tyerman, janetyerman@trentu.ca
There is no conflict of interest in this project.
DOI: 10.11124/JBISRIR-2016-003055
JBI Database of Systematic Reviews and Implementation Reports
systematic review is to address the characteristics/
activities of effective pre-simulation preparation
and briefing.
In defining the term briefing or pre-briefing, two
main themes emerged from the literature. First, prebriefing is used to describe ‘‘information or an
orientation session held prior to the start of a simulation-based learning experience in which instructions or preparatory information is given to the
participants.’’16(p.S6) Activities supported during
the pre-briefing include review of the learning objectives, orientation to the learning environment, and
overview of learner roles and expectations.14,16
During this phase, clarifying learner expectations
and assisting the learner in the suspension of disbelief
is important in fostering learner engagement during
the simulation experience.14 These activities are
completed just prior to the simulation experience
to facilitate achievement of the learning objectives.16
Clear and explicit instruction provided in advance of
the scenario was found to be beneficial to improving
learner performance during the simulation encounter.15
The term pre-briefing has also been used to
describe additional preparatory activities to augment simulation-based learning. The learner preparation may include completion of a web-based
module or independent reading assigned in advance
of the simulation.17–19 Pre-simulation preparation
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
80
SYSTEMATIC REVIEW PROTOCOL
may also include a list of essential skills that are to be
mastered by the learner prior to attending the simulation experience.19 Other strategies for pre-scenario
preparatory work include creating cognitive aids
such as cue cards or concept maps to be used during
the simulation scenario as a quick reference.19 Prescenario preparation can be guided by learning outcomes and descriptors provided by assessment
rubrics, which may help learners to close the gap
between their actual and desired performance.20
Attendance at additional laboratory practice
sessions designed to refine psychomotor skills could
also be included in pre-simulation preparation.13
Discussion continues when determining the correct dose and types of pre-simulation activity.17
Gantt13 suggests that simulation-based learning
experiences are associated with a physiologic stress
response, yet too much stress can inhibit the learning
process leaving the learner unable to perform during
the simulation. When learners experience a level of
comfort in the environment as a result of orientation
and prior exposure to the simulation scenario, the
anxiety associated with the simulation experience
has been shown to be lessened. Decreasing anxiety
promotes student engagement during simulation
experiences, which in turn supports critical thinking
and reflective practice.13–18
The degree to which learners should optimally be
provided with cues during the simulation remains in
question. Sharoff18 provided the learners with comprehensive preparatory materials, which could be
viewed as providing a high level of cueing. Depending on the formative or summative nature of the
assessment, it may or may not be appropriate to cue
the learner. Gantt13 notes a lack of evidence to
recommend the actual components of preparation
for simulation-based learning. Although lectures,
readings, videos and web-based materials are common methods of preparing learners, there is a lack
of evidence to favor one method over another.
Additionally, evidence points to variations in the
amount of assigned pre-simulation preparation that
is actually completed by learners prior to attending
the simulation experience.17 Further, while learners
may be assigned preparatory materials, there is no
verification of review and understanding of this
information.17
For the purpose of this review, we maintain that
simulation experiences involve three learning stages:
preparation, participation and debriefing,10 and that
JBI Database of Systematic Reviews and Implementation Reports
J. Tyerman et al.
the preparation stage includes the pre-briefing or
briefing, which occurs immediately prior to
participation in the scenario. To guide this review,
it is necessary to define the key terms:
Pre-simulation preparation is the content or
material provided at unspecified times in advance
of the simulation experience. Broadly, this includes
course-related content in any format shared with the
learner in advance of the scenario, to optimize the
learning. Traditionally, lecture or other preparatory
work such as assigned readings are considered presimulation activities. Additionally, methods of selfassessment such as pre-simulation quizzes, selfreflections, assessment rubrics or other evaluations
completed by the learner to identify knowledge or
skill gaps in advance of the simulation experience
may be considered pre-simulation preparation
activities.
Pre-briefing and briefing are used synonymously
to indicate the interaction between the facilitator
and the learner, just prior to the simulation experience. During this interaction, the learner is prepared
to fully engage in the learning environment. Considered a more active experience, the facilitator may
review the learning objectives, provide equipment
orientation and share other pertinent information
relevant to the specific simulation experience.
As the science of simulation continues to evolve,
emphasis previously placed largely on scenario
development and debriefing must now expand to
other components of the learning experience. As the
focus shifts to other key elements of simulation
design, greater interest is directed toward pre-simulation preparation and pre-briefing practices and
how these contribute to the simulation experience
and learning. Yet few literature reviews have been
completed on this topic.
A search of the JBI Database of Systematic
Reviews and Implementation Reports, Cochrane
Database of Systematic Reviews, Medline, the
Cumulative Index to Nursing and Allied Health
Literature (CINAHL) and Epistemonikos found no
existing systematic reviews completed or proposed
on pre-simulation preparation. However, PageCutrara completed a comprehensive review in
2014 specifically on pre-briefing of nursing literature
only.11 The review included 15 articles, and reported
that the role of pre-briefing may be beneficial in
developing critical thinking and clinical judgment.11
Chamberlain12 performed a concept analysis in 2015
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
81
SYSTEMATIC REVIEW PROTOCOL
of pre-briefing in nursing simulation in which she
reported the results of a review conducted using a
single database, CINAHL. After review of 23
articles, it was recommended that pre-briefing be
conducted by faculty educated in the use of simulation to enhance learner engagement and effectiveness of the simulation experience.12 To expand on
the existing work on this topic, our review will
examine all pre-simulation preparation activities,
including pre-briefing in all health professionals
and health professional students.
For this review, outcome measures for quality
simulation experiences include knowledge, attitudes
skill performance, self-efficacy and self-confidence.
A variety of tools exist such as the Simulation Evaluation Tool, which has been used to examine student
satisfaction and self-confidence while engaging in
simulation.21,22 In many studies, student satisfaction
and self-confidence were measured, revealing most
students enjoyed the active learning that occurs in
simulation.22–26 Confidence, most often assessed as
single-item measure on a questionnaire, increased
after most simulated learning experiences.27,28
Increased self-confidence can lead to enhanced performance and competency, which results in an
increased self-efficacy enhancing performance
through affective, cognitive and motivational processes.25,29,30
For the purposes of this appraisal, any activity
that is completed in preparation of a simulation
experience will be included for review. Concepts
related to pre-simulation preparation and pre-briefing practices will be explored within the context of
healthcare professional’s education. As more emphasis is placed on simulation-based learning experiences for healthcare students and professionals, a
closer examination of the role of pre-simulation
preparation and pre-briefing practices is warranted,
specifically including a focus on how they contribute
to the learning process.
Inclusion criteria
Types of participants
This quantitative review will consider studies from
any setting that include any health professionals
and/or health professional students participating
in simulation. Simulation may include medium
fidelity, hybrid simulations or high-fidelity simulations using computerized manikins or standardized patients.
JBI Database of Systematic Reviews and Implementation Reports
J. Tyerman et al.
Types of intervention(s)/phenomena of interest
The review will consider studies that evaluate
characteristics/activities of pre-simulation preparation and/or pre-simulation briefing (or prebriefing). Comparators may include traditional
lecture preparation, alternate preparation or briefing, or no preparation and/or briefing activities.
Outcomes
This review will consider studies that include any
of the following learner outcome measures:
knowledge, attitudes, self-confidence, self-efficacy,
anxiety and skill performance. A variety of measures
are implemented to evaluate simulation-based
learning activities. Competency-based checklists,
rubrics and scales are used in the evaluation of
simulation-based learning outcome measures.
Instruments used in the studies for this review range
from researcher developed tools to well-validated
and reliable instruments.
Types of studies
This review will consider both experimental and
epidemiological study designs, including randomized controlled trials, non-randomized controlled
trials, quasi-experimental, before and after studies,
prospective and retrospective cohort studies, casecontrol studies and analytical cross-sectional studies.
The review will also consider descriptive epidemiological study designs, including case series, individual case reports and descriptive cross-sectional
studies.
Search strategy
The search strategy aims to find both published and
unpublished studies. A three-step search strategy
will be utilized in this review. An initial limited
search of MEDLINE and CINAHL will be undertaken followed by analysis of the text words contained in the title and abstract and the index terms
used to describe the article. A second search using
all identified keywords and index terms will then be
undertaken across all included databases. Third,
the reference list of all identified reports and
articles will be searched for additional studies.
No language limitation will be placed on the search
strategy. Research studies published in languages
other than English will be tallied and reported. To
be thorough, no date limit will be placed on the
search strategy.
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
82
SYSTEMATIC REVIEW PROTOCOL
The databases to be searched include: Medline,
CINAHL, PsycINFO, ERIC, Web of Science, and
Cochrane Central Register of Controlled Trials.
The search for unpublished studies will include:
Dissertation Abstracts, Google, and selected grey
literature databases/gateways (e.g. OpenGrey, Grey
Literature Report, Grey Source).
Initial keywords to be used will be: simulation,
prebrief$, brief$, prescenario, pre-scenario, presimulation, presimulation, pretrain$, pre-train$, preparation, orientation, facilitation
Assessment of methodological quality
Quantitative articles selected for retrieval will be
assessed by two independent reviewers for methodological validity prior to inclusion in the review using
standardized critical appraisal instruments from the
Joanna Briggs Institute Meta-Analysis of Statistics
Assessment and Review Instrument (JBI-MAStARI)
(Appendix I). Any disagreements that arise between
the reviewers will be resolved through discussion or
with a third reviewer.
Data extraction
Quantitative data will be extracted from articles
included in the review independently by two
reviewers, using the standardized data extraction
tool from JBI-MAStARI (Appendix II). The data
extracted will include specific details about the
interventions (e.g. duration, content and method
of delivery), populations, study methods and outcomes of significance to the review question and
specific objectives. For missing information or to
clarify unclear data, the authors of primary studies
will be contacted. Any disagreements that arise
between the reviewers will be resolved through discussion or with a third reviewer.
Data synthesis
Quantitative articles will, wherever possible, be
pooled in statistical meta-analysis using JBI-MAStARI. All results will be subject to double data entry.
Effect sizes expressed as odds ratio (for categorical
data) and weighted mean differences (for continuous
data) and their 95% confidence intervals will be
calculated for analysis. Heterogeneity will be
assessed statistically using the standard x2 and also
explored using subgroup analyses based on the
different quantitative study designs included in this
JBI Database of Systematic Reviews and Implementation Reports
J. Tyerman et al.
review. Where statistical pooling is not possible the
findings will be presented in narrative form, including tables and figures to aid in data presentation,
wherever appropriate. The MAStARI critical
appraisal tool will be used to evaluate the studies,
blinded to each reviewer.
References
1. Arthur C, Kable A, Levett-Jones T. Human patient simulation
manikins and information communication technology use
in Australian schools of nursing: a cross-sectional survey.
Clin Simul Nurs 2011;7(6):e219–27.
2. Hayden J, Smiley R, Alexander M, Kardong-Edgren S, Jeffries
P. The NCSBN National Simulation Study: a longitudinal,
randomized, controlled study replacing clinical hours with
simulation in prelicensure nursing education. J Nurs Regul
2014;5(2 Suppl):S3–40.
3. Wolfgram LJ, Quinn AO. Integrating simulation innovatively:
evidence in teaching in nursing education. Clin Simul Nurs
2012;8(5):e169–75.
4. Cato ML, Lasater K, Peeples AI. Nursing students’ self-assessment of their simulation experiences. Nurs Educ Perspect
2009;30(2):105–8.
5. Benner PE. Educating nurses: a call for radical transformation. 1st ed San Francisco, CA: Jossey-Bass; 2010.
6. Seropian MA. General concepts in full scale simulation:
getting started. Anesth Analg 2003;97(6):1695–705.
7. Anderson M, LeFlore JL, Anderson JM. Evaluating videotaped role-modeling to teach crisis resource management
principles. Clin Simul Nurs 2013;9(9):e343–54.
8. Alinier G, Harwood C, Harwood P, Montague S, Huish E,
Ruparelia K, et al. Immersive clinical simulation in undergraduate health care interprofessional education: knowledge and perceptions. Clin Simul Nurs 2014;10(4):e205–
16.
9. Palaganas J. Using health care simulation for interprofessional education and practice. In: Ulrich BT, Mancini ME,
editors. Mastering simulation: a handbook for success. Indianapolis, IN: Sigma Theta Tau International, 2014;175–201.
10. Husebø SE, Friberg F, Søreide E, Rystedt H. Instructional
problems in briefings: How to prepare nursing students for
simulation-based cardiopulmonary resuscitation training.
Clin Simul Nurs 2012;8(7):e307–18.
11. Page-Cutrara K. Use of Prebriefing in nursing simulation: a
literature review. J Nurs Educ 2014;53(3):136–41.
12. Chamberlain J. Prebriefing in nursing simulation: a concept
analysis using Rodger’s methodology. Clin Simul Nurs
2015;11(7):318–22.
13. Gantt LT. The effect of preparation on anxiety and performance in summative simulations. Clin Simul Nurs 2013;9(1):
e25–33.
14. Rudolph JW, Raemer DB, Simon R. Establishing a safe
container for learning in simulation: the role of the presimulation briefing. Simul Healthc 2014;9(6):339–49.
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
83
SYSTEMATIC REVIEW PROTOCOL
15. Brackney DE, Priode KS. Creating context with prebriefing: a
case example using simulation. J Nurs Educ Pract 2014;5(1):129.
16. Meakim C, Boese T, Decker S, Franklin A, Gloe D, Lioce L, et al.
Standards of best practice: simulation. Standard I: terminology. Clin Simul Nurs 2011;7(4 Suppl):S3–7.
17. Franklin AE, Sideras S, Gubrud-Howe P, Lee CS. Comparison
of expert modeling versus voice-over PowerPoint lecture
and presimulation readings on novice nurses’ competence
of providing care to multiple patients. J Nurs Educ
2014;53(11):615–22.
18. Waxman KT. The development of evidence-based clinical
simulation scenarios: guidelines for nurse educators. J Nurs
Educ 2010;49(1):29–35.
19. Sharoff L. Simulation: pre-briefing preparation, clinical judgment and reflection. What is the connection? J Contemp
Med 2015;5(2):88–101.
20. Ellery K. Assessment for learning: a case study using feedback effectively in an essay-style test. Assess Eval High Educ
2008;33(4):421–9.
21. Howard V, Englert N, Kameg K, Perrozzi K. Integration of
simulation across the curriculum: Student and faculty
perspectives. Clin Simul Nurs 2011;7(1):e1–10.
22. Howard V, Ross C, Mitchell A, Nelson G. Human patient
simulators and interactive case studies: a comparative
analysis of learning outcomes and student perceptions.
CIN:Comput Inform Nu 2010;28(1):42–8.
JBI Database of Systematic Reviews and Implementation Reports
J. Tyerman et al.
23. Bambini D, Washburn J, Perkins R. Outcomes of clinical
simulation for novice nursing students: communication,
confidence, clinical judgment. Nurs Educ Perspect
2009;30(2):79–82.
24. O’Donnell J, Decker S, Howard V, Levett-Jones T, Miller C.
NLN/Jeffries. Simulation Framework state of the science
project: Simulation learning outcomes. Clin Simul Nurs
2014;10:373–82.
25. Smith S, Roehrs C. High fidelity simulation: Factors correlated with nursing student satisfaction and self-confidence.
Nurs Educ Perspect 2009;30(2):74–8.
26. Zulkosky K. (2012). Simulation use in the classroom: Impact
on knowledge acquisition, satisfaction, and self-confidence.
Clin Simul Nurs 2012;8(1):25–33.
27. McCaughey C, Traynor M. The role of simulation in nurse
education. Nurse Educ Today 2010;30(8):827–32.
28. Mould J, White H, Gallagher R. Evaluation of a critical
simulation series for undergraduate nursing students. Contemp Nurse 2011;38(1-2):180–90.
29. Arnold J, Johnson L, Tucker S, Malec J, Henrickson S, Dunn
W. Evaluation tools in simulation learning: performance and
self-efficacy in emergency response. Clin Simul Nurs
2009;5(1):e35–43.
30. Bandura A. (1989). Regulation of cognitive process through
perceived self-efficacy. Dev Psychol 1898;25(5):729–35.
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
84
SYSTEMATIC REVIEW PROTOCOL
J. Tyerman et al.
Appendix I: MAStARI appraisal instruments
JBI Critical Appraisal Checklist for Experimental Studies
Reviewer _________________________________ Date __________________
Author _______________________________________________Year ______
Yes
1.
Was the assignment to treatment groups truly random?
2.
Were participants blinded to treatment allocation?
3.
Was allocation to treatment groups concealed from the
allocator?
No
Unclear
4. Were the outcomes of people who withdrew described and
included in the analysis?
5. Were those assessing the outcomes blind to the treatment
allocation?
6. Were control and treatment groups comparable at entry?
7. Were groups treated identically other than for the named
interventions?
8. Were outcomes measured in the same way for all groups?
9. Were outcomes measured in a reliable way?
10. Was appropriate statistical analysis used?
Overall appraisal:
Include
Exclude
Seek further info
Comments (including reasons for exclusion):
JBI Database of Systematic Reviews and Implementation Reports
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
85
SYSTEMATIC REVIEW PROTOCOL
J. Tyerman et al.
JBI Critical Appraisal Checklist for Comparable Cohort/ Case Control
Reviewer _________________________________ Date __________________
Author _______________________________________________Year ______
Yes
1.
Is sample representative of patients in the population as a
whole?
2.
Are the patients at a similar point in the course of their
condition/illness?
3.
Has bias been minimized in relation to selection of cases and
of controls?
4.
No
Unclear
Are confounding factors identified and strategies to deal with
them stated?
5. Are outcomes assessed using objective criteria?
6. Was follow up carried out over a sufficient time period?
7. Were the outcomes of people who withdrew described and
included in the analysis?
8. Were outcomes measured in a reliable way?
9. Was appropriate statistical analysis used?
Overall appraisal:
Include
Exclude
Seek further info
Comments (including reasons for exclusion):
JBI Database of Systematic Reviews and Implementation Reports
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
86
SYSTEMATIC REVIEW PROTOCOL
J. Tyerman et al.
JBI Critical Appraisal Checklist for Descriptive/ Case Series
Reviewer _________________________________ Date __________________
Author _______________________________________________Year ______
Yes
1.
Was study based on a random or pseudo- random sample?
2.
Were the criteria for inclusion in the sample clearly defined?
3.
Were confounding factors identified and strategies to deal
with them stated?
4.
Were outcomes assessed using objective criteria?
No
Unclear
5. If comparisons are being made, was there sufficient
description of the groups?
6. Was follow up carried out over a sufficient time period?
7. Were the outcomes of people who withdrew described and
included in the analysis?
8. Were outcomes measured in a reliable way?
9. Was appropriate statistical analysis used?
Overall appraisal:
Include
Exclude
Seek further info
Comments (including reasons for exclusion):
JBI Database of Systematic Reviews and Implementation Reports
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
87
SYSTEMATIC REVIEW PROTOCOL
J. Tyerman et al.
Appendix II: Data extraction instruments
MAStARI data extraction instrument
JBI Data Extraction Form for Experimental/Observational Studies
Reviewer
Date
Author
Year
Journal
Record Number
Study Method
Participants
RCT
Quasi-RCT
Retrospective
Observational
Longitudinal
Other
Setting
Population
Sample size
Intervention 1
Intervention 2
Intervention 3
Interventions
Intervention 1
Intervention 2
Intervention 3
Clinical outcome measures
Outcome Description
JBI Database of Systematic Reviews and Implementation Reports
Scale/measure
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
88
SYSTEMATIC REVIEW PROTOCOL
J. Tyerman et al.
Study results
Dichotomous data
Outcome
Intervention (
)
Intervention (
)
number / total
number / total
number
number
Continuous data
Outcome
Intervention (
)
Intervention (
mean & SD
mean & SD
(number)
(number)
)
Authors’ Conclusions
Comments
JBI Database of Systematic Reviews and Implementation Reports
ß 2016 THE JOANNA BRIGGS INSTITUTE
©2016 Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.
89
Descargar