Oral Presentations

O-12: Assessing performance of the Smart Audit tool for assessing quality of short-term medical missions (STMMs)

Dr. Christopher Dainton

Christopher Dainton*,
William Cherniak,
Christina Gorman,
Charlene Chu

McMaster University
Emergency Medicine


Topic: Global Health

Short-term medical missions (STMMs) are increasingly common among medical  professionals and trainees, but little information exists on their quality. We aimed to assess the adherence of a broad sample of STMMs operating in Latin America and the Caribbean (LAC) to key best practices using the Smart Audit assessment tool. This tool was based on a previously developed, literature-based and stakeholder validated inventory of 18 key best practices for STMMs.

Program administrators from 333 North American STMM organizations operating in LAC were contacted to complete an anonymous SurveyMonkey assessment tool evaluating their adherence to 18 best practice elements. The assessment tool was also publicly available for completion by clinicians and non-clinicians who had recently traveled with each organization. Adherence to each best practice was reported by respondents as Yes, No, or Not Sure, and conflicting data was resolved by investigator consensus. Adherence to each best practice was calculated as the percentage of Yes responses, and Krippendorff’s alpha was used to assess the interrater agreement of the responses.

178 individuals responded from 93/333 organizations (response rate 28%). The reported adherence was greater than 80% for 12/18 surveyed best practices. Reported adherence was lower for “Presence of a formal referral process” (68.5%), “Presence of protocols for clinical scope of practice” (65.2%), “Presence of a formal staffing plan” (75.3%), “Distribution of evidence-based clinical guidelines to providers” (60.7%), “Presence of accessible medical records” (66.3%), and “Assessment of financial cost to the host community” (55.1%). For the 32 organizations with multiple raters, Krippendorff’s alpha ranged between 0.5 and 0.7 (moderate to substantial agreement) for 12/18 best practices, but was lower for “Presence of a local clinician” (0.40; fair agreement), “Presence of a clear pathway for obtaining advanced tests” (0.40; fair agreement), “Distribution of evidence-based clinical guidelines” (0.28; slight agreement), “Protocols for clinical scope of practice” (0.28; slight agreement), “Presence of a formal staffing plan” (0.39; fair agreement), and “Assessment of financial cost to the host community” (0.15; poor agreement).

To our knowledge, this is the largest existing study of STMM best practices, and the first to attempt a data-driven, bottom-up assessment of program structure and adherence to practice standards. Evaluation of core best practice elements may be broadly useful to medical professionals and trainees selecting a high-quality project and allows a mechanism for assessment and monitoring of the landscape of STMMs. More precise language may improve the
interrater agreement for items in the inventory that performed poorly. Despite multiple attempts to contact all organizations, the low response rate makes it unclear whether the practices of non-respondent organizations are fundamentally different from those most likely to engage in quality improvement initiatives.