Content » Vol 99, Issue 2

Review

Digital Education for Health Professions in the Field of Dermatology: A Systematic Review by Digital Health Education Collaboration

Xiaomeng Xu1, Pawel Przemyslaw Posadzki1, Grace E. Lee2, Josip Car1,3 and Helen Elizabeth Smith4

1Centre for Population Health Sciences (CePHaS), 4Family Medicine and Primary Care, Lee Kong Chian School of Medicine, Nanyang Technological University, 2School of Computer Science and Engineering, Nanyang Technological University, Singapore, Singapore, and 3Department of Primary Care and Public Health, School of Public Health, Imperial College London, London, UK

ABSTRACT

Digital health education is a new approach that is receiving increasing attention with advantages such as scalability and flexibility of education. This study employed a Cochrane review approach to assess the evidence for the effectiveness of health professions’ digital education in dermatology to improve knowledge, skills, attitudes and satisfaction. Twelve trials (n = 955 health professionals) met our eligibility criteria. Nine studies evaluated knowledge; of those two reported that digital education improved the outcome. Five studies evaluated skill; of those 3 studies stated that digital education improved this outcome whereas 2 showed no difference when compared with control. Of the 5 studies measuring learners’ satisfaction, 3 studies claimed high satisfaction scores. Two studies reported that when compared with traditional education, digital education had little effect on satisfaction. The evidence for the effectiveness of digital health education in dermatology is mixed and the overall findings are inconclusive, mainly because of the predominantly very low quality of the evidence. More methodologically robust research is needed to further inform clinicians and policymakers.

Key words: review; dermatology; education.

Accepted Oct 15, 2018; Epub ahead of print Oct 15, 2018

Acta Derm Venereol

Corr: Josip Car, Centre for Population Health Sciences, 11 Mandalay Road, Level 18 Clinical Sciences Building, Singapore 308232. E-mail: josip.car@ntu.edu.sg

SIGNIFICANCE

Digital education is a promising new approach with advantages such as scalability, flexibility, portability and adaptability of education. This study synthesized  effectiveness evidence for health professions’ digital education in dermatology, and assessed whether it can improve knowledge, skills, attitudes and satisfaction as compared to traditional learning. We found 12 studies involving in total 955 health professionals. The main learning outcomes were comparable in terms of knowledge improvement, skills enhancement and satisfaction, suggesting the potential of digital health education to be used as a complementary or alternative method to traditional learning in dermatology. It has the potential to address the increased demand for dermatology education but requires further rigorous research to maximise its potential.

INTRODUCTION

There is a growing burden of skin conditions among the general population (1–3). Several studies have shown that up to 7% of primary care consultations are for skin related complaints (4–6). At the same time, there is an increasing worldwide shortage of healthcare professionals, including dermatologists and dermatology nurses (7). The world is short of 17.4 million healthcare professionals (8) and this shortage is projected to remain, with a deficit of 14 million in 2030 (9). This shortage may be further accentuated by inadequate dermatology education; it is estimated that the formal dermatology education in the undergraduate medical and nursing curriculum represents only 0.24–0.3% of teaching time. Nonetheless, according to the American Academy of Dermatology’s (AAD) 2007 practice profile survey, dermatology practices have tended increasingly to employ dermatology nurse practitioners in order to augment dermatology care services (10, 11). As for medical students, one survey claimed that they received no more than 18 h of dermatological education in medical schools (10). This paucity of dermatology training is of concern as patients with skin problems are encountered in many clinical specialties including general medicine, paediatrics, venereology and general practice.

To fulfill the increasing need for dermatology education, it is necessary to provide high-quality teaching among pre- and post-registration health professionals (12). A recent study highlighted the potential for digital health education (DHE) in dermatology (13). In that study, digital education was found to significantly increase the effectiveness of dermatology learning, enhance the quality of education, and improve the teacher’s resources. DHE (also known as eLearning) is a broad construct that includes digital technology delivered or improved approach to teaching and learning in health-care which encompasses many different modalities offline and online, including virtual patient (VP), virtual reality environment (VRE), mobile learning (mLearning or mobile digital education), psychomotor skill trainers (PST), digital game-based learning (DGBL), Virtual Learning Environments (VLEs), Learning Management Systems (LMSs) and Massive Open Online Courses (MOOCs) (14).

DHE, compared to traditional intervention, has advantages such as flexibility, portability and especially, cost-effectiveness. It helps create an efficient and convenient learning mode for health professionals and students which further improves learning outcomes (15). DHE also enables healthcare professionals to learn and update their knowledge remotely without restriction of time and location (16). Moreover, DHE could potentially optimize resources utilization and reduce healthcare costs (17). Dermatology is a particularly suitable discipline for digital education due to its strong dependency on visual clues. Although many RCTs have evaluated the effectiveness of DHE few have focussed on dermatology. Therefore, in this systematic review, we critically evaluate the evidence for use of digital education of dermatology.

METHODS

The review protocol was registered at PROSPERO (Prospero registration no. 42016051156; http://www.crd.york.ac.uk/prospero/).

Search strategy

We conducted our systematic review following Cochrane methods (18). Our review is part of a larger series of systematic reviews synthesizing evidence on different types of DHE (19–21). The databases were first searched in 2014 and the searches updated on a yearly basis, the most recent in August 2017. The following electronic databases were searched: Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE (via Ovid), Embase (via Ovid), Web of Science, Educational Resource Information Centre (ERIC) (via Ovid), PsycINFO (Ovid) and CINAHL (via EBSCO). Reference lists of included studies and relevant SRs were also searched as potential sources.

Selection of studies

Two authors (XX and GL) independently screened titles and abstracts to identify studies potentially meeting the eligibility criteria. Randomised controlled trials (RCTs), cluster RCTs (cRCTs) or quasi-randomised trials of all types of DHE or blended Education (DHE plus traditional education) were considered eligible in this SR. Cross-over trials including stepped wedge design were excluded due to high likelihood of carry over effect. For a study to be included, it had to compare the effectiveness of full DHE or blended education versus traditional or no education. If an RCT had more than one DHE intervention group, we compared the relevant DHE arm with the least active control arm (22). Based on the Health Field Education and Training of the International Standard Classification of Education(ISCED-F) (UNESCO), the pre- and post-healthcare professionals’ learning subjects were dental studies, medicine, nursing and midwifery, medical diagnostic and treatment technology, therapy and rehabilitation, and pharmacy (23).

Outcome measures

Primary outcomes included (i) changes in dermatology knowledge (e.g. quantified differences in post-intervention scores), (ii) changes in clinical skills (e.g. pre- and post-test scores, time to perform a procedure, number of errors made whilst performing a procedure), (iii) changes in attitudes (e.g. pre- and post-intervention scores related to attitudes) and changes in learners’ satisfaction with the DHE intervention (e.g., Likert-like scales related to satisfaction). Secondary outcomes included (i) the economic outcomes of the intervention (e.g., cost-effectiveness, implementation cost, return on investment), (ii) outcomes related to patient care, including patients’ satisfaction, improvement in clinical signs and symptoms (e.g., remission rate, disease-related clinical and behavioural indices) and (iii) adverse/unintended effects of the interventions.

Data extraction and risk of bias (ROB) assessments

Two authors (XX and GL) independently extracted the data using a custom-made data extraction form. Cochrane’s Risk of Bias Tool (ROB) was used to assess the following domains: random sequence generation (selection bias); allocation concealment (selection bias); blinding of participants, personnel (performance bias); blinding of outcome assessment (detection bias); completeness of outcome data (attrition bias), selective outcome reporting (relevant outcomes reported); other sources of bias (baseline imbalances) (18). Judgements concerning the ROB for each study fell under 3 categories: high, low, or unclear risk of bias. This was done by 3 reviewers independently of each other. Any disagreements were resolved through a discussion and a third reviewer acted as arbiter (CS).

Summary of Findings Tables

The quality of evidence was evaluated using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) criteria (24). The quality of the evidence was downgraded by 1 or 2 levels due to limitations in the study (ROB), inconsistency, indirectness, imprecision, and publication bias. Any disagreements were resolved through discussion with an arbiter (CS).

RESULTS
Literature search

The initial searches yielded 6,676 potential studies of which 234 studies remained after the primary screening of the title and abstract. Screening the full text of these studies identified 12 trials that met our review criteria (22, 25–35) (see Fig. 1). All the included studies were RCTs and conducted between 2002 and 2016 in Brazil (28, 32), China (22), France (34), Kingdom of Saudi Arabia (26), Norway (27, 31), Sweden (35), Spain (33), US (29, 30) and UK (25). Participants ranged from year one undergraduates (22, 25, 26, 28, 30, 32, 33, 35) to certified physicians (29, 31, 34) and nurses (27). The sample sizes in the studies ranged from 46 (30) to 141 (34) and the duration of the intervention varied, from 15 min (28) to 6 months (31). The digital education included offline computer-based tutorials (22, 26, 29, 32, 35), online computer based tutorial (30, 31, 34) and computer-based learning-software (25, 27, 28, 33). The comparators were traditional education (22, 25, 26, 28, 30, 32, 33, 35), or no education (27, 29, 31, 34). Nine studies (75%) assessed the knowledge improvement by MCQ (multiple choice questions) (27, 30–32, 34), written examinations (22, 33, 35) and questionnaires (28). Five studies (42%) evaluated skill enhancement using grades of student performance subjectively assessed by tutors (22), objective examinations of clinical cases (25, 26, 29), checklist (10 lists with one score for each) and global assessment (9 assessments with a 1–5 score for each) (28). Satisfaction was evaluated in 5 studies (41.7%) using a variety of Likert-type scales and questionnaires (22, 26, 31, 32, 35) none of which were validated. None of the studies evaluated attitudes.


Fig. 1. PRISMA flow diagram.

Risk of bias assessment

Judgments about each ROB item for each study are presented in Figs 2 and 3. Three studies (25%) (25, 26, 29) were judged to have unclear ROB for random sequence generation. Seven studies (58%) (22, 25, 26, 29, 30, 33, 35) were judged to have an unclear ROB for allocation concealment. Six studies (50%) (25–27, 29, 30, 35) were judged to have an unclear ROB and one study was judged to be a high ROB for blinding of personnel and participants. Seven studies (58%) (25–27, 29, 31, 33, 34) were judged to have an unclear ROB and one study was judged have a high ROB for blinding of outcome assessors. One study (8.3%) (27) was judged to have a high risk of ROB for incomplete outcome data. Aldridge et al.  (25) and Amri et al. (26) studies (100%) were judged to to have an unclear ROB for reporting bias while others were judged as a low ROB. Only one study (1/12) (27) was judged to have an unclear ROB in other bias domain due to baseline imbalance.


Fig. 2. Risk of bias summary: review authors’ judgements about each risk-of-bias item for each included study.


Fig. 3. Risk-of-bias: review authors’ judgments about each risk-of-bias item presented as percentages of all included studies.

Summary of Findings Tables

None of the studies mentioned a power calculation. For knowledge, the evidence was of very low quality due to the serious concern about the ROB, inconsistency, and indirectness, based on the results from 9 RCTs with 638 participants. For skills, the evidence was of low quality because of serious concern about the ROB and inconsistency among 5 RCTs with 338 participants. The evidence for the satisfaction of 5 RCTs with 380 participants was of very low quality due to the serious concerns of bias, inconsistency, and indirectness (Table SI).

Description of studies

Due to the high heterogeneity of the included studies, we did not employ meta-analytic techniques. Instead we used a narrative approach to synthesize the data. Detailed information of the included studies is shown in Table SII. Seven studies (22, 27, 30–32, 34, 35) found no difference in digital education, compared with traditional or no intervention, in improving post-intervention knowledge scores. Only two of the 9 studies (28, 33) found digital education resulted in a significant improvement in knowledge compared with a traditional learning intervention with low to unclear ROB (p < 0.01 (28); p < 0.001 (33)). A subgroup analysis by registration level revealed distinctions of learning outcomes. One of 4 studies showed significant difference among post-registration (nurse and physicians) (29). Among those studies with pre-registration healthcare professionals (medical students and nursing undergraduate students), 2 of 8 studies (25, 28) showed significant differences in knowledge improvement. Three studies (3/5) favoured digital education for skills enhancement compared with traditional or no learning intervention. Skill was demonstrated significantly better in flap skin surgery validation (checklist: p < 0.02; global assessment: p < 0.017) (28), skin cancer triage (diagnostic accuracy rate: p < 0.001; evaluation plan: p < 0.001) (29) and dermatological knowledge (diagnostic accuracy rate:p < 0.00001) (25), following 15 min or 10 days of digital education (software) (25, 28, 29). Five studies (22, 26, 31, 32, 35) measured satisfaction. Of these, two reported (22, 31) that digital education may have little effect or no difference on learners’ satisfaction (measured with Likert-like scales) when compared with controls (no education and traditional learning). In the remaining 3 studies (25, 32, 35) there was no comparison group. As for the attitude measurement, none of the included studies reported the attitude related outcomes as well as secondary outcomes, including economic outcomes, patient-related outcomes and adverse and unintended effects of the digital education.

DISCUSSION

The overall findings of this review are inconclusive due to the significant methodological, clinical and statistical heterogeneity of the included studies. The uncertainty of the results stems from inconsistency, imprecision, and indirectness in terms of diverse participants, educational interventions, different measurement instruments, or various comparison groups. Participants included were heterogeneous, ranging from medical students, healthcare professionals including specialists, primary care physicians, nursing students to registered nurses.

Interventions were also heterogeneous, ranging from offline learning (digital photos), software-based learning, online learning, to blended learning (combination of conventional learning and computer-based learning) (22, 27, 32–35). Comparison groups were homogeneous too. Eight out of 12 studies (22, 25, 26, 28, 30, 32, 33, 35) used traditional learning, while others had no intervention (education) group. Traditional learning interventions also differed including paper-based learning (22, 25, 26, 28) and lectures (30, 32, 33, 35).

Outcome measurement instruments were varied. Among included studies, knowledge improvement was measured with different instruments including MCQs, written examinations, and questionnaires. Similarly, the variability was also found in skills measurement tools. Objective clinical skills examinations (diagnosis accuracy measurements; disease recognition; management plan) were employed in 4 studies (22, 25, 26, 29) while subjective checklist assessment was used in De Sena’s study (28). For satisfaction, 4 studies used Likert-like scales, but the criteria assessed were different. The remaining study (35) used 4 point rating scale questionnaires. The variety of assessment tools employed by different studies hinder the comparison among the studies and weaken the interpretability of the findings (14, 21).

Strength and limitations

This study has several strengths including comprehensive searches, rigorous adherence to high methodological standards and critical appraisal of the evidence. Our comprehensive search strategy had no language restrictions and included a wide range of databases and peer-reviewed journals. However, several limitations should be kept in mind while interpreting its results. Included studies reporting sample sizes without power calculation left uncertainty of detecting an effect. Besides, non-randomized experiments are the most commonly used design for evaluation of digital education, but we exclude those in order to maintain the quality of the included studies.

It should be highlighted that blinding study participants and personnel to digital education is difficult as there is always a possibility of unidentified contamination between the study arms as digital education materials are easily accessible. Participants who are assigned to the control group may easily visit the webpage or download the app without the researcher’s awareness, resulting in potential performance bias. Moreover, learning content and learning theories were barely mentioned in the included papers. Unverified and incomprehensive learning content may mislead learners and even endanger patients’ health. Authoritative digital education practices and strategies emerging from different learning theories may contribute to better knowledge improvement, skill enhancement and higher satisfaction with less cognitive overload (36, 37). Theoretical education allows learners to optimize others’ learning experience, provides the possibility of deeper understanding of the learning materials and helps to connect the learning content with applications in practice. Besides, no economic outcomes were reported (28).

Our study demonstrated little or no improvement in dermatological knowledge, skills, attitudes and satisfaction among healthcare professionals when compared DHE to traditional learning or no interventions. The results are not specific for dermatology area but are also observed in other medical education setting including nursing, neurosurgery training, palliative care, dental education (14, 21, 38–41). Comparable results show the potential of DHE as a complementary or even alternative intervention to traditional learning. Given the flexibility, portability, adaptability and scalability of DHE, it could be considered as the promising solution to meet the increasing demand and address the shortage of highly trained dermatologists (42, 43).

Further research is needed to address the methodological limitations mentioned above. Also, there is a need to adopt learning theories in the design of future DHE interventions as well as employ power and sample size calculations. Moreover, to avoid contamination between groups, researchers need to track and follow-up control group participants to make sure they are not exposed to DHE. Furthermore, future studies should provide information on cost, cost-effectiveness and potential unintended effects of digital education.

Conclusion

The evidence for the effectiveness of DHE in dermatology is mixed and the overall findings are inconclusive mainly because of the predominantly very low quality of the evidence. More methodologically robust research is needed to further inform clinicians and policymakers.

ACKNOWLEDGEMENT

We thank Ms. Chen Si for assistance with resolving the disagree-ments when screening the papers for inclusion in this review.

This study was funded by NTU Research Scholarship.

The authors have no conflicts of interest to declare.

REFERENCES
  1. Hay RJ, Johns NE, Williams HC, Bolliger IW, Dellavalle RP, Margolis DJ, et al. The global burden of skin disease in 2010: an analysis of the prevalence and impact of skin conditions. J Invest Dermatol 2014; 134: 1527–1534.
    View article    Google Scholar
  2. Hay RJ, Fuller LC. Global burden of skin disease in the elderly: a grand challenge to skin health. G Ital Dermatol Venereol 2015; 150: 693–698.
    View article    Google Scholar
  3. Kalia S, Haiducu ML. The burden of skin disease in the United States and Canada. Dermatol Clin 2012; 30: 5–18, vii.
    View article    Google Scholar
  4. Duque G, Finkelstein A, Roberts A, Tabatabai D, Gold SL, Winer LR, et al. Learning while evaluating: the use of an electronic evaluation portfolio in a geriatric medicine clerkship. BMC Med Educ 2006; 6: 4.
    View article    Google Scholar
  5. Alavi A, Moallemi N, Zolfaghari G, Amjadi N. Effect of maintenance therapy with isoxsuprine in prevention of preterm labor. Int J Gynaecol Obstet 2015; 131: E477.
    View article    Google Scholar
  6. Badiei M, Gharib M, Zolfaghari M, Mojtahedzadeh R. Comparing nurses’ knowledge retention following electronic continuous education and educational booklet: a controlled trial study. Med J Islam Repub Iran 2016; 30: 364.
    View article    Google Scholar
  7. Kimball AB, Resneck JS, Jr. The US dermatology workforce: a specialty remains in shortage. J Am Acad Dermatol 2008; 59: 741–745.
    View article    Google Scholar
  8. Sinclair JA, Singla A, Lowe R, Jacobson BC. A randomized, controlled trial to improve ascites management by internal medicine residents. Gastroenterology 2012; 142: S62.
    View article    Google Scholar
  9. Aluttis C, Bishaw T, Frank MW. The workforce for health in a globalized context – global shortages and international migration. Glob Health Action 2014; 7: 23611.
    View article    Google Scholar
  10. Resneck JS, Jr., Kimball AB. Who else is providing care in dermatology practices? Trends in the use of nonphysician clinicians. J Am Acad Dermatol 2008; 58: 211–216.
    View article    Google Scholar
  11. Tsang MW, Resneck JS, Jr. Even patients with changing moles face long dermatology appointment wait-times: a study of simulated patient calls to dermatologists. J Am Acad Dermatol 2006; 55: 54–58.
    View article    Google Scholar
  12. Organization WH. A universal truth: no health without a workforce. http://www.who.int/workforcealliance/knowledge/resources/GHWA-a_universal_truth_report.pdf: World Health Organization, 2014.
    View article    Google Scholar
  13. Boespflug A, Guerra J, Dalle S, Thomas L. Enhancement of Customary Dermoscopy Education With Spaced Education e-Learning: A Prospective Controlled Trial. JAMA Dermatology 2015; 151: 847–853.
    View article    Google Scholar
  14. George PP, Papachristou N, Belisario JM, Wang W, Wark PA, Cotic Z, et al. Online eLearning for undergraduates in health professions: A systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Global Health 2014; 4: 010406.
    View article    Google Scholar
  15. Goh PS. eLearning or technology enhanced learning in medical education-Hope, not hype. Medical Teach 2016; 38: 957–958.
    View article    Google Scholar
  16. Reid HJ, Thomson C, McGlade KJ. Content and discontent: a qualitative exploration of obstacles to elearning engagement in medical students. BMC Med Educ 2016; 16: 188.
    View article    Google Scholar
  17. Sissine M, Segan R, Taylor M, Jefferson B, Borrelli A, Koehler M, et al. Cost Comparison Model: Blended eLearning versus traditional training of community health workers. Online J Public Health Inform 2014; 6: e196.
    View article    Google Scholar
  18. Higgins JPT, Green S. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 The Cochrane Collaboration 2011.
    View article    Google Scholar
  19. Fontaine G, Cossette S. Effectiveness of adaptive E-learning environments on knowledge, competence, and behavior in health professionals and students: protocol for a systematic review and meta-analysis. JMIR Res Protoc 2017; 6: e128.
    View article    Google Scholar
  20. Krishnasamy C, Ong SY, Yock Y, Lim I, Rees R, Car J. Factors influencing the implementation, adoption, use, sustainability and scalability of mLearning for medical and nursing education: a systematic review protocol. JMIR Res Protoc 2016; 5: 178.
    View article    Google Scholar
  21. Rasmussen K, Belisario JM, Wark PA, Molina JA, Loong SL, Cotic Z, et al. Offline eLearning for undergraduates in health professions: A systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Global Health 2014; 4: 010405.
    View article    Google Scholar
  22. Li J, Li QL, Li J, Chen ML, Xie HF, Li YP, et al. Comparison of three problem-based learning conditions (real patients, digital and paper) with lecture-based learning in a dermatology course: a prospective randomized study from China. Med Teach 2013; 35: e963–970.
    View article    Google Scholar
  23. Uis.unesco.org. International Standard Classification of Education (ISCED). 2018.
    View article    Google Scholar
  24. Schünemann HJ OA, Higgins JP, Vist GE, Glasziou P, Guyatt GH. Chapter 11: Presenting results and ‘summary of findings’ tables. Cochrane Handbook for Systematic Reviews of Interventions: Cochrane Applicability and Recommendations Methods Group and the Cochrane Statistical Methods Group; 2011.
    View article    Google Scholar
  25. Aldridge B, Glodzik D, Bisset Y, Ballerini L, Robertson,K, Xi L, et al. Dermofit: A novel software that improves novices’ diagnostic accuracy to a level above that of trained medical students. J Invest Dermatol 2016; 31: 32.
    View article    Google Scholar
  26. Amri M, ElHani I, Alkhateeb AA. Digital photographs in clinical teaching of dermatology: what is their proper place? Med Teach 2012; 34: 510–511.
    View article    Google Scholar
  27. Bredesen IM, Bjoro K, Gunningberg L, Hofoss D. Effect of e-learning program on risk assessment and pressure ulcer classification – A randomized study. Nurse Educ Today 2016; 40: 191–197.
    View article    Google Scholar
  28. de Sena DP, Fabricio DD, Lopes MH, da Silva VD. Computer-assisted teaching of skin flap surgery: validation of a mobile platform software for medical students. PloS One 2013; 8: e65833.
    View article    Google Scholar
  29. Gerbert B, Bronstone A, Maurer T, Berger T, McPhee SJ, Caspers N. The effectiveness of an Internet-based tutorial in improving primary care physicians’ skin cancer triage skills. J Cancer Educ 2002; 17: 7–11.
    View article    Google Scholar
  30. Jenkins S, Goel R, Morrell DS. Computer-assisted instruction versus traditional lecture for medical student teaching of dermatology morphology: a randomized control trial. J Am Acad Dermatol 2008; 59: 255–259.
    View article    Google Scholar
  31. Schopf T, Flytkjaer V. Impact of interactive web-based education with mobile and email-based support of general practitioners on treatment and referral patterns of patients with atopic dermatitis: randomized controlled trial. J Med Internet Res 2012; 14: e171.
    View article    Google Scholar
  32. Soirefmann M, Comparin C, Boza J, Wen CL, Cestari TF. Impact of a cybertutor in dermatological teaching. Int J Dermatol 2013; 52: 722–727.
    View article    Google Scholar
  33. Veredas FJ, Ruiz-Bandera E, Villa-Estrada F, Rufino-Gonzalez JF, Morente L. A web-based e-learning application for wound diagnosis and treatment. Comput Methods Programs Biomed 2014; 116: 236–248.
    View article    Google Scholar
  34. Viguier M, Rist S, Aubin F, Leccia MT, Richard MA, Esposito-Farese M, et al. Online training on skin cancer diagnosis in rheumatologists: results from a nationwide randomized web-based survey. PLoS One 2015; 10: e0127564.
    View article    Google Scholar
  35. Wahlgren CF, Edelbring S, Fors U, Hindbeck H, Stahle M. Evaluation of an interactive case simulation system in dermatology and venereology for medical students. BMC Med Educ 2006; 6: 40.
    View article    Google Scholar
  36. Goldie JG. Connectivism: A knowledge learning theory for the digital age? Medical Teach 2016; 38: 1064–1069.
    View article    Google Scholar
  37. Lau KH. Computer-based teaching module design: principles derived from learning theories. Med Educ 2014; 48: 247–254.
    View article    Google Scholar
  38. Stienen MN, Schaller K, Cock H, Lisnic V, Regli L, Thomson S. eLearning resources to supplement postgraduate neurosurgery training. Acta Neurochirurgica 2017; 159: 325–337.
    View article    Google Scholar
  39. Vaona A, Banzi R, Kwag KH, Rigon G, Cereda D, Pecoraro V, et al. E-learning for health professionals. Cochrane Database Syst Rev 2018; 1: Cd011736.
    View article    Google Scholar
  40. Lahti M, Hatonen H, Valimaki M. Impact of e-learning on nurses’ and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis. Int J Nurs Stud 2014; 51: 136–149.
    View article    Google Scholar
  41. Handal B, Groenlund C, Gerzina T. Academic perceptions amongst educators towards eLearning tools in dental education. Int Dent J 2011; 61: 70–75.
    View article    Google Scholar
  42. Reeves S, Fletcher S, McLoughlin C, Yim A, Patel KD. Interprofessional online learning for primary healthcare: findings from a scoping review. BMJ Open 2017; 7: e016872.
    View article    Google Scholar
  43. Booth A, Carroll C, Papaioannou D, Sutton A, Wong R. Applying findings from a systematic review of workplace-based e-learning: implications for health information professionals. Health Info Libr J 2009; 26: 4–21.
    View article    Google Scholar
Supplementary content
Table SI
Table SII