Abstract
This chapter presents an overview of the theoretical and empirical evidence on the effectiveness of simulation-based learning in higher education for learners in the domains of medical and teacher education. First and foremost, it presents a theoretical framework for fostering diagnostic competences in simulation-based environments. This theoretical framework was utilized to develop the simulations described in this book and contribute to generating further empirical evidence on the effective design of simulation-based learning environments in the context of diagnosis. Moreover, this chapter presents insights from a meta-analytic study supporting the importance of learners’ individual prerequisites as well as the instructional and contextual factors described in the model.
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
Keywords
- Professional knowledge
- Instructional support
- Diagnostic activities
- Context of simulation
- Diagnostic competences
2.1 Theoretical Overview
2.1.1 Instructional Support in Facilitating Competences
The conceptual framework used in this book is based on theoretical and empirical findings on skill development and theories of expertise development (Anderson, 1983; Jonassen, 1997; Renkl & Atkinson, 2003; Van Lehn, 1996), which suggests that learners need sufficient prior knowledge and to engage in complex practice opportunities to improve their professional competencies. Existing research on complex learning environments supports the claim that learning is more effective when instructional support is included (Lazonder & Harmsten, 2016). One possibility to avoid ineffective learning related to exposure to complex and ill-structured problems, particularly at early stages of expertise development, is to accompany the challenging tasks with scaffolding procedures, particularly those emphasizing metacognition and reflection as the main mechanisms of learning through experience. Therefore, we also include an overview of scaffolding types and measures as part of our theoretical framework.
2.1.2 Simulations in Medical and Teacher Education
A simulation is a model or representation of reality (object, system, or situation) with certain parameters that can be controlled or manipulated. The aim of a simulation is to arrive at a better understanding of the interconnections between the variables in the system or to put different strategies to test (Frasson & Blanchard, 2012; Shannon, 1975; Wissenschaftsrat., 2014). Thus, a central goal of simulations teaching diagnostic competences is to provide training opportunities in which learners can take diagnostic actions on cases with a certain similarity to professional practice (Seidel et al., 2015; Shavelson, 2013). Both digital simulations and face-to-face role-plays have been used as simulation-based learning environments. Numerous primary research on the effectiveness of simulations in medical and teacher education supports their effectiveness (e.g., Koparan & Yılmaz, 2015; Liaw et al., 2010; Matsuda et al., 2013). Meta-analytic studies in medical education (e.g., Cook et al., 2012, 2013) provide evidence supporting the generalizability of the high effects of simulations. However, the open question is what features and parameters make simulations most effective in different contexts for learners with certain personal characteristics, such as learning prerequisites, different levels of prior professional knowledge, and levels of expertise.
2.2 Model Description
The conceptual model (Fig. 2.1) consists of five essential blocks of elements:
“Test performance” block : diagnostic competences are considered to be the target learning outcome and can be measured by assessing the efficiency and the accuracy of the diagnosis, applying professional knowledge, and performing appropriate diagnostic activities.
“Processes in simulation-based learning environments” block : activities in simulation-based learning environments are hypothesized to directly affect the learning outcomes. This block also includes diagnostic activities performed to acquire the target knowledge and competences and an intermediate assessment of the diagnostic accuracy and efficiency during the learning phase.
“Individual learning prerequisites” block : the following factors are hypothesized to have (1) a direct effect on the development of diagnostic competences as learning outcomes and (2) an indirect effect via Block II by specifying the way learning strategies and instructional support are utilized. This block includes the existing professional knowledge base: learners’ conceptual and strategic knowledge, executive functions/working memory capacity, motivational variables, and interest.
“Instructional support” block : instructions include different types of scaffolding and ways to present information to the learners. They are hypothesized to influence the improvement of diagnostic competences by supporting learning processes and activities. The availability of appropriate instructional support that matches the learning goals and learners’ individual prerequisites determines the effectiveness of simulation-based learning environments.
The “Context of simulation ” block encompasses the construction of learning environments and competence assessments and is hypothesized to have an effect on learning processes, the types of instructional support that can be utilized, and outcomes. This block includes the domain and the nature of the diagnostic situation (the information base and the need to collaborate during the diagnosis).
In the following paragraphs, we will describe the specific variables included in the five blocks of the conceptual model in more detail.
2.2.1 Professional Knowledge Base
The definition and differentiation of knowledge types constituting the professional knowledge base in the model are adopted from previous research in teacher and medical education (Förtsch et al., 2018). Professional knowledge consists of content and strategic knowledge. Content knowledge as defined by Shulman (1987) or conceptual knowledge (Stark et al., 2011) refers to the knowledge of subject matter, key terms, and their interrelations. Strategic knowledge, in turn, relates to the application of conceptual knowledge to solve a problem. The distinction between strategical and conceptual knowledge has been validated in empirical studies in medical education and beyond (e.g., Förtsch et al., 2018).
2.2.2 Individual Learners’ Characteristics
Apart from the prior professional knowledge base, a range of other learner-related factors can potentially influence learning processes and outcomes: executive functions, working memory capacity, motivational variables, and interest. The conceptual model refers to individual learner characteristics in order to capture aptitude—treatment interactions (Snow, 1991), the expertise reversal effect (Kalyuga, 2007), and other motivational and affective predictors of learning outcomes with moderate to high effects (see Lazowski & Hulleman, 2016 for an overview). In line with research findings on the role of working memory (e.g., Koopmann-Holm & O’Connor, 2017; Sweller, 2005) and executive functions (Miyake & Friedman, 2012; Schwaighofer et al., 2015), we hypothesize that these factors might moderate both learning processes and outcomes.
2.2.3 Diagnostic Activities
Diagnostic processes require the collection, integration, and generation of case-specific information to reduce uncertainty and make medical or educational decisions. Therefore, we hypothesize that these processes require the same activities that are used across domains to collect and generate knowledge. The taxonomy of eight activities relevant to diagnostic processes was adopted from research on scientific reasoning and argumentation (Fischer et al., 2014). These activities include problem identification, questioning, hypothesis generation, construction/redesign of artifacts, evidence generation, evidence evaluation, drawing conclusions, and communicating the results. Diagnosing may require all or only some of these activities, the order of these activities may vary, with some activities repeated and some skipped depending on the particular situation at hand.
2.2.4 Diagnostic Quality: Accuracy and Efficiency
Diagnostic quality consists of the two measures diagnostic accuracy and diagnostic efficiency. Accuracy is a measure of the correspondence between the true state of the person being diagnosed and the diagnosis. In medical education, this would refer to correctly identifying the disease; in teacher education, this would relate to the assessment of the student’s knowledge, their competence, or the identification of misconceptions. The second variable is diagnostic efficiency, which refers to the time, effort, and costs required to reach an accurate diagnosis and contributes to the quality of the diagnosis alongside diagnostic accuracy.
2.2.5 Simulations as Instructional Method
To develop professional competencies, learners need to have sufficient prior knowledge at their disposal and engage in a large amount of practice (i.e., Van Lehn, 1996). Simulations allow learners to practice authentic cases without compromising patients’ or students’ safety or well-being, and address rare and complex situations. Simulations also provide sufficient time and opportunity for practice, understanding underlying principles and concepts, and developing reasoning and reflection skills (Frasson & Blanchard, 2012).
2.2.6 Explicit Presentation of Information
Presenting information explicitly may play an important role in designing learning environments that facilitate the development of competences. Domain concepts and strategies, the framework of the task, and its requirements need to be communicated to guide students’ attention to the most relevant information and reduce confusion (Kirschner et al., 2006; Sweller, 2005). However, there is no systematic research on how much explicit information needs to be communicated in different domains and learning environments. Moreover, research on the role of and interaction between the explicit presentation of information and other instructional methods is scarce. How the explicit presentation of information can be included in simulations is further described in Chaps. 6 and 7.
2.2.7 Scaffolding
The most prominent definition of scaffolding (Wood et al., 1976) defines it as the process of supporting learners by taking over some intricate factors of the task. According to recent literature reviews (Belland, 2014; Reiser & Tabak, 2014), scaffolding is effective in supporting the development of complex cognitive skills. It can facilitate cognitive, metacognitive, motivational and strategic learning processes and outcomes (Hannafin et al., 1999). Some promising forms of support in simulation-based learning that have shown positive effects in facilitating learning are providing examples, prompts, role-taking, and introducing reflection phases.
Prompts refer to information or guidance offered to learners during the learning process in order to improve its effectiveness (Berthold et al., 2007). Empirical evidence provides support for self-explanation prompts (Heitzmann et al., 2015, 2019), metacognitive prompts (Quintana et al., 2004), and collaboration scripts (Fischer et al., 2013; Vogel et al., 2017) as supports for learning. How prompts can be used successfully in simulations is described in Chaps 5, 6, and 8.
Role-taking can be considered a type of scaffolding when it reduces the full complexity of a situation by assigning learners a specific role with limited tasks or a limited perspective on the full task. A large body of empirical research suggests that complex skills can be acquired effectively in the agent role (i.e., teacher or doctor) (e.g., Cook, 2014). Scaffolding for role-taking is implemented in the simulations described in Chaps. 4, 5, 9, and 10.
The positive effects of reflection on learning were first proposed by Dewey (1933). Reflection can be induced through guided reflection phases and can take place before, during, or after an event. Different types of reflection (e.g., reflecting on reasoning or reflecting on the problem at hand) have been reported to efficiently foster the acquisition of diagnostic competences in medicine (Sandars, 2009) and teacher education (Beauchamp, 2015). Reflection phases were included in the simulations described in Chap. 9.
2.2.8 The Nature of the Diagnostic Situation
The nature of the diagnostic situation is defined by the set of specific features present in the specific situation in which the diagnosis takes place (Heitzmann et al., 2019). Heitzmann et al. suggest differentiating these features along two dimensions: (1) the source of information for the diagnosis, and (2) the necessity to collaborate with other professionals to reach the diagnosis. With regard to the first dimension, a distinction can be made between interaction-based and document-based diagnoses. In a diagnosis based on interaction, the information is gathered through interaction with another person (e.g., patient, student, their family members, etc.), (see simulations described in Chaps. 4, 5, 6, and 9); conversely, document-based diagnosis relies on information obtained in written or recorded form (see simulations described in Chaps. 5, 7, and 9). This distinction is highly relevant for practice, as different information sources might require different processing times as well as different types and amount of scaffolding. The second dimension ranges from individual diagnostic actions to a necessity to collaborate and communicate with other professionals during the diagnostic process. The processes involved in such collaboration and the factors relevant for diagnostic efficiency and accuracy during it have not been thoroughly researched in either the medical or teacher education fields. Simulations involving a collaborative context are described in Chaps. 7 and 10.
2.2.9 Domain
We focused on medical and teacher education as two domains that require accurate diagnoses before further professional action can be taken. Simulations in medical education are described in Chaps. 9 and 10. Simulations in teacher education are described in Chaps. 3, 4, 5, 6, 7, and 8. There are some similarities in diagnostic processes and thus also in diagnostic competences between these two domains. Therefore, we assume that interdisciplinary research and applications of simulation-based learning can provide insights for both fields.
The diagnostic process in medicine aims to determine the cause of a disease and the appropriate course of action for either further diagnosis or treatment (Charlin et al., 2000). The diagnostic process in teacher education aims to identify the gap between the present and the desired state of learners’ competences and optimize the use of instructional methods to close this gap (Helmke et al., 2012). While the two fields differ, it is also obvious that these diagnostic processes share a key commonality, namely that diagnosing a patient’s health status or a learner’s understanding is a goal-oriented process of collecting and integrating case-specific information to reduce uncertainty in order to make medical or educational decisions (Heitzmann et al., 2019).
2.3 Evidence from a Meta-Analysis
Recently, we conducted a meta-analysis of 35 empirical studies building on the conceptual framework developed above to investigate the role of instruction, scaffolding, and contextual factors in facilitating the development of diagnostic competences in learners with different levels (low and high) of professional knowledge. As little empirical research was found on the effects of simulation-based learning on the development diagnostic competences, a broader search was conducted and studies of different types of instructional support were included in the analysis. We specifically focused on investigating the role of problem-solving as one of several problem-centered instructional approaches (Belland et al., 2017, p. 311).
The main aim of the meta-analysis was to estimate the overall effect of instructional support on the development of diagnostic competences in the domains of medical and teacher education and, more specifically, provide the missing evidence and synthesized results on the effects of different scaffolding types. We also included learning with examples as a scaffolding type (in addition to prompts, role-taking and reflection phases). Examples allow learners to retrace the steps of a solution (worked example) or observe a model displaying the problem-solving process (modeling example) before they solve problems independently (Renkl, 2014). Instructional support had a moderate positive effect on diagnostic competences, which is in line with previous research findings on fostering complex cognitive skills (Belland et al., 2017; Dochy et al., 2003). Problem-based learning as an instructional support facilitated the improvement of diagnostic competences in all learners, independently of their prior professional knowledge base. However, it is important to note that all interventions that applied a problem-based learning approach also implemented at least one other type of scaffolding or additional instruction.
One of the research questions in the meta-analysis specifically addressed the interaction between individual learners’ prerequisites (i.e., prior knowledge base) and the effectiveness of a problem-solving approach and scaffolding procedures. The hypothesis behind this research question was that scaffolding measures vary in the degree of self-regulation required from learners. Thus, we assumed that providing example solutions and modeling desired behavior are more strongly guided forms of instruction requiring less self-regulation, as the learners do not face a problem to solve, but rather a solution. In contrast, reflection phases were considered to require high levels of self-regulation. Diagnostic competences were found to be facilitated effectively through problem-solving independent of learners’ knowledge base. Although all types of scaffolding had positive effects on learning, scaffolding types providing high levels of guidance were more effective for less advanced learners, whereas scaffolding types relying on high levels of self-regulation were more effective for more advanced learners.
Moreover, the context was a significant moderator of improved diagnostic competences, with better learning associated with an interactive diagnostic situation. The domains of medical and teacher education were comparable in the effects of instructional support and scaffolding, but differed in terms of the prior professional knowledge base and therefore presumably in the design of effective learning environments to foster diagnostic competences.
2.4 Conclusions
This chapter addressed existing theoretical and empirical research on developing competences in higher education. It aimed at describing state-of-the-art research and developing a theoretical framework for using problem-solving (with and without simulations) to facilitate the development of diagnostic competences in medical and teacher education. Existing research suggests that instructional support that uses problem-solving to facilitate the development of complex cognitive skills and competences, and in particular diagnostic competences, has a moderate positive effect on learning outcomes (Chernikova et al., 2019). Meta-analytical studies, in turn, provide evidence of positive effects of simulations, as an example of a problem-solving approach, on learning in multiple domains.
The existing research suffers from a vast heterogeneity with respect to how researchers define diagnosing and diagnostic competences, which individual learners’ prerequisites and processes they assume to be relevant for diagnosing and learning to diagnose, what instructional approaches should be used, and how the context (i.e., the nature of the diagnostic situation) can influence the effectiveness of learning. Nevertheless, simulations are promising means to measure and facilitate diagnostic competences.
Notably, both the literature review and the meta-analysis identified a range of empirical studies that used different simulations to facilitate skills related to diagnostic competences; however, it also became clear that empirical studies rarely provide detailed descriptions of the learning environments and simulations involved or the measures used to assess improved competences. This makes it difficult to draw conclusions about effects of specific learning activities and processes.
Moreover, hardly any study reused existing simulation-based learning environments, preferring to design new ones from scratch and match them to the study’s particular needs. Such an approach contributes to high levels of heterogeneity that is difficult to explain as well as difficulties in summarizing the applied methods. This in turn leads to a lack of standardized instruments and measures that can be systematically used and adjusted if needed. However, such efforts are necessary to create foundations for high-quality, interdisciplinary, replicable empirical research and for better-designed learning environments to effectively facilitate the acquisition of diagnostic competences.
References
Anderson, J. R. (1983). Cognitive science series. The architecture of cognition. Lawrence Erlbaum Associates, Inc..
Beauchamp, C. (2015). Reflection in teacher education: Issues emerging from a review of current literature. Reflective Practice: International and Multidisciplinary Perspectives, 16(1), 123–141. https://doi.org/10.1080/14623943.2014.982525
Belland, B. R. (2014). Scaffolding: Definition, current debates, and future directions. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 505–518). Springer. https://doi.org/10.1007/978-1-4614-3185-5_39
Belland, B. R., Walker, A. E., Kim, N. J., & Lefler, M. (2017). Synthesizing results from empirical research on computer-based scaffolding in STEM education: A meta-analysis. Review of Educational Research, 87(2), 309–344. https://doi.org/10.3102/0034654316670999
Berthold, K., Nückles, M., & Renkl, A. (2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction, 17(5), 564–577. https://doi.org/10.1016/j.learninstruc.2007.09.007
Charlin, B., Tardif, J., & Boshuizen, H. P. A. (2000). Script and medical diagnostic knowledge: Theory and applications for clinical reasoning instruction and research. Academic Medicine: Journal of the Association of American Medical Colleges, 75(2), 182–190. PMID: 10693854
Chernikova, O., Heitzmann, N., Fink, M. C., Timothy, V., Seidel, T., & Fischer, F. (2019). Facilitating diagnostic competences in higher education - a meta-analysis in medical and teacher education. Educational Psychology Review, 1–40. https://doi.org/10.1007/s10648-019-09492-2
Cook, D. A. (2014). How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education. Medical Education, 48(8), 750–760. https://doi.org/10.1111/medu.12473
Cook, D. A., Brydges, R., Hamstra, S. J., Zendejas, B., Szostek, J. H., Wang, A. T., … Hatala, R. (2012). Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simulation in Healthcare, 7(5), 308–320. https://doi.org/10.1097/SIH.0b013e3182614f95
Cook, D. A., Hamstra, S. J., Brydges, R., Zendejas, B., Szostek, J. H., Wang, A. T., Erwin, P. J., & Hatala, R. (2013). Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Medical Teacher, 35(1), 867–898. https://doi.org/10.3109/0142159X.2012.714886
Dochy, F., Segers, M., van den Bossche, P., & Gijbels, D. (2003). Effects of problem-based learning: A meta-analysis. Learning and Instruction, 13(5), 533–568. https://doi.org/10.1016/S0959-4752(02)00025-7
Dewey, J. (1933). How we think. A restatement of the relation of reflective thinking to the educative process (revised ed.), Boston: D. C. Heath.
Fischer, F., Kollar, I., Stegmann, K., & Wecker, C. (2013). Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist, 48(1), 56–66. https://doi.org/10.1080/00461520.2012.748005
Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., et al. (2014). Scientific reasoning and argumentation: Advancing an interdisciplinary research agenda in education. Frontline Learning Research, 2(2), 28–45. https://doi.org/10.14786/flr.v2i2.96
Förtsch, C., Sommerhoff, D., Fischer, F., Fischer, M. R., Girwidz, R., Obersteiner, A., Reiss, K., Stürmer, K., Siebeck, M., Schmidmaier, R., Seidel, T., Ufer, S., Wecker, C., & Neuhaus, B. J. (2018). Systematizing professional knowledge of medical doctors and teachers: Development of an interdisciplinary framework in the context of diagnostic competences. Education Sciences, 8(4), 207. https://doi.org/10.3390/educsci8040207
Frasson, C., & Blanchard, E. (2012). Simulation-based learning. In I. N. Seel (Ed.), Encyclopedia of the sciences of learning (pp. 3076–3080). Springer.
Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments: Foundations, methods, and models. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (pp. 115–140). Lawrence Erlbaum Associates, Inc.
Heitzmann, N., Fischer, F., Kühne-Eversmann, L., & Fischer, M. R. (2015). Enhancing diagnostic competence with self-explanation prompts and adaptable feedback. Medical Education, 49(10), 993–1003. https://doi.org/10.1111/medu.12778
Heitzmann, N., Seidel, T., Opitz, A., Hetmanek, A., Wecker, C., Fischer, M., Ufer, S., Schmidmaier, R., Neuhaus, B., Siebeck, M., Stürmer, K., Obersteiner, A., Reiss, K., Girwidz, R., & Fischer, F. (2019). Facilitating diagnostic competences in simulations: A conceptual framework and a research agenda for medical and teacher education. Frontline Learning Research, 7(4), 1–24. https://doi.org/10.14786/flr.v7i4.384
Helmke, A., Schrader, F.-W., & Helmke, T. (2012). EMU: Evidenzbasierte Methoden der Unterrichtsdiagnostik und -entwicklung. Unterrichtsdiagnostik – Ein Weg, um Unterrichten sichtbar zu machen. Schulverwaltung Bayern, 35(6), 180–183.
Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem solving learning outcomes. Educational Technology Research & Development, 45(1), 45–94. https://doi.org/10.1007/BF02299613
Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19(4), 509–539.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86. https://doi.org/10.1207/s15326985ep4102_1
Koopmann-Holm, B., & O’Connor, A. (2017). Working memory. CRC Press.
Koparan, T., & Yılmaz, G. (2015). The effect of simulation-based learning on prospective teachers’ inference skills in teaching probability. Universal Journal of Educational Research, 3(11), 775–786. https://doi.org/10.13189/ujer.2015.031101
Lazonder, A. W., & Harmsten, R. (2016). Meta-analysis of inquiry-based learning: Effects of guidance. Review of Educational Research, 86(3), 681–718. https://doi.org/10.3102/0034654315627366
Lazowski, R. A., & Hulleman, C. S. (2016). Motivation interventions in education: A meta-analytic review. Review of Educational Research, 86(2), 602–640. https://doi.org/10.3102/0034654315617832
Liaw, S. Y., Chen, F. G., Klainin, P., Brammer, J., O’Brien, A., & Samarasekera, D. D. (2010). Developing clinical competency in crisis event management: An integrated simulation problem-based learning activity. Advances in Health Sciences Education: Theory and Practice, 15(3), 403–413. https://doi.org/10.1007/s10459-009-9208-9
Matsuda, N., Yarzebinski, E., Keiser, V., Raizada, R., Stylianides, G. J., & Koedinger, K. R. (2013). Studying the effect of a competitive game show in a learning by teaching environment. International Journal of Artificial Intelligence in Education, 23(1–4), 1–21. https://doi.org/10.1007/s40593-013-0009-1
Miyake, A., & Friedman, N. P. (2012). The nature and organization of individual differences in executive functions: Four general conclusions. Current Directions in Psychological Science, 21(1), 8–14. https://doi.org/10.1177/0963721411429458
Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., Kyza, E., Edelson, D., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386. https://doi.org/10.1207/s15327809jls1303_4
Reiser, B. J., & Tabak, I. (2014). Scaffolding. In R. K. Sawyer (Ed.), Cambridge handbooks in psychology. The Cambridge handbook of the learning sciences (pp. 44–62). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.005
Renkl, A. (2014). Toward an instructionally oriented theory of example-based learning. Cognitive Science, 38(1), 1–37. https://doi.org/10.1111/cogs.12086
Renkl, A., & Atkinson, R. K. (2003). Structuring the transition from example study to problem solving in cognitive skill acquisition: A cognitive load perspective. Educational Psychologist, 38(1), 15–22. https://doi.org/10.1207/S15326985EP3801_3
Sandars, J. (2009). The use of reflection in medical education: AMEE guide no. 44. Medical Teacher, 31(8), 685–695. https://doi.org/10.1080/01421590903050374
Schwaighofer, M., Fischer, F., & Bühner, M. (2015). Does working memory training transfer? A meta-analysis including training conditions as moderators. Educational Psychologist, 50(2), 138–166. https://doi.org/10.1080/00461520.2015.1036274
Seidel, T., Stürmer, K., Schäfer, S., & Jahn, G. (2015). How preservice teachers perform in teaching events regarding generic teaching and learning components. Zeitschrift Für Entwicklungspsychologie Und Pädagogische Psychologie, 47(2), 84–96. https://doi.org/10.1026/0049-8637/a000125
Shannon, R. E. (1975). Systems simulation: The art and science. Prentice-Hall.
Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educational Psychologist, 48(2), 73–86. https://doi.org/10.1080/00461520.2013.779483
Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–23.
Snow, R. E. (1991). Aptitude-treatment interaction as a framework for research on individual differences in psychotherapy. Journal of Consulting and Clinical Psychology, 59(2), 205–210.
Stark, R., Kopp, V., & Fischer, M. R. (2011). Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education. Learning and Instruction, 21(1), 22–33. https://doi.org/10.1016/j.learninstruc.2009.10.001
Sweller, J. (2005). Implications of cognitive load theory for multimedia learning. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 19–30). Cambridge University Press.
Van Lehn, K. (1996). Cognitive skill acquisition. Annual Review of Psychology, 47(1), 513–539. https://doi.org/10.1146/annurev.psych.47.1.513
Vogel, F., Wecker, C., Kollar, I., & Fischer, F. (2017). Socio-cognitive scaffolding with computer-supported collaboration scripts: A meta-analysis. Educational Psychology Review, 29(3), 477–511. https://doi.org/10.1007/s10648-016-9361-7
Wissenschaftsrat. (2014). Bedeutung und Weiterentwicklung von simulation in der Wissenschaft [importance of the development of simulations in science].
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry, 17(2), 89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x
Acknowledgments
The research presented in this chapter was funded by a grant from the Deutsche Forschungsgemeinschaft (DFG-FOR 2385) to Frank Fischer and Tina Seidel (FI 792/12-1).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2022 The Author(s)
About this chapter
Cite this chapter
Chernikova, O., Heitzmann, N., Opitz, A., Seidel, T., Fischer, F. (2022). A Theoretical Framework for Fostering Diagnostic Competences with Simulations in Higher Education. In: Fischer, F., Opitz, A. (eds) Learning to Diagnose with Simulations . Springer, Cham. https://doi.org/10.1007/978-3-030-89147-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-89147-3_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89146-6
Online ISBN: 978-3-030-89147-3
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)