Abstract
Since the ability to teach and therefore also diagnose not only subject-specific but also cross-domain skills are an important part of every teacher’s day-to-day work, we developed simulations to quantify and furthermore support the competence to diagnose secondary school students’ scientific reasoning skills. For this purpose, the simulations also include the possibility to rehearse interdisciplinary collaborations between physics and biology pre-service teachers. The simulations are video-based, containing short, scripted videos showing two students working on different inquiry tasks, including a physics and a biology experiment. Participants have to observe the students and can individually decide which pre-formulated questions they want to ask the students before, during and after the experiments to gather relevant information. The corresponding simulated answers are subsequently presented via additional videos. The information gained during the simulations is supposed to be used to diagnose the students’ scientific reasoning skills later in the process.
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
Keywords
- Pre-service teacher education
- Diagnostic competences
- Video-based simulations
- Scientific reasoning
- Interdisciplinary collaboration
This chapter’s simulation at a glance
Domain | Teacher education |
---|---|
Topic | Scientific reasoning in physics and biology |
Learner’s task | To adopt the role of a physics or biology teacher and diagnose—individually or in interdisciplinary collaboration—a student’s scientific reasoning skills |
Target group | Pre-service teachers |
Diagnostic mode | Individual and collaborative diagnosis |
Sources of information | (Interactive) videos of pairs of students who perform inquiry activities in physics and biology |
Special features | Standardized and parallelized simulations for two different school subjects (physics and biology); possibility to directly interact with the students by “asking” them questions concerning their inquiry activities |
7.1 Scientific Reasoning as a Cross-Domain Skill
Many educational objectives in schools refer to subject-specific knowledge and skills, but others refer to cross-curricular or cross-domain skills such as learning strategies, media literacy, or scientific reasoning skills. These skills have in common that they typically cannot be developed without being applied to particular subject-specific content—a so-called exemplifying domain (Renkl et al., 2009). For example, a learning strategy such as organizing information by constructing a concept map can only be demonstrated and practiced in the context of a particular topic, such as stem cell research, for example (Hilbert et al., 2008). Fostering scientific reasoning skills requires inquiry tasks concerning phenomena such as factors influencing the image of an object projected through a lens or the growth of plants. Typically, exemplifying domains for the development of cross-domain skills are taken from the body of knowledge contained within school subjects.
Cross-domain skills also have in common that they can be applied to topics from more than one school subject. Learning strategies, media literacy, or—to some degree—scientific reasoning skills can be applied to content from the humanities, the social sciences, or the natural sciences. Therefore, promoting such cross-domain skills can be regarded as a joint task of more than one teacher and more than one school subject (Wecker et al., 2016). Against this backdrop, it may be advisable for teachers of subjects that can serve as exemplifying domains for such cross-domain skills to collaborate in this joint task and share information about individual students’ learning progress.
In our own research, we focus on scientific reasoning as a cross-domain skill. Scientific reasoning can be seen as a rather complex set of cognitive activities (Schunn & Anderson, 1999) and is therefore best explained by looking at its subskills. While there are frameworks that differentiate many subskills (Fischer et al., 2014), most researchers distinguish among three dimensions of scientific reasoning skills: (1) formulating hypotheses, (2) designing and conducting experiments, and (3) drawing conclusions from experiments (e.g., de Jong & van Joolingen, 1998; Klahr & Dunbar, 1988). The formulation of hypotheses may be strongly influenced by a person’s domain knowledge in a certain field and can be assessed by looking at the specificity of a stated hypothesis (Lazonder et al., 2008). After a hypothesis has been formulated, experiments have to be designed and conducted to test it. At this point, the so-called control of variables strategy, i.e., varying one independent variable from the hypothesis while holding all other variables constant, plays a crucial role in obtaining unequivocal results (Chen & Klahr, 1999; Tschirgi, 1980; Schwichow et al., 2016). Observations from well-designed experiments can then be evaluated and used to draw conclusions about the tested hypothesis. Just as the initial hypothesis, these conclusions again may vary in terms of their specificity. Furthermore, drawing correct inferences about factors that do or do not influence the dependent variable from informative and well-designed comparisons is an important aspect at this point (see Kuhn et al., 1992).
Although there are views that question the existence of cross-domain skills in general or that scientific reasoning in particular is a cross-domain skill (e.g., Tricot & Sweller, 2014; Osborne, 2018), there is research suggesting that there are in fact scientific reasoning skills that can be applied across content areas, at least in related subjects or different scientific subdisciplines (e.g., Kuhn et al., 1992; Schunn & Anderson, 1999). A reason for this ongoing debate about the existence of domain-general or—as we would prefer to call them—cross-domain skills might be different conceptions of the terms “domain” and “domain-general” (Hetmanek et al., 2018), but in light of the strong research tradition on scientific reasoning, we consider scientific reasoning skills as both real and applicable to content from different subjects.
Research from developmental psychology shows that early in the development of a specific subskill of scientific reasoning, it is often applied in one narrow context and no others. Only with time and practice do learners begin to apply the new subskill to a broader range of topics (Kuhn et al., 1992; Zimmerman, 2007) within and across subjects. Hence, the breadth of topics to which a subskill of scientific reasoning can be applied constitutes a quality dimension of the subskill itself. These considerations suggest that practicing scientific reasoning skills in the context of different science subjects such as physics and biology may contribute to the development of higher levels of scientific reasoning skills.
7.1.1 The Role of Teachers’ Diagnostic Competences for the Development of Learners’ Scientific Reasoning Skills
Teachers’ diagnostic competences are an important prerequisite for their adaptive and effective support for their students (Schrader, 2009). Therefore, teachers need to be able to diagnose their students’ current skill levels to be able to support them appropriately. The definition by Fischer et al. (2022) is adopted as a basis for the work presented in this chapter.
In order to diagnose correctly, teachers need the cognitive and context-specific performance dispositions to do so (Koeppen et al., 2008). Similar to other cognitive skills, it can be assumed that diagnostic competences are based on teachers’ professional knowledge (e.g., Baumert & Kunter, 2006; Förtsch et al., 2018). Therefore, teachers need different types of knowledge (knowing that, knowing how and knowing when and why) as well as content-related facets of knowledge in order to diagnose their students (see Förtsch et al., 2018). Against the background of research on the acquisition of cognitive skills (see VanLehn, 1996), developing diagnostic competences also requires opportunities to apply such knowledge to authentic cases and practice the application of diagnostic competences.
To arrive at a diagnosis, the diagnostician can employ a set of different types of (epistemic) diagnostic activities, including (1) problem identification, (2) questioning, (3) hypothesis generation, (4) construction and redesign of artifacts, (5) evidence generation, (6) evidence evaluation, (7) drawing conclusions, (8) communication and scrutinizing (see Chernikova et al., 2022; Heitzmann et al., 2019).
While research on diagnostic competences has mainly focused on the accuracy of teachers’ judgments of subject-specific knowledge and skills, research on diagnostic competences concerning cross-domain skills, such as scientific reasoning, is still scarce (Südkamp et al., 2012). Therefore, students’ scientific reasoning skills were selected as the focus of teachers’ diagnostic competences in our present work.
Giving students the chance to conduct scientific experiments in class can create the opportunity to diagnose students’ scientific reasoning levels. Two common experiments are experimenting with optical lenses (physics) and experimenting with the growth of plants (biology). The goal while experimenting with plants is to find out which variables (the amount of water, a fertilizer stick, salt and an undefined white powder) influence the growth of a plant (e.g., a bean plant). Therefore, students have to convert their ideas about what influences the growth of a plant into a scientific hypothesis. For example, this could be the idea that the amount of water influences the growth. To test this idea, the students must conduct an experiment. In this case, they would need to vary the quantity of water between two plants to see if there is a difference in growth. Students also need to draw the right conclusions based on the results of the experiment. Based on the growth of the plants, they should be able to determine whether to confirm or reject their hypothesis. The optical lens experiment works quite similarly. Students need to find out which variables (lens curvature, lens size, the distance between the lens and depicted object and an undefined polarizing filter) influence the measurement point at which an object—depicted through an optical lens—appears clear on an imaging screen.
7.1.2 Collaborative Diagnosis of Scientific Reasoning Skills
In the context of daily school routines, diagnosing a student doesn’t always have to be a one-person job. Since different teachers experience the same learners in different situations, exchanging information about these learners might be beneficial for teachers to support their students. Still, it is unclear whether interdisciplinary teacher collaboration can help them achieve better results in diagnosing students’ scientific reasoning skills. Maybe the information a single teacher can gather in his or her own lessons is already comprehensive enough to be able to arrive at a good diagnosis. However, it is possible that this is not the case and that information from several subjects is needed to be able to get enough information to serve as a basis for a satisfactory diagnosis. This might be especially true when it comes to the question of whether or not a student can apply scientific reasoning skills across school subjects (e.g., physics and biology) in a given domain (science). Therefore, situations in different thematic fields might be necessary to get enough insight (see Kuhn et al., 1992; Zimmerman, 2007). In addition, collaborative diagnosis might have an advantage over the individual development of a diagnosis when the collaborating teachers have different—in the best case complementary—areas of expertise. If this is the case, teachers could benefit from each other by working together (de Wit & Greer, 2008). This idea itself is not new and already very common in different fields of expertise—for example, in the field of medicine. The daily routine in hospitals offers many possibilities or rather necessities for doctors from different fields to work together to improve their chance of arriving at better diagnoses. So-called tumor boards are just one example of such interdisciplinary collaboration. Here, experts from different fields come together to discuss particularly complex malignant diseases. Even though it is also recommended for teachers to collaborate when necessary and to seek help with the management of difficult tasks (Helmke, 2010), this kind of exchange is not institutionalized in the same way. Collaboration is often restricted to a group of teachers teaching the same subject working together to create worksheets or tests. Therefore, there is still a lot of potential for interdisciplinary collaboration, especially when it comes to the need for improving the process of diagnosing students. This approach seems especially promising for teachers from related subjects such as English and German or different scientific subjects. Scientific research also shows that medical students who work in groups arrive at better diagnoses than students working on their own (Hautz et al., 2015). Based on these findings, it seems likely that the same might be true for pre-service teachers. Additionally, it has to be stated that such collaborations can only be fruitful if the process of sharing information is implemented successfully (see Radkowitsch et al., 2022).
7.1.3 Simulations as a Learning Opportunity
Since there are not many opportunities in university-based teacher preparation programs for practicing the diagnosis of scientific reasoning skills in real classroom situations, there is a need for additional training opportunities. In this context, video-based simulations constitute a promising setting for both the training and the measurement of diagnostic competences. Overall, simulations are considered representations of reality segments that offer the possibility to control or manipulate certain parameters (see Chernikova et al., 2022). Simulations can, for example, include videos focusing on specific (classroom) situations and thereby control participants’ attention while still creating a realistic scenario. This makes video-based simulations especially interesting for tasks in which learning involves self-regulated exploration—so-called inquiry learning tasks (de Jong, 2006). Another advantage of simulations is that once they are designed and programmed they can be used repeatedly for practice as well as testing.
In contrast to the education of pre-service teachers, learning with simulations is very common in medical education (Peeraer et al., 2007). This is especially interesting since both professions are quite similar when it comes to the need to create training situations for educational purposes. This is the case because in both professions it is difficult to immediately start training in real-life situations. Appropriate alternatives—such as computer-based simulations—can create the opportunity to get this experience.
7.1.4 Video-Based Simulations for Pre-Service Teachers’ Diagnosis of Students’ Scientific Reasoning Skills
Video-based simulations were developed as an environment to practice and measure pre-service teachers’ diagnostic competences concerning students’ scientific reasoning skills. As the diagnosis of cross-domain skills such as scientific reasoning skills may benefit from interdisciplinary collaboration, the simulations can be used for individual as well as collaborative diagnosing in interdisciplinary teams made up of teachers of different science subjects.
The simulation can best be understood in terms of the segment of reality it simulates. In this segment of reality, teachers of science subjects (physics or biology) have to diagnose the scientific reasoning skills of individual learners from their classes. For this purpose, they can observe these learners while they perform inquiry tasks in small groups during lessons in their respective subject. Teachers can watch and listen to their students while they generate research questions and formulate hypotheses, design and run experiments and document their observations, and draw conclusions from their observations concerning their hypotheses. They may also interrupt their students by asking questions about their research questions, hypotheses, observations, and conclusions in order to collect information about learners’ scientific reasoning that is not directly observable or fully transparent from their activities and dialogue. Based on the information gathered by observing and asking questions of their students during these lessons, they can arrive at a diagnosis of each learner’s scientific reasoning skills. Beyond such individual diagnoses, teachers may exchange their observations and discuss their diagnoses with colleagues who teach a different science subject to the same learners and therefore may have collected complementary information about these learners, which may support, contradict, or extend their own diagnoses. Hence, the teachers may collaborate to arrive at a joint diagnosis of each learner’s scientific reasoning skills.
The simulation tries to mimic this segment of reality. It is therefore introduced as a kind of role play. Pre-service teachers have to picture themselves as a teacher working in their own school subject. Staged videos of learner dyads are used to simulate a small segment of teachers’ experiences during lessons, including the opportunity to observe learners’ activities and dialogue and select questions they would like to ask the learners to gain deeper insights into their scientific reasoning during these inquiry tasks. The pre-service teachers’ task is to diagnose the scientific reasoning skills of one pre-designated learner from the dyad captured in the video. After watching the video, they are asked to individually write down a diagnosis concerning this learner’s scientific reasoning skills. In the collaborative version of the simulation, they then enter a phase of interdisciplinary collaboration with a pre-service teacher for the other science subject (physics or biology) in order to generate a joint diagnosis of the learner’s scientific reasoning skills that integrates the observations and conclusions from both science subjects. To arrive at their joint diagnosis, they can talk to each other and use material from their individual diagnoses. The video simulations were implemented as follows:
Platform
The simulation environment runs in a standard web browser. It is written in PHP, HTML, and Javascript, and uses a MySQL database to store configuration tables and log files. The platform also has test and questionnaire functionalities for empirical studies concerning the instructional design of the video simulations.
Interface
During the video simulations with staged videos of learner dyads who collaborate on inquiry tasks, the computer screen is divided into four parts (see Fig. 7.1):
-
1.
The videos are displayed in the top-left area (“video area”).
-
2.
The top-right area (“inquiry table”) displays a worksheet that the learners in the video use to document their experiments in handwriting. It contains a table with one row per experiment and columns for the research questions and/or hypotheses, the settings of the four independent variables, the measured values of the dependent variable, and a conclusion. The inquiry table always displays the worksheet state corresponding to the current state of the video: Each time one of the learners starts to take notes about their current experiment, all the information that is written down at this point is displayed at once so that the pre-service teachers can immediately process this information. This information enables the pre-service teachers to keep track of the experiments the students have already conducted.
-
3.
The bottom-right area (“note pad”) comprises a text box for notes participants can write down while watching the video, just as teachers could take notes during their lessons. In some versions of the simulation environment the note pad contains some text that structures the pre-service teachers’ notes. The notes are saved and displayed again later when participants write their final diagnosis.
-
4.
The bottom-left area (“navigation area”) displays questions (“video links”) that serve as links to short video segments that can be inserted at certain points of the main video and that contain a voice-over of a teacher asking the respective question to the learners in the video along with their responses.
Video Material
The videos show a classroom situation focused on two students. Several scripted videos were produced that show these students performing two inquiry tasks. The tasks are based on the two already described scientific experiments. The physics experiment has to do with lenses and the biology experiment has to do with the growth of plants. Both experiments have exactly the same structure. In both cases, the learners in the video have to find out whether and how the dependent variables—plant growth and optimal distance between lens and illustration screen, respectively—are influenced by four independent variables. In physics, the four independent variables are (1) the curvature of the lens, (2) the size of the lens, (3) the distance between the object and the lens, and (4) a so-called polarizing filter. In biology, the four variables are (1) the amount of water, (2) salt, (3) a fertilizer stick, and (4) an unspecified white powder. The videos are the pre-service teachers’ main source of information, supplemented only by the inquiry table that documents the learners’ experiments.
Developing Video Scripts
At the beginning of creating the simulations, we came up with and wrote down several fictional student profiles containing appropriate values for all relevant scientific reasoning subskills, with the objective of creating realistic, average students. We then wrote corresponding scripts matching these profiles. Those scripts were later handed to the student actors to prepare for their roles and learn their dialogues.
Interaction
By default, typical media player control elements (e.g., play, pause, stop, forward, backward, replay, and time bar functionalities as well as a time display) are disabled for the video area. Thus, the simulation platform mimics the situation in classroom instruction, during which there is also no opportunity to interrupt or revisit parts of the flow of events. To be sure, video interactivity and reflection phases may be helpful design features of video simulations, which can also be investigated in this simulation environment.
The video links in the navigation area constitute the essential feature of the environment that renders it a simulation, because they enable the participants to “interact with the students” in the videos (see Fig. 7.2). During the planning and documentation phases of each experiment in the video, groups of video links with questions that might be appropriate at this point are displayed in the navigation area. When the learners run the experiment or move on to the next experiment, the group of video links disappears and is eventually replaced by a new group of video links.
If a participant decides to ask a certain question (for example: “What do you want to find out now?”), he or she may click on the corresponding link. The video segment containing the teacher question and learner response is then inserted at the next appropriate point in the main video following the selection of the corresponding question. Until this point, participants have the possibility to withdraw their selection by clicking on the video link for a second time. They may also select more than one video link. If the participant has selected several video links, the corresponding video segments are played in a prespecified sequence. After choosing a question and watching the additional video segment, the main video continues. Only the remaining video links are displayed; hence, no video segment can be viewed twice.
After the main video has ended, a group of video links is displayed that comprises questions which do not refer to individual experiments, but rather to the sequence of experiments as a whole (see Fig. 7.2). One example of these ending questions is: “Is there one or even more than one experiment that wasn’t completely necessary and therefore could have been left out?” When the participant selects one of these video links, the video segment with the corresponding question is played immediately. After the video segment has ended, again only the remaining video links are displayed, and the next question can be selected.
The participants have only limited time for questions during each simulation. It is therefore impossible to view all additional video segments. Hence, participants have to choose the most relevant and important ones. These interactions should always serve the purpose of gaining additional relevant information about the learner’s scientific reasoning skills that cannot be obtained from the main video. In some cases, it also makes sense to postpone the selection of a specific question because the corresponding information may occur in the main video at some later point, and only ask the question at a later occasion if it turns out that the main video does not contain the information. To help the participants keep track of the available time, both the time remaining for additional questions and the length of the video segments corresponding to the video links are displayed in the navigation area.
7.1.5 Measuring Pre-Service Teachers’ Diagnostic Activities and the Quality of Their Diagnoses of Students’ Scientific Reasoning Skills
The participants’ performance in the simulation is later evaluated using accuracy and efficiency measures. Accuracy is a measure for the quality of the participants’ performance in the simulations in terms of choosing the “right” questions. Therefore, we consider the “right” questions to be those that are promising in the sense of the expectation to provide useful information for the diagnosing process. Since we additionally need some unimportant questions as distractors, there are also some questions that are either completely irrelevant or focused on information that can easily be acquired just by watching the main video. On the other hand, efficiency is a measure of accuracy in proportion to time. This is important because participants are encouraged to use their time for questions wisely.
In addition to the performance evaluation in the simulations, we also evaluate the participants’ written diagnoses using only a measure of accuracy. Both the individual diagnoses and—in the collaborative test condition—the additional collaborative diagnoses are rated by comparing them to a sample solution. This sample solution is based on the student profiles used to create the scripts, which include the envisaged values for all relevant scientific reasoning subskills. The level of congruence between the sample solution and the individual diagnosis is considered as an accuracy measure.
7.1.6 Research on (Support for) Pre-Service Teachers’ Diagnosis of Students’ Scientific Reasoning Skills in Video-Based Simulations
The simulation environment and the video simulations described in this contribution provide a basis for investigating several important research questions concerning pre-service teachers’ diagnosis of students’ scientific reasoning skills. In our research, we focus on two main areas: The role of different types and content-related facets of professional knowledge for (pre-service) teachers’ diagnostic activities and the quality of diagnoses of students’ scientific reasoning skills on the one hand, and on kinds of scaffolding that foster the development of pre-service teachers’ individual and collaborative diagnostic competences concerning students’ scientific reasoning skills in video-based simulations on the other. Putting our research interests in context, we will focus on Research Questions 2 and 4, as mentioned in both the introduction by Fischer et al. (2022) and the concluding chapter by Opitz et al. (2022). In particular, we investigate
-
1.
how conceptual content knowledge, scientific reasoning skills, and conceptual pedagogical content knowledge about scientific reasoning and its diagnosis among pre-service teachers in physics and biology are related to their diagnostic activities and the quality of their diagnoses,
-
2.
how the collaborative vs. individual development of a diagnosis influences diagnostic activities and the quality of the diagnosis, as well as what role the distribution of information (shared vs. separate experiences of learners’ inquiry activities during lessons) plays in this respect, and,
-
3.
to what extent a collaboration script for joint diagnosis can enhance diagnostic activities and the quality of the diagnosis as well as the development of individual and collaborative diagnostic competences.
Thus, in the long run, the present research may contribute to the improvement of teacher education at universities.
References
Baumert, J., & Kunter, M. (2006). Stichwort: Professionelle Kompetenz von Lehrkräften. Zeitschrift für Erziehungswissenschaft, 9, 469–520. https://doi.org/10.1007/s11618-006-0165-2
Chen, Z., & Klahr, D. (1999). All other things being equal: Acquisition and transfer of the control of variables strategy. Child Development, 70, 1098–1120. https://doi.org/10.1111/1467-8624.00081
Chernikova, O., Heitzmann, N., Opitz, A., Seidel, T., & Fischer, F. (2022). A theoretical framework for fostering diagnostic competences with simulations. In F. Fischer & A. Opitz (Eds.), Learning to diagnose with simulations—Examples from teacher education and medical education. Springer.
de Jong, T. (2006). Computer simulations. Technological advances in inquiry learning. Science, 312, 532–533. https://doi.org/10.1126/science.1127750
de Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68, 179–201. https://doi.org/10.2307/1170753
de Wit, F. R. C., & Greer, L. L. (2008). The black-box deciphered: A meta-analysis of team diversity, conflict, and team performance. In Academy of management proceedings (Vol. 1, pp. 1–6). Academy of Management.
Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., et al. (2014). Scientific reasoning and argumentation: Advancing an interdisciplinary research agenda in education. Frontline Learning Research, 2(3), 28–45.
Fischer, F., Chernikova, O., & Opitz, A. (2022). Learning to diagnose with simulations: Introduction. In F. Fischer & A. Opitz (Eds.), Learning to diagnose with simulations—examples from teacher education and medical education. Springer.
Förtsch, C., Sommerhoff, D., Fischer, F., Fischer, M., Girwidz, R., Obersteiner, A., et al. (2018). Systematizing professional knowledge of medical doctors and teachers: Development of an interdisciplinary framework in the context of diagnostic competences. Educational Sciences, 8, 207. https://doi.org/10.3390/educsci8040207
Hautz, W. E., Kämmer, J. E., Schauber, S. K., Spies, C. D., & Gaissmaier, W. (2015). Diagnostic performance by medical students working individually or in teams. JAMA, 313, 303–304. https://doi.org/10.1001/jama.2014.15770
Heitzmann, N., Seidel, T., Opitz, A., Hetmanek, A., Wecker, C., Fischer, M., et al. (2019). Facilitating diagnostic competences in simulations in higher education. Frontline Learning Research, 7(4), 1–24. https://doi.org/10.14786/flr.v7i4.384
Helmke, A. (2010). Unterrichtsqualität und Lehrerprofessionalität: Diagnose, Evaluation und Verbesserung des Unterrichts (3rd ed.). Klett Kallmeyer.
Hetmanek, A., Engelman, K., Opitz, A., & Fischer, F. (2018). Beyond intelligence and domain knowledge: Scientific reasoning and argumentation as a set of cross-domain skills. In F. Fischer, C. A. Chinn, K. F. Engelmann, & J. Osborne (Eds.), Scientific reasoning and argumentation: The roles of domain-specific and domain-general knowledge (pp. 203–226). Routledge.
Hilbert, T. S., Nückles, M., Renkl, A., Minarik, C., Reich, A., & Ruhe, K. (2008). Concept Mapping zum Lernen aus Texten. Zeitschrift für Pädagogische Psychologie, 22, 119–125. https://doi.org/10.1024/1010-0652.22.2.119
Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12, 1–48.
Koeppen, K., Hartig, J., Klieme, E., & Leutner, D. (2008). Current issues in competence modeling and assessment. Zeitschrift für Psychologie/Journal of Psychology, 216, 61–73. https://doi.org/10.1027/0044-3409.216.2.61
Kuhn, D., Schauble, L., & Garcia-Mila, M. (1992). Cross-domain development of scientific reasoning. Cognition and Instruction, 9, 285–327. https://doi.org/10.1207/s1532690xci0904_1
Lazonder, A. W., Wilhelm, P., & Hagemans, M. G. (2008). The influence of domain knowledge on strategy use during simulation-based inquiry learning. Learning and Instruction, 18, 580–592. https://doi.org/10.1016/j.learninstruc.2007.12.001
Opitz, A., Fischer, M., Seidel, T., & Fischer, F. (2022). Conclusions and outlook: Toward more systematic research on the use of simulations in higher education. In F. Fischer & A. Opitz (Eds.), Learning to diagnose with simulations—Examples from teacher education and medical education. Springer.
Osborne, J. (2018). Styles of scientific reasoning: What can we learn from looking at the product, not the process, of scientific reasoning? In F. Fischer, C. A. Chinn, K. F. Engelmann, & J. Osborne (Eds.), Scientific reasoning and argumentation: The roles of domain-specific and domain-general knowledge (pp. 162–186). Routledge.
Peeraer, G., Scherpbier, A. J., Remmen, R., de Winder, B. Y., Hendrickx, K., van Petegem, P., et al. (2007). Clinical skills training in a skills lab compared with skills training in internships: Comparison of skills development curricula. Education and Health, 20(3), 125.
Radkowitsch, A., Sailer, M., Fischer, M. R., Schmidmaier, R., & Fischer, F. (2022). Diagnosing collaboratively: A theoretical model and a simulation-based learning environment. In F. Fischer & A. Opitz (Eds.), Learning to diagnose with simulations—Examples from teacher education and medical education. Springer.
Renkl, A., Hilbert, T., & Schworm, S. (2009). Example-based learning in heuristic domains: A cognitive load theory account. Educational Psychology Review, 21, 67–78. https://doi.org/10.1007/s10648-008-9093-4
Schrader, F.-W. (2009). Anmerkungen zum Themenschwerpunkt Diagnostische Kompetenz von Lehrkräften. Zeitschrift für Pädagogische Psychologie, 23, 237–245. https://doi.org/10.1024/1010-0652.23.34.237
Schunn, C. D., & Anderson, J. R. (1999). The generality/specificity of expertise in scientific reasoning. Cognitive Science, 23, 337–370. https://doi.org/10.1207/s15516709cog2303_3
Schwichow, M., Christoph, S., Boone, W. J., & Härtig, H. (2016). The impact of sub-skills and item content on students’ skills with regard to the control-of-variables strategy. International Journal of Science Education, 38, 216–237. https://doi.org/10.1080/09500693.2015.1137651
Südkamp, A., Kaiser, J., & Möller, J. (2012). Accuracy of teachers’ judgments of students’ academic achievement: A meta-analysis. Journal of Educational Psychology, 104, 743–762. https://doi.org/10.1037/a0027627
Tricot, A., & Sweller, J. (2014). Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26, 265–283. https://doi.org/10.1007/s10648-013-9243-1
Tschirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51(1), 1–10.
VanLehn, K. (1996). Cognitive skill acquisition. Annual Review of Psychology, 47, 513–539. https://doi.org/10.1146/annurev.psych.47.1.513
Wecker, C., Hetmanek, A., & Fischer, F. (2016). Zwei Fliegen mit einer Klappe? Fachwissen und fächerübergreifende Kompetenzen gemeinsam fördern. Unterrichtswissenschaft, 44(3), 226–238.
Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27, 172–223. https://doi.org/10.1016/j.dr.2006.12.001
Acknowledgments
The research presented in this chapter was funded by a grant from the Deutsche Forschungsgemeinschaft (DFG-FOR 2385) to Christof Wecker, Birgit Neuhaus, and Raimund Girwidz (WE 5426/2-1).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2022 The Author(s)
About this chapter
Cite this chapter
Pickal, A.J., Wecker, C., Neuhaus, B.J., Girwidz, R. (2022). Learning to Diagnose Secondary School Students’ Scientific Reasoning Skills in Physics and Biology: Video-Based Simulations for Pre-Service Teachers. In: Fischer, F., Opitz, A. (eds) Learning to Diagnose with Simulations . Springer, Cham. https://doi.org/10.1007/978-3-030-89147-3_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-89147-3_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89146-6
Online ISBN: 978-3-030-89147-3
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)