Abstract
This study investigated the relationships of teacher-directed approaches with science achievement in Australian schools. The data for this study were drawn from the Program for International Student Assessment (PISA) 2015 database and analysed using multilevel modelling (MLM). MLMs were estimated to test the contribution of each item to students’ science achievement scores and to estimate the mediation effect of teacher explanations on these relationships. Only explicit, teacher-directed practices demonstrated a significant, positive association with science achievement. The positive, significant nature of the item ‘the teacher explains scientific ideas’ (B = 29.61, p < 0.001) suggested that this practice should take place in all science lessons. In the mediation model, the explicit, teacher-directed approaches in the inquiry scale revealed a significant indirect effect on science achievement, through the process of the teacher explaining scientific ideas. This indicated that effective explanations also underpin other instructional approaches such as contextualised science learning. These findings, accompanied by an analysis of the teacher-directed items and their relationships to science outcomes, give teachers and policymakers clear guidance regarding the effective use of instructional explanations in the science classroom.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Recent studies have reported negative relationships between inquiry-based instruction and science achievement in international large-scale assessments (ILSAs; Areepattamannil et al., 2011; Cairns, 2019; Cairns & Areepattamannil, 2019; Lavonen & Laaksonen, 2009; OECD, 2016a; Oliver et al., 2019). The inquiry-based instruction constructs employed in these assessments are conceptualised in broad terms. For example, the Program for International Student Assessment (PISA) 2015 survey included an inquiry-based instruction scale that encompassed items describing inquiry activities that had been previously categorised at varying levels of teacher guidance (Jiang & McComas, 2015) and four conceptual domains (OECD, 2014). The experimental inquiry-based learning literature reports positive effects on student science performance (e.g. Espinoza, 2015; Furtak et al., 2012; Minner et al., 2010). The largest postive effects were reported for inquiry approaches that focused on (a) specific conceptual domains (epistemic; Furtak et al., 2012; Minner et al., 2010), (b) provided structured inquiry learning experiences (Furtak et al., 2012; Hmelo-Silver, 2011; Hmelo-Silver et al., 2007; Salamon et al., 1991), and (c) where students had sufficient prior subject (van Riesen et al., 2018) and process skills knowledge (Bredderman, 1985; Klahr & Nigam, 2004).
There is limited range of correlational research using ILSA data investigating teacher-directed approaches and their associations with science achievement (Cairns, 2019; Denoël et al., 2017). This study intends to address this by exploring the effects of teacher-directed practices (measured as individual items in the student questionnaire) included in the PISA 2015 scaled indexes, and the mediating effects of teacher explanations on any significant relationships found. All analyses were carried out using a representative sample of 15-year-old students in Australian schools from the PISA 2015 dataset (OECD, 2017a). The PISA project is an OECD initiative involving a standardised assessment of the capacity of students to apply reading, mathematics, and science knowledge to authentic, workplace-related, contexts. The PISA 2015 survey focused on scientific literacy. The use of the term scientific literacy to reflect student performance in science (science achievement) illustrates that the application of scientific knowledge to authentic contexts is central to the OECD’s beliefs relating the intended outcomes of science education (OECD, 2017b). The construct of scientific literacy requires students to be able to explain phenomena scientifically, evaluate and design scientific inquiry, and interpret data and evidence. These competencies consist of three types of knowledge: (1) content, (2) procedural, and (3) epistemic. Accordingly, the terms scientific literacy and science achievement will be used interchangeably in this study.
Teacher-Directed Instruction
Teacher-directed instruction (TDI) is a construct included in the PISA 2015 student background questionnaire that measures student responses to items relating to teacher-led instruction. The scale measures students’ exposure to teaching practices whereby the teacher is wholly responsible for initiating and structuring student learning (OECD, 2014). An early form of TDI, coined Direct Instruction (DI), was originally developed by Engelmann in the 1960s (Bereiter & Engelmann, 1966) and focuses on conceptual modelling, conceptual reinforcement, and informative feedback targeting progression towards mastery (Joyce et al., 2000). Direct Instructional theory is underpinned by two key concepts: (a) learners can only access new materials and tasks when they have mastered the prerequisite knowledge and skills and (b) the instruction must be explicit and unambiguous. DI theory is not predicated on constructivism and the uniqueness of the learner but rather on the mastery of learning through carefully sequenced and highly structured examples (Stockard et al., 2018). Upon mastery of essential knowledge and skills, DI theory states that this existing “foundation” facilitates the next stage of learning. Thus, learners are required to have achieved mastery before progressing and the curriculum itself must be designed to provide clear, progressive learning pathways (Stockard et al., 2018). For example, Gersten et al. (2001) reported a conceptualisation of DI that enabled a focus on the learning of discrete strategies and skills, educational concepts, and higher-order thinking skills involving the teaching of skills, initially in isolation (e.g. conclusion writing in a range of contexts) followed by the application of these skills (in combination with other skills, as appropriate), in ever more complex scenarios. Due to the selective processes in the DI model, learning must also be rewarding for the learners. This is achieved through continuous positive reinforcement (of learning behaviours) and frequent celebrations of student successes (such as progression in the successful acquisition of knowledge and skills; Stockard et al., 2018).
As such, DI encompasses both curriculum content and instructional approaches. Similar teacher-directed methods that are concerned only with teaching approaches (how to teach) include “direct instruction”, “explicit teaching” (e.g. Stanovich, 1980), and “systematic instruction” (Katz, 2007). These approaches conceptually overlap with DI teaching practices. For example, Rosenshine (2012) described essential direct instructional practices for successful teachers from a cognitive science, classroom practice, and cognitive support perspective. The 17 principles outlined included familiar practices such as giving clear and detailed instructions and explanations, providing a high level of active practice for all students, providing guidance for students as they begin to practice, providing model solutions to problems and modelling processes and strategies, providing regular feedback, re-teaching and re-explaining as necessary, ensuring students are well-prepared for independent practice, and supporting them when they start working independently (Rosenshine, 2012).
There is also a growing consensus in the instructional quality literature that the four basic elements of quality teaching (instructional clarity, cognitive activation, discourse features, and a supportive climate) include multiple elements of TDI (Klette et al., 2017). For example, cognitive clarity is considered an essential condition to positively influence students’ achievement. Cognitive clarity promoting approaches include explicit teaching, not only of domain specific concepts, but also how and when to use specific strategies (Afflerbach et al., 2013) supported by the modelling of how strategies can be applied in authentic contexts (Afflerbach et al., 2013; Gersten et al., 2009). Another component of effective instruction is cognitive activation through high-quality, challenging learning tasks (Klette et al., 2017) where connections between concrete and abstract concepts, and new and old content, skills and learning strategies are made clear through explicit instruction (Lipowsky et al., 2009).
Teacher-Directed Instruction in Practice
Implementation Studies
In terms of TDI in science lessons, a study designed to compare TDI, guided instruction, and minimal instruction when children are learning to design investigations (Hushman & Marley, 2015) reported that TDI was more effective in terms of learning process skills, recalling conceptual knowledge, application of process skills to different contexts, and effective evaluation of experimental designs.
A meta-analysis of 50 years of DI effectiveness literature showed that DI approaches were consistently positive across all subject areas (Stockard et al., 2018). More years of intervention and increased frequencies of exposure to DI methods further strengthened the observed effects (Stockard et al., 2018). Also, DI and other TDI approaches are reported as effective in research supporting children with learning difficulties (e.g. Datchuk, 2017; Head et al., 2018). Although there is a body of evidence to suggest that explicit instructional approaches are effective when working with academically at-risk students, there is no inherent reason why these students should learn differently to higher performing students (Martin, 2015). The key for an effective teacher is knowing when (and how often) to use appropriate instructional approaches (Martin, 2013).
In a review of TDI and academic achievement studies, Liem & Martin, 2013 concluded that teacher-led approaches (where the teacher’s role is an activator of learning) were more effective than open-ended, student-led approaches (where teachers served as facilitators of learning) due to the lowering of cognitive demands in terms of the working memory and executive functioning.
Correlational Studies
An analysis by Denoël et al. (2017) using the PISA 2015 dataset found that students who experienced a blend of TDI and inquiry-based instruction had the best science outcomes in European countries. They suggested that content mastery is a pre-requisite for successful inquiry-based learning evidenced by a positive correlation between science literacy scores and students experiencing teacher-directed approaches in most lessons and inquiry-based approaches in some lessons. Another study demonstrated, through the disaggregation of the inquiry-based instruction scale into the component items, that the majority of teaching strategies included in the scale had a negative correlation with science attainment (Cairns, 2019). However, in a similar vein to the McKinsey report (Denoël et al., 2017) an “optimum” level of inquiry approaches was posited. For example, the frequency of students spending time in the laboratory doing experiments associated with the highest science scores was in “some” lessons.
Although the recent addition of the TDI scale to the PISA 2015 background questionnaire enables researchers to test for associations between teacher-led instructional practices and science achievement, the items in the TDI scale do not appear to relate closely to the practices described above (OECD, 2016b) or to other detailed descriptions of teacher-directed instructional approaches (e.g. Hattie, 2012).
In the previously mentioned study of the effects of items from the inquiry-based learning scale, Cairns (2019) reported that two of the items that involved teacher explanations ‘the teacher explains how a science idea can be applied’ and ‘the teacher clearly explains relevance of science concepts to our lives’ exhibited a significant, positive relationship with science achievement. The explanation given for this relationship suggested that this positive association was due to the importance of students’ understandings of how scientific knowledge is applied and how it relates to their everyday lives. However, these statements consist of two parts: (1) the teacher-led explanation of a concept and (2) the concepts themselves (relevance and applications of science). It is possible that the mode of instruction (teacher-led explanations) could also be a factor contributing to the apparent success of these two approaches.
The Present Study
Considering the limited body of research regarding TDI in the correlational literature relating to ILSAs, this study investigated the relationships of teacher-led approaches on science achievement using the PISA 2015 dataset. We investigated these relationships by disaggregating the inquiry-based instruction and TDI scaled indexes, and identifying and analysing the teacher-directed approaches included in these constructs (while controlling for the effects of student-led approaches). The teacher-directed strategies identified from the 13 items (see Appendix for a full list of the items) are “the teacher explains how a science idea can be applied”, “the teacher clearly explains the relevance of science concepts to our lives”, “the teacher explains scientific ideas”, “the teacher discusses our questions” and “the teacher demonstrates an idea”. This identification is derived directly from the wording of the items and informed by prior research (Hattie, 2012; Lau & Lam, 2017). Lau and Lam (2017) identified the first three items listed above as an “interactive application” construct through exploratory factor analysis of the Hong Kong data. The item “the teacher discusses our questions” was included as a component of TDI construct conceptualised by Hattie (2012) and “the teacher demonstrates an idea” item was included due to the unambiguous wording of the item.
For the next step, we investigated the mediating effect of the teacher explanation item (“the teacher explains scientific ideas”) on the significant, teacher-directed approaches from the inquiry-based instruction scale to explain any associations found between these variables. These approaches were applied to answer the following research questions:
-
1.
What is the relationship between the reported frequency of experiencing teacher-directed instructional approaches in science lessons and science achievement?
-
2.
Is the relationship between the reported frequencies of teacher-led instructional approaches in science lessons and science achievement mediated (or explained) by the frequency of students experiencing teacher-led instructional explanations?
Analysis
Data
The data for this study were drawn from the questionnaires in the PISA 2015 database (http://www.oecd.org/pisa/data/2015database) for Australia only. The sample consisted of 14,530 15-year-old students (male = 7367, 51%; female = 7163, 49%) from 758 schools. The schools were selected to represent the national target population (OECD, 2017a) and the sampling frame included any school that could currently be educating 15-year-old students. Schools were allocated to sampling strata appropriate to the Australian school system such that each type of school was proportionately represented. The stratification variables used in the Australian school sample included: state/territory (NSW, QLD and so on); sector (Catholic, independent or government schools); and modal grade (year 10; OECD, 2017a). For example, there were 23 independent schools from Queensland with year 10 students enrolled included in the PISA 2015 sample.
Measures
The outcome measure for this research was the students’ 10 plausible value scores in the scientific literacy domain. These plausible values, which are a range of multiply imputed values derived from the student’s performance using IRT scaling, were combined with a regression model based on the student’s answers to the background questionnaire (OECD, 2009). As a result, we ran 10 separate analyses and pooled the results to yield the parameter estimates for a given model.
For the purposes of this study, individual items from the inquiry-based instruction and TDI constructs (raw scores) were used as predictor measures, as the effects of specific practices (described by individual items on the scales) were our area of interest.
Statistical Analysis
To answer research question 1, a two-level model was fitted that included the inquiry-based instruction items and the TDI items as predictors and the student-, school-level variables as covariates. To address research question 2, the mediating effect of the TDI scale item ‘the teacher explains scientific ideas’ on the teacher-led inquiry items (‘the teacher explains how a science idea can be applied’ and ‘the teacher clearly explains relevance of science concepts to our lives’) and their associations with scientific literacy was estimated by running two-level models with each explanatory variable regressed on the mediator variable in order to estimate the direct (the relationship between the dependent and independent variable), indirect (the product of the relationship between the independent, mediator and dependent variable), and total effect (the sum of the direct and indirect effects) on the outcome variable.
Further details regarding the individual items and their item codes, the control variables employed, and the specifics of analysis methods used are included in the Appendix.
Results
The means, mean differences (between Australia and the OECD average for each item), standard deviations, and correlations between the predictor variables used in the models are shown in Table 1. Items belonging to the same scales are positively correlated and items from different scales were negatively correlated with each other. The difference between responses from students in Australia compared with the mean responses across all regions included in PISA 2015 did not show any interesting deviations. Students in the Australian PISA sample reported experiences in their classrooms that are largely similar to the mean reported experiences across all sampled OECD regions. The correlations between the independent variables were sufficiently low (< 0.7), indicating an acceptable level of multicollinearity, which led to the inclusion of all 13 items in the model.
The relationship between individual inquiry and TDI items with science literacy was estimated using a two-level MLM approach. The multi-level intercept-only model estimated a grand mean science achievement score of 504.92 (γ00 = 504.92), and statistically significant variation in the random effects for science literacy score values between students (level 1: σ2 = 8294.40) and across schools (level 2: τ00 = 2656.66). The intra-class correlation coefficients (ICCs) indicated that 78% of the science achievement score variance occurred between students and 22% of the variance occurred across schools justifying the use of a multi-level analytical approach.
Research Question 1
The relationships between the 13 items from the PISA 2015 inquiry-based and TDI scales as estimated by the full MLM model are shown Table 2.
The approaches of giving students opportunities to explain their ideas, asking students to draw conclusions from an experiment they have conducted, allowing students to design their own experiments, having class debate about investigations, the teacher discussing students questions, and the teacher demonstrating an idea had no significant, linear associations with science achievement, when accounting for other teaching approaches and student- and school-level demographics. A range of teaching practices (as described by the items) were found to have a significant, negative relationship with science achievement. Where a science lesson involves having a whole class discussion with the teacher a one-unit increase in the frequency of experiencing this approach (for example, from “some” lessons to “most” lessons) resulted in a nearly 10-point decrease in the association with science literacy (B = − 9.82, p < 0.001). Similar negative relationships were observed for the items “students spend time in the laboratory doing practical experiments” (B = −7.95, p < 0.01), “students are required to argue about science questions” (B = −8.71, p < 0.001), and “students to do an investigation to test ideas” (B = −7.14, p < 0.001).
Classroom practices that were associated with a significant increase in science achievement were teacher-directed approaches that included the teacher explaining how a science idea can be applied (B = 8.56, p < 0.001), the teacher clearly explaining the relevance of science concepts to students’ lives (B = 6.02, p < 0.01) and, the teacher explaining scientific ideas (B = 29.61, p < 0.001). The positive relationship for the teacher explaining scientific ideas was by far the largest of all the teaching approaches in the model. A one-point increase in the frequency of experiencing teacher explanations about scientific ideas, on average, corresponds to a 29.61 point increase a student’s science score. In other words, the average student experiencing teacher explanations in all science lessons is predicted to score nearly 90 points higher in scientific literacy than a student that never experiences a teacher delivering scientific explanations. This is approximately equivalent to 2.25 years of learning progression (Jerrim et al., 2019). In a previous study, Cairns (2019) demonstrated that the explicit, teacher-led approaches included in the inquiry-based instruction scale were highly, positively related to science achievement (albeit for 69 countries). The results described above suggest that this relationship is weaker when items from the TDI scale are included in the model–specifically the approach of the teacher explaining scientific ideas. This observation suggests the presence of a mediation effect, which is addressed by research question 2.
Research Question 2
The mediation analysis showed a statistically, significant indirect effect of “the teacher explains how a science idea can be applied” on science achievement through the mediator “the teacher explains scientific ideas”, estimate = 6.82, 99% CI [5.20, 8.45]. There was also a statistically, significant direct effect on science achievement, estimate = 8.62, 99% CI [3.48, 13.78]. The total effect was 15.45, 99% CI [10.28, 20.62]. The direct effect in the mediation model remained significant (see Fig. 1), suggesting the presence of a partial mediation effect. The total effect of the teacher explaining how a science idea can be applied is 15.45 points in science achievement. 6.82 of these points are related to the effect of the teacher explaining how a science idea can be applied through the process of explaining scientific ideas. 8.63 points of the total effect on students’ science scores are related directly to the teacher explaining how a science idea can be applied (without influence from the teacher explaining scientific ideas).
There was also a significant indirect effect of “the teacher clearly explains the relevance of science concepts to our lives” on science achievement through the mediator “the teacher explains scientific ideas”, estimate = 7.02, 99% CI [5.26, 8.79] (see Fig. 1). There was also a significant direct effect, estimate = 6.19, 99% CI [1.36, 11.02], and total effect, estimate = 13.21, 99% CI [8.40, 18.03]. As such, the total effect of the teacher clearly explaining the relevance of science concepts to students’ lives is 13.21 points of science achievement. 7.02 of these points are related to the effect of the teacher clearly explaining the relevance of science concepts through the process of explaining scientific ideas. 6.19 points of the total effect on science scores are related directly to the teacher clearly explaining the relevance of science concepts to students’ lives.
Discussion
The approaches found to be associated with increased levels of scientific literacy all involve teacher-led practices. These include, “the teacher explains scientific ideas”, “the teacher explains how a science idea can be applied”, and “the teacher clearly explains the relevance of science concepts to our lives”. However, some likely teacher-led teaching strategies (namely, “the teacher discusses our questions” and “the teacher demonstrates an idea”) did not have a statistically significant relationship.
These results are noteworthy as teacher demonstrations are common practice in science lessons and are used to build conceptual knowledge through methods such as conceptual conflict and anchor-bridge models (Thijs & Bosch, 1995). The pedagogical rationale for a teacher using demonstrations usually revolves around the students’ lack of mastery regarding the manipulative and process skills involved (van den Berg & Giddings, 1992). However, other (non-cognitive) motivations may include time pressures, limited resources, and health and safety concerns. The lack of a relationship between the frequency of teacher demonstrations and science achievement appears counter-intuitive for such a well-used teaching strategy but there are some studies that question the efficacy of science teacher demonstrations (Roadruck, 1993; Roth et al., 1997). One important aspect of successful demonstrations is the alignment of the content with the cognitive level of the students (Roadruck, 1993). For a student to accurately decode the meaning of a demonstration, they must have sufficient understanding of the underlying principles of the phenomena they are observing (Roth et al., 1997). As the PISA background questionnaires do not provide information regarding the quality of teacher demonstrations, one possible explanation is that the effectiveness of teacher demonstrations varies considerably between schools and teachers. A similar issue may exist with regards to the item “the teacher discusses our questions” where low quality classroom discussions (also see the item “a whole class discussion takes place with the teacher”, below) may be taking place. There is evidence that classroom discussions, especially discussions involving students talking about their learning that lead to instructional adaptation, have a positive impact on achievement (Hattie, 2012).
Teaching strategies related to a decreased level of science achievement included the following: “students spend time in the laboratory doing practical experiments”, “students are required to argue about science questions”, “students are asked to do an investigation to test ideas”, and “a whole class discussion takes place with the teacher”. Although students spending time in the laboratory doing experiments results in a negative relationship with science achievement, previous studies have reported a curvilinear relationship whereby there is an optimum frequency (in some lessons), beyond which the relationship is negative (Cairns, 2019; Oliver et al., 2019) possibly due to instructional time constraints. Cairns (2019) suggested that, for laboratory-based work to lead to improved phenomenological understanding, sufficient time should be allocated for reflective opportunities; this would be difficult to provide if experimental work took place in all lessons.
Statistically significant direct, indirect, and total effects were estimated for the mediating effect of “the teacher explains scientific ideas” on the relationship between “the teacher explains how a science idea can be applied” and “the teacher clearly explains the relevance of science concepts to our lives” with science achievement. A previous analysis of the individual items on the inquiry-based instruction scale (Cairns, 2019) suggested that these positive associations were related to the pedagogical effectiveness of applying scientific concepts and relating classroom-based science learning to every-day contexts. However, these findings suggest that teacher explanations themselves play a significant role in the associations of these two items (“the teacher explains how a science idea can be applied” and “the teacher clearly explains the relevance of science concepts to our lives”).
Intuitively, explanations must play a major role in science teaching, and they are certainly ubiquitous in educational environments. Explanations can be delivered in many formats and contexts, and learners generally know when they have received a sufficiently good explanation (Geelan, 2013). Although there is a paucity of research regarding explanations and quality in educational settings (instructional explanations), some researchers have developed conceptual frameworks for understanding the effectiveness of explanations (Wittwer & Renkl, 2008). Examples of instructional explanations in science include explicit reference to the underlying principles behind natural phenomena (such as motion) allowing learners to generalise to other, more complex, contexts (Chi et al., 1989). Effective instructional explanations can be characterised as being aligned with the learner’s current level of prerequisite knowledge, centred on underpinning concepts and principles, and (perhaps most importantly; Webb & Farivar, 1999; Webb et al., 1995) associated with the current learning in such a way that allows for the practical application of the knowledge shared in the explanation (Wittwer & Renkl, 2008). Although the PISA questionnaires only provide information regarding the perceived frequency of experiencing various approaches, it is reasonable to assume that a learner knows when they have received a high-quality explanation as it will have fulfilled their cognitive needs (Keil et al., 2000). It follows then, that when a student reports “the teacher explains scientific ideas” in, for example, most lessons it can be assumed that the teacher explanation was successful in order for the learner to have recognised the existence of the explanation.
Implications for Teacher Development
In terms of classroom practice, instructional explanations should be very carefully constructed to ensure they are effective, as clear, instructional explanations are related to considerably higher science literacy scores. The effectiveness of instructional explanations is dependent on the teacher’s awareness of the learners’ current levels of understanding (namely, the existence of prior and sometimes faulty knowledge). Indeed, types of instructional explanation that are inappropriate for advanced learners can be superfluous (or even detrimental; Wittwer & Renkl, 2008). As such, effective teacher explanations are inextricably linked to high-quality assessment practices.
Once the current level of knowledge and understanding of learners has been ascertained by the science teacher, appropriate scientific instructional explanations should focus on: clarifying the underlying principles of scientific phenomena; clear guidance on when to apply scientific strategies (for example, when to draw on particle theory to solve a scientific problem); modelling of how to apply scientific strategies (for instance, specifically how to draw a conclusion from collected data); demonstrating how to make links in related content (such as abstract and concrete concepts or prior learning with new learning); and explicitly establishing how to reflect on the science learning process.
Limitations
There are three major limitations to the study. First, this study employed a correlational research design. Although correlational research designs, like observational research designs, tend to demonstrate relatively high ecological validity, causation cannot be assumed from correlation. Further research employing longitudinal or experimental research designs is warranted to examine causal relations among the variables of interest in the study. Second, correlational studies of this nature often use indirect measures of instructional practices. Instead of observing teachers’ actual instructional practices in classroom, such studies frequently rely on proxy measures such as students’ perceptions of their teachers’ instructional practices. Future research based on observational studies may better capture teachers’ actual instructional practices in classroom. Third, data for the present study were drawn from only one of the 72 PISA 2015 participating countries and economies, thereby limiting the generalisability of the findings. More cross-national and cross-cultural research is needed to further validate the findings and conclusions of this study.
Conclusions
Despite these limitations, this study provides empirical support for the importance of providing clear, instructional explanations for students, in all science lessons. Although these explanations are directly related to science achievement, they also underpin other teaching approaches (such as applying conceptual knowledge to practice and relating science concepts to the everyday lives of students). Previous studies have also suggested that teacher explanations (as determined from the TDI scale, a scale that has been shown by this study to be dominated by the positive effect of teacher explanations) are likely a pre-requisite for effective inquiry-based instruction (Denoël et al., 2017). Further work in this area should focus on the analysis of the effectiveness of science teaching methodologies that include explicit, teacher-led instruction (for a given concept) prior to students experiencing open, inquiry-based approaches.
References
Afflerbach, P., Cho, B. Y., Kim, J. Y., Crassas, M. E., & Doyle, B. (2013). Reading: What else matters besides strategies and skills? The Reading Teacher, 66(6), 440–448.
Areepattamannil, S., Freeman, J., & Klinger, D. (2011). Influence of motivation, self-beliefs, and instructional practices on science achievement of adolescents in Canada. Social Psychology of Education, 14(2), 233–259. https://doi.org/10.1007/s11218-010-9144-9.
Bereiter, C., & Engelmann, S. (1966). Teaching disadvantaged children in the preschool. Prentice-Hall.
Bredderman, T. (1985). Laboratory programs for elementary school science: a meta-analysis of effects on learning. Science Education, 69(4), 577–591. https://doi.org/10.1002/sce.3730690413.
Cairns, D. (2019). Investigating the relationship between instructional practices and science achievement in an inquiry-based learning environment. International Journal of Science Education, 41(15), 1–23. 2113–2135. https://doi.org/10.1080/09500693.2019.1660927.
Cairns, D., & Areepattamannil, S. (2019). Exploring the relations of inquiry-based teaching to science achievement and dispositions in 54 countries. Research in Science Education, 49(1), 1–23. https://doi.org/10.1007/s11165-017-9639-x.
Chi, M. T., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13(2), 145–182. https://doi.org/10.1207/s15516709cog1302_1.
Datchuk, S. M. (2017). A direct instruction and precision teaching intervention to improve the sentence construction of middle school students with writing difficulties. The Journal of Special Education, 51(2), 62–71.
Denoël, E., Dorn, E., Goodman, A., Hiltunen, J., Krawitz, M., & Mourshed, M. (2017). Drivers of student performance: Insights from Europe. Mckinsey & Company. https://www.mckinsey.com/industries/social-sector/our-insights/drivers-of-student-performance-insights-from-europe.
Espinoza, F. (2015). Graphical representations and the perception of motion: Integrating isomorphism through kinesthesia into physics instruction. Journal of Computers in Mathematics and Science Teaching, 34(2), 133–154.
Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching a meta-analysis. Review of Educational Research, 82(3), 300–329.
Geelan, D. (2013). Teacher explanation of physics concepts: A video study. Research in Science Education, 43(5), 1751–1762. https://doi.org/10.1007/s11165-012-9336-8.
Gersten, R., Baker, S., Pugach, M., Scanlon, D., & Chard, D. (2001). Contemporary research on special education teaching. In V. Richardson (Ed.), Handbook of research on teaching (4th ed., pp. 695–722). AERA.
Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. K., Morphy, P., & Flojo, J. (2009). Mathematics instruction for students with learning disabilities: A meta-analysis of instructional components. Review of Educational Research, 79(3), 1202–1242. https://doi.org/10.3102/0034654309334431.
Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. Routledge.
Head, C. N., Flores, M. M., & Shippen, M. E. (2018). Effects of direct instruction on reading comprehension for individuals with autism or developmental disabilities. Education and Training in Autism and Developmental Disabilities, 53(2), 176–191.
Hmelo-Silver, C. E. (2011). Design principles for scaffolding technology-based inquiry. In M. O'Donnell, C. E. Hmelo-Silver, & E. Gijsbert (Eds.), Collaborative learning, reasoning, and technology (pp. 147–170). Routledge.
Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42(2), 99–107. https://doi.org/10.1080/00461520701263368.
Hushman, C. J., & Marley, S. C. (2015). Guided instruction improves elementary student learning and self-efficacy in science. The Journal of Educational Research, 108(5), 371–381. https://doi.org/10.1080/00220671.2014.899958.
Jerrim, J., Oliver, M., & Sims, S. (2019). The relationship between inquiry-based teaching and students’ achievement. New evidence from a longitudinal PISA study in England. Learning and Instruction, 61, 35–44. https://doi.org/10.1016/j.learninstruc.2018.12.004.
Jiang, F., & McComas, W. F. (2015). The effects of inquiry teaching on student science achievement and attitudes: evidence from propensity score analysis of PISA data. International Journal of Science Education, 37(3), 554–576. https://doi.org/10.1080/09500693.2014.1000426.
Joyce, B., Weil, M., & Calhoun, E. (2000). Models of teaching (6th ed.). Allyn & Bacon.
Keil, F. C., Wilson, R. A., & Wilson, R. A. (2000). Explanation and cognition. MIT press.
Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15(10), 661–667. https://doi.org/10.1111/j.0956-7976.2004.00737.x.
Klette, K., Blikstad-Balas, M., & Roe, A. (2017). Linking instruction and student achievement. A research design for a new generation of classroom studies. Acta Didactica Norge, 11(3), 1–19. https://doi.org/10.5617/adno.4729.
Lau, K.-c., & Lam, T. Y.-p. (2017). Instructional practices and science performance of 10 top-performing regions in PISA 2015. International Journal of Science Education, 39(15), 2128–2149. https://doi.org/10.1080/09500693.2017.1387947.
Lavonen, J., & Laaksonen, S. (2009). Context of teaching and learning school science in Finland: reflections on PISA 2006 results. Journal of Research in Science Teaching, 46(8), 922–944. https://doi.org/10.1002/tea.20339.
Liem, G. A. D., & Martin, A. J. (2013). Direct instruction and academic achievement. In J. Hattie & E. Anderman (Eds.), International guide to student achievement (pp. 366–368). Routledge.
Lipowsky, F., Rakoczy, K., Pauli, C., Drollinger-Vetter, B., Klieme, E., & Reusser, K. (2009). Quality of geometry instruction and its short-term impact on students' understanding of the Pythagorean theorem. Learning and Instruction, 19(6), 527–537. https://doi.org/10.1016/j.learninstruc.2008.11.001.
Martin, A. J. (2013). From will to skill: The psychology of motivation, instruction and learning in today's classroom. InPsych: The Bulletin of the Australian Psychological Society Ltd, 35(6), 10.
Martin, A. J. (2015). Teaching academically at risk students in middle school: The roles of explicit instruction and guided discovery learning. In S. Groundwater-Smith & N. Mockler (Eds.), Big fish, little fish: Teaching and learning in the middle years (pp. 29–39). Cambridge University Press.
Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction—What is it and does it matter? Results from a research synthesis years 1984 to 2002. Journal of Research in Science Teaching, 47(4), 474–496.
Muthén, L., Muthén, B. (2020). Mplus HTML User's Guide. 2020, from http://www.statmodel.com/html_ug.shtml.
OECD. (2009). PISA Data Analysis Manual for SPSS Users (2nd ed.). OECD publishing.
OECD. (2014). PISA 2015 draft questionnaire framework. Retrieved from https://www.oecd.org/pisa/pisaproducts/PISA-2015-draft-questionnaire-framework.pdf.
OECD. (2016a). PISA 2015 Results (Volume I): Policies and Practices for Successful Schools: PISA. OECD Publishing.
OECD. (2016b). PISA 2015 Results (Volume II): Policies and Practices for Successful Schools: PISA. OECD Publishing.
OECD. (2017a). PISA 2015 technical report. PISA, OECD Publishing.
OECD. (2017b), PISA 2015 assessment and analytical framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving. OECD Publishing. https://doi.org/10.1787/9789264281820-en.
Oliver, M., McConney, A., & Woods-McConney, A. (2019). The efficacy of inquiry-based instruction in science: A comparative analysis of six countries using PISA 2015. Research in Science Education, 1–22. https://doi.org/10.1007/s11165-019-09901-0.
Roadruck, M. D. (1993). Chemical demonstrations: Learning theories suggest caution. Journal of Chemical Education, 70(12), 1025. https://doi.org/10.1021/ed070p1025.
Rosenshine, B. (2012). Principles of instruction: Research-based strategies that all teachers should know. American Educator, 36(1), 12.
Roth, W. M., McRobbie, C. J., Lucas, K. B., & Boutonné, S. (1997). Why may students fail to learn from demonstrations? A social practice perspective on learning in physics. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 34(5), 509–533. https://doi.org/10.1002/(SICI)1098-2736(199705)34:5%3C509::AID-TEA6%3E3.0.CO;2-U.
Salamon, G., Perkins, D. N., & Globerson, T. (1991). Partners in Cognition: Extending human intelligence with intelligent technologies. Educational Researcher, 20(3), 2–9. https://doi.org/10.3102/0013189x020003002.
Stanovich, K. E. (1980). Effects of explicit teaching and peer tutoring on the reading achievement of learning disabled and low-performing students in regular classrooms. Reading Research Quarterly, 16, 32–71. https://doi.org/10.1086/461851.
Stockard, J., Wood, T. W., Coughlin, C., & Khoury, C. R. (2018). The effectiveness of direct instruction curricula: A meta-analysis of a half Century of research. Review of Educational Research, 0034654317751919. https://doi.org/10.3102/0034654317751919.
Thijs, G. D., & Bosch, G. M. (1995). Cognitive effects of science experiments focusing on students’ preconceptions of force: A comparison of demonstrations and small-group practicals. International Journal of Science Education, 17(3), 311–323. https://doi.org/10.1080/0950069950170304.
van den Berg, E., & Giddings, G. (1992). Laboratory practical work: An alternative view of laboratory teaching (monograph). Western Australia: Curtin University, Science and Mathematics Education Centre.
van Riesen, S. A. N., Gijlers, H., Anjewierden, A., & de Jong, T. (2018). The influence of prior knowledge on experiment design guidance in a science inquiry context. International Journal of Science Education, 40(11), 1327–1344. https://doi.org/10.1080/09500693.2018.1477263.
Webb, N. M., & Farivar, S. (1999). Developing productive group interaction in middle school mathematics. In A. M. O'Donnell & A. King (Eds.), Cognitive perspectives on peer learning (pp. 117–149). Lawrence Erlbaum Associates.
Webb, N. M., Troper, J. D., & Fall, R. (1995). Constructive activity and learning in collaborative small groups. Journal of Educational Psychology, 87(3), 406–423. https://doi.org/10.1037/0022-0663.87.3.406.
Wittwer, J., & Renkl, A. (2008). Why instructional explanations often do not work: A framework for understanding the effectiveness of instructional explanations. Educational Psychologist, 43(1), 49–64. https://doi.org/10.1080/00461520701756420.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Variables and Analysis
Individual Item Codes
The PISA 2015 inquiry-based instruction scaled index items:
-
ST098Q01TA-Students are given opportunities to explain their ideas.
-
ST098Q02TA-Students spend time in the laboratory doing practical experiments.
-
ST098Q03NA-Students are required to argue about science questions.
-
ST098Q05TA-Students are asked to draw conclusions from an experiment they have conducted.
-
ST098Q06TA-The teacher explains how a science idea can be applied.
-
ST098Q07TA-Students are allowed to design their own experiments.
-
ST098Q08TA-There is a class debate about investigations.
-
ST098Q09TA-The teacher clearly explains the relevance of science concepts to our lives.
-
ST098Q10NA-Students are asked to do an investigation to test ideas.
The PISA 2015 teacher-direct instruction scaled index items:
-
ST103Q01NA-The teacher explains scientific ideas
-
ST103Q03NA-A whole class discussion takes place with the teacher
-
ST103Q08NA-The teacher discusses our questions
-
ST103Q11NA-The teacher demonstrates an idea.
Item Recoding
The raw scores of individual inquiry scale items were reverse-coded [1 = never or hardly ever, 4 = in all lessons] and the items from the teacher-directed instruction scale remained unchanged [again, 1 = never or hardly ever, 4 = in all lessons]. The individual items were used in this study as a means of determining the relationship between science achievement and the frequency of each discrete practice described by the item (whilst controlling for the effects of other practices that students may be experiencing).
Control Variables
Further control variables included in this study were student- and school-level demographic and socioeconomic variables such as gender (1 = female, 0 = male), immigration status (1 = non-immigrant, 0 = immigrant), the index of economic, social, and cultural status (ESCS; a composite score of highest level of education of parents, highest parental occupational status, and home possessions; see OECD, 2017b), school ownership type (1 = public, 0 = other), school location (1 = rural, 0 = urban), and the index of schools’ science-specific resources.
Statistical Analysis
The IEA IDB Analyser: Analysis Module (version 4.0.23) (https://www.iea.nl/data) combined with the IBM© SPSS© version 26 software package was used to generate descriptive statistics. Multi-level modelling (MLM) analyses was carried out using Mplus version 8.2 (Muthén & Muthén, 2020). The parameter estimation technique used in the two-level MLM analysis was full information maximum likelihood with robust standard error estimations (MLR; Muthén & Muthén, 2020). In order to provide a more meaningful interpretation of the results (i.e. meaningful parameters when other parameters are set to zero) and to also mitigate the effects of multicollinearity, all continuous variables in the model were grand mean centred.
The 10 multiply imputed plausible values for each student outcome measure (science literacy) were used to generate 10 separate datasets for analysis using TYPE = Imputation option in the Mplus syntax. This resulted in 10 separate analyses that were pooled to yield the parameter estimates for each model. Also, normalised student weights were included in the analysis to account for the two-stage stratified sampling methods employed (OECD, 2017). Missing data was addressed by using full information maximum likelihood (FIML) estimation in the Mplus software that generated a likelihood function for each case based on the variables present, allowing for the use of all the available data when estimating model parameters.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Cairns, D., Areepattamannil, S. Teacher-Directed Learning Approaches and Science Achievement: Investigating the Importance of Instructional Explanations in Australian Schools. Res Sci Educ 52, 1171–1185 (2022). https://doi.org/10.1007/s11165-021-10002-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11165-021-10002-0