Abstract
Blended forms of learning have become increasingly popular. Learning activities within these environments are supported by a large variety of online and face-to-face interventions. However, it remains unclear whether these blended environments are successful, and if they are, what makes them successful. Studies suggest that blended learning challenges the self-regulatory abilities of learners, though the literature does little to explain these findings; nor does it provide solutions. In particular, little is known about the attributes that are essential to support learners and how they should guide course design. To identify such attributes and enable a more thoughtful redesign of blended learning environments, this systematic literature review (n = 95) examines evidence published between 1985 and 2015 on attributes of blended learning environments that support self-regulation. The purpose of this review is therefore to identify and define the attributes of blended learning environments that support learners’ self-regulatory abilities. Seven key attributes were found (authenticity, personalization, learner-control, scaffolding, interaction, cues for reflection and cues for calibration). This review is the first to identify and define the attributes that support self-regulation in blended learning environments and thus to support the design of blended learning environments. This study may serve to facilitate the design of blended learning environments that meet learners’ self-regulatory needs. It also raises crucial questions about how blended learning relates to well-established learning theories and provides a basis for future research on self-regulation in blended learning environments.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
During the last two decades we have seen a steep rise in computer- and web-based technologies, which has led to significant changes in education. Blended forms of learning have become increasingly popular (Garrison and Kanuka 2004; Garrison and Vaughan 2008; Graham 2006; Spanjers et al. 2015). Learning activities within these blended environments are supported by a large variety of online and face-to-face instructional interventions. As a result of this, blended learning environments differ widely in the technologies used, the extent of integration of online and face-to-face instruction and the degree to which online activities are meant to replace face-to-face instruction (Smith and Kurthen 2007). Despite their popularity, it remains unclear whether these environments are successful, and if they are, which attributes make them successful (Oliver and Trigwell 2005). An important observation is that blended learning seems to be especially challenging for learners with lower self-regulatory abilities; but the opposite is also true: those who are able to regulate their own learning do well in these environments (Barnard et al. 2009; Lynch and Dembo 2004). However, it remains unclear why this is the case and what can be done to help struggling learners. This is problematic since educational research shows that the effectiveness of a learning environment depends on its design (Piccoli et al. 2001), e.g. the nature of the tasks given to learners and the information provided to help them perform the learning activities (Smith and Ragan 1999; Sweller et al. 1998). In order to design blended learning environments that support self-regulation and thus make learning more effective, we first need to determine the attributes of such environments. This paper therefore makes a first attempt to identify and define these attributes in the existing literature. After providing a brief overview of existing theories of self-regulation, we explain why the model we used as a framework to reflect upon the results of this review was most appropriate. Subsequently, we review the relevant literature, identify the attributes of effective blended learning environments and define them. This definition is particularly challenging, firstly because an inductive or bottom-up approach was used in this systematic literature review (see: Hart 2009; Joy 2007); its aim was to identify attributes rather than validating them. Secondly, numerous studies have already noted (e.g., Petticrew and Roberts 2008) that conceptual transparency is often lacking in intervention studies within learning and educational sciences. It is likely, then, that while the retrieved studies report on common attributes, they approach them from different perspectives. While this complicates the definition process, such definitions are nonetheless likely to make a key contribution when designing interventions aimed at particular attributes.
1.1 Learner variables influencing self-regulation
In this study learning is seen as an activity performed by learners for themselves in a proactive manner, rather than as something that happens to them as results of instruction (Bandura 1989; Benson 2013; Knowles et al. 2014). Learning is therefore seen as a self-regulated process (Zimmerman and Schunk 2001). This perception of the abilities of learners to regulate their learning originates from the social cognitive perspective (Bandura 1977). Over the past three decades, various self-regulated learning theories have been grafted onto this perspective. Five main theories can be identified in the leading reviews written to date (e.g., Baumeister and Heatherton 1996; Boekaerts 1999; Boekaerts et al. 2005; Puustinen and Pulkkinen 2001; Zimmerman and Schunk 2001). These theories describe a cyclic process of self-regulatory phases, often consisting of (a) defining the task, (b) goal-setting and planning, (c) performance and (d) evaluation (e.g. Boekaerts’ Model of Adaptable Learning (1992; 1995; 1996a; 1996b; 1997; Boekaerts et al. 2005) and Pintrich’s General Framework for Self-regulation (Pintrich 2000; Pintrich and De Groot 1990; Schunk et al. 2008)). In total, the five main theories also identify three categories of variables: (1) cognition (e.g. Zimmerman’s cyclical Social Cognitive Model of Self-regulation (Zimmerman 1986, 1990, 1998; Zimmerman 2000; Zimmerman and Pons 1986)), (2) metacognition (e.g. Borkowski’s Process-oriented Model of Metacognition (Borkowski et al. 1990; Pressley et al. 1987)) and (3) motivation (e.g., Butler and Winne 1995; Schraw et al. 2006; Schraw and Moshman 1995; Zimmerman 2000).
Although no theory of self-regulation can be considered superior to any other, the Winne and Hadwin (1998) model was selected to facilitate the search for attributes of blended learning environment that support self-regulation since it has a number of characteristics that make it very suitable for the purpose of this study. These characteristics are outlined in more detail below. As the name suggests, Winne’s Four-stage Model of Self-regulated Learning (Butler and Winne 1995; Winne 1995, 1996; Winne and Hadwin 1998; Winne and Perry 2000) describes four stages: (1) task definition, during which learners develop perceptions of the task concerned; (2) goal-setting and planning; (3) enacting the tactics and strategies chosen during goal-setting and planning; and (4) metacognitively adapting studying techniques, keeping future needs in mind. Each of these phases consists of five elements: Conditions, Operations, Procedures, Evaluations and Standards (COPES). The theory emphasizes that learners whose teachers prompt more effective processing in stage one (task definition) and stage two (goal-setting and planning) are more likely to have accurate expectations of the task (Winne and Hadwin 1998). At the second level, Winne and Hadwin (1998) describe the conditions that influence each of these phases. First, they provide information about the task conditions (e.g. time constraints, available resources and social context). Secondly, they outline the cognitive conditions (e.g. interest, goal orientation and task knowledge) that influence how the task will be engaged with (Winne and Hadwin 1998). Cognitive conditions are influenced by epistemological beliefs, prior knowledge (all information stored in the long-term memory) and motivation (Winne and Hadwin 1998).
As mentioned above, the Four-stage Model of Self-regulated Learning has four key characteristics that suit the purposes of this study very well. Firstly, the model looks beyond the focus on instructional stimuli and their effect on learning, assuming instead that all learners process the stimuli as intended (Winne 1982). The authors see learners as active agents (Winne 1982, 1985, 2006) or mediating factors in the instructional process, a perspective on instruction which is largely undocumented and needs consideration (Keller 2010b; Winne 1982). The model gives clear indications about which phases should be targeted, namely task definition followed by goal-setting and planning (Winne and Hadwin 1998). A second consideration is that each phase (one to four) incorporates the COPES process, which when combined make up the cognitive system (Greene and Azevedo 2007). This cognitive system explicitly models how work is done in each phase and allows for a more detailed look at how various aspects of the COPES architecture interact (Greene and Azevedo 2007). Thirdly, with monitoring and control functioning as the key drivers of regulation within each phase, Winne and Hadwin’s model can effectively describe how changes in one phase can lead to changes in other phases over the course of learning (Greene and Azevedo 2007). This allows the model to explicitly detail the recursive nature of self-regulation (Greene and Azevedo 2007). A fourth and final reason for this model’s suitability is that it separates task definition and goal-setting and planning into distinct phases, in contrast to the model of Pintrich (2000) for example; this allows more pertinent questions to be asked about these phases than would otherwise be the case when focusing on instructional interventions (Greene and Azevedo 2007; Winne and Marx 1989). In this respect the systematic literature review presented here will focus on asking such questions and identifying the attributes of blended learning environments that are deliberately integrated into or added to the environment in order to support self-regulated learning (Zumbrunn et al. 2011).
1.2 Support in blended learning environments
This study focuses exclusively on blended learning environments. In their editorial for the Journal of Educational Media, Whitelock and Jelfs (2003) described three definitions of the concept of blended learning. These definitions were also used as a categorization by Graham (2006) in the handbook of blended learning, and by Ifenthaler (2010) in his book on learning and instruction in the digital age. The first definition (based on Harrison (2003)) views blended learning as the integrated combination of traditional learning with web-based online approaches (Bersin and others 2003; Orey 2002a, b; Singh et al. 2001; Thomson 2002). The second one considers it a combination of media and tools employed in an e-learning environment (Reay 2001; Rooney 2003; Sands 2002; Ward and LaBranche 2003; Young 2001) and the third one treats it as a combination of a number of didactic approaches, irrespective of the learning technology used (Driscoll 2002; House 2002; Rossett 2002). Driscoll (2002, p. 1) concludes that “the point is that blended learning means different things to different people, which illustrates its widely untapped potential”. Oliver and Trigwell (2005) add that the term remains unclear and ill-defined. Taking these observations into account, the definition used in this study is as follows: “Blended learning is learning that happens in an instructional context which is characterized by a deliberate combination of online and classroom-based interventions to instigate and support learning. Learning happening in purely online or purely classroom-based instructional settings is excluded” (Boelens et al. 2015).
A formal definition of learner support in blended learning environments does not yet seem to have been provided in research literature, although a considerable number of researchers (e.g., Kearsley and Moore 1996; Keegan 1996; Robinson 1995; Tait 2000; Thorpe 2002) have made valuable contributions by defining similar concepts. Learner support in blended learning environments often refers to meeting the needs all learners have, choices at course level, preparatory tests, study skills, access to seminars and tutorials, and so on. These are elements in systems of learner support that many practitioners see as essential for the effective provision of blended learning (Kearsley and Moore 1996; Keegan 1996). Nonetheless Sewart (1993) notes that a review of key areas of the literature dating back to 1978 does not reveal any comprehensive analysis of learner support services (see also Robinson (1995)). It is therefore particularly challenging to address the issue of learner support in blended learning. Tait (2000) describes the central functions of learner support services in non-strictly face-to-face settings most fundamentally, arguing that it should be cognitive, affective, and systemic (Tait 2000). In this study, ‘support’ refers to all measures taken to instigate and / or facilitate learning.
A final remark should be made regarding the term ‘learning outcome’. This term is often used in the same sense as learning objectives (Melton 1997), but in our opinion this understanding is too narrow and too focused on an increase in performance. In this study, learning outcomes are defined as changes (due to support) in cognitive, metacognitive or motivational abilities, which together constitute a learner’s ability to self-regulate (e.g., Allan 1996; Popham et al. 1969).
1.3 Problem statement
There is a growing realization that the precise design of blended learning environments has different impacts on learning for different types of learners. It has been suggested that blended learning makes high demands of learners’ self-regulatory abilities and is therefore a major challenge for those with lower self-regulatory abilities. The opposite is also true: blended learning environments are well suited to learners who work well in environments with e.g. a lot of learner control. We do not yet know why this is the case or what a solution might be for learners who struggle. In particular, little is known about the attributes of blended learning environments that are essential to support learners and how they should guide course design. Winne and Marx (1989) and Keller (2010a) have called for an approach to course design in blended learning that centres more closely around supporting self-regulation. As a consequence, the research question addressed in this systematic literature review is: “What attributes of blended learning environments support learners’ self-regulation?” In answering this research question, we identify the attributes of blended learning environments that support self-regulation and define them. On the one hand, this facilitates the design of blended learning environments that meet learners’ self-regulatory needs. On the other hand, it also contributes to research in the field of ICT and education by shifting the focus towards learners’ self-regulation in technology-mediated environments.
2 Methodology
The methodological approach used to answer the research question was based both on research literature on systematic literature reviews (e.g., Hart 2009; Joy 2007) and on the methodologies used in highly valued educational reviews with similar methodological aims (e.g., Bernard et al. 2004; Blok et al. 2002; Butler and Winne 1995; De Jong and Van Joolingen 1998; Greene and Azevedo 2007; Tallent-Runnels et al. 2006; Tinto 1975). The systematic literature review methodology is particularly suited to the aim of this study, because it focuses on the identification, critical evaluation and integration of findings from a considerable number of relevant resources (Baumeister and Leary 1997). Using this methodology allows us to formulate general statements and overarching conceptualizations (Sternberg 1991). Although this methodology is most appropriate for the aim of this study, it also has its limitations. Higgins and Green (2008) described the main issues as follows: they argue, firstly, that because such a methodology allows us to target broader research questions, it inevitably restricts the depth of analysis; and secondly, that categorizing findings across the retrieved articles puts pressure on the replicability and transparency of the methodology. As elaborated on below, we propose a peer-reviewed and double-checked bibliographical approach in order to ensure transparency and replicability. As the focus of this study is to identify and define attributes, rather than exploring each attribute in detail, the depth issue is less of a threat. Nonetheless, we propose further research avenues for elaborating on each of the attributes.
By comparing the studies on the systematic literature review methodology, it could be observed that most of the reviews suggest a similar design as presented by Hart (2009). His methodological outline and suggestions will be therefore used to perform the systematic literature review. First, general searches for background information on the study’s main concepts were performed. This resulted in an initial map of related topics, a vocabulary of concepts and a provisional list of key authors. The findings of this phase were reported in the introduction of the systematic literature review and functions as a theoretical basis to reflect upon the results of this study. On the other hand, the focus on the topics to be analysed and the identification of information needs regarding the topic was established, resulting in a clear research question. This research question was reported during the problem statement. To answer this research question relevant data was collected and analysed. These procedures will be described below.
2.1 Data collection
To establish a collection of publications to be analysed and synthesized, relevant databases for retrieving publications on instruction and information (and communication) technology were identified (n = 5): Web of Science, ProQuest, EBSCOhost, Science Direct and OvidSP. The search terms used to perform the searches derived from a deductive process based on the key concepts of this study as presented in the introduction. The following search string was used: (“blended learning” OR “online learning” OR “hybrid learning” OR “web based learning” OR “distance learning” OR “virtual learning”) AND design AND (low OR poor OR inadequate OR negative) AND self-regulat* AND (“prior knowledge” OR “cognitive strategies” OR “learning strategies” OR “motivation”) AND (problem* OR solution* OR effects OR issues OR explain*) AND (“adult learner” OR “adult learning” OR postgraduate OR post-graduate OR postsecondary OR post-secondary) NOT (kindergarten OR “primary education” OR “secondary education” OR under-graduate OR undergraduate OR “K-12” OR elementary). A number of additional inclusion and exclusion criteria were specified to select appropriate publications for inclusion in the systematic literature review. To be included in the review, publications had to (a) have been published between January 1985 and February 2015, (b) have no duplicates, (c) include full text, (d) include empirical evidence (research based on, concerned with, or verifiable by observation or experience rather than theory or pure logic (see: Barratt (1971); Mouly (1978)) relating to the impacts and outcomes of blended learning environments; this was to address the perceived lack of empirical evidence concerning blended learning. Finally, publications had to (e) include performance measures that reflected individual courses (micro level) or learning tasks, rather than entire programmes.
2.2 Data analysis
Following the suggestion of Hart (2009), the publications were first skimmed for structure, overall topic, style, general reasoning, data and bibliographical references. A second more detailed survey followed of the sections of each publication (introduction, theoretical foundations, methodology, etc.). The third step included the creation of a summary of each publication retrieved. This was to ensure the preservation of the rich data and context of each publication. A minimally condensed version of this summary can be found in Appendix 1. The summary includes: (a) the aim of each publication, (b) the dependent and independent variables, (c) the sample (including the characteristics of the participants), (d) the procedure or method used, (e) the measurement instrument(s) used and (f) the results and conclusions. This analysis was performed and managed in QSR NVIVO 10 and summarized in MS Word and Excel documents. Based on this third step, the analysis for common attributes was performed by comparing the different variables, results and conclusions with one another. Once the attributes were identified, a twofold (peer-reviewed by the other author), double check (manual versus bibliometric (Cheng et al. 2014) to ensure inter-coder reliability) was performed to ensure that the attributes identified when synthesizing the summaries were found by both researchers individually and explicitly retrieved in the consulted publications. Thus, both researchers synthesized a sample of the summaries and compared their findings. A text search query was also used to check whether the attributes identified by analysing the summaries were also found explicitly in the retrieved publications (see for detailed methodology: Cheng et al. (2014); Graddol et al. (1994); Popping (2000); Romero and Ventura (2007); Wegerif and Mercer (1997)). Finally, based on the identification of the common attributes and the publications that refer explicitly to these attributes, a detailed analysis of the publications involved was done to determine what decisions and conclusions could be drawn from these publications. The results of this analysis can be found in the results section.
3 Results
Using the search string mentioned above, an initial search was performed per database, on title and abstract. In total, 247 publications were retained and imported into Endnote X7. A search for overlap or duplicates was done. The publications retrieved first were retained and the duplicate removed from the database. A total of seventeen publications were deleted and 230 publications retained. The last step was the automatic search, performed in Endnote X7, for the full texts of each abstract. A total of 88 publications were removed from the database due to a lack of full text. The remaining 142 publications were imported into QSR NVivo 10 for further analysis. All 142 publications were scanned for general relevance and empirical evidence. Reviews (n = 30) and irrelevant publications (n = 17) (see for example: “Community based forest enterprises in Britain: Two organizing typologies” by Ambrose-Oji et al. (2014)) were excluded. This brought the number of publications included to 95. No publications were excluded based on (d) the level of focus (course or curriculum): all the publications retrieved reported on course level.
3.1 Descriptive statistics of the publications included
General descriptive statistics say something about the field of blended learning and the inclusion of self-regulation in the discourse. The search included all publications from between January 1985 and February 2015. It is noteworthy that no publications were retrieved from the period 1985 to 2001. Between 2002 and 2009 an annual average of four publications were published relating to the search results of this systematic literature review. Between 2010 and February 2015, an average of eleven publications were published per year. The descriptive results of the systematic literature review also show which journals the majority of retrieved publications originated from. The largest proportion of publications were retrieved from Computers & Education (n = 19); Computers in Human Behaviour produced thirteen publications, followed by The Internet & Higher Education (n = 10), the International Journal of Human-Computer Studies (n = 4), Nurse Education Today (n = 3), Learning & Instruction (n = 3), Higher Education (n = 2), Journal of Computing in Higher Education (n = 2) and the International Journal of Educational Research (n = 2). These journals accounted for 61 % of all the retrieved publications. In total, 61 of the retrieved publications were quantitative; 33 included experimental interventions with pre- and post-tests in controlled conditions; 23 retrieved information using surveys; and 5 reported on quasi-experiments (e.g. no pre- or post-tests). Finally, 13 publications were qualitative in nature and used case studies (n = 5), observations (n = 1), document analysis (n = 2) or interviews (n = 5) as their method. In the mixed-method combinations of quasi-experiments and interviews, observations and document analysis were used (n = 13). Table 1 shows the number of publications retrieved by type of research and methodology used. The publications retrieved were also analysed by the learning variables taken into account. The majority of the publications (n = 57) reported on a mix of learning variables (cognition, metacognition and motivation); 30 publications reported on individual variables. Table 2 shows the number of publications retrieved by learner variable. Both the methodological data and the variables used can be found in the individual summaries presented in Appendix 1.
3.2 Attributes of blended learning for self-regulation
As mentioned above, after analysing the publications’ descriptive features and learner variables (cognitive, metacognitive and motivational) a search was performed to identify common attributes of interest in the retrieved publications. Once the attributes were identified, a twofold (peer-reviewed), double check (manual versus bibliometric) was performed to ensure that the attributes identified when synthesizing the summaries were found by both researchers individually and explicitly retrieved in the consulted publications.
The systematic literature review presented here suggests that blended learning environments that foster cognition, metacognition and motivation and thus support self-regulation have seven main attributes. These attributes are (1) authenticity, (2) personalization, (3) learner control, (4) scaffolding, (5) interaction, (6) reflection cues and finally (7) calibration cues. Table 3 shows the number of publications retrieved per attribute: 87 reported on at least one attribute (eight were excluded due to a lack of explicit reference to at least one attribute). It is important to note that 59 articles reported on at least two attributes, with a maximum of six attributes per publication. This illustrates the interrelatedness of each attribute with the others. The summaries in Appendix 1 report on the attributes identified in each of the publications. Based on these findings the relevant publications were synthesized in more depth. Each attribute is elaborated on in more detail below.
3.2.1 Authenticity
In total, 29 publications appear to centre around authenticity (e.g., Ai-Lim Lee et al. 2010; Artino 2009b; Chen 2014; Corbalan et al. 2008; Demetriadis et al. 2008; Donnelly 2010; Gulikers et al. 2005; Smith et al. 2008; Ting 2013) and report its influence on cognitive (e.g., Corbalan et al. 2008; Gulikers et al. 2005), metacognitive (e.g., Chen 2014; Kuo et al. 2012) and motivational (e.g., Kovačević et al. 2013; Sansone et al. 2011; Siampou et al. 2014) variables that influence the self-regulatory abilities of learners. The retrieved publications contained several definitions of authenticity, ranging from ‘real-world relevance’ and ‘needed in real-life situations’ to ‘of important interest to the learner for later professional. In sum, authenticity was treated as the real-world relevance, to the learners’ professional and personal lives, of the learning experience. It was described as being manifested in both the learning environment and the task at hand.
The majority of publications retrieved referred to the motivational value of authentic learning tasks. In this respect Ai-Lim Lee et al. (2010) used a survey study and Kovačević et al. (2013) an experimental design to conclude that authentic tasks in an educational context are associated with finding meaning and relevance and therefore associated with higher motivation. In their survey study, Sansone et al. (2011) add that when learners have little pre-existing interest or motivation, tasks that practise skills needed in real-life situations were more motivating. An example is provided in the interview study of Smith et al. (2008), who report that learners wanted to be involved in education as long it proved to have a practical application and relevance to their professional background.
On the metacognitive side, a survey study included in the experimental study of Chen (2014) and Kuo et al. (2012) found that authentic digital learning materials significantly influenced learners’ perceptions of learning outcome expectations, learning gratification and learning climate in web-based learning environments. Wesiak et al. (2014) conducted an experiment and analysed log-files of learners. They add to the previous findings that real-world relevance in an online medical simulation improved metacognitive skills. Taken together, these findings suggest that authentic tasks influence cognitive (e.g. prior knowledge and performance), metacognitive (e.g. learning outcome expectations) and motivational (e.g. enjoyment, intrinsic motivation) learner variables, which in turn influence the self-regulatory abilities of learners. However, Gulikers et al. (2005) conducted an experiment and emphasized that authentic tasks and authentic contexts are two different things and have different impacts on learning (no evidence was found for the superiority of authentic environments). Corbalan et al. (2008) analysed log-files during an experiment and added to this that for novice learners, the acquisition of complex skills by performing authentic tasks is heavily constrained by the limited processing capacity of their working memory and that such tasks can cause cognitive overload and should therefore be adapted to the individual needs of learners.
3.2.2 Personalization
We identified 24 publications which address personalization (e.g., Hung and Hyun 2010; Law and Sun 2012; Leen and Lang 2013; Liaw et al. 2010; Ma 2012; Reichelt et al. 2014; Yu et al. 2007). In these publications, personalization is defined as non-homogenous experiences related directly to the tailoring of the learning environment (both the characteristics and objects) to the inherent needs of each individual learner (topics of high interest value). Examples include elements of name recognition or the integration of name-specific references to the learner, self-description or tailoring of the environment to the individual preferences (content, subject, etc.) of the learner and cognitive-situationing or adapting the environment to the performance level of the learner.
Some of the retrieved publications report on interventions carried out to identify the effect of personalization on a mix of learner variables, whereby Reichelt et al. (2014), using a quasi-experimental set-up including document analysis, and Leen and Lang (2013), using a survey study, found that personalized learning materials, a good fit of learning contexts integrating the personal preferences of the learners and communicative features expressed in a personalized style contribute to enhanced motivation and learning, seem to engage learners in learning processes and provide learning success. Accordingly Ai-Lim Lee et al. (2010) investigated the influence of a desktop virtual reality application’s constructivist learning characteristics on learning outcomes. During this investigation they found that options regarding individual preferences relate positively to learning effectiveness and satisfaction.
Other publications reported more generally on the nature of blended learning environments and their suitability with regard to a range of learner variables. Liaw et al. (2010); Ma (2012); Mohammadi (2015); Yu et al. (2007) used survey studies and interviews to evaluate the feasibility of e-learning for continuing education and concluded that diversity, flexibility, adaptability and individualization are catalysts for increasing motivation, user satisfaction, intention to use e-learning and regulating abilities. Law and Sun (2012) did the same with regard to a digital educational game. Here, too, adaptability (to personal preferences) was seen as an influencing factor for the user experience. Although the literature retrieved seems to find a positive influence of personalization on metacognitive and motivational learner variables (e.g., Liaw et al. 2010; Mohammadi 2015; Yu et al. 2007) personalization itself had no straightforward effect on learning performance (Ai-Lim Lee et al. 2010; Reichelt et al. 2014).
3.2.3 Learner control
In total, 18 publications refer to the amount of control learners have in blended learning environments (e.g. (e.g., Artino 2009a, 2009b; Corbalan et al. 2008; Hughes et al. 2013; Hung et al. 2011; Leen and Lang 2013; Lin et al. 2012; Mohammadi 2015; Reychav and Wu 2015; Roca et al. 2006; Ting 2013; Yu et al. 2007)). These publications consider learner control to be an inclusive concept that describes the degree of control that learners have over the content and activities within the learning environment. Examples include control over the pace of the course, the content used, learning activities in which the content is presented and content sequencing which allows the learner to determine the order in which the content is provided.
Corbalan et al. (2008) and Hughes et al. (2013) found in their experimental studies, including log-file analysis, that shared (learner and instructor) control has positive effects on learner motivation, and that the choice provided positively influenced the amount of effort invested in learning, combined with higher learning outcomes. In his survey study, Artino (2009b) provided evidence for the positive predictive ability of the task learners choose (rehearsal vs in-depth) on elaboration, metacognition, satisfaction and continuing motivation. During their survey study, Lin et al. (2012) found that the higher the level of control and learning afforded by a virtual-reality-based learning environment, the better the learning outcomes as measured by performance achievement, perceived learning effectiveness and satisfaction would be. While learner control seems to influence cognition (Ai-Lim Lee et al. 2010), metacognition (Artino 2009b) and motivation (Lin et al. 2012) this influence is not unfailingly positive. Some remarks are made in the publications retrieved. Corbalan et al. (2008) found that learners with lower levels of competence in a domain lack the ability to make productive use of learner control; Artino (2009a) observed, in his survey study on how feelings, and actions are associated with the nature of an online course, that a lack of control on the part of the learner results in boredom and frustration. Leen and Lang (2013) found that older adults had a strong need for a sense of belonging and personal growth, and thus a heightened interest in learner control, whereas younger adults’ motives for learning were more competition-related. Learners with a high need for control might tend to adopt e-learning quickly, whereas learners with low self-control abilities tend to reject e-learning (Yu et al. 2007). For individuals with lower self-control abilities, it seems essential to establish user-friendly learning environments in the early stages of development (Yu et al. 2007). Hung and Hyun (2010) conclude as a result of their interview study that learners with low prior knowledge require a learning context provided by the instructors to sustain the learning experience.
3.2.4 Scaffolding
The search produced 24 publications related to scaffolding in blended learning environments (e.g., Aleven and Koedinger 2002; Artino and Jones 2012; Artino and Stephens 2009; Chia-Wen et al. 2011; Davis and Yi 2012; Demetriadis et al. 2008; Govaere et al. 2012; Kim and Ryu 2013; Koh and Chai 2014; Kuo et al. 2012; Niemi et al. 2003; Wesiak et al. 2014). These publications define scaffolding as changes in the task or learning environment that assist learners in accomplishing tasks that would otherwise have been beyond their reach. This could involve ongoing diagnosis of the amount of support learners need and the provision of tailored support based on the results of this ongoing diagnosis, both of which result in a decrease in support over time.
Some of the retrieved publications report on interventions done to identify the effect of scaffolding on cognition, metacognition and motivation. Wesiak et al. (2014), for example, found clear indications that the addition of thinking prompts provided by scaffolding services is beneficial to learners, who reported an increasing amount of effort in terms of time spent. These findings imply a positive effect of the refinements of thinking prompts and/or affective element added. This supports the assumption that scaffolding support fosters metacognition and reflection. Aleven and Koedinger (2002) conducted an experiment and concluded that scaffolding of problem-solving practice, using self-explanation, with a computer-based cognitive scaffolding tutor was an effective tool for the support of the acquisition of metacognitive problem-solving strategies and that guided self-explanation adds value to guided problem-solving practice without self-explanation. Demetriadis et al. (2008) and Govaere et al. (2012) found, using an experimental set-up, that learners in a scaffolded group achieved significantly higher scores, which indicates that explicitly asking scaffolding questions to activate learners has positive effects. Accordingly, Kim and Ryu (2013) showed that, during the assessment of a web-based formative peer assessment system, learners using such a system achieved significantly higher scores for metacognitive awareness. Devised questions, prompts, and peer interaction as scaffolding strategies are shown to facilitate metacognitive skills.
Artino and Stephens (2009), on the other hand, used a survey to investigate the potential developmental difference in self-regulated learning and come up with instructional guidelines to overcome these differences. They suggest that scaffolding for the support of self-regulated learning in online learning environments should ideally be achieved by explicitly providing instructional support, structure and scaffolds of social interaction. Artino and Jones (2012) articulated the benefits of attending to learners’ achievement emotions in structuring online learning environments. This way, learning and performance are improved by facilitating learners’ use of adaptive self-regulatory learning strategies. Yu et al. (2007) emphasized, in their investigation of the feasibility of the adaption of e-learning for continuing education, that for learners with lower self-regulatory abilities it is essential to scaffold support around strategies of behaviour modification, to increase learners’ confidence and self-regulatory abilities while maintaining their participation and improving the learning effect.
3.2.5 Interaction
We retained 70 publications that appear to centre around interaction (e.g., Alant and Dada 2005; Chen 2014; Clark et al. 2015; DuBois et al. 2008; Gomez et al. 2010; Ho and Dzeng 2010; Liaw et al. 2010; Lin et al. 2012; Ma 2012; Siampou et al. 2014; Ting 2013; Xie et al. 2013). These publications describe interaction as the involvement of learners with elements in the learning environment, including content (learning materials, object, etc.), the instructor (teacher, coach, trainer, etc.), other learners (peers, colleagues, etc.) and the interface (objects in the online or offline learning environment).
Some of the publications retrieved report on the positive influence of social interaction on self-regulation, whereby Ting (2013) and Reichelt et al. (2014) found in their experiments that communicative features, peer interaction and back-feedback gave learners more control over their learning. Kuo et al. (2012) emphasized in this respect that the method of the integration of collaborative learning mechanisms within an online inquiry-based learning environment has great potential to promote middle- and low-achievement learners’ problem-solving ability and learning attitudes. Michinov and Michinov (2007) add to this that paying closer attention to social interaction is particularly useful during transition periods at the midpoint of an online collaborative activity. Liaw et al. (2010) found during a survey study that enriching interaction and communication activities have a significant positive influence on the acceptance of mobile-learning systems. Siampou et al. (2014) investigated whether the type of interaction influences the learners’ modelling processes. Their results suggest that the online dyads focused extensively on the analysis and synthesis actions and their learning was higher than their offline counterparts. Lin et al. (2012) identified in a correlation study that the establishment of social interaction to promote intrinsic motivation increased positive affect and fulfilment in web-based environments. Ai-Lim Lee et al. (2010) found that interaction with the desktop virtual reality application only impacted learning effectiveness (positively). Gomez et al. (2010) emphasize the interaction between motivation and social interaction and perceived learning, concluding that when learners value these social interactions, they will enjoy learning more.
Other publications report on the negative influence of the lack of social interaction on a mix of learner variables. Artino (2009a) and DuBois et al. (2008) observed using an experiment that a lack of interaction results in a decrease in engagement and satisfaction and an increase in drop-out risk. In summary, it can be observed that the publications retrieved report positively on the influence of social interaction for increasing cognitive (e.g., Siampou et al. 2014), metacognitive (e.g., Kuo et al. 2012) and motivational e.g., Lin et al. (2012) learner variables. A negative influence is seen with regard to motivation when there is a lack of social interaction.
3.2.6 Reflection
In total, 14 publications appear to focus on cues that increase the reflective practice of learners in blended learning environments (e.g., Aleven and Koedinger 2002; Anseel et al. 2009; Ibabe and Jauregizar 2010; Kim and Ryu 2013; Martens et al. 2010; Mauroux et al. 2014). Reflection cues are defined in these publications as prompts that aim to activate learners’ purposeful critical analysis of knowledge and experience, in order to achieve deeper meaning and understanding. The publications describe three main types: first, reflection during action, which takes place while learners are performing a task; second, reflection about action, which is systematic and deliberate consideration of a task that has already been completed; and third, reflection before action, which involves proactive thinking about a task which will soon be performed.
There is some evidence that reflection can be used to increase learner motivation, especially when learners are in a state of low motivation to learn (Ibabe and Jauregizar 2010). The majority of evidence supporting the influence of reflection on self-regulation-influencing variables relates to cognitive learner variables. Anseel et al. (2009) concluded, in their investigation of reflection as a strategy for enhanced task performance, that reflection combined with feedback has a more positive impact than feedback alone on task performance. Ai-Lim Lee et al. (2010) and Aleven and Koedinger (2002), who used experiments, added to this that engaging learners in reflective thinking is a significant antecedent to learning outcomes and that engaging them in explanation helps learners acquire better-integrated knowledge.
In addition, a substantial number of publications were found that focus on metacognitive variables. Kim and Ryu (2013), for example, found that peer interaction and back-feedback gave learners more control over their learning; these learners scored significantly higher for metacognitive awareness and performance than the traditional peer assessment group, who in turn achieved higher scores for metacognitive awareness than a self-assessment group who received no peer interaction or back-feedback. Based on a survey study, Niemi et al. (2003) suggested that young learners gain new information about their learning strategies and skills through negotiation with peers and that this negotiation also helps more experienced learners strengthen their learning.
In summary, the publications retrieved report positively on the influence of reflection on cognitive (e.g., Anseel et al. 2009), metacognitive (e.g., Kim and Ryu 2013) and motivational (e.g., Ibabe and Jauregizar 2010) learner variables. Anseel et al. (2009) emphasize that learners’ levels of learning goal orientation, need for cognition and personal importance affect the extent to which individuals engage in reflection positively. Ibabe and Jauregizar (2010) and Mauroux et al. (2014) supplement this claim with the finding that when leaners have low levels of motivation and acceptance of reflection, the only type of reflection tool they will use are self-assessment tools.
3.2.7 Calibration
The search identified 15 publications which appear to centre around cues for calibration in blended learning environments (e.g., Anseel et al. 2009; Artino 2009a; Artino and Stephens 2009; Brusso and Orvis 2013). These publications describe calibration cues as triggers for learners to test their perceptions of achievement against their actual achievement. They are used both to overcome deviations in learner’s judgements from the facts by introducing notions of bias and also to address metric issues regarding the validity of cues’ contributions to judgements. Two main types of calibration cues were identified in the publications retrieved: prompts that aim to trigger metacognitive monitoring, such as reviewing content, and secondly, checklists and timed alerts to summarize content and practice tests to help learners compare their own perceptions and the facts.
Using an experimental design Vighnarajah et al. (2009) found that learners reported practising different self-regulated learning strategies (intrinsic and extrinsic goal orientation, control of learning beliefs, rehearsal, elaboration, critical thinking, peer learning, and help seeking). The strategies that interested learners the least were task value, effort regulation, and metacognitive self-regulation. Artino (2009a) illustrated the importance of learner goal-setting by showing that learners with career aspirations directly related to the course content would be more likely to report adaptive motivation and academic success than their peers. Using a survey study, Brusso and Orvis (2013) found that learners who experienced a larger goal-performance discrepancy at the beginning of a course performed worse in the subsequent sessions than those whose performance more closely mirrored their goals. The two survey studies conducted by Brusso and Orvis (2013) and Anseel et al. (2009) suggest that a combination of reflection interventions and goal-setting instructions (looking back on past behaviour by means of coached reflection and managing future behaviour by setting goals) appears to be a particularly strong intervention. Artino and Stephens (2009) illustrate this by presenting two instructional strategies for helping learners identify and set challenging, proximal goals and for providing them with timely, honest, explicit performance feedback.
Despite the moderate number of publications retrieved, the evidence indicates the importance of helping learners make a reasonable estimation of the instructors’ expectations and their own capabilities. The studies call for appropriate cues for task definition, goal-setting and planning in order to influence the cognitive (e.g., Brusso and Orvis 2013) metacognitive (e.g.,Artino and Stephens 2009) and motivational (e.g., Artino 2009a) learning variables that in turn influence self-regulation.
4 Conclusions and discussion
The aim of this systematic literature review was to identify attributes of blended learning environments that support self-regulation. An inductive or bottom-up approach was used. Following the initial literature analysis, seven attributes were identified and defined. First, authenticity was defined as the real-world relevance of the learning experience (both task and learning environment) to learners’ professional and personal lives. Secondly, personalization was defined as non-homogenous experiences related directly to the tailoring of the learning environment (name recognition, self-description and cognitive situationing) to the inherent needs of each individual learner. Third, learner control was defined as an inclusive concept which describes the degree to which learners have control over the content and activities (pace, content, learning activities and sequencing) within the learning environment. Fourth, scaffolding was defined as changes in the task or learning environment (support which diminished over time) which assist learners in accomplishing tasks that would otherwise be beyond their reach. Fifth, interaction was described as learners’ involvement with elements in the learning environment (content, instructor, other learners and interface). Sixth, reflection cues were defined as prompts that aim to activate learners’ purposeful critical analysis of knowledge and experience (before, during and after), in order to achieve deeper meaning and understanding. Finally, calibration cues were described as triggers for learners (forms, timed alerts and practice tests) to test their perceptions of achievement against their actual achievement and their perceived use of study tactics against their actual use of study tactics.
While this systematic literature review has attempted to identify and define the seven attributes as clearly as possible, it remains unclear what the exact relationship is between each attribute and the self-regulatory behaviour exhibited by learners. It is beyond the scope of this review to address this problem directly. In what follows, however, we make a first attempt to explain the relevance of each attribute using the Four-stage Model of Self-regulated Learning developed by Winne and Hadwin (1998). As mentioned earlier, it is the first two phases of this model—task definition and goal-setting and planning—that are most susceptible to instruction, so the main focus will lie on these two phases (Butler and Winne 1995; Winne and Hadwin 1998; Zimmerman 2000).
4.1 Attributes and their relation to the four-stage model of self-regulated learning
In promoting self-regulation, both constructivist and sociocultural theories stress the importance of building on learners’ existing knowledge and skills (Harris and Pressley 1991; Vygotsky 1978). It has been argued that, rather than providing direct instruction about predefined strategies, teachers should provide support that assists learners to self-regulate their own learning effectively (Butler 1998; Palincsar and Brown 1988). Based on this premise, a search for attributes that support self-regulation in blended learning environments was performed. Authenticity and personalization in the environment seem to contextualize and individualize the conditions and standards needed to make appropriate judgements about the task at hand and thus direct goal-setting and planning. Both authenticity and personalization support learners in situating the task in a realistic, familiar context and tailor it to the general preferences of the learner. In doing so, the environment takes into account the cognition, metacognition and motivation of the learners and supports the identification of conditions (how the task at hand will be approached) and standards (criteria against which products will be evaluated) (Butler 2002; Reeve and Brown 1985). It is worth bearing in mind, however, that when learners have had negative prior experiences, they will judge the conditions and standards less accurately (Lodewyk et al. 2009). Similarly, learner control and scaffolding seem to help learners maximize their degree of control over their own learning and evaluate their learning (comparing standards) more accurately (Perry 1998; Perry et al. 2004) and thus set more appropriate goals and plan further actions. As the learners are allowed to choose how to learn more freely, and as the support provided is tailored and reduced over time, learners experience how products should be evaluated according to the standards they set themselves and thus how to maximize self-regulation. The relation between learner control and scaffolding is worth mentioning, because when learners have low self-regulatory skills, for example, a high degree of learner control in the environment will leave them wandering aimlessly unless they are supported by scaffolds that gradually disappear over time (Lynch and Dembo 2004). Interaction and cues for reflection expose learners to the various procedures available (e.g. through social interaction, reflection questions, etc.), providing them with self-initiated feedback about their own performance and helping them to select appropriate procedures for tackling the task at hand (Kumar et al. 2010). This supports learners in identifying the procedures needed to define and execute the task, which influences their planning of the actual performance. While reflection and interaction support practice retrospectively, they do not have an impact on faulty calibration mechanisms. Cues for calibration therefore need to be put in place to make learners with low self-regulatory abilities aware of such problems. Cues for calibration help learners assess their performance correctly and compare it to the standards they initially set and act upon any perceived deficit (Hadwin and Winne 2001). Involving learners in processes of external feedback (e.g. by taking tests) will provide them with a realistic framework against which to compare themselves (Winne and Jamieson-Noel 2002).
4.2 The attributes and their relation to current learning theories
To consolidate the relevance of the attributes identified for the design of blended learning environment, they were also tested against other well-established learning theories and instructional design models, with positive results. While conceptual transparency is sometimes lacking within and between these models, our results bear striking similarities to the Four Component Instructional Design model of van Merriënboer (1997), which focuses on task execution support. Van Merriënboer’s model states that learners will be able to complete a task when there is a degree of (1) authenticity (van Merriënboer 1997); (2) personalized task selection (Salden et al. 2006); (3) learner control in selecting their own learning tasks (Corbalan et al. 2009); (4) support for calibrating learners’ goal directedness (van Merriënboer 1997); (5) scaffolding for complex tasks to prevent cognitive overload (van Merriënboer et al. 2002); (6) reflection triggered by cues integrated with feedback (van den Boom et al. 2007; Wouters et al. 2009); and (7) interaction with peers (van Zundert et al. 2010). It can also be observed that the attributes identified by the review presented here are among the basic components of any powerful learning environment (De Corte et al. 1996; De Corte et al. 2003) as well as a typical constructivist learning environment (Jonassen 1999; Wilson 1996). These conclusions support the view that the attributes of blended learning environments identified as supporting self-regulation can in fact be seen as basic attributes of any effective learning environment; they can therefore be found in learning theories and instructional design models that are not specifically related to blended learning. This finding contributes to the question raised by certain researchers of whether the concept of blended learning should be reconsidered (Oliver and Trigwell 2005). Our findings do indeed suggest that the concept of blended learning could be simplified both theoretically and conceptually. The principal value of this review, however, lies in its identification of design features that foster learners’ self-regulation. To the best of our knowledge, this is the first study of self-regulation to present such a framework of design attributes.
4.3 Limitations of the study
A number of limitations, both of the publications described and the systematic literature review itself, should be acknowledged. The publications retrieved for this contribution demonstrate both theoretical and methodological limitations and inconsistencies. With regard to methodology, we often see a lack of awareness about the studies’ reliability issues. In many case, only the group receiving treatment is described; pre- and post-tests are only administered to the experimental group; and/or no control group is included. Such methodological flaws make it difficult to ascertain the exact design of a study and gain insight into its validity. It also remains unclear in some cases which variables are targeted by the study design. A well-thought-out model of variables and their interactions and mediations would be beneficial for reviewing the literature and reflecting upon interactions and common characteristics in the wide-ranging field that is instruction and support in blended learning environments. Furthermore, the literature often reports on multiple related concepts at the same time (e.g. proactive stickiness, learning gratifications, computer self-efficacy, learning outcome expectations, social environment, interaction, learning climate, system characteristics and digital material features). This makes it difficult to ascribe certain effects to specific interventions or variables.
A number of theoretical limitations were also evident in the publications retrieved. First, conceptual transparency, including situating the concepts within a broader theoretical framework or instructional theory, is problematic. Due to a lack of clarity about other potentially influencing variables in the model used, or the learning environment in which the study was conducted, it is sometimes difficult to determine which variable is responsible for which outcome. Secondly, the studies appear to make minimal use of instructional design approaches. Using such systematic approaches would help give more insight into the interventions and their conditions. Without a detailed description and specific design, however, study replication is impossible. The third and final remark is that the existing literature is often descriptive rather than theoretical or explanatory. Studies frequently reported on observations using surveys, for example, instead of researching the reasons behind these observations by conducting interventions and experiments. This point also influences the nature of the systematic literature review presented in this study. Specifically, the review is unable to describe in great depth which interventions are successful for which variables. In addition, it also describes the attributes that affect cognitive, metacognitive and motivational variables rather than explaining, for example, the precise degree of learner control needed to evoke a change in motivation for learners with low self-regulatory abilities.
As stated above, the systematic review methodology also has its limitations. One limitation is the scope and level of detail provided about each of the attributes identified, which can be seen as a constraint for immediate application in practice (e.g. design of learning environments). The main focus of this review was to identify attributes rather than focus immediately on application; the output therefore remains descriptive. Accordingly, our first suggestion for future research is to undertake a deeper analysis of each of the attributes presented by performing an additional, extended literature review per attribute in order to gain a more profound understanding of the current situation. The second limitation concerns the development of the search string and the validity of the attribute categorization. The approach combined a theory-driven search string with inclusion and exclusion criteria; a twofold (peer-reviewed), double (manual versus bibliometric) check was also performed, resulting in a robust selection of publications. This contributes to the replicability and validity of the study and to the detailed demarcation of attributes. On the other hand, however, a reasonable number of potentially relevant publications (e.g. reviews of different support types, learner variables or attributes) were excluded. Thirdly, while considerable effort was made to interpret the publications correctly and as intended by their authors, other potentially relevant findings may have been overlooked due to the explicit search for concepts relating to self-regulation in blended learning environments.
Despite the limitations mentioned above, this systematic literature review makes a number of useful contributions. It provides a clear overview of the existing literature by identifying and defining seven attributes that appear to be worth taking into account when designing blended learning environments that support self-regulation, namely authenticity, personalization, learner-control, scaffolding, interaction and cues for reflection and calibration. In addition, one key finding will help further the debate on the relevance of models for designing blended learning environments: attributes of blended learning environments that support self-regulation appear to tie in closely with the attributes of any effective learning environment. Finally, this study has the potential to function as a basis for further research on the attributes of blended learning and technology-mediated environments that support self-regulation. It would be useful not only to review existing research further on self-regulation per attribute (as suggested above), but also to obtain more experimental evidence for each attribute. Such studies might involve the following steps: firstly, create a sound basis for comparison using a well-established instructional design model (e.g., Merrill 2002; van Merriënboer 1997) for the experimental and control conditions. Secondly, after administering a pre-test for one of the self-regulatory variables, a treatment can be implemented among an experimental group focusing on the attributes of self-regulation; this will help clarify how certain attributes relate to the variable being investigated. A third and final step would be to compare the post-tests of the experimental and control groups and describe any differences found. Using such an approach would enhance the replicability and validity of the study and help to unravel how and why the attributes identified here impact the variables responsible for learners’ self-regulatory abilities.
References
Ai-Lim Lee, E., Wong, K. W., & Fung, C. C. (2010). How does desktop virtual reality enhance learning outcomes? a structural equation modeling approach. Computers & Education, 55(4), 1424–1442. doi:10.1016/j.compedu.2010.06.006.
Alant, E., & Dada, S. (2005). Group learning on the web. International Journal of Educational Development, 25(3), 305–316. doi:10.1016/j.ijedudev.2004.11.010.
Aleven, V. A. W. M. M., & Koedinger, K. R. (2002). An effective metacognitive strategy: learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive Science, 26(2), 147–179. doi:10.1016/S0364-0213(02)00061-7.
Allan, J. (1996). Learning outcomes in higher education. Studies in Higher Education, 21(1), 93–108. doi:10.1080/03075079612331381487.
Ambrose-Oji, B., Lawrence, A., & Stewart, A. (2014). Community based forest enterprises in Britain: two organising typologies. Forest Policy and Economics. doi:10.1016/j.forpol.2014.11.005.
Anseel, F., Lievens, F., & Schollaert, E. (2009). Reflection as a strategy to enhance task performance after feedback. Organizational Behavior and Human Decision Processes, 110(1), 23–35. doi:10.1016/j.obhdp.2009.05.003.
Artino, A. R. (2009a). Online learning: are subjective perceptions of instructional context related to academic success? Internet & Higher Education, 12(3/4), 117–125. doi:10.1016/j.iheduc.2009.07.003.
Artino, A. R. (2009b). Think, feel, act: motivational and emotional influences on military students’ online academic success. Journal of Computing in Higher Education, 21(2), 146–166. doi:10.1007/s12528-009-9020-9.
Artino, A. R., & Jones, K. D., II. (2012). Exploring the complex relations between achievement emotions and self-regulated learning behaviors in online learning. Internet and Higher Education, 15(3), 170–175. doi:10.1016/j.iheduc.2012.01.006.
Artino, A. R., & Stephens, J. M. (2009). Academic motivation and self-regulation: a comparative analysis of undergraduate and graduate students learning online. Internet and Higher Education, 12(3-4), 146–151. doi:10.1016/j.iheduc.2009.02.001.
Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191.
Bandura, A. (1989). Human agency in social cognitive theory. American Psychologist, 44(9), 1175.
Barnard, L., Lan, W. Y., To, Y. M., Paton, V. O., & Lai, S.-L. (2009). Measuring self-regulation in online and blended learning environments. The Internet and Higher Education, 12(1), 1–6. doi:10.1016/j.iheduc.2008.10.005.
Barratt, P. E. H. (1971). Bases of psychological methods (vol. 7). Australia: Wiley.
Barzilai, S., & Eshet-Alkalai, Y. (2015). The role of epistemic perspectives in comprehension of multiple author viewpoints. Learning and Instruction, 36, 86–103. doi:10.1016/j.learninstruc.2014.12.003.
Baumeister, R. F., & Heatherton, T. F. (1996). Self-regulation failure: an overview. Psychological Inquiry, 7(1), 1–15. doi:10.1207/s15327965pli0701_1.
Baumeister, R. F., & Leary, M. R. (1997). Writing narrative literature reviews. Review of General Psychology, 1(3), 311.
Benson, P. (2013). Teaching and researching: Autonomy in language learning: Routledge.
Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., & Huang, B. (2004). How does distance education compare with classroom instruction? a meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379–439. doi:10.3102/00346543074003379.
Bersin, J., & others. (2003). What works in blended learning. Learning circuits.
Blok, H., Oostdam, R., Otter, M. E., & Overmaat, M. (2002). Computer-assisted instruction in support of beginning reading instruction: a review. Review of Educational Research, 72(1), 101–130. doi:10.3102/00346543072001101.
Boekaerts, M. (1992). The adaptable learning process: initiating and maintaining behavioural change. Applied Psychology, 41(4), 377–397. doi:10.1111/j.1464-0597.1992.tb00713.x.
Boekaerts, M. (1995). The interface between intelligence and personality as determinants of classroom learning. International handbook of personality and intelligence (161–183): Springer.
Boekaerts, M. (1996a). Coping with stress in childhood and adolescence.
Boekaerts, M. (1996a). Self-regulated learning at the junction of cognition and motivation. European Psychologist, 1(2), 100. doi:10.1027/1016-9040.1.2.100.
Boekaerts, M. (1997). Self-regulated learning: a new concept embraced by researchers, policy makers, educators, teachers, and students. Learning and Instruction, 7(2), 161–186. doi:10.1016/S0959-4752(96)00015-1.
Boekaerts, M. (1999). Self-regulated learning: where we are today. International Journal of Educational Research, 31, 445–457. doi:10.1016/S0883-0355(99)00014-2.
Boekaerts, M., Pintrich, P. R., & Zeidner, M. (2005). Handbook of self-regulation: Elsevier.
Boelens, R., Van Laer, S., De Wever, B., & Elen, J. (2015). Blended learning in adult education: towards a definition of blended learning.
Borkowski, J., Carr, M., Rellinger, E., Pressley, M., & others. (1990). Self-regulated cognition: interdependence of metacognition, attributions, and self-esteem. Dimensions of Thinking and Cognitive Instruction, 1, 53–92.
Brusso, R. C., & Orvis, K. A. (2013). The impeding role of initial unrealistic goal-setting on videogame-based training performance: identifying underpinning processes and a solution. Computers in Human Behavior, 29(4), 1686–1694. doi:10.1016/j.chb.2013.01.006.
Butler, D. L. (1998). The strategic content learning approach to promoting self-regulated learning: a report of three studies. Journal of Educational Psychology, 90(4), 682.
Butler, D. L. (2002). Individualizing instruction in self-regulated learning. Theory Into Practice, 41(2), 81–92.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: a theoretical synthesis. Review of Educational Research, 65, 245–281. doi:10.2307/1170684.
Casillas, J. M., & Gremeaux, V. (2012). Evaluation of medical students’ expectations for multimedia teaching materials: illustration by an original method using the evaluation of a web site on cardiovascular rehabilitation. Annals of Physical and Rehabilitation Medicine, 55(1), 25–37. doi:10.1016/j.rehab.2011.12.001.
Chen, Y.-C. (2014). An empirical examination of factors affecting college students’ proactive stickiness with a web-based English learning environment. Computers in Human Behavior, 31, 159–171. doi:10.1016/j.chb.2013.10.040.
Cheng, B., Wang, M., Mørch, A. I., Chen, N.-S., Kinshuk, & Spector, J. M. (2014). Research on e-learning in the workplace 2000–2012: a bibliometric analysis of the literature. Educational Research Review, 11, 56–72. doi:10.1016/j.edurev.2014.01.001.
Chia-Wen, T., Pei-Di, S., & Meng-Chuan, T. (2011). Developing an appropriate design of blended learning with web-enabled self-regulated learning to enhance students’ learning and thoughts regarding online learning. Behaviour & Information Technology, 30(2), 261–271. doi:10.1080/0144929X.2010.514359.
Cholowski, K. M. R. N. B. M. P., & Chan, L. K. S. B. P. (2004). Cognitive factors in student nurses’ clinical problem solving. Journal of Evaluation in Clinical Practice, 10(1), 85–95.
Clark, E., Draper, J., & Rogers, J. (2015). Illuminating the process: enhancing the impact of continuing professional education on practice. Nurse Education Today, 35(2), 388–394. doi:10.1016/j.nedt.2014.10.014.
Corbalan, G., Kester, L., & van Merriënboer, J. J. G. (2008). Selecting learning tasks: effects of adaptation and shared control on learning efficiency and task involvement. Contemporary Educational Psychology, 33(4), 733–756. doi:10.1016/j.cedpsych.2008.02.003.
Corbalan, G., Kester, L., & van Merriënboer, J. J. G. (2009). Dynamic task selection: effects of feedback and learner control on efficiency and motivation. Learning and Instruction, 19(6), 455–465. doi:10.1016/j.learninstruc.2008.07.002.
Cox, E. D., Koscik, R. L., Olson, C. A., Behrmann, A. T., Hambrecht, M. A., McIntosh, G. C., & Kokotailo, P. K. (2006). Caring for the Underserved: blending service learning and a web-based curriculum. American Journal of Preventive Medicine, 31(4), 342–349. doi:10.1016/j.amepre.2006.06.024.
Cramer, M. E., High, R., Culross, B., Conley, D. M., Nayar, P., Nguyen, A. T., & Ojha, D. (2014). Retooling the RN workforce in long-term care: nursing certification as a pathway to quality improvement. Geriatric Nursing, 35(3), 182–187. doi:10.1016/j.gerinurse.2014.01.001.
Dai, C.-Y., & Huang, D.-H. (2015). Causal complexities to evaluate the effectiveness of remedial instruction. Journal of Business Research, 68(4), 894–899. doi:10.1016/j.jbusres.2014.11.048.
Davis, J. M., & Yi, M. Y. (2012). User disposition and extent of web utilization: a trait hierarchy approach. International Journal of Human-Computer Studies, 70(5), 346–363. doi:10.1016/j.ijhcs.2011.12.003.
De Corte, E., Greer, B., & Verschaffel, L. (1996). Mathematics teaching and learning.
De Corte, E., Verschaffel, L., Entwistle, N., & van Merriënboer, J. J. G. (2003). Unravelling basic components and dimensions of powerful learning environments. Oxford: Elsevier Science.
De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201. doi:10.3102/00346543068002179.
Demetriadis, S. N., Papadopoulos, P. M., Stamelos, I. G., & Fischer, F. (2008). The effect of scaffolding students’ context-generating cognitive activity in technology-enhanced case-based learning. Computers & Education, 51(2), 939–954. doi:10.1016/j.compedu.2007.09.012.
Donnelly, R. (2010). Harmonizing technology with interaction in blended problem-based learning. Computers & Education, 54(2), 350–359. doi:10.1016/j.compedu.2009.08.012.
Doo, M. Y. (2006). A problem in online interpersonal skills training: do learners practice skills? Open Learning, 21(3), 263–272. doi:10.1080/02680510600953252.
Driscoll, M. (2002). Blended learning: let’s get beyond the hype. E-learning, 1.
DuBois, J. M. P. D., Dueker, J. M. M. P. H., Anderson, E. E. P. M. P. H., & Campbell, J. P. (2008). The development and assessment of an NIH-funded research ethics training program. Academic Medicine, 83(6), 596–603. doi:10.1097/ACM.0b013e3181723095.
Garrison, D. R., & Kanuka, H. (2004). Blended learning: uncovering its transformative potential in higher education. The Internet and Higher Education, 7(2), 95–105. doi:10.1016/j.iheduc.2004.02.001.
Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines: John Wiley & Sons.
Gerhard, M., Moore, D., & Hobbs, D. (2004). Embodiment and copresence in collaborative interfaces. International Journal of Human-Computer Studies, 61(4), 453–480. doi:10.1016/j.ijhcs.2003.12.014.
Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285–292. doi:10.1016/j.chb.2012.09.005.
Gomez, E. A., Wu, D., & Passerini, K. (2010). Computer-supported team-based learning: the impact of motivation, enjoyment and team contributions on learning outcomes. Computers & Education, 55(1), 378–390. doi:10.1016/j.compedu.2010.02.003.
Govaere, L. J., de Kruif, A., & Valcke, M. (2012). Differential impact of unguided versus guided use of a multimedia introduction to equine obstetrics in veterinary education. Computers & Education, 58(4), 1076–1084. doi:10.1016/j.compedu.2011.11.006.
Graddol, D., Maybin, J., & Stierer, B. (1994). Researching language and literacy in social context: a reader: Multilingual matters.
Graham, C. R. (2006). Blended learning systems. CJ Bonk & CR Graham, The handbook of blended learning: Global perspectives, local designs. Pfeiffer.
Greene, J. A., & Azevedo, R. (2007). A theoretical review of Winne and Hadwin’s model of self-regulated learning: new perspectives and directions. Review of Educational Research, 77(3), 334–372. doi:10.3102/003465430303953.
Gulikers, J. T. M., Bastiaens, T. J., & Martens, R. L. (2005). The surplus value of an authentic learning environment. Computers in Human Behavior, 21(3), 509–521. doi:10.1016/j.chb.2004.10.028.
Hadwin, A. F., & Winne, P. H. (2001). CoNoteS2: a software tool for promoting self-regulation. Educational Research and Evaluation, 7(2-3), 313–334.
Harris, K. R., & Pressley, M. (1991). The nature of cognitive strategy instruction: interactive strategy construction. Exceptional Children, 57(5), 392–404.
Harrison, M. (2003). Blended learning in practice.
Hart, C. (2009). Doing a literature review: Releasing the social science research imagination: Sage.
Higgins, J. P., & Green, S. (2008). Cochrane handbook for systematic reviews of interventions (vol. 5): Wiley Online Library.
Ho, C., & Dzeng, R.-J. (2010). Construction safety training via e-learning: learning effectiveness and user satisfaction. Computers & Education, 55(2), 858–867. doi:10.1016/j.compedu.2010.03.017.
Ho, C., & Swan, K. (2007). Evaluating online conversation in an asynchronous learning environment: an application of Grice’s cooperative principle. The Internet and Higher Education, 10(1), 3–14. doi:10.1016/j.iheduc.2006.11.002.
Hodges, C. B., & Murphy, P. F. (2009). Sources of self-efficacy beliefs of students in a technology-intensive asynchronous college algebra course. The Internet and Higher Education, 12(2), 93–97. doi:10.1016/j.iheduc.2009.06.005.
House, R. (2002). Clocking in column. The Spokesman-Review.
Hughes, M. G., Day, E. A., Wang, X., Schuelke, M. J., Arsenault, M. L., Harkrider, L. N., & Cooper, O. D. (2013). Learner-controlled practice difficulty in the training of a complex task: cognitive and motivational mechanisms. Journal of Applied Psychology, 98(1), 80–98. doi:10.1037/a0029821.
Hung, H.-L., & Hyun, E. (2010). East Asian international graduate students’ epistemological experiences in an American University. International Journal of Intercultural Relations, 34(4), 340–353. doi:10.1016/j.ijintrel.2009.12.001.
Hung, S.-Y., Huang, K.-L., & Yu, W.-J. (2011). An empirical study of the effectiveness of multimedia disclosure of informed consent: a technology mediated learning perspective. Information & Management, 48(4–5), 135–144. doi:10.1016/j.im.2011.03.002.
Ibabe, I., & Jauregizar, J. (2010). Online self-assessment with feedback and metacognitive knowledge. Higher Education, 59(2), 243–258. doi:10.1007/s10734-009-9245-6.
Ifenthaler, D. (2010). Learning and instruction in the digital age: Springer.
Ioannou, A., Brown, S. W., & Artino, A. R. (2015). Wikis and forums for collaborative problem-based activity: a systematic comparison of learners’ interactions. The Internet and Higher Education, 24, 35–45. doi:10.1016/j.iheduc.2014.09.001.
Jonas, D., & Burns, B. (2010). The transition to blended e-learning. changing the focus of educational delivery in children’s pain management. Nurse Education in Practice, 10(1), 1–7. doi:10.1016/j.nepr.2009.01.015.
Jonassen, D. H. (1999). Designing constructivist learning environments. Instructional Design Theories and Models: A New Paradigm of Instructional Theory, 2, 215–239.
Joy, M. (2007). Research methods in education (6th Edition). Bioscience Education, 10. doi: http://dx.doi.org/10.3108/beej.10.r1.
Kearsley, G., & Moore, M. (1996). Distance education: a systems view. Wadsworth Publishing Company, Washington, 290, 80.
Keegan, D. (1996). Foundations of distance education: Psychology Press.
Keller, J. M. (2010a, 2010). Challenges in learner motivation: A holistic, integrative model for research and design on learner motivation.
Keller, J. M. (2010a). Motivational design for learning and performance. Boston: Springer US.
Kim, M., & Ryu, J. (2013). The development and implementation of a web-based formative peer assessment system for enhancing students’ metacognitive awareness and performance in ill-structured tasks. Educational Technology Research and Development, 61(4), 549–561. doi:10.1007/s11423-012-9266-1.
Knowles, M. S., Holton, E. F., & Swanson, R. A. (2014). The adult learner: The definitive classic in adult education and human resource development: Routledge.
Kobak, K. A., Craske, M. G., Rose, R. D., & Wolitsky-Taylor, K. (2013). Web-based therapist training on cognitive behavior therapy for anxiety disorders: a pilot study. Psychotherapy, 50(2), 235–247. doi:10.1037/a0030568.
Koh, J. H. L., & Chai, C. S. (2014). Teacher clusters and their perceptions of technological pedagogical content knowledge (TPACK) development through ICT lesson design. Computers & Education, 70, 222–232. doi:10.1016/j.compedu.2013.08.017.
Koke, T., & Norvele, I. (2008). Incorporation of learning strategies into adult distance learning. Studies for the Learning Society, 1(1), 39. doi:10.2478/v10240-012-0017-y.
Kovačević, I., Minović, M., Milovanović, M., de Pablos, P. O., & Starčević, D. (2013). Motivational aspects of different learning contexts: “My mom won’t let me play this game…”. Computers in Human Behavior, 29(2), 354–363. doi:10.1016/j.chb.2012.01.023.
Kumar, V. S., Gress, C. L., Hadwin, A. F., & Winne, P. H. (2010). Assessing process in CSCL: an ontological approach. Computers in Human Behavior, 26(5), 825–834.
Kuo, F.-R., Hwang, G.-J., & Lee, C.-C. (2012). A hybrid approach to promoting students’ web-based problem-solving competence and learning attitude. Computers & Education, 58(1), 351–364. doi:10.1016/j.compedu.2011.09.020.
Lafuente Martínez, M., Álvarez Valdivia, I. M., & Remesal Ortiz, A. (2015). Making learning more visible through e-assessment: implications for feedback. Journal of Computing in Higher Education, 27(1), 10–27. doi:10.1007/s12528-015-9091-8.
Law, E. L.-C., & Sun, X. (2012). Evaluating user experience of adaptive digital educational games with activity theory. International Journal of Human-Computer Studies, 70(7), 478–497. doi:10.1016/j.ijhcs.2012.01.007.
Leen, E. A. E., & Lang, F. R. (2013). Motivation of computer based learning across adulthood. Computers in Human Behavior, 29(3), 975–983. doi:10.1016/j.chb.2012.12.025.
Liaw, S.-S., Hatala, M., & Huang, H.-M. (2010). Investigating acceptance toward mobile learning to assist individual knowledge management: based on activity theory approach. Computers & Education, 54(2), 446–454. doi:10.1016/j.compedu.2009.08.029.
Lin, K.-M. (2011). e-Learning continuance intention: moderating effects of user e-learning experience. Computers & Education, 56(2), 515–526. doi:10.1016/j.compedu.2010.09.017.
Lin, A. C. H., Fernandez, W. D., & Gregor, S. (2012). Understanding web enjoyment experiences and informal learning: a study in a museum context. Decision Support Systems, 53(4), 846–858. doi:10.1016/j.dss.2012.05.020.
Lin, S., Zimmer, J. C., & Lee, V. (2013). Podcasting acceptance on campus: the differing perspectives of teachers and students. Computers & Education, 68, 416–428. doi:10.1016/j.compedu.2013.06.003.
Lodewyk, K. R., Winne, P. H., & Jamieson-Noel, D. L. (2009). Implications of task structure on self-regulated learning and achievement. Educational Psychology, 29(1), 1–25.
Lynch, R., & Dembo, M. (2004). The relationship between self-regulation and online learning in a blended learning context. The International Review of Research in Open and Distance Learning, 5.
Ma, Y. (2012). A study on teachers’ online distance learning at Shangluo University.
Makoe, M., Richardson, J. T., & Price, L. (2008). Conceptions of learning in adult students embarking on distance education. Higher Education, 55(3), 303–320. doi:10.1007/s10734-007-9056-6.
Martens, R., de Brabander, C., Rozendaal, J., Boekaerts, M., & van der Leeden, R. (2010). Inducing mind sets in self-regulated learning with motivational information. Educational Studies, 36(3), 311–327. doi:10.1080/03055690903424915.
Mauroux, L., Konings, K. D., Zufferey, J. D., & Gurtner, J.-L. (2014). Mobile and online learning journal: effects on apprentices’ reflection in vocational education and training. Vocations and Learning, 7(2), 215–239. doi:10.1007/s12186-014-9113-0.
Melton, R. (1997). Objectives, competences and learning outcomes. London: Kogan Page.
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43–59. doi:10.1007/Bf02505024.
Michalsky, T. (2014). Developing the SRL-PV assessment scheme: preservice teachers’ professional vision for teaching self-regulated learning. Studies in Educational Evaluation, 43, 214–229. doi:10.1016/j.stueduc.2014.05.003.
Michinov, E., & Michinov, N. (2007). Identifying a transition period at the midpoint of an online collaborative activity: a study among adult learners. Computers in Human Behavior, 23(3), 1355–1371. doi:10.1016/j.chb.2004.12.013.
Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: an integration of TAM and IS success model. Computers in Human Behavior, 45, 359–374. doi:10.1016/j.chb.2014.07.044.
Mohammadyari, S., & Singh, H. (2015). Understanding the effect of e-learning on individual performance: the role of digital literacy. Computers & Education, 82, 11–25. doi:10.1016/j.compedu.2014.10.025.
Mouly, G. J. (1978). Educational research: The art and science of investigation. Boston: Allyn and Bacon.
Mulder, Y. G., Lazonder, A. W., & de Jong, T. (2011). Comparing two types of model progression in an inquiry learning environment with modelling facilities. Learning and Instruction, 21(5), 614–624. doi:10.1016/j.learninstruc.2011.01.003.
Niemi, H., Nevgi, A., & Virtanen, P. (2003). Towards self-regulation in web-based learning. Journal of Educational Media, 28(1), 49–71. doi:10.1080/1358165032000156437.
Obura, T., Brant, W. E., Miller, F., & Parboosingh, I. J. (2011). Participating in a community of learners enhances resident perceptions of learning in an e-mentoring program: proof of concept. BMC Medical Education, 11, 3. doi:10.1186/1472-6920-11-3.
Oliver, M., & Trigwell, K. (2005). Can ‘blended learning’be redeemed. E-learning, 2, 17–26. doi:10.2304/elea.2005.2.1.17.
Oosterbaan, A. E., van der Schaaf, M. F., Baartman, L. K. J., & Stokking, K. M. (2010). Reflection during portfolio-based conversations. International Journal of Educational Research, 49(4–5), 151–160. doi:10.1016/j.ijer.2011.02.001.
Orey, M. (2002). Definition of blended learning. University of Georgia. Retrieved February, 21, 2003.
Orey, M. (2002b, 2002). One year of online blended learning: Lessons learned.
Palincsar, A. S., & Brown, A. L. (1988). Teaching and practicing thinking skills to promote comprehension in the context of group problem solving. Remedial and Special Education, 9(1), 53–59.
Perry, N. E. (1998). Young children’s self-regulated learning and contexts that support it. Journal of Educational Psychology, 90(4), 715. doi:10.1037/0022-0663.90.4.715.
Perry, N. E., Phillips, L., & Dowler, J. (2004). Examining features of tasks and their potential to promote self-regulated learning. The Teachers College Record, 106(9), 1854–1878. doi:10.1111/j.1467-9620.2004.00408.x.
Petticrew, M., & Roberts, H. (2008). Systematic reviews in the social sciences: A practical guide: John Wiley & Sons.
Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: a research framework and a preliminary assessment of effectiveness in basic IT skills training. MIS Quarterly, 25(4), 401–426. doi:10.2307/3250989.
Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning.
Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33. doi:10.1037/0022-0663.82.1.33.
Popham, W. J., Eisner, E. W., Sullivan, H. J., & Tyler, L. L. (1969). Instructional objectives (vol. 3): Rand McNally Chicago.
Popping, R. (2000). Computer-assisted text analysis: Sage.
Pressley, M., Levin, J. R., & McDaniel, M. A. (1987). Remembering versus inferring what a word means: Mnemonic and contextual approaches.
Puustinen, M., & Pulkkinen, L. (2001). Models of self-regulated learning: a review. Scandinavian Journal of Educational Research, 45(3), 269–286. doi:10.1080/00313830120074206.
Raupach, T., Munscher, C., Pukrop, T., Anders, S., & Harendza, S. (2010). Significant increase in factual knowledge with web-assisted problem-based learning as part of an undergraduate cardio-respiratory curriculum. Advances in Health Sciences Education, 15(3), 349–356. doi:10.1007/s10459-009-9201-3.
Ream, E., Gargaro, G., Barsevick, A., & Richardson, A. (2015). Management of cancer-related fatigue during chemotherapy through telephone motivational interviewing: modeling and randomized exploratory trial. Patient Education and Counseling, 98(2), 199–206. doi:10.1016/j.pec.2014.10.012.
Reay, J. (2001). Blended learning-a fusion for the future. Knowledge Management Review, 4, 6.
Reeve, R. A., & Brown, A. L. (1985). Metacognition reconsidered: implications for intervention research. Journal of Abnormal Child Psychology, 13(3), 343–356.
Regan, K., Evmenova, A., Baker, P., Jerome, M. K., Spencer, V., Lawson, H., & Werner, T. (2012). Experiences of instructors in online learning environments: identifying and regulating emotions. The Internet and Higher Education, 15(3), 204–212. doi:10.1016/j.iheduc.2011.12.001.
Reichelt, M., Kämmerer, F., Niegemann, H. M., & Zander, S. (2014). Talk to me personally: personalization of language style in computer-based learning. Computers in Human Behavior, 35, 199–210. doi:10.1016/j.chb.2014.03.005.
Reychav, I., & Wu, D. (2015). Are your users actively involved? a cognitive absorption perspective in mobile training. Computers in Human Behavior, 44, 335–346. doi:10.1016/j.chb.2014.09.021.
Robinson, B. (1995). LEARNER SUPPORT. Open and distance learning today, 221.
Roca, J. C., Chiu, C.-M., & Martínez, F. J. (2006). Understanding e-learning continuance intention: an extension of the technology acceptance model. International Journal of Human-Computer Studies, 64(8), 683–696. doi:10.1016/j.ijhcs.2006.01.003.
Romero, C., & Ventura, S. (2007). Educational data mining: a survey from 1995 to 2005. Expert Systems with Applications, 33(1), 135–146. doi:10.1016/j.eswa.2006.04.005.
Rooney, J. E. (2003). Knowledge infusion. Association Management, 55, 26–32.
Rossett, A. (2002). The ASTD e-learning handbook: Best practices, strategies, and case studies for an emerging field: McGraw-Hill Trade.
Salden, R. J. C. M., Paas, F., & van Merriënboer, J. J. G. (2006). Personalised adaptive task selection in air traffic control: effects on training efficiency and transfer. Learning and Instruction, 16(4), 350–362. doi:10.1016/j.learninstruc.2006.07.007.
Sands, P. (2002). Inside outside, upside downside. Strategies, 8.
Sansone, C., Fraughton, T., Zachary, J. L., Butner, J., & Heiner, C. (2011). Self-regulation of motivation when learning online: the importance of who, why and how. Etr&D-Educational Technology Research and Development, 59(2), 199–212. doi:10.1007/s11423-011-9193-6.
Sansone, C., Smith, J. L., Thoman, D. B., & MacNamara, A. (2012). Regulating interest when learning online: potential motivation and performance trade-offs. Internet and Higher Education, 15(3), 141–149. doi:10.1016/j.iheduc.2011.10.004.
Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7(4), 351–371. doi:10.1007/Bf02212307.
Schraw, G., Crippen, K. J., & Hartley, K. (2006). Promoting self-regulation in science education: metacognition as part of a broader perspective on learning. Research in Science Education, 36(1-2), 111–139. doi:10.1007/s11165-005-3917-8.
Schunk, D. H., Pintrich, P. R., & Meece, J. L. (2008). Motivation in education: theory, research, and applications.
Sewart, D. (1993). Student support systems in distance education. Open Learning, 8(3), 3–12.
Siampou, F., Komis, V., & Tselios, N. (2014). Online versus face-to-face collaboration in the context of a computer-supported modeling task. Computers in Human Behavior, 37, 369–376. doi:10.1016/j.chb.2014.04.032.
Singh, H., Reed, C., & others. (2001). A white paper: achieving success with blended learning. Centra software, 1.
Smith, G. G., & Kurthen, H. (2007). Front-stage and back-stage in hybrid e-learning face-to-face courses. International Journal on E-Learning, 6(3), 455–474.
Smith, P. L., & Ragan, T. J. (1999). Instructional design. New York: Wiley.
Smith, L. N., Craig, L. E., Weir, C. J., & McAlpine, C. H. (2008). The evidence-base for stroke education in care homes. Nurse Education Today, 28(7), 829–840. doi:10.1016/j.nedt.2008.02.002.
Spanjers, I. A., Könings, K. D., Leppink, J., Verstegen, D. M., de Jong, N., Czabanowska, K., & Van Merriënboer, J. J. (2015). The promised land of blended learning: quizzes as a moderator. Educational Research Review. doi:10.1016/j.edurev.2015.05.001.
Sternberg, B. K. (1991). A review of some experience with the induced-polarization/resistivity method for hydrocarbon surveys: Successes and limitations. Geophysics, 56(10), 1522–1532.
Strang, K. D. (2011). Knowledge articulation dialog increases online university science course outcomes. Education and Information Technologies, 16(2), 123–137. doi:10.1007/s10639-010-9130-z.
Sweller, J., Van Merrienboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296. doi:10.1023/A:1022193728205.
Tait, A. (2000). Planning student support for open and distance learning. Open Learning, 15(3), 287–299.
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online: a review of the research. Review of Educational Research, 76(1), 93–135. doi:10.3102/00346543076001093.
Tan, K. E., & Richardson, P. W. (2006). Writing short messages in English: out-of-school practices of Malaysian high school students. International Journal of Educational Research, 45(6), 325–340. doi:10.1016/j.ijer.2006.11.010.
Tao, Y.-H. (2008). Typology of college student perception on institutional e-learning issues—an extension study of a teacher’s typology in Taiwan. Computers & Education, 50(4), 1495–1508. doi:10.1016/j.compedu.2007.02.002.
Taplin, R. H., Kerr, R., & Brown, A. M. (2013). Who pays for blended learning? a cost–benefit analysis. The Internet and Higher Education, 18, 61–68. doi:10.1016/j.iheduc.2012.09.002.
Thomson, I. (2002). Thomson job impact study: the next generation of corporate learning. Retrieved July, 7, 2003.
Thorpe, M. (2002). Rethinking learner support: the challenge of collaborative online learning. Open Learning, 17(2), 105–119.
Ting, Y.-L. (2013). Using mobile technologies to create interwoven learning interactions: an intuitive design and its evaluation. Computers & Education, 60(1), 1–13. doi:10.1016/j.compedu.2012.07.004.
Tinto, V. (1975). Dropout from higher education: a theoretical synthesis of recent research. Review of Educational Research, 45, 89–125. doi:10.3102/00346543045001089.
Tseng, F.-C., & Kuo, F.-Y. (2010). The way we share and learn: an exploratory study of the self-regulatory mechanisms in the professional online learning community. Computers in Human Behavior, 26(5), 1043–1053. doi:10.1016/j.chb.2010.03.005.
van den Boom, G., Paas, F., & van Merriënboer, J. J. G. (2007). Effects of elicited reflections combined with tutor or peer feedback on self-regulated learning and learning outcomes. Learning and Instruction, 17(5), 532–548. doi:10.1016/j.learninstruc.2007.09.003.
van Merriënboer, J. J. G. (1997). Training complex cognitive skills: A four-component instructional design model for technical training: Educational Technology.
van Merriënboer, J. J. G., Clark, R. E., & De Croock, M. B. (2002). Blueprints for complex learning: the 4C/ID-model. Educational Technology Research and Development, 50, 39–61. doi:10.1007/BF02504993.
van Zundert, M., Sluijsmans, D., & van Merriënboer, J. (2010). Effective peer assessment processes: research findings and future directions. Learning and Instruction, 20(4), 270–279. doi:10.1016/j.learninstruc.2009.08.004.
Verhagen, T., Feldberg, F., van den Hooff, B., Meents, S., & Merikivi, J. (2012). Understanding users’ motivations to engage in virtual worlds: a multipurpose model and empirical testing. Computers in Human Behavior, 28(2), 484–495. doi:10.1016/j.chb.2011.10.020.
Vighnarajah, W., Luan, W. S., & Abu Bakar, K. (2009). Qualitative findings of students’ perception on practice of self-regulated strategies in online community discussion. Computers & Education, 53(1), 94–103. doi:10.1016/j.compedu.2008.12.021.
von Bastian, C. C., & Oberauer, K. (2013). Distinct transfer effects of training different facets of working memory capacity. Journal of Memory and Language, 69(1), 36–58. doi:10.1016/j.jml.2013.02.002.
Vygotsky, L. (1978). Interaction between learning and development. Readings on the Development of Children, 23(3), 34–41.
Ward, J., & LaBranche, G. A. (2003). Blended learning: the convergence of e-learning and meetings. Franchising World, 35, 22–24.
Weaver, S. B., Oji, V., Ettienne, E., Stolpe, S., & Maneno, M. (2014). Hybrid e-learning approach to health policy. Currents in Pharmacy Teaching and Learning, 6(2), 313–322. doi:10.1016/j.cptl.2013.11.013.
Wegerif, R., & Mercer, N. (1997). Using computer-based text analysis to integrate qualitative and quantitative methods in research on collaborative learning. Language and Education, 11(4), 271–286.
Wesiak, G., Steiner, C. M., Moore, A., Dagger, D., Power, G., Berthold, M., & Conlan, O. (2014). Iterative augmentation of a medical training simulator: effects of affective metacognitive scaffolding. Computers & Education, 76, 13–29. doi:10.1016/j.compedu.2014.03.004.
Whitelock, D., & Jelfs, A. (2003). Editorial for special issue on blended learning: blending the issues and concerns of staff and students. Journal of Educational Media, 28, 99–100.
Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design: Educational Technology.
Winne, P. H. (1982). Minimizing the black box problem to enhance the validity of theories about instructional effects. Instructional Science, 11(1), 13–28. doi:10.1007/Bf00120978.
Winne, P. H. (1985). Steps toward promoting cognitive achievements. The Elementary School Journal, 85(5), 673–693. doi:10.1086/461429.
Winne, P. H. (1995). Inherent details in self-regulated learning. Educational Psychologist, 30(4), 173–187. doi:10.1207/s15326985ep3004_2.
Winne, P. H. (1996). A metacognitive view of individual differences in self-regulated learning. Learning and Individual Differences, 8(4), 327–353. doi:10.1016/S1041-6080(96)90022-9.
Winne, P. H. (2006). Handbook of educational psychology: Psychology Press.
Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. Metacognition in Educational Theory and Practice, 93, 27–30.
Winne, P. H., & Jamieson-Noel, D. (2002). Exploring students’ calibration of self reports about study tactics and achievement. Contemporary Educational Psychology, 27(4), 551–572.
Winne, P. H., & Marx, R. W. (1989). A cognitive-processing analysis of motivation within classroom tasks. Research on Motivation in Education, 3, 223–257.
Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning.
Wouters, P., Paas, F., & van Merriënboer, J. J. G. (2009). Observational learning from animated models: effects of modality and reflection on transfer. Contemporary Educational Psychology, 34(1), 1–8. doi:10.1016/j.cedpsych.2008.03.001.
Xie, K., Miller, N. C., & Allison, J. R. (2013). Toward a social conflict evolution model: examining the adverse power of conflictual social interaction in online learning. Computers & Education, 63, 404–415. doi:10.1016/j.compedu.2013.01.003.
Yang, Y.-F., & Tsai, C.-C. (2010). Conceptions of and approaches to learning through online peer assessment. Learning and Instruction, 20(1), 72–83. doi:10.1016/j.learninstruc.2009.01.003.
Young, J. R. (2001). “ Hybrid” teaching seeks to end the divide between traditional and online instruction. The Chronicle of Higher Education.
Yu, S., Chen, I. J., Yang, K.-F., Wang, T.-F., & Yen, L.-L. (2007). A feasibility study on the adoption of e-learning for public health nurse continuing education in Taiwan. Nurse Education Today, 27(7), 755–761. doi:10.1016/j.nedt.2006.10.016.
Zimmerman, B. J. (1986). Becoming a self-regulated learner: which are the key subprocesses? Contemporary Educational Psychology, 11, 307–313.
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: an overview. Educational Psychologist, 25(1), 3–17. doi:10.1207/s15326985ep2501_2.
Zimmerman, B. J. (1998). Academic studing and the development of personal skill: a self-regulatory perspective. Educational Psychologist, 33, 73–86.
Zimmerman, B. J. (2000). Self-efficacy: an essential motive to learn. Contemporary Educational Psychololgy, 25(1), 82–91. doi:10.1006/ceps.1999.1016.
Zimmerman, B. J., & Pons, M. M. (1986). Development of a structured interview for assessing student use of self-regulated learning strategies. American Educational Research Journal, 23(4), 614–628. doi:10.3102/00028312023004614.
Zimmerman, B. J., & Schunk, D. H. (2001). Self-regulated learning and academic achievement: Theoretical perspectives: Routledge.
Zumbrunn, S., Tadlock, J., & Roberts, E. D. (2011). Encouraging self-regulated learning in the classroom: a review of the literature. Metropolitan Educational Research Consortium (MERC).
Author information
Authors and Affiliations
Corresponding author
Appendix 1
Appendix 1
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Van Laer, S., Elen, J. In search of attributes that support self-regulation in blended learning environments. Educ Inf Technol 22, 1395–1454 (2017). https://doi.org/10.1007/s10639-016-9505-x
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-016-9505-x