Abstract
This systematic review explores the emerging themes in the design and implementation of student-facing learning analytics dashboards in higher education. Learning Analytics has long been criticised for focusing too much on the analytics, and not enough on the learning. The review is then guided by an interest in whether these dashboards are still primarily analytics-driven or if they have become pedagogically informed over time. By mapping the identified themes of technological maturity, informing frameworks, affordances, data sources, and analytical levels over publications per year, the review identifies an emerging trajectory towards student-focused dashboards. These dashboards are informed by theory-oriented frameworks, designed to incorporate affordances that supporting student learning, and realised through integration of more than just activity data from learning management systems – allowing the dashboards to better support students' learnings processes. Based on this emerging trajectory, the review provides a series of design recommendations for student-focused dashboards that are connected to learning sciences as well as analytics.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
1.1 Learning analytics is inevitable due to increased digitalisation of education
Learning is increasingly becoming digitally supported, especially throughout Higher Education (HE). One of the key elements in this change is the use of Learning Management Systems (LMS) such as Moodle or Blackboard, allowing students to interact with course material, discussion posts, quizzes, and a variety of other resources. When interacting with these systems, students leave digital traces or footprints that are stored in activity logs (You, 2016). These digital traces have historically been the primary source of data in the field of Learning Analytics (LA) (Schwendimann et al., 2017). LA is an emerging field interested in “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Siemens, 2013, p. 1382). As digital technologies continue to play an increasingly important role in HE, there is a growing interest in using these digital traces to inform decision making processes, improve teaching, and student learning. However, as Ifenthaler and Schumacher (2016) put it in their review of LA for supporting student success in HE “more educational data does not always make better educational decisions” (p. 1982). In order to effectively make use of LA to support students, challenges occur, including building a strong connection to learning sciences, understanding the environment in which learning occurs and the relevant datapoints, and focusing on the perspectives of the learners (Ferguson, 2012).
1.2 Learning analytics dashboards are bringing LA to the learners
One of the most promising sub-fields of LA when it comes to building a link to the learning sciences seems to be Learning Analytics Dashboards (LAD) (Viberg et al., 2018). Analytics dashboards provide a visual representation of data required to achieve one or more objectives (Teasley, 2017). This is then a shift from using LA to inform adaptive systems, to making data directly available to the user through visual analytics, placing the intervention in the hand of the user (Ruiperez-Valiente et al., 2021). While the terminology and boundaries around these tools have been unclear throughout its emerging state, an increasingly prevalent definition is proposed by Schwendimann et al. (2017): “A learning dashboard is a single display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualizations” (p. 38). Dashboards have been designed for a variety of groups including administrators, faculty, study advisers and teachers. Most current dashboards are, however, aimed at the learners themselves (Matcha et al., 2020). This is a recent change, with dashboards previously being mostly aimed at teachers (Jivet et al., 2017). In this change, it is important to ensure that LADs are not just student-facing, presenting data to students, but instead student-focused, being designed for supporting student learning by through feedback, reflection, and relevant next actions. This paper presents a systematic review of student-facing dashboards in HE, exploring how learning and analytics can be connected within LAD design in order to support students’ learning - making dashboards student-focused.
1.3 When learners become the users of LADs theory plays an important role
By looking at the commonly applied research interest outlined by Siemens (2013), the LA field should be inclined to embrace pedagogical assumptions, guiding the design of user-facing analytics, such as student-facing LADs. Suthers and Verbert (2013) call LA a ‘middle space’ between the fields of learning and analytics. LA as a field has however long been technologically-driven, mostly ignoring the social and material context of the learning environment in which analytics has been applied (Fawns, 2019). With pedagogical factors set aside, researchers have primarily looked at data-points in isolation from the rest of the learning process (Gašević et al., 2015). There then seems to be lack of theory informing LA, a problem that still seems to persist (Guzmán-Valenzuela et al., 2021). In order to put the learning back into LA, there has been a call for using LA to support students directly by creating tools that can scaffold their self-directed learning (Kilińska & Ryberg, 2019). As the primary user of the analytics changes from the institution to the student, as is the case with student-facing dashboards, pedagogy begins to play a central role, and questions about how we can design and use LA for supporting students’ learning begin to arise.
1.4 LA is increasingly becoming about learning and not just analytics
Multiple existing reviews have identified the theory of self-regulated learning (SRL) as a starting point for working either directly with LADs or conducting systematic reviews (Jivet et al., 2017; Matcha et al., 2020; Valle et al., 2021a). In recent times, different pedagogical assumptions have been proposed for informing dashboards, e.g. collaborative learning analytics – using analytics to support collaborative learning (Wise et al., 2021). With multiple pedagogical assumptions paving the way for dashboards, it could point towards LA becoming increasingly about learning and not just analytics. Even recent reviews, however, keep concluding that learning is missing from LA (Guzmán-Valenzuela et al., 2021). The approaches of these reviews however raise a methodological challenge of whether the missing learning is a continuing problem or a ghost from the past. In concluding these points, the reviews seem to be addressing the entirety of a corpus spanning back to the emergence of the LA field. LADs are often described as being in an exploratory state (Schwendimann et al., 2017). With recent concepts, reviews and studies emphasising pedagogy, it is necessary to look at the body of research through a time-sensitive lens, outlining the development over time and emerging trajectories for future dashboard design. In order to guide our review, the following research questions have been outlined and will be expanded upon through the methodology.
Research questions
-
RQ1) How have the following themes developed over time within the design of student-facing LADs in HE, and what trajectories, if any, appear when mapping the themes over publications per year?
-
a
The technological maturity of the dashboard
-
b
The frameworks informing the design of the dashboard
-
c
The ways that the dashboard present data to students
-
d
The different data sources that feed into the dashboard
-
e
The level of analytics used to analyse the data in the dashboard
-
a
-
RQ2) What trajectories, if any, appear when cross mapping the themes from RQ1, and how may these inform the future design of student-facing LADs in HE?
The review contributes to the existing knowledge base by providing a systematic overview of student-facing dashboards in higher education, uncovering emerging trajectories in order to guide further design of dashboards.
2 Systematic review methodology
In order to investigate the outlined research questions a systematic review has been conducted (Gough et al., 2017). For guiding the review, inspiration was taken from the PRISMA 2020 framework in order to ensure the transparency of the review (Page et al., 2021). The PRISMA 2020 framework proposes a model for systematically building a data corpus which has been adopted.
2.1 Identification
Due to the interdisciplinary nature of LAD (Matcha et al., 2020), a variety of databases were chosen for performing the search, encompassing both general databases as well as technological and educational databases (Table 1).
In order to begin building the corpus, the following search string was developed using Boolean-operators (Gough et al., 2017): (“dashboard*”) AND (“learning analytics” OR “educational data mining”). The search string was intended to be as broad as possible, as the terminology around LAD’s remains unaligned throughout the field, especially during the early years (Schwendimann et al., 2017). A search block limiting the results to students and learners has therefore not been included. The target user has instead been addressed through the exclusion criteria (Section 2.2). The search string was applied on title, abstract and keywords. The initial search was performed in all databases on 08-NOV-2022 and resulted in 920 papers. The search was limited to include only peer-reviewed material from databases where this is not the default (ProQuest).
2.2 Screening, eligibility, and inclusion
Before screening results, duplicates were removed bringing the total amount of papers down to 592. In order to perform an initial screening of the results, a set of exclusion criteria were created and applied by assessing the abstracts of the initial corpus (Table 2).
The criteria “Language” was created to ensure that papers which could not be interpreted were not included in the synthesis. Therefore, only papers written in English were included. “Document type” excluded all documents other than journal articles, conference papers and PhD theses. The criteria “Dashboard” excluded studies that didn’t present a dashboard for education, as well as excluding non-empirically tested frameworks for designing and evaluating dashboards. “Target user” excluded dashboards for academic staff such as teacher-only dashboards. If the dashboard was aimed at both students and teachers, the paper was excluded, as student-facing dashboards differ from teacher-facing dashboards in their representation of data (Matcha et al., 2020). “Multiple dashboards” excluded literature reviews as well as comparisons of different dashboards from different educational settings. Papers comparing different versions of the same dashboard in the same educational setting were still included. Lastly, “educational setting”, excluded studies that didn’t present their educational setting or presented an educational setting different to higher education. This also excluded MOOCs not explicitly limited to higher education students.
If the exclusion was unclear the paper was marked and discussed with another researcher in order to determine inclusion/exclusion. After screening abstracts, the exclusion criteria presented in Table 2 were applied to the remaining full texts. 115 items were sought for retrieval with five items not being retrieved due to limited access. After the detailed process, 39 papers were included for further analysis (Fig. 1).
Before attending to the synthesis of the corpus, the overall corpus characteristics will first be examined. A relevant metric is the number of publications per year in the corpus (Fig. 2).
Figure 2 shows that student-facing LAD’s in Higher Education is a growing field, a point which is consistent with other reviews (Guzmán-Valenzuela et al., 2021; Matcha et al., 2020; Valle et al., 2021a). The publication types present in the final corpus has also been analysed (Fig. 3).
Figure 3 shows that most of the publications are conference papers. This is in line with LADs still being in an exploratory state (Schwendimann et al., 2017). There is however also a noticeable amount of journal publications, leading back to the initial question of whether the technology is maturing – a point which will be discussed throughout the paper.
2.3 Synthesis
In order to make sense of the final corpus, research question 1 was addressed by mapping and coding the corpus with the aim of outlining the technological maturity, informing frameworks, affordances, data sources and analytical levels of the studies.
Informing frameworks, affordances, and data sources were mapped by inductively coding themes so as to not apply bias in terms of pre-conceptions of the approaches used in designing and implementing the dashboard. This meant that the codes emerged by reading the corpus and searching for the presented themes e.g., data sources. The codes were then iteratively refined by the researchers as the codes emerged and were noted down in a shared codebook for ensuring transparency (Belur et al., 2021). Initial coding was conducted by researcher A. Unclear codes were noted down and discussed with researcher B before assigning the final code and re-iterating the codebook. The inductively created codes will be further expanded upon as they are introduced in Section 3.
For addressing the technological maturity, technology readiness levels were adapted (Tzinis, 2015). Building upon the tech-readiness scale allows for developing a standardised approach that can be adopted by others in order to evaluate the maturity of LADs. To relate the scale to LADs, a codebook was developed translating the scale to reflect LAD maturity. For analysis the coding was further clustered (Table 3).
In order to code the analytical levels, four types of LA were adopted from Jayashanka et al. (2022) (Table 4). This allowed for going beyond the descriptive / predictive distinction seen in other reviews such as Susnjak et al. (2022).
In addition to the descriptive codes, the different themes were also mapped out over number of publications per year in a stacked area chart. Doing so allowed for identifying current trends in the field and potential trajectories. In order to address RQ2 these categories were cross mapped, showing how the different themes relate to each other. The cross-mapping methodology will be further unfolded later in the paper.
2.4 Limitations
Despite the systematic nature of the review, and the added transparency of the codebook, there are still some limitations which may affect the conclusion that can be drawn from the results. By limiting itself to student-facing dashboards in HE, the trajectories neither represent student-facing dashboards, dashboards as a way of mediating learning analytics, nor the entirety of the learning analytics field. Another limiting factor is the approach used in the synthesis. The trajectories are dependent on the inductive codes for some categories and are therefore not objective statements about the dashboards. During the coding process it was noted that while many publications cite feedback and reflection as the purpose of the dashboard, these terms are used in very broad ways across the corpus - meaning that while authors may claim that a certain dashboard is built for feedback, it may not be coded as such if the affordance description is not matched. The inductive nature of these codes may then change the way the trajectories appear.
3 Results
In order to address the research questions, the following three steps will be outlined for every theme identified in RQ1. First, the codes will be presented. Secondly, descriptive results for each category will be presented. Finally, the stacked area charts will be presented based on the clusters for the categories in RQ1.
3.1 Technological maturity
This theme outlines the technological maturity of the dashboards, identifying how close they are to being implemented in a generalised setting. In order to address the technological maturity, the four clusters outlined in Section 2.3 have been applied to the final full texts (Table 5).
Over half of the included studies (23/39) are coded under the second maturity level, reporting that the dashboard has been introduced into the wild by being evaluated in one or more courses, but not repeatedly implemented over multiple semesters. Only a single paper has reported a level 4 dashboard, with it being available for others to use (Roa Romero et al., 2021). When mapped over publications per year, it can however be seen that the field is evolving over time (Fig. 4).
Figure 4 shows that while the number of publications is rising overall, most studies are still falling under maturity level 1 or 2. This can potentially indicate that most studies are still concerned with trying out different configurations, supporting the explorative notion. The mapping over publications by year however also shows an influx in level 3 and 4 papers emerging from 2016 and onwards.
3.2 Informing frameworks
This theme outlines the frameworks that have informed the design of the dashboard. This entails both design-oriented frameworks, e.g. user-centred design (Duan et al., 2022), as well as theory-oriented frameworks, e.g. Self-Regulated Learning (Lu et al., 2020) (Table 6).
The codes were inductively constructed for this theme, meaning that codes were not pre-determined, but iteratively developed through reading the corpus, extracting the frameworks from the individual papers. During this coding process a pattern appeared, with the codes ‘none’ (11/39) and ‘self-regulated learning (SRL)’ (14/39) appearing more often than other codes. The rest of the codes were therefore grouped into two clusters, as can be seen in Table 6. The first cluster comprises the theory-oriented frameworks, including educational theories such as social constructivism (3/39), evaluative frameworks such as Actionable feedback (2/39), Psychological concepts Self-determination theory (3/39) and the sociological concept of Use diffusion (1/39). The second cluster comprises the design-oriented frameworks such as User-Centred Design (2/39). There is then a dominance of theory-oriented frameworks, primarily SRL. Despite this wide range of informing frameworks, there is still a noticeable number of publications not citing any frameworks for informing their design. In order to see whether this relation between frameworks is stable, the frameworks have been mapped over publications per year (Fig. 5).
Figure 5 shows that LADs are increasingly informed by theory-oriented frameworks, primarily SRL. While there is still a stable number of dashboards present that are not informed by any framework, an increase can be seen in the number of dashboards informed by theory-oriented frameworks. Secondly, it can also be seen that the presence of other theory-oriented frameworks appeared before SRL emerged as a framework for informing dashboard design, with SRL taking the dominant position in recent years.
3.3 Affordances
This theme outlines the affordances that the dashboards entail when looking through a student-focused lens (Table 7). This encompasses different ways of visualising data for students, as well as the ways students can interact with the dashboard. The coded affordances attend to individual parts of the dashboards, e.g., a visualisation showing progress over time (Sedrakyan et al., 2017), which may differ from the research purpose outlined in the papers, e.g., increasing dashboard use.
The codes for this theme were inductively constructed for this theme, meaning that codes were not pre-determined, but developed by iteratively defining and coding the descriptions shown in the in Table 7.
The most common affordances are comparison (33/39), awareness (30/39) and monitoring (24/39). While comparison, awareness and monitoring are present in most dashboards, the rest of the coded items are more scarcely distributed. Here, it is interesting to note that the three most common affordances are all oriented towards describing practice – this will be further discussed during the fifth theme (Analytical Levels). The remaining codes entail items such as prediction (10/39), showing students an prediction of their final grades (Hellings & Haelermans, 2022); Recommendation (8/39), recommending student future action (Sansom et al., 2020); Feedback (4/39), giving students assessment / evaluation which is not just an grade or a scale, but e.g. written feedback (Tzi-Dong Ng et al., 2022); Reflection (3/39), using text prompts to facilitate reflection; Goal Setting (2/39), allowing students to set goals that they can follow up on (Winstone, 2019). When mapped out over publications per year, a trajectory appears (Fig. 6).
Figure 6 shows that the three primary affordances (comparison, awareness, and monitoring) were also the ones that were first present in dashboard design, with other affordances appearing from 2016 onwards. Figure 6 shows how the student-focused LAD field is evolving, leading to non-descriptive items beginning to appear. Here, a distinction can also be made between technically informed affordances such as predictive visualisations and theory-oriented affordances such as feedback. It can then be seen that most of the new affordances emerging after 2017 are aimed at supporting students, e.g., feedback and recommendation, creating a clearer link back to the theory-oriented frameworks which are increasingly informing LADs.
3.4 Data sources
This theme outlines the data sources that feed into the dashboards. This means that the focus is on the data generated by, or about the student (Table 8).
This category was also inductively coded by iteratively defining the description shown in the second column and resulted in four different codes. The most common approach coded is centred around activity logs from LMS systems, e.g. Moodle (Sahin & Yurdugül, 2017), being coded 26/39 times. Secondly, 11/39 dashboards collect data from an external tool outside of the LMS. These can be divided into dashboards exclusively focusing on external tools such as e-book readers (Chen et al., 2020) or online forums (Ullmann et al., 2019) and external tools, supplementing the activity data from LMS’s (Ramaswami et al., 2019). Students self-reporting data, e.g. time management (Broos et al., 2020) or emotional response (Sedrakyan et al., 2017) were coded 9/39 times. This approach is the least technical way of implementing LA, as the technology is only present in the analysis, and not in the data collection. The least common approach is collecting the students grades from administrative systems, being coded only 6/39. When mapping out the data sources over time, no clear trajectory emerges (Fig. 7).
Figure 7 shows that most data sources have been present throughout most of the time period, meaning that a trajectory cannot be outlined. This shows that while the affordances of the dashboards seem to be evolving over time, the inputs remain relatively stable.
3.5 Analytical levels
This theme outlines the analytical levels of analysing data which are present in the dashboards (Table 9). This means that the focus is how data is processed before being visualised to the student. Here, the distinction between four different levels of LA, and their description, was adopted from Jayashanka et al. (2022).
The first level of analytics is the descriptive level, which is also the most common, being coded 38/39 times. This is in alignment with the affordances outlined in trajectory 3, showing what has happened by making students aware, allowing them to monitor over time and compare to others. The second level is the diagnostic level, which is concerned with explaining why something happened. This level is coded 7/39 times, taking shape in form of e.g. elaborative text, explaining the descriptive results (Broos et al., 2020). While this analytical level is coded seven times, it is also here where multiple papers on the same dashboard seems to be most prevalent, with the seven codes being spread out over just three different dashboards (Broos et al., 2020; De Quincey et al., 2019; Sansom et al., 2020).
When moving to the third level, predictive analytics, there is an increase in publications, with it being coded 10/39 times. This may be in part be due to the above mentioned historical perspective, but also due to an increased focus on creating an distinction between descriptive and predictive analytics (Valle et al., 2021c), leaving out the remaining levels. Lastly, prescriptive analytics are found in 6/39 studies taking shape in the form of recommendation for next course material (Afzaal et al., 2021; Sansom et al., 2020) and changes in activity, e.g. increased reading or engagement (De Quincey et al., 2019; Susnjak et al., 2022). Here, three of the six papers are again reporting on the same dashboard (R. Bodily et al., 2018; R. G. Bodily, 2018; Sansom et al., 2020), showing that work done with diagnostic and prescriptive analytics also seem to be the dashboards that are published multiple times. When mapping the four levels out over publications per year, a trajectory appears (Fig. 8).
Figure 8 shows that the descriptive level is the basis of student-facing LADs, being present from the beginning. In recent years the other levels are beginning to appear, albeit with no diagnostic papers being coded in the last two years. Prescriptive and diagnostic codes before 2019 are the result of multiple publications on the same dashboards, which is in line with the trajectories around the affordance of recommendation and the diagnostic analytical level.
4 Cross mapping themes
In RQ1 the five presented themes each show a certain trajectory, or lack thereof, when mapped out over publications per year. Attending to RQ2, the use of heatmaps allows for cross mapping the identified themes, e.g., data sources over affordances. All 20 cross-mappings were examined, and the key cross-mappings from informing frameworks and data sources are presented here, as they provide insights which we deem valuable in relation to the design of future dashboards.
4.1 Informing frameworks
When mapping theme 3 (Affordances) over publications by year, it became clear that affordances have moved beyond comparison, awareness, and monitoring. At the same time, in theme 2 (Informing Frameworks) it was shown that LADs are increasingly being informed by theory-oriented frameworks. It is then deemed relevant to explore whether the change in affordances is connected to the change in informing frameworks. In attempt to answer this, informing frameworks can be mapped over affordances (Table 10)
Table 10 shows that predictive affordances mostly arise from dashboards not informed by theory-oriented frameworks. The heatmap then also shows that theory-oriented frameworks inform reflection, goal setting, and recommended next actions. SRL is the dominant theory as an overall approach, but when the other theory-oriented frameworks are clustered, they seem to have slightly different affordances than the SRL-informed dashboard. A similar trajectory appears when mapping out informing frameworks over analytical levels (Table 11).
Table 11 supports the notion that prediction is mostly non-theory, it however also shows that while SRL is the dominant theory-oriented framework, it is mostly applied in conjunction with descriptive analytics. The use of other theory-oriented frameworks then seems to be what is paving the way for diagnostic and prescriptive analytics.
4.2 Data sources
As seen in theme 4 (Data Sources), the inputs of LADs are relatively stable over time, with most of the data coming from LMS activity logs, and most of the external tools also just integrating activity-related measures. Here, it is deemed relevant to map out data sources over frameworks, and analytical levels (Table 12).
Table 12 shows that the dashboards informed by SRL or no framework are mostly using LMS data while most of the other theory-oriented framework informed dashboards are using external tools. It is also interesting to note that the inclusion of students’ final grades is mostly used in combination with SRL. Another relevant cross mapping appears between data sources and analytical levels (Table 13).
Table 13 shows that while dashboards using LMS data are dominantly descriptive, they are also the primary utilisers of predictive analytics. Here, the dashboards using external tools seem to be more varied, with an increase in diagnostic analytics compared to the LMS category.
5 Discussion
Based on the insights derived from the mapping of themes over publications per year, and the presented cross-mappings, we will now discuss the implications of these results for moving student-facing dashboards towards student-focused dashboards. After this, we will present our design recommendations for student-focused dashboard design in HE.
5.1 Implications
5.1.1 LADs are increasingly becoming about learning
The results from this review show an increase in theory-oriented frameworks informing the design of student-facing LADs in HE This challenges the notion that learning is missing from analytics (Guzmán-Valenzuela et al., 2021; Jivet et al., 2017). The results from this review do not discredit this claim, as new dashboards are still emerging which are not informed by theory-oriented frameworks. The results however imply that dashboards informed by theory-oriented frameworks are emerging more rapidly than the ones that are not. A core implication of this trajectory is that dashboards informed by pedagogy can provide students with relevant insights and actions for their learning through relevant affordances and analytical levels.
5.1.2 Theory-oriented frameworks are pushing for student-focused affordances
The increase in theory-oriented frameworks also manifests itself in the affordances of the dashboards, with recent affordances appearing, such as recommendation and reflection, supporting students more directly, and not just making them aware of their activity and the acitivity of their peers. However, with most of the work still only focusing on comparison, awareness and monitoring, there remains a gap in the use of LADs to not only make students aware of their learning, but also support learning - moving from description of practice to providing actionable insights – a point also concluded by Susnjak et al. (2022), who, in line with this review, show that most learner-facing dashboards only employ descriptive measures. This potentially hinders the ability for students to move beyond monitoring and into meaningful action, a point also concluded by Liu and Nesbit (2020).
5.1.3 Theory-oriented frameworks need to be reflected in dashboard design
The clustering of the theory-oriented frameworks shows that while other theory-oriented frameworks emerged first, SRL seems to have taken over as the primary informing theory-oriented framework for student-facing dashboard design. SRL being the primary theory-oriented framework is in line with the literature (Jivet et al., 2017). The cross mappings performed around the theory-oriented frameworks however raise a question of whether SRL is the appropriate theory-oriented framework for supporting students’ learning, at least in the way most dashboards currently apply it. Cross mapping frameworks with affordances (Table 10) analytical levels (Table 11) show that the use of other theory-oriented frameworks (not SRL), seems to be what is moving dashboards towards supporting student learning, e.g., through the affordance of recommendation (Afzaal et al., 2021; Bodily et al., 2018; Bodily, 2018; De Quincey et al., 2019; Sansom et al., 2020; Susnjak et al., 2022), and the use of diagnostic analytics (Bodily et al., 2018; Bodily, 2018; Broos et al., 2017, 2018, 2020; De Quincey et al., 2019; Sansom et al., 2020), allowing students to identify what has gone wrong, and take relevant action. This is not to invalidate self-regulated learning as an theory-oriented framework, as the SRL phases (Zimmerman, 2002) provide many student-supporting affordances that are in line with this trajectory, while also resulting in affordances such as goal setting, which is also present in dashboard coded as informed by SRL (Tzi-Dong Ng et al., 2022; Winstone, 2019). These results then raise a need for dashboard designers to ensure that the pedagogical concepts embedded in the informing frameworks are also afforded by the final dashboard, the chosen data sources and then applied analytics.
It is also interesting to note that only 5/39 studies include a design framework. This is a topic which has recently received attention with some authors calling for more user-involvement in design of learning analytics (Sarmiento & Wise, 2022). There is then a need to explore how theory-oriented and design-oriented frameworks may complement each other in the design of student-focused dashboards in order to ensure relevance and effectiveness.
5.1.4 Moving beyond LMS-data and descriptive analytics
Another core implication of the results derived from this review is the link between data sources and analytical levels. Throughout the LA field there has been a continuous calling for increased and more complex data integration (Samuelsen et al., 2019). The results of this review imply that the data sources feeding into the dashboards have been mostly stable, with the types of analytics applied on these inputs changing over time.
LMS data is the most common data source feeding into the dashboards (Table 8). In the early days of the LA field this was often the go-to approach, as the goal was to predict student success based on course activity (Hellas et al., 2018). This however seems to be changing when the analytics are for the students instead of about the students, creating an increased focus on the process rather than the results. 15/39 studies include external data sources, which are then primarily informed by theory-oriented frameworks (Table 12) and the primary drivers behind diagnostic and prescriptive analytics (Table 13). It is interesting to note that the SRL informed dashboards are mostly using LMS data applied through descriptive dashboards while most of the other theory-oriented framework informed dashboards are using external tools aimed at diagnostic and prescriptive analytics. This could indicate that a new standard for dashboard design is emerging. This contrasts with the trajectory outlined in crossing frameworks with affordance and analytical levels, where dashboards with multiple publications seem to build on other theory-oriented frameworks and applied through diagnostic and prescriptive analytics, although still with the descriptive analytics present. There then seems to be a divide between new-entries to the field based on SRL, but only applying descriptive analytics, and repeated entries informed by a broader set of theories and realised through more student-oriented analytics.
Our results imply that external data sources are needed in order to support diagnostic and prescriptive analytics, the types of analytics that we argue are needed in order to support students’ learning through affordances such as feedback, reflection and recommendation. Dashboards limited to LMS data are by that nature also restricted in what they can present to students, and to what degree they can understand and support students’ learning processes.
5.1.5 The role of predictive analytics
The outlined discussion of descriptive vs. diagnostic analytics is in misalignment with the literature, which is primarily focusing on the distinction between description and prediction and not on diagnostic or prescriptive analytics (Valle et al., 2021c). When cross mapping informing frameworks and analytical levels (Table 11) the dashboards not informed by theory-oriented frameworks seem to be pushing towards predictive analytics, while dashboards informed by theory-oriented frameworks are pushing towards diagnostic and prescriptive analytics. While 4/10 of the predictive dashboards are informed by a theory-oriented framework, the remaining six are informed by a design-framework, or no framework at all. It is then vital, that predictive analytics are grounded in relevant theory-oriented frameworks in order to ensure that a divide doesn’t occur between analytics-driven prediction dashboards, and theory-oriented dashboards
5.2 Recommendations for student-focused dashboard design
Based on our results we outline a series of recommendations informing the design of future dashboards aimed at students in HE. These suggestions tie in to previous work by Jivet et al. (2018) and Bodily & Verbert (2017). Our recommendations translate the different mappings into core suggestions, strengthening the link between learning sciences and analytics that Ferguson (2012) identified as one of the core challenges for LA.
-
Dashboards should build upon existing literature in order to address the identified surge of recent papers primarily applying SRL with descriptive analytics, which our results have put into question.
-
Theory-oriented frameworks should be applied to ground affordances, data sources and analytical levels in pedagogical concepts relevant to the learning activity/environment.
-
Affordances should be in alignment with the chosen framework – For instance, if SRL is selected, the affordances should support different SRL phases/concepts.
-
Affordances should go beyond comparison, awareness, and monitoring – They should encompass tools that facilitate reflection and action through, such as feedback, recommendation, and planning.
-
Relevant data sources should be identified to provide the necessary measures for the affordances derived from the chosen frameworks – This allows for linking data measures to learning constructs.
-
Dashboards should go beyond descriptive analytics – Our findings suggest the need for diagnostic analytics to support reflection, and prescriptive analytics to support action.
-
Predictive analytics should be incorporated in accordance with theory-oriented framework, rather than being solely based on a technical justification.
While learning analytics dashboards still seem to be an exploratory state, as supported by the technological maturity of the dashboards spreading, rather than maturing, we believe that these recommendations can pave the way for supporting the emerging trajectories towards student-focused dashboard design in HE.
6 Conclusion
This review has highlighted the current themes and emerging trajectories in the design and implementation of student-facing learning analytics dashboards in higher education. The results show an emerging trajectory towards directly supporting students’ learning through dashboards that incorporate multiple data sources and are rooted in diverse theory-oriented frameworks. This trajectory has demonstrated the importance of a pedagogical approach to the design of student-facing learning analytics dashboards in higher education, as well as the need for the integration of multiple data sources and analytical levels to provide a deeper understanding and better facilitation of students’ learning processes. By attending to this trajectory, student-focused learning analytics dashboards have the potential to transform the way that students engage with digitally supported learning in higher education.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Afzaal, M., Nouri, J., Zia, A., Papapetrou, P., Fors, U., Wu, Y., Li, X., & Weegar, R. (2021). Explainable AI for data-driven feedback and intelligent action recommendations to support students self-regulation. Frontiers in Artificial Intelligence, 4, 723447. https://doi.org/10.3389/frai.2021.723447
Aljohani, N. R., & Davis, H. C. (2013). Learning analytics and formative assessment to provide immediate detailed feedback using a student centered mobile dashboard. International Conference on Next Generation Mobile Applications, Services, and Technologies, 262–267. https://doi.org/10.1109/NGMAST.2013.54
Aljohani, N. R., Daud, A., Abbasi, R. A., Alowibdi, J. S., Basheri, M., & Aslam, M. A. (2019). An integrated framework for course adapted student learning analytics dashboard. Computers in Human Behavior, 92, 679–690. https://doi.org/10.1016/j.chb.2018.03.035
Azmi Murad, M. A., Shah Jahan, A. F., Mohd Sharef, N., Ab Jalil, H., Ismail, I. A., & Mohd Noor, M. Z. (2022). An Analytics Dashboard for Personalised E-learning: A Preliminary Study (Vol. 835, p. 866). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-16-8515-6_65
Belur, J., Tompson, L., Thornton, A., & Simon, M. (2021). Interrater reliability in systematic review methodology: Exploring variation in coder decision-making. Sociological Methods & Research, 50(2), 837–865. https://doi.org/10.1177/0049124118799372
Bodily, R. G. (2018). Designing, Developing, and Implementing Real-time Learning Analytics Student Dashboards [Ph.D., Brigham Young University]. In ProQuest Dissertations and Theses (2057208363). Education Database; ProQuest Dissertations & Theses Global; Social Science Premium Collection.
Bodily, R., Ikahihifo, T. K., Mackley, B., & Graham, C. R. (2018). The design, development, and implementation of student-facing learning analytics dashboards. Journal of Computing in Higher Education, 30(3), 572–598. https://doi.org/10.1007/s12528-018-9186-0
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405–418. https://doi.org/10.1109/TLT.2017.2740172
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017). Dashboard for actionable feedback on learning skills: Scalability and usefulness. LNCS (Vol. 10296, p. 241). Springer Verlag. https://doi.org/10.1007/978-3-319-58515-4_18
Broos, T., Pinxten, M., Delporte, M., Verbert, K., & De Laet, T. (2020). Learning dashboards at scale: Early warning and overall first year experience. Assessment and Evaluation in Higher Education, 45(6), 855–874. https://doi.org/10.1080/02602938.2019.1689546
Broos, T., Verbert, K., Langie, G., Van Soom, C., & De Laet, T. (2018). Multi-Institutional positioning test feedback dashboard for aspiring students lessons learnt from a case study in flanders. ACM International Conference Proceeding Series, 51–55. https://doi.org/10.1145/3170358.3170419
Brouwer, N., Bredeweg, B., Latour, S., Berg, A., & van der Huizen, G. (2016). Learning analytics pilot with coach2—Searching for effective mirroring: Vol. 9891 LNCS (Vol. 369p.). Springer Verlag. https://doi.org/10.1007/978-3-319-45153-4_28
Chen, L., Lu, M., Goda, Y., Shimada, A., & Yamada, M. (2020). Factors of the use of learning analytics dashboard that affect metacognition. 17th International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2020, 295–302.
Corrin, L., & De Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. Proceedings of ASCILITE 2014 - Annual Conference of the Australian Society for Computers in Tertiary Education, 629–633. https://www.ascilite.org/conferences/dunedin2014/files/concisepapers/223-Corrin.pdf. Acessed 08 Nov 2022.
Corrin, L., & De Barba, P. (2015). How do students interpret feedback delivered via dashboards? ACM International Conference Proceeding Series, 16-20-March-2015, 430–431. https://doi.org/10.1145/2723576.2723662
De Quincey, E., Kyriacou, T., Briggs, C., & Waller, R. (2019). Student centred design of a learning analytics system. ACM International Conference Proceeding Series, 353–362. https://doi.org/10.1145/3303772.3303793
Duan, X., Wang, C., & Rouamba, G. (2022). Designing a learning analytics dashboard to provide students with actionable feedback and evaluating its impacts. International Conference on Computer Supported Education, CSEDU - Proceedings, 2, 117–127. https://doi.org/10.5220/0011116400003182
Fawns, T. (2019). Postdigital education in design and practice. Postdigital Science and Education, 1(1), 132–145. https://doi.org/10.1007/s42438-018-0021-8
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 5/6.
Fleur, D. S., van den Bos, W., & Bredeweg, B. (2020). Learning analytics dashboard for motivation and performance: Vol. 12149 LNCS (p. 419). Springer. https://doi.org/10.1007/978-3-030-49663-0_51
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x
Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews. SAGE.
Guzmán-Valenzuela, C., Gómez-González, C., Rojas-Murphy Tagle, A., & Lorca-Vyhmeister, A. (2021). Learning analytics in higher education: A preponderance of analytics but very little learning? International Journal of Educational Technology in Higher Education, 18(1), 23. https://doi.org/10.1186/s41239-021-00258-x
Hatala, M., Beheshitha, S. S., & Gaševic, D. (2016). Associations between students’ approaches to learning and learning analytics visualizations. CEUR Workshop Proceedings, 1596, 3–10.
Haynes, C. C. (2020). The Role of Self-Regulated Learning in the Design, Implementation, and Evaluation of Learning Analytics Dashboards. L@S 2020 - Proceedings of the 7th ACM Conference on Learning @ Scale, 297–300. https://doi.org/10.1145/3386527.3406732
Hellas, A., Ihantola, P., Petersen, A., Ajanovski, V. V., Gutica, M., Hynninen, T., Knutas, A., Leinonen, J., Messom, C., & Liao, S. N. (2018). Predicting academic performance: A systematic literature review. Proceedings Companion of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education, 175–199. https://doi.org/10.1145/3293881.3295783
Hellings, J., & Haelermans, C. (2022). The effect of providing learning analytics on student behaviour and performance in programming: A randomised controlled experiment. Higher Education, 83(1), 1. https://doi.org/10.1007/s10734-020-00560-z
Hill, E. H., III. (2018). The effects of student activity dashboards on student participation, performance, and persistence [Ph.D., Nova Southeastern University]. In ProQuest Dissertations and Theses (2016833524). ProQuest Dissertations & Theses Global.
Ifenthaler, D., & Schumacher, C. (2016). Student perceptions of privacy principles for learning analytics. Educational Technology Research and Development, 64(5), 923–938. https://doi.org/10.1007/s11423-016-9477-y
Jayashanka, R., Hettiarachchi, E., & Hewagamage, K. P. (2022). Technology enhanced learning analytics dashboard in higher education. Electronic Journal of E-Learning, 20(2), 151–170. https://doi.org/10.34190/ejel.20.2.2189
Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data driven approaches in digital education (pp. 82–96). Springer International Publishing. https://doi.org/10.1007/978-3-319-66610-5_7
Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice. ACM International Conference Proceeding Series, 31–40. https://doi.org/10.1145/3170358.3170421
Khan, I., & Pardo, A. (2016). Data2U: Scalable real time student feedback in active learning environments. 25–29 April-2016, 249–253. https://doi.org/10.1145/2883851.2883911
Kia, F. S., Teasley, S. D., Hatala, M., Karabenick, S. A., & Kay, M. (2020). How patterns of students dashboard use are related to their achievement and self-regulatory engagement. Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, 340–349. https://doi.org/10.1145/3375462.3375472
Kilińska, D., & Ryberg, T. (2019). Connecting learning analytics and problem-based learning – potentials and challenges. Journal of Problem Based Learning in Higher Education, 7(1), 1. https://doi.org/10.5278/ojs.jpblhe.v7i1.2545
Liu, A. L., & Nesbit, J. C. (2020). Dashboards for computer-supported collaborative learning. In M. Virvou, E. Alepis, G. A. Tsihrintzis, & L. C. Jain (Eds.), Machine learning paradigms: Advances in learning analytics (pp. 157–182). Springer International Publishing. https://doi.org/10.1007/978-3-030-13743-4_9
Lu, M., Chen, L., Goda, Y., Shimada, A., & Yamada, M. (2020). Visualizing studying activities for a learning dashboard supporting meta-cognition for students: Vol. 12203 LNCS (p. 580). Springer. https://doi.org/10.1007/978-3-030-50344-4_41
Matcha, W., Uzir, N. A., Gasevic, D., & Pardo, A. (2020). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Transactions on Learning Technologies, 13(2), 226–245. https://doi.org/10.1109/TLT.2019.2916802
Mejia, C., Florian, B., Vatrapu, R., Bull, S., Gomez, S., & Fabregat, R. (2017). A novel web-based approach for visualization and inspection of reading difficulties on university students. IEEE Transactions on Learning Technologies, 10(1), 53–67. https://doi.org/10.1109/TLT.2016.2626292
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., & Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews, 10(1), 89. https://doi.org/10.1186/s13643-021-01626-4
Ramaswami, G. S., Susnjak, T., & Mathrani, A. (2019). Capitalizing on learning analytics dashboard for maximizing student outcomes. 2019 IEEE Asia-Pacific Conference on Computer Science and Data Engineering, CSDE 2019. https://doi.org/10.1109/CSDE48274.2019.9162357
Roa Romero, Y., Tame, H., Holzhausen, Y., Petzold, M., Wyszynski, J. V., Peters, H., Alhassan-Altoaama, M., Domanska, M., & Dittmar, M. (2021). Design and usability testing of an in-house developed performance feedback tool for medical students. BMC Medical Education, 21(1), 354. https://doi.org/10.1186/s12909-021-02788-4
Ruiperez-Valiente, J. A., Gomez, M. J., Martinez, P. A., & Kim, Y. J. (2021). Ideating and developing a visualization dashboard to support teachers using educational games in the classroom. Ieee Access : Practical Innovations, Open Solutions, 9, 83467–83481. https://doi.org/10.1109/ACCESS.2021.3086703
Sahin, M., & Yurdugül, H. (2017). The framework of intervention engine based on learning analytics. 14th International Conference on Cognition and Exploratory Learning in the Digital Age, CELDA 2017, 255–258. https://files.eric.ed.gov/fulltext/ED579496.pdf. Accessed 08 Nov 2022.
Şahin, M., & Yurdugül, H. (2019). An intervention engine design and development based on learning analytics: The intelligent intervention system (In2S). Smart Learning Environments, 6(1), 1–18. https://doi.org/10.1186/s40561-019-0100-7
Samuelsen, J., Chen, W., & Wasson, B. (2019). Integrating multiple data sources for learning analytics—review of literature. Research and Practice in Technology Enhanced Learning, 14(1), 11. https://doi.org/10.1186/s41039-019-0105-4
Sansom, R. L., Bodily, R., Bates, C. O., & Leary, H. (2020). Increasing student use of a learner dashboard. Journal of Science Education and Technology, 29(3), 386–398. https://doi.org/10.1007/s10956-020-09824-w
Santos, J. L., Verbert, K., Govaerts, S., & Duval, E. (2013). Addressing learner issues with StepUp! An evaluation. ACM International Conference Proceeding Series, 14–22. https://doi.org/10.1145/2460296.2460301
Sarmiento, J. P., & Wise, A. F. (2022). Participatory and co-design of learning analytics: An initial review of the literature. LAK22: 12th International Learning Analytics and Knowledge Conference, 535–541. https://doi.org/10.1145/3506860.3506910
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., Gillet, D., & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30–41. https://doi.org/10.1109/TLT.2016.2599522
Sedrakyan, G., Leony, D., Muñoz-Merino, P. J., Kloos, C. D., & Verbert, K. (2017). Evaluating student-facing learning dashboards of affective states. LNCS (Vol. 10474, p. 237). Springer Verlag. https://doi.org/10.1007/978-3-319-66610-5_17
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. https://doi.org/10.1177/0002764213498851
Susnjak, T., Ramaswami, G. S., & Mathrani, A. (2022). Learning analytics dashboard: A tool for providing actionable insights to learners. International Journal of Educational Technology in Higher Education, 19(1), 12. https://doi.org/10.1186/s41239-021-00313-7
Suthers, D., & Verbert, K. (2013). Learning analytics as a ‘middle space’. Proceedings of the Third International Conference on Learning Analytics and Knowledge, 1–4. https://doi.org/10.1145/2460296.2460298
Taniguchi, Y., Owatari, T., Minematsu, T., Okubo, F., & Shimada, A. (2022). Live sharing of learning activities on E-Books for enhanced learning in online classes. Sustainability (Switzerland), 14(12), https://doi.org/10.3390/su14126946
Teasley, S. D. (2017). Student facing dashboards: One size fits all? Technology Knowledge and Learning, 22(3), 377–384. https://doi.org/10.1007/s10758-017-9314-3
Tzi-Dong Ng, J., Wang, Z., & Hu, X. (2022). Needs analysis and prototype evaluation of student-facing la dashboard for virtual reality content creation. ACM International Conference Proceeding Series, 444–450. https://doi.org/10.1145/3506860.3506880
Tzinis, I. (2015). Technology Readiness Level. NASA; Brian Dunbar. http://www.nasa.gov/directorates/heo/scan/engineering/technology/technology_readiness_level. Acessed 05 May 2023.
Ulfa, S., Fattawi, I., Surahman, E., & Yusuke, H. (2019). Investigating learners’ perception of learning analytics dashboard to improve learning interaction in online learning system. 2019 5th International Conference on Education and Technology, ICET 2019, 49–54. https://doi.org/10.1109/ICET48172.2019.8987229
Ullmann, T. D., De Liddo, A., & Bachler, M. (2019). A visualisation dashboard for contested collective intelligence. Learning analytics to improve sensemaking of group discussion. Revista Iberoamericana de Educación a Distancia, 22(1), 41–80. https://doi.org/10.5944/ried.22.1.22294
Valle, N., Antonenko, P., Dawson, K., & Huggins-Manley, A. C. (2021a). Staying on target: A systematic literature review on learner-facing learning analytics dashboards. British Journal of Educational Technology, 52(4), 1724–1748. https://doi.org/10.1111/bjet.13089
Valle, N., Antonenko, P., Valle, D., Dawson, K., Huggins-Manley, A. C., & Baiser, B. (2021b). The influence of task-value scaffolding in a predictive learning analytics dashboard on learners’ statistics anxiety, motivation, and performance. Computers and Education, 173, 104288. https://doi.org/10.1016/j.compedu.2021.104288
Valle, N., Antonenko, P., Valle, D., Sommer, M., Huggins-Manley, A. C., Dawson, K., Kim, D., & Baiser, B. (2021c). Predict or describe? How learning analytics dashboard design influences motivation and statistics anxiety in an online statistics course. Educational Technology Research and Development, 69(3), 1405–1431. https://doi.org/10.1007/s11423-021-09998-z
Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110. https://doi.org/10.1016/j.chb.2018.07.027
Villalobos, E., Pérez-Sanagustin, M., Sanza, C., Tricot, A., & Broisin, J. (2022). Supporting self-regulated learning in BL: Exploring learners’ tactics and strategies: Vol. 13450 LNCS (p. 420). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-16290-9_30
Winstone, N. (2019). Facilitating students’ use of feedback: Capturing and tracking impact using digital tools. In The Impact of Feedback in Higher Education: Improving Assessment Outcomes for Learners (pp. 225–242). Palgrave Macmillan. https://doi.org/10.1007/978-3-030-25112-3_13
Wise, A. F., Knight, S., & Shum, S. B. (2021). Collaborative learning analytics. In U. Cress, C. Rosé, A. F. Wise, & J. Oshima (Eds.), International handbook of computer-supported collaborative learning (pp. 425–443). Springer International Publishing. https://doi.org/10.1007/978-3-030-65291-3_23
You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. The Internet and Higher Education, 29, 23–30. https://doi.org/10.1016/j.iheduc.2015.11.003
Zimmerman, B. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41, 64–70. https://doi.org/10.1207/s15430421tip4102_2
Funding
Open access funding provided by Aalborg University No funding was received for conducting this study
Author information
Authors and Affiliations
Contributions
LP conducted the literature review and analysis. EDL participated in the design of the study. LP prepared the manuscript under supervision of EDL. Both authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Paulsen, L., Lindsay, E. Learning analytics dashboards are increasingly becoming about learning and not just analytics - A systematic review. Educ Inf Technol 29, 14279–14308 (2024). https://doi.org/10.1007/s10639-023-12401-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-023-12401-4