Next Article in Journal
The Complexities and Benefits of Community-Partnered Projects for Engineering Capstone Design Students
Next Article in Special Issue
Editorial for the Special Issue on EdTech in Higher Education: Future Perspectives on Teaching and Learning
Previous Article in Journal
Information Technology Undergraduate Students’ Intercultural Value Orientations and Their Beliefs about the Influence of Such Orientations on Teamwork Interactions
Previous Article in Special Issue
An Extended Technology Adoption Model with Perceived Visual Attractiveness to Assess Academic Web Portals
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Using Motivation Theory to Design Equity-Focused Learning Analytics Dashboards

by
Stephen J. Aguilar
Rossier School of Education, University of Southern California, Los Angeles, CA 90089, USA
Trends High. Educ. 2023, 2(2), 283-290; https://doi.org/10.3390/higheredu2020015
Submission received: 17 February 2023 / Revised: 14 March 2023 / Accepted: 24 March 2023 / Published: 29 March 2023

Abstract

:
Learning Analytics applications, and their associated dashboards, are frequently used in post-secondary settings; yet, there has been limited work exploring the motivational implications of their deployment, especially for under-served student populations that are more susceptible to (perceived) negative messages about their academic performance. In this paper, I argue that Situated Expectancy-Value Theory (EVT) is well-positioned to serve as a useful lens when developing and evaluating learning analytics dashboard designs and their future development. Used in this way, SEVT can help the learning analytics community to ensure that student experiences with learning analytics are adaptively motivating, both in general and for underserved student populations more specifically.

1. Introduction

Close to ten years ago, the 2014 EDUCAUSE Horizon Report stated that learning analytics would provide “ways...to improve student engagement and provide a high-quality, personalized experience for learners” [1]. Eight years after the fact, this objective has been pursued through the use of learning analytics applications that have reached far into post-secondary institutions and are approaching ubiquity within them. Applications for the design of learning analytics have ranged from support for student information-seeking behaviors with a focus on supporting how they learn [2] to large-scale implementations designed to support students seen as being “at-risk” with respect to academic challenges [3]. Meanwhile, there have been calls for learning analytics to focus on “information justice based on an ethics of justice and care” [4], i.e., by ensuring that learning analytics applications are designed and implemented in a manner that is respectful of student privacy and supports their agency. Despite the near-decade of learning analytics as a field, it is notable that a core dimension of the “personalization” argument (i.e., that learning analytics can use student data to provide individualized learning experiences to students while helping to understand the within-student variance of those experiences) is largely unchanged. It seems clear that a core dimension of learning analytics is personalization as means of supporting student learning.
Many contemporary applications of learning analytics research have enacted this core affordance of learning analytics by focusing on helping those students who are most at risk of facing academic challenges [5]. Considering that the concept of “at-risk” students suggests the potential for intervention to prevent undesirable outcomes (e.g., academic failure), there have been calls to understand how academic information is communicated through learning analytics dashboards (LADs). Indeed, the role that LADs play with respect to fostering, limiting, or otherwise affecting students’ academic motivation and self-regulated learning is a growing area of study (e.g., [6,7,8,9,10,11,12]). Yet, while the field has indeed moved towards personalizing the experience of students (e.g., [13,14,15]), predicting which students might need extra support (e.g., [16,17]), and communicating those insights via LADs, recent work has shown that learning analytics has a blind spot when it comes to issues of diversity and equity [18,19,20]. In this paper, I argue that motivation theory (i.e., Situated Expectancy-Value Theory [21]) can be a bridge that links the field of learning analytics with respect to LAD design to issues of educational equity. Specifically, I hold that if the role of learning analytics is to support student success though the lever of engagement, then it is important for insights communicated though LADs to be grounded in motivation theory in order to ensure a better understanding of how interventions academically motivate, demotivate, or differently motivate students.

2. Expectancy-Value Theory

Student academic motivation has been linked to how they engage with their courses [22]; if, when, and how they seek help [23]; whether or not they choose to cheat [24]; and whether they choose to persist despite struggles [25]. Thus, if LADs have the potential to influence student motivation, and student motivation plays a key role in learning, then the motivational implications of LADs warrant examination. While there has been work in this regard within the learning analytics community (e.g., [5,12,26]), that work has largely focused on the role played by Achievement Goal Theory [27,28,29] in how students respond to different LAD designs. Adding additional theoretical approaches, then, is a critical step toward determining how students make sense of the information they receive via LADs. Otherwise, the field risks using an overly narrow conception of motivation when studying students’ sense-making practices and whether those practices result in adaptive or maladaptive forms of academic motivation.
I extend the discussion of how motivation theory can expand equity focused discussions of of LAD design by using Situated Expectancy-Value Theory (EVT) as a theoretical frame. As its name implies, SEVT [21,30,31,32,33] seeks to explain student motivation by answering two questions about courses of action: “Can I?” and “Do I want to?” The former is conceptualized as “expectancies,” and the latter is captured by the multidimensional construct of “value”, in turn composed of intrinsic, utility, importance, and cost components. Each of these components, moreover, is situated in a particular context that, by necessity, attends to the individual dimensions of learners. Equity plays a central role in this, as SEVT accounts for individual motivational differences that might otherwise be collapsed into larger aggregate constructs. I examine each of the components of EVT below, and discuss how understanding each construct can be relevant to the design of learning analytics dashboards with equity in mind.

2.1. Expectancy

For students, expectancies serve as de facto self-evaluations of the likelihood of success or failure when a task is attempted. Students can have a range of expectancies for both general domains (e.g., math), and specific tasks (e.g., a math exam). Importantly, expectancies have a temporal dimension that can be proximal (i.e., a task requiring immediate or near-immediate attention) or distal (i.e., a task to be completed at a time in the future). A student, for example, could pose an expectancy question that takes the form of: “What is the likelihood that I will successfully accomplish X,” where X is an academic task that a student is considering undertaking. To help a student answer this question, a given LAD can use known data about students, e.g., past information, as it relates to their performance on similar tasks, as well as information from empirically matched peers who have successfully accomplished the same task in the past. The inclusion of both peer information and information that might be gathered through psychometrically validated surveys can enable LADs to attend to individual differences [5] while incorporating information to ensure that students are not treated in a way that is “color-blind” or otherwise ignores their identities [18].
Being able to use data about a student’s performance and that of their contemporary or historical peers can enable learning analytics applications to provide feedback to the student in a manner that is both personal and data-driven. In this way, a students’ expectancies can be grounded in tangible information and historical trends. How this information is displayed and what kind of information is displayed is both a pragmatic concern and an ethical one, as institutions need to walk the line between heavy-handed paternalism and providing students with information that could encourage positive outcomes [34]. One such concern includes providing students with access to information about their peers. While this information can be aggregated and anonymized to address privacy concerns, there may be possible effects on student motivation [26]. Such effects should be theorized prior to deploying LADs, as well as empirically examined after that fact to validate a priori hypotheses. The results of such investigations can enable iterative (re)design processes that encourage adaptive motivation among the students they serve.
Similarly, it is important to not overgeneralize historical data to a particular student or population of students. The promise of learning analytics is to personalize learning experiences and suggest course-corrections during moments of struggle; the possibility of boxing students in should be avoided. The danger of learning analytics doing this is insidious; being data-driven does not necessarily suggest that the available data that are driving learning insights represent all data on student or group of students. If this is assumed, subsequent inferences might become reified in designs or practice. Such concerns have been raised with regard to potential racism being inherent in collected data [35], resulting in its subsequent reification [36]. Instead, expectancies should be tied to domain-specific standards in order to promote equitable outcomes [37].

2.2. Value

2.2.1. Attainment Value

The “attainment” value is defined as the “personal importance of doing well on a task” [31]. Students learning algebra, for example, might assign a high attainment value to their performance on an upcoming algebra exam because they have developed a strong affinity towards math, such that a dimension of their identity is being “a good math student”. A student who self-identifies as “a good writer” may not assign nearly as much attainment value to the same algebra exam. Attainment values, then, represent what is important and salient to a student during demonstrations of competence via various academic tasks. Questions such as “What does it say of me if X is accomplished, and do I care?” are questions that speak to a student’s identity in relation to a specific task, and are well suited to informing the design of LADs. By definition, LADs are designed to use information that is unique to each student; thus, the individuality of a student is already foregrounded in a way that is both present and potentially more powerful, as it is better informed via the inclusion of student data during the generation of feedback. Course-taking behaviors, for example, can be used to introduce students to other courses of interest (e.g., [38]). While imperfect, such feedback mechanisms have the potential to expose students to courses and resources that are better aligned with their particular attainment values.
A potential objection to this approach is the notion that an interest-driven approach may lead students to where they would prefer to go (i.e., it is motivating); however, where a student wishes to go may actually be sub-optimal. While such concerns are valid, it is important to note that a core component of equity is not simply equity with regard to positive outcomes; there should be equity with regard to the processes (e.g., choices) that lead to good outcomes as well. A compromise is to ensure that optimal paths are made clear, rather than encouraging de facto tracking. This middle ground enables insights to not fall into the realm of a paternalism that denies students their agency through coercion.

2.2.2. Intrinsic Value

If attainment value is anchored on conceptions of self, then intrinsic value can be understood as a focus on internal processes. Specifically, a given task might evoke feelings of joy [33] or flow [39] as it is being performed. A student may simply be interested in the prospect of accomplishing a novel task, such as learning another language. Because intrinsic value is largely focused on internal perceptions of a task, the construct has been shown to be predictive of students choosing to master tasks rather than simply choosing easy ones (e.g., [40]). This is supported by other motivation theories, as mastery goals are generally characterized as being intrinsic [41].
To carry the above example forward, a student who assigns low attainment value to doing well on a math exam because they see themselves as a good writer and not as a student who is good at math may nonetheless assign high intrinsic value to the math exam because they simply enjoy algebra. This enjoyment need not be tied to notions of identity, and is usually temporally proximal to the student (i.e., what a student is working on now or will be working on soon). This is not always the case, however. Research has shown that female students may attach higher intrinsic value to obtaining higher education due to a “personal wellness” orientation, which is a move away from a more cautious orientation driven by avoiding the negative consequences associated with the pursuit of an advanced degree (such as limiting family planning decisions) [33,42]. Such students, then, attach intrinsic value to goals that remain somewhat distant. Understood in this way, intrinsic value is both complex in origin and dynamic in practice, and is conditioned on various possibilities and inclinations a student may possess or wish to posses.
Intrinsic value asks questions such as “Will I enjoy or feel engaged with the process that leads up to successfully accomplishing X?” This is another place in which LADs can inhabit a mediating role between antecedent questions a student may pose. This is especially true when considering all of the possible configurations in which information can be presented to students as well as which data are drawn on to generate a given representation. A graph showing a particular student doing poorly on an academic task in relation to their peers, for example, might have the unintended consequence of diminishing the intrinsic value that the student may have felt while engaging with the task [7,12].

2.2.3. Utility Value

Utility value relates to a student’s current and future goals. The above student may actually dislike algebra, for example; however, if he or she assigns high utility value to learning it, then he or she will nonetheless be motivated to pass the algebra exam because of the future purpose it serves. Consequently, utility value can depend either on current goals (such as passing algebra during a current term in order to take calculus the next) or distant goals (such as needing to pass algebra because it falls in a sequences of courses that ultimately lead to getting into medical school [33]. Whether or not these goals can be characterized as intrinsic or extrinsic, however, is a matter of interpretation; for instance, students’ goals may be internalized and intertwined with their aspirational identities (intrinsic), or may be anchored to the wishes of their parents (extrinsic).
In addition to shaping how students approach an academic task, utility values have been shown to predict farther-reaching decisions, such as the types of courses students choose to enroll in [43]. It is unsurprising, then, that utility values are the focus of intervention designs such as “relevance” interventions. Such interventions explicitly ask students to connect a given task to their own lives through a writing exercise in order to increase perceived utility value; these interventions have proved successful in both laboratory and classroom settings [44]. Interestingly, attainment value and intrinsic value as measured in reference to students accomplishing distant tasks (such as an undergraduate student’s goal of attending graduate school) have been shown to be more difficult constructs to separate from one another empirically [42].
Utility value questions such as “Will accomplishing X be useful to me now or in the future?” is where the ability for LADs to leverage institutional data becomes important. There is evidence, for example, showing that certain course-taking pathways are more fruitful for engineering students than others [45]. This information would be invaluable for utility decisions; if certain course “pathways” are shown to impose unneeded burdens on students, then students are able to avoid them. In addition, institutions could be prompted to evaluate certain sequences of courses to inform potential LAD designs. As learning analytics interventions continue to scale in higher education settings, LADs will be able to feed past information back to students. This can work well for specific study strategies, though it need not be limited in such a way.

2.2.4. Cost

Cost refers to an internal calculation of the opportunity costs associated with a given task, e.g., doing well on an exam or choosing to major in a STEM field. Students wishing to do well on an exam, for example, may need to delay their own gratification [46,47] and be attentive during instruction, whereas students pursuing a STEM career may have to factor in other long-term costs. In either case, doing well often comes at the cost of participating in enticing activities, such as talking to friends during instruction or choosing less rigorous field to major in.
Cost questions such as:“What will I lose if I accomplish X, and do the associated benefits for success outweigh those costs?” are well suited for learning analytics dashboard applications, and can inform students’ understanding and deployment of different resources. These resources include time (e.g., the number of credit hours attempted during a term), money (e.g., money spent on additional learning materials), and effort (e.g., time spent studying for a class), among others. Recent work in the self-regulated learning literature has shown the capability to predict and capture granular student behaviors that may make cost calculations easier for researchers to reflect back to students (e.g., [48,49]). This can be helpful to students, as cost management is important; students must manage various demands, as well as both their digital and analog resources.
This is especially important for under-served students, who need to make such cost calculations more often due to more limited resources. Knowing beforehand, for example, that taking a certain combination of courses will lead to the loss of time available for extracurricular activities or employment opportunities in the short term, even if it leads to a graduating sooner, is helpful for students to know as they register for courses, and can potentially help them to save money in the long term. While this information may already exist within peer networks or university handbooks, such information may not be as readily accessible to underserved student populations. Thus, while knowing the rate at which a particular combination leads to successful outcomes is not as readily available, it is possible with learning analytics, and can be continually updated to ensure that students make decisions based on the most recent information.

3. Discussion

It is obvious that learning analytics dashboards are well positioned to provide students with a platform to engage with their information in a manner that is personal, dynamic, and capable of answering particular questions that can influence both expectancies and task value. These data could consist of relevant information about students or about their learning contexts, e.g., historical trends in a class, their own learning trends, etc. Taking a thorough approach to which data best inform a set of LADs enables dashboards to be designed with equity in mind, as equity is predicated on attending to both the contextual and individual characteristics of learners.
This information is made salient through LADs, and is consequently more relevant to students as individual learners. Such an approach is a departure from providing students with information meant to help the “average” student, one who is often envisioned in a manner that decouples them from their core identity. The potential of learning analytics to influence motivation as conceptualized by EVT, then, is through mediation. Specifically, LA applications have the potential to help students make decisions that they are already contemplating by providing feedback and resources previously unavailable to them, or available only slowly via cumbersome channels. This feedback flows through a chosen representation.
Though their work predates much of the discussion of motivation within the learning analytics community, Makara and Karabenick [50] provide an example of how this process might occur within an EVT framework. They posit that help-seeking outcomes, such as soliciting help from a given source and/or utilizing help from available resources, are a product of student expectations of the source (e.g., expecting that the source will provide help) and the “value of the source,” such as the perceived accuracy and/or quality of the source. LADs have the potential to be valuable “sources” because they are adaptable in such a way that they can incorporate various sources of information. A necessary step for this, however, is for LADs to mindfully frame their designs in a manner that attends to the dimensions outlined above. Indeed, as shown in this paper, EVT is a theory designed to understand the relationships between students’ antecedent beliefs and attributions in and about various academic settings, behaviors, and outcomes. Framed in this way, the joining of these two fields provides a unique opportunity to bridge theory and practice via an instantiated design, namely, the LAD.
It should be noted that learning analytics applications can be understood as the instantiation of the goals of various stakeholders, including the institutions that implement them. Embedded in the design of a given application are various decisions regarding what sort of information may or may not help students, as well as the form that information takes. Rather than being distinct from decisions made in a one-on-one scenario (e.g., when a student meets an advisor), learning analytics applications automate this decision-making process and extend the decision-making powers of those who design them, enabling stakeholders to scale their decisions to thousands of students.
Work on motivation and LADs has shown that underserved students who are members of ethnic and racial minority groups interpret visualizations of academic information differently (e.g., [8,26,51]). Such work suggests that underserved students may have specific needs in this respect; such students may be sensitive to comparative information, though they may seek it out as well. This tension between the information underserved students are drawn to versus the information that may be most appropriate is an important one, and requires more empirical work and theorizing to determine the best approach. This is simply one example of special populations of students that might suffer unanticipated effects of LAD design, however. As has been reported elsewhere (e.g., [20], learning analytics has not focused sufficiently on students with disabilities, and it is important to examine any potential effects empirically. These two examples, taken together, suggest that there may be other blind spots worth investigating. Future work in this area would benefit from more focused investigations of specific populations.

4. Conclusions

As this paper has highlighted, LADs have the potential to help students manage university resources, adopt beneficial self-regulated learning strategies, and influence their academic motivation in adaptive ways. Such LAD designs can be improved by taking into consideration the potential benefits of information utilization in view of contemporary motivation theory and research. In this way, future learning analytics research and application can center the experience of students who are under-represented in their home institutions. Indeed, students’ academic motivation and information-seeking practices matter in all instructional contexts. The learning analytics community would benefit from further study into how LADs can provide adaptive motivational pathways. If LADs are to be designed in a way that informs students, it is essential to understand how they motivate students along many dimensions.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Johnson, L.; Becker, S.A.; Estrada, V.; Freeman, A. NMC Horizon Report: 2014 K; The New Media Consortium: West Lake Hills, TX, USA, 2014. [Google Scholar]
  2. Abdul Jalil, N.; Wong Ei Leen, M. Learning Analytics in Higher Education: The Student Expectations of Learning Analytics. In Proceedings of the 2021 5th International Conference on Education and E-Learning, Tsuru, Japan, 5–7 November 2021; pp. 249–254. [Google Scholar]
  3. Herodotou, C.; Rienties, B.; Boroowa, A.; Zdrahal, Z.; Hlosta, M. A large-scale implementation of predictive learning analytics in higher education: The teachers’ role and perspective. Educ. Technol. Res. Dev. 2019, 67, 1273–1306. [Google Scholar] [CrossRef] [Green Version]
  4. Prinsloo, P.; Slade, S. Big data, higher education and learning analytics: Beyond justice, towards an ethics of care. In Big Data and Learning Analytics in Higher Education: Current Theory and Practice; Springer: Berlin/Heidelberg, Germany, 2017; pp. 109–124. [Google Scholar]
  5. Aguilar, S.J. Learning analytics: At the nexus of big data, digital innovation, and social justice in education. TechTrends 2018, 62, 37–45. [Google Scholar] [CrossRef]
  6. Schumacher, C.; Ifenthaler, D. The importance of students’ motivational dispositions for designing learning analytics. J. Comput. High. Educ. 2018, 30, 599–619. [Google Scholar] [CrossRef] [Green Version]
  7. Aguilar, S.J.; Baek, C. Motivated information seeking and graph comprehension among college students. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 280–289. [Google Scholar]
  8. Aguilar, S.J.; Karabenick, S.A.; Teasley, S.D.; Baek, C. Associations between learning analytics dashboard exposure and motivation and self-regulated learning. Comput. Educ. 2021, 162, 104085. [Google Scholar] [CrossRef]
  9. Günther, S.A. The Impact of Social Norms on Students’ Online Learning Behavior: Insights from Two Randomized Controlled Trials. In Proceedings of the LAK21: 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 12–21. [Google Scholar] [CrossRef]
  10. Fan, Y.; Saint, J.; Singh, S.; Jovanovic, J.; Gašević, D. A Learning Analytic Approach to Unveiling Self-Regulatory Processes in Learning Tactics. In Proceedings of the LAK21: 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 184–195. [Google Scholar] [CrossRef]
  11. Salehian Kia, F.; Hatala, M.; Baker, R.S.; Teasley, S.D. Measuring Students’ Self-Regulatory Phases in LMS with Behavior and Real-Time Self Report. In Proceedings of the LAK21: 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 259–268. [Google Scholar] [CrossRef]
  12. Aguilar, S.J. Experimental Evidence of Performance Feedback vs. Mastery Feedback on Students’ Academic Motivation. In Proceedings of the LAK22: 12th International Learning Analytics and Knowledge Conference, Online, 21–25 March 2022; pp. 556–562. [Google Scholar]
  13. Teplovs, C.; Fujita, N.; Vatrapu, R. Generating predictive models of learner community dynamics. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 147–152. [Google Scholar]
  14. Fancsali, S.E. Variable construction for predictive and causal modeling of online education data. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 54–63. [Google Scholar]
  15. Brooks, C.; Epp, C.D.; Logan, G.; Greer, J. The who, what, when, and why of lecture capture. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 86–92. [Google Scholar]
  16. Suthers, D.; Rosen, D. A unified framework for multi-level analysis of distributed learning. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 64–74. [Google Scholar]
  17. Atkisson, M.; Wiley, D. Learning analytics as interpretive practice: Applying Westerman to educational intervention. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 117–121. [Google Scholar]
  18. Williamson, K.; Kizilcec, R.F. Learning Analytics Dashboard Research Has Neglected Diversity, Equity and Inclusion. In Proceedings of the Eighth ACM Conference on Learning @ Scale (L@S’21), Virtual Event, 22–25 June 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 287–290. [Google Scholar] [CrossRef]
  19. Williamson, K.; Kizilcec, R. A Review of Learning Analytics Dashboard Research in Higher Education: Implications for Justice, Equity, Diversity, and Inclusion. In Proceedings of the LAK22: 12th International Learning Analytics and Knowledge Conference, Online, 21–25 March 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 260–270. [Google Scholar] [CrossRef]
  20. Baek, C.; Aguilar, S.J. Past, present, and future directions of learning analytics research for students with disabilities. J. Res. Technol. Educ. 2022, 1–16. [Google Scholar] [CrossRef]
  21. Eccles, J.S.; Wigfield, A. From expectancy-value theory to situated expectancy-value theory: A developmental, social cognitive, and sociocultural perspective on motivation. Contemp. Educ. Psychol. 2020, 61, 101859. [Google Scholar] [CrossRef]
  22. Perez, T.; Cromley, J.G.; Kaplan, A. The role of identity development, values, and costs in college STEM retention. J. Educ. Psychol. 2014, 106, 315–329. [Google Scholar] [CrossRef]
  23. Karabenick, S.A. Perceived Achievement Goal Structure and College Student Help Seeking. J. Educ. Psychol. 2004, 96, 569–581. [Google Scholar] [CrossRef]
  24. Murdock, T.B.; Anderman, E.M. Motivational Perspectives on Student Cheating: Toward an Integrated Model of Academic Dishonesty. Educ. Psychol. 2006, 41, 129–145. [Google Scholar] [CrossRef]
  25. Eaton, S.B.; Bean, J.P. An Approach/Avoidance Behavioral Model of College Student Attrition. Res. High. Educ. 1995, 36, 617–645. [Google Scholar] [CrossRef]
  26. Lonn, S.; Aguilar, S.J.; Teasley, S.D. Investigating student motivation in the context of a learning analytics intervention during a summer bridge program. Comput. Hum. Behav. 2015, 47, 90–97. [Google Scholar] [CrossRef]
  27. Urdan, T.; Kaplan, A. The origins, evolution, and future directions of achievement goal theory. Contemp. Educ. Psychol. 2020, 61, 101862. [Google Scholar] [CrossRef]
  28. Harackiewicz, J.M.; Barron, K.E.; Pintrich, P.R.; Elliot, A.J.; Thrash, T.M. Revision of achievement goal theory: Necessary and illuminating. J. Educ. Psychol. 2002, 94, 638–645. [Google Scholar] [CrossRef]
  29. Senko, C.; Hulleman, C.S.; Harackiewicz, J.M. Achievement goal theory at the crossroads: Old controversies, current challenges, and new directions. Educ. Psychol. 2011, 46, 26–47. [Google Scholar] [CrossRef]
  30. Wigfield, A.; Eccles, J.S. Expectancy–value theory of achievement motivation. Contemp. Educ. Psychol. 2000, 25, 68–81. [Google Scholar] [CrossRef]
  31. Eccles, J.S.; Wigfield, A. Motivational beliefs, values, and goals. Annu. Rev. Psychol. 2002, 53, 109–132. [Google Scholar] [CrossRef] [Green Version]
  32. Wigfield, A. Expectancy-value theory of achievement motivation: A developmental perspective. Educ. Psychol. Rev. 1994, 6, 49–78. [Google Scholar] [CrossRef]
  33. Wigfield, A.; Cambria, J. Students’ achievement values, goal orientations, and interest: Definitions, development, and relations to achievement outcomes. Dev. Rev. 2010, 30, 1–35. [Google Scholar] [CrossRef]
  34. Khalil, M.; Prinsloo, P.; Slade, S. A Comparison of Learning Analytics Frameworks: A Systematic Review. In Proceedings of the LAK22: 12th International Learning Analytics and Knowledge Conference, Online, 21–25 March 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 152–163. [Google Scholar] [CrossRef]
  35. Heath, M.K. Buried treasure or Ill-gotten spoils: The ethics of data mining and learning analytics in online instruction. Educ. Technol. Res. Dev. 2021, 69, 331–334. [Google Scholar] [CrossRef]
  36. Uttamchandani, S.; Quick, J. An introduction to fairness, absence of bias, and equity in learning analytics. Handb. Learn. Anal. 2022, 2022, 205–212. [Google Scholar]
  37. Grimm, A.; Steegh, A.; Kubsch, M.; Neumann, K. Learning Analytics in Physics Education: Equity-Focused Decision-Making Lacks Guidance! J. Learn. Anal. 2023, 10, 1–14. [Google Scholar] [CrossRef]
  38. Pardos, Z.A.; Jiang, W. Designing for serendipity in a university course recommendation system. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt, Germany, 23–27 March 2020; pp. 350–359. [Google Scholar]
  39. Csikszentmihalyi, M.; Rathunde, K. The measurement of flow in everyday life: Toward a theory of emergent motivation. In Nebraska Symposium on Motivation, 1992: Developmental Perspectives on Motivation; Jacobs, J.E., Ed.; University of Nebraska Press: Lincoln, NE, USA, 1993; pp. 57–97. [Google Scholar]
  40. Liem, A.D.; Lau, S.; Nie, Y. The role of self-efficacy, task value, and achievement goals in predicting learning strategies, task disengagement, peer relationship, and achievement outcome. Contemp. Educ. Psychol. 2008, 33, 486–512. [Google Scholar] [CrossRef]
  41. Elliot, A.J. A conceptual history of the achievement goal construct. In Handbook of Competence and Motivation; Elliot, A.J., Dweck, C.S., Eds.; Guilford Publications: New York, NY, USA, 2005; pp. 52–72. [Google Scholar]
  42. Battle, A.; Wigfield, A. College women’s value orientations toward family, career, and graduate school. J. Vocat. Behav. 2003, 62, 56–75. [Google Scholar] [CrossRef]
  43. Bong, M. Between-and within-domain relations of academic motivation among middle and high school students: Self-efficacy, task value, and achievement goals. J. Educ. Psychol. 2001, 93, 23. [Google Scholar] [CrossRef]
  44. Hulleman, C.S.; Schrager, S.M.; Bodmann, S.M.; Harackiewicz, J.M. A meta-analytic review of achievement goal measures: Different labels for the same constructs or different constructs with similar labels? Psychol. Bull. 2010, 136, 422. [Google Scholar] [CrossRef]
  45. Nam, S.; Lonn, S.; Brown, T.; Davis, C.S.; Koch, D. Customized course advising: Investigating engineering student success with incoming profiles and patterns of concurrent course enrollment. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, IN, USA, 24–28 March 2014; pp. 16–25. [Google Scholar]
  46. Bembenutty, H.; Karabenick, S.A. Academic delay of gratification. Learn. Individ. Differ. 1998, 10, 329–346. [Google Scholar] [CrossRef]
  47. Bembenutty, H. Academic delay of gratification, self-regulation of learning, gender differences, and expectancy-value. Personal. Individ. Differ. 2009, 46, 347–352. [Google Scholar] [CrossRef]
  48. Arizmendi, C.J.; Bernacki, M.L.; Raković, M.; Plumley, R.D.; Urban, C.J.; Panter, A.; Greene, J.A.; Gates, K.M. Predicting student outcomes using digital logs of learning behaviors: Review, current standards, and suggestions for future work. Behav. Res. Methods 2022, 1–29. [Google Scholar] [CrossRef]
  49. Cogliano, M.; Bernacki, M.L.; Hilpert, J.C.; Strong, C.L. A self-regulated learning analytics prediction-and-intervention design: Detecting and supporting struggling biology students. J. Educ. Psychol. 2022, 114, 1801–1816. [Google Scholar] [CrossRef]
  50. Makara, K.A.; Karabenick, S.A. Characterizing sources of academic help in the age of expanding educational technology: A new conceptual framework. In Advances in Help-Seeking Research and Applications: The Role of Emerging Technologies; IAP: Charlotte, NC, USA, 2013; pp. 37–72. [Google Scholar]
  51. Aguilar, S.J. Examining the relationship between comparative and self-focused academic data visualizations in at-risk college students’ academic motivation. J. Res. Technol. Educ. 2018, 50, 84–103. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Aguilar, S.J. Using Motivation Theory to Design Equity-Focused Learning Analytics Dashboards. Trends High. Educ. 2023, 2, 283-290. https://doi.org/10.3390/higheredu2020015

AMA Style

Aguilar SJ. Using Motivation Theory to Design Equity-Focused Learning Analytics Dashboards. Trends in Higher Education. 2023; 2(2):283-290. https://doi.org/10.3390/higheredu2020015

Chicago/Turabian Style

Aguilar, Stephen J. 2023. "Using Motivation Theory to Design Equity-Focused Learning Analytics Dashboards" Trends in Higher Education 2, no. 2: 283-290. https://doi.org/10.3390/higheredu2020015

APA Style

Aguilar, S. J. (2023). Using Motivation Theory to Design Equity-Focused Learning Analytics Dashboards. Trends in Higher Education, 2(2), 283-290. https://doi.org/10.3390/higheredu2020015

Article Metrics

Back to TopTop