Next Article in Journal
Modeling Soil Temperature with Fuzzy Logic and Supervised Learning Methods
Previous Article in Journal
Assessing Building Seismic Exposure Using Geospatial Technologies in Data-Scarce Environments: Case Study of San José, Costa Rica
Previous Article in Special Issue
Assessing Digital Technologies’ Adoption in Romanian Secondary Schools
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Awareness to Action: Student Knowledge of and Responses to an Early Alert System

1
Learning Engineering Institute, Arizona State University, Tempe, AZ 85281, USA
2
Human Systems Engineering, The Polytechnic School, Arizona State University, Tempe, AZ 85281, USA
3
Department of Psychology, Arizona State University, Tempe, AZ 85281, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(11), 6316; https://doi.org/10.3390/app15116316
Submission received: 22 April 2025 / Revised: 30 May 2025 / Accepted: 30 May 2025 / Published: 4 June 2025
(This article belongs to the Special Issue The Application of Digital Technology in Education)

Abstract

Introduction: Student retention is a critical issue in higher education. Universities have responded by implementing supports like early alert systems. Objective: We investigated students’ knowledge of and experiences with an early alert system designed to enhance academic persistence. Method: We surveyed (N = 356) undergraduates at a large public university in the U.S. The survey was researcher-created and administered online. Participants self-selected into the study and provided informed consent prior to beginning the study. Data were coded by experts, who achieved excellent IRR. The analyses were frequency-based to understand diverse student knowledge, experiences, and responses with early alert systems. Results: Students commonly reported experiencing negative emotions after receiving an alert, but also reported that receiving an alert motivated them to increase their course attendance, improve their study habits, and participate more in class. Finally, students indicated that receiving an early alert facilitated supportive interactions with instructors, though student communication with academic advisors was less frequent. Student recommendations for system improvement included using positive language in alerts and providing actionable guidance. Conclusions: These results provide new insight into student views of early alert systems and suggest that these systems can positively impact students in need of support.

1. Introduction

Many students benefit from support in persisting through college or university [1,2]. Consequently, institutions of higher education have developed resources to give students the type of support they need (e.g., academic, emotional, financial) [3,4,5,6]. According to self-determination theory (SDT), an individual’s motivation to act is informed by the degree to which they perceive that they have ownership of their actions (i.e., autonomy), that they can effectively respond to their context (i.e., competence), and that they have connections with others (i.e., relatedness) [7,8]. Therefore, institutions that satisfy students’ basic needs for autonomy, competence, and relatedness are more efficacious in supporting student success [7,8].
Early alert systems seek to improve relatedness by identifying students who would benefit from intervention in a timely manner. This identification allows the institution to connect that student with resources or student success staff proactively (e.g., by prompting an academic advisor to reach out to the student) [9,10,11,12]. This institutional outreach is essential within the theoretical framework provided by Tinto [13], as faculty interactions forward a students’ social integration into campus and thus decrease the risk of institutional departure.
Early alert systems are prevalent among public and private universities in the United States. A 2014 report estimated that 85–90% of higher education institutions in the United States had an early alert system in use [14]. Early alert systems have been found to be effective in global universities as well. For example, early alert systems have been found to enhance student success at universities in Europe [15], Africa [16], and Asia [17], and Oceana [18], either by accurate detection of at-risk students or connection of students to tutoring resources.
Early alert systems are particularly relevant to the support ecosystem at colleges and universities because they can serve as an entry point for a student to access relevant resources. For example, an early alert can prompt a student to meet with an academic advisor, who then may suggest making an appointment with the tutoring center. Thus, although early alert systems themselves may not provide help in the form of homework support or financial aid adjustment, they can prompt awareness and action that lead the student to interface with university supports that do offers those services. Though these systems interface with instructors and student success staff, the key user is the student, who decides how to respond to the alert (if at all). Thus, the current study was contextualized within frameworks of student persistence in higher education such as Tinto’s model of institutional departure [13] and self-determination theory [8]. Both theories highlight that students who are connected to the people and resources of an institution are more likely to persist through university.
We sought to understand the student experience of a particular early alert system, called Academic Status Reports (ASRs). Thus, we administered a survey to undergraduate students at a large public university to examine their knowledge of and responses to ASRs. Our primary aim was to gain an improved understanding of how college students think about and react to this type of institutional support. This aim led us to develop three research questions for this study: (1) What do students know and believe about ASRs? (2) How do students respond to receiving an ASR? and (3) What recommendations do students have to improve the system?

1.1. Early Alert Systems in Higher Education

Early alert systems can be used to identify students who would benefit from an influx of support and notify student success staff (e.g., administrators, advisors) of these students [12]. These systems are sometimes referred to as early warning systems when they are used only to identify students experiencing academic or related challenges [15,19]. However, these systems can also be used to provide positive feedback (e.g., kudos to a student for a strong start to the semester). For that reason, we use the neutral term early alert system.
Initial early alert systems relied on reports submitted manually by an instructor detailing student behaviors (e.g., missing attendance, not submitting assignments) [10]. We refer to these as human-driven early alert systems because they are initiated by a human action (as compared to automated early alert systems that use student data captured by university systems) [20,21,22]. For example, an early alert system implemented at Vincennes University in the early 1990s allowed instructors to send postcards to students when they started to accrue absences [23]. Academic advisors also received a list of students who had been flagged for non-attendance. Data collected following system implementation indicated that the early alert system reduced the number of students who dropped, failed, or withdrew from courses, and significantly increased passing grades in early morning and evening classes.
Many colleges and universities continue to use human-driven early alert systems [24]. Alerts from these systems may be particularly beneficial for students because they signal that an instructor notices and cares about them [25]. Past research on student persistence has found that when students and faculty engage in one-on-one interactions, students are more likely to perceive themselves as members of the college community, and this perception decreases the likelihood of attrition [13,26,27]. Additionally, a survey of undergraduates who considered dropping out revealed that social support was a key factor in their decision to stay enrolled [28]. These findings align with key theories of persistence and motivation. Tinto’s [13] model of institutional departure, for example, suggests that social integration (e.g., connections with peers and faculty) is a key influence on students’ likelihood of degree completion. Likewise, self-determination theory [8] posits that relationship with others in one’s context (e.g., classroom, university) bolsters self-directed motivation. Therefore, implementation of an early alert system, particularly one that is human-driven, has the potential to improve persistence by connecting students to members of the university.

1.2. Design and Implementation of Human-Driven Early Alert Systems

Although there are many potential benefits of human-driven early alert systems, they are limited in scale by the human hours available for use of the system. For example, instructors in large enrollment classes do not always have the time to take on additional instruction tasks [29,30], such as submitting a report for every relevant student. Thus, as with other human-centered systems, human-driven early alert systems should be designed to efficiently meet the needs and capabilities of their users [31]. In Tampke’s [12] study on the design and implementation of an early alert system within an academic institution, the author considered the impact of the system on faculty, students, and advising staff via reported effects on student behaviors, frequency of instructor reporting, and frequency of meetings with advising staff. Similarly, Patterson and colleagues [32] designed an early alert system informed by researchers, administrators, instructors, advisors, and students, recognizing that stakeholders have unique perspectives, goals, and needs.
Once a system is designed and ready to be implemented, instructors, students, and staff should receive training on the system’s goals and how to use it. Implementation and training can occur in writing (e.g., emails, documents), through workshops, or via videos [33]. Differences in training on and use of early alert systems across institutional units can lead to differences in effectiveness [34]. This variation may, in turn, impede the system’s use at a university level, as scalability in part hinges on consistent communication to all stakeholders (e.g., students, instructors, advising) [35].

1.3. How Do Stakeholders View and Use Human-Driven Early Alert Systems?

Prior research on stakeholder views of early alert systems has centered on staff, instructors, and students, with much of these data collected as supplemental to a broader investigation of system efficacy [23]. Academic advisors report that they are excited to have access to early alert system data [23] and use these data when meeting with students [35]. Instructors, on the other hand, have more mixed reactions to early alert systems. Instructors believe they should reach out to struggling students [34] and endorse having an early alert system because they feel ill-prepared to be the sole source of help [12]. However, instructors also indicate that a drawback of early alert systems is the time and effort required to learn a new interface [36] and the time required to submit early alerts [34], though some instructors of large enrollment courses find the additional workload worthwhile because they would not be able to engage with individual students in their courses without such systems [36]. Finally, some instructors criticize early alerts for their lack of initiating change in students [34], whereas others observe that early alerts lead students to modify relevant behaviors such as their study habits [37].
For how widely early alert systems are used in higher education, research offering student views of this institutional support is surprisingly sparse [34]. Prior work suggests that most college students want to be contacted if their performance in a course is unsatisfactory, would like that contact as soon as possible, and want to receive information about resources to help them get back on track [38]. Students also report various emotions after receiving an early alert, from being happy that someone cared enough to reach out [10,34], to indifference and negative emotions (e.g., fear) [34]. These responses align with instructor concerns that receiving an early alert may harm student confidence [34]. Most (76%) students surveyed by Velasco [34] also reported that they wanted to talk with their instructor after receiving an early alert notification, but only 33% wanted to talk with an advisor. In terms of spurring action, Velasco [34] found that students’ most common response to an early alert was to do nothing, suggesting that some students may need extra guidance or support in reacting to an early alert, but that a substantial percentage of students did shift their academic behaviors through submitting missing assignments and increasing their class attendance. Altogether, prior work suggests that students want to learn in a timely manner if they need to improve their performance–coupled with connection to support–, but that receiving an early alert leads to negative emotional reactions for a substantial percentage of students and does not always prompt positive action. Consistent with Velasco’s [34] call for further research into students’ perspectives on early alert systems, we extend research on this topic by surveying undergraduate students at a large public state university whose enrollment size necessitates the use of highly scalable systems for student support. Thus, we examined student knowledge of and experiences with academic early alerts within the context of an institution where these systems are especially well-positioned to impact students.

1.4. Academic Status Reports: A Human-Driven Early Alert System

In this study, we build upon prior work by investigating student experiences with and perceptions of Academic Status Reports (ASRs) at Arizona State University (ASU). The ASR system is an early alert reporting system designed to give students feedback about their progress throughout the semester from their instructor (https://students.asu.edu/academic-status-report, accessed on 12 May 2025). A unique goal of ASRs is to facilitate a “triangle” of connections between students, instructors, and support staff [10]. Within the ASR system, instructors have the option to submit positive or negative ASRs. Positive ASRs act as “kudos” to students for doing well in class; for example, earning a high grade on coursework. Negative ASRs act as a warning to students, signaling that their instructor has concerns about their academic behavior (e.g., not attending class) or performance (e.g., currently holding a failing grade in the course).
The ASR system is not automated (in contrast to 15, 20). Instead, like other human-driven early alert systems, it is set in motion via instructor reports [12,33] and ASRs are intended to go beyond the classroom by facilitating intervention by university staff [10,21,33]. The ASR system is also different from initial early alert systems in higher education that focused solely on attendance [10]. Rather, ASRs can be submitted by instructors for a variety of reasons, and the system allows for the provision of feedback and recommended actions. When submitting an ASR, instructors input a status (either a letter grade or a “satisfactory/unsatisfactory” designation), and then have the option to (but are not required to) enter a reason code (i.e., “low class participation”, “no contact with instructor”, or “poor grades on assignments”), recommend student action (e.g., “contact the instructor or TA”, “meet with advisor”), and/or write an open-ended comment. ASRs are then displayed as an alert on a student’s academic portal. This display is coupled by a centrally managed email and text message informing the student that they have an ASR to view on their academic portal, with some academic units sending additional follow-up emails or texts to their students. Also depending on the instructor’s academic unit, ASRs are sometimes shared with student support staff (e.g., academic advisors) who can then contact the student.
For ASRs to function properly, there are many stakeholders continuously involved in the system. The Vice Provost’s Office, in collaboration with Enterprise Technology, oversees the maintenance of the system, maintains university websites describing the system and its goals, and sends regular communications to instructors encouraging them to submit ASRs, particularly early in the academic term. It also centrally manages follow-up emails and texts to students informing them of their ASR. Instructors set the system in motion by choosing when, why, and for whom to submit an ASR. Staff continue communication by reaching out to a student who has received an ASR, though not all ASRs are followed by staff contact (follow-up from an academic unit’s staff is guided by each unit’s own processes and procedures). Finally, and perhaps most importantly, students decide how to respond to the ASR. For this reason, the current study focused on students’ knowledge of and reactions to the ASR early alert system.

1.5. Initial Investigation: ASR System Usage

Our initial work investigating this system suggests that many students will receive an ASR at some point in their undergraduate career. Using ASU’s Learning@Scale digital learning network platform (https://learningatscale.asu.edu/, accessed on 12 May 2025), we examined ASR usage in five 100-level courses in biology, communication, English, and psychology from 2018–2022. Out of the 99,000 enrolled students, 10.8% received at least one ASR. Further, informal conversations with instructors and academic advisors at ASU suggest that ASRs are used by instructors in different ways (e.g., consistently submitted weekly versus submitted for select students) and that instructors and academic advisors have encountered highly variable responses to receiving an ASR by students (e.g., panic, dismissal, motivation to improve) [39]. These variable responses indicate that students have different understandings of the system and the implications of receiving an ASR. Instructors and academic advisors also noted that the ASR system undergoes iteration. For example, the system initially only allowed negative ASRs, but was later expanded to include positive ASRs. This shifted the system from an early warning system to a more neutral early alert system and aligns with research finding that positive feedback can encourage student persistence [40,41].

1.6. Survey Study: Students’ Knowledge of and Experiences with the ASR System

Based on our initial investigation of ASR usage at the institution we devised a survey querying knowledge of and experiences with the ASR system and administered it online to undergraduate students at ASU. Our goal was to gain insight into student knowledge about, perceptions of, and experiences with a human-driven early alert system, supporting our understanding of how students engage with and view this kind of institutional support. By better understanding students’ perceptions, experiences, and responses to ASRs, we aim to gain theoretical understanding of how ASRs improve students’ relatedness within an SDT framework and decrease the likelihood of institutional departure.
Informed by our preliminary data analysis and theoretical background, we constructed this survey to answer three research questions: (1) What do students know and believe about ASRs? (2) How do students respond to receiving an ASR? and (3) What recommendations do students have to improve the system?

2. Materials and Methods

The current study was a mixed-methods survey study utilizing both Likert-scale items and free responses. The study utilizes observational analyses of the frequencies of students’ responses to better understand students’ knowledge, beliefs, and responses to ASRs. The study was preregistered at https://doi.org/10.17605/OSF.IO/VK6YE. Survey materials and deidentified data are available at https://osf.io/a246c/?view_only=5fff7e024dba46609a57aae9833a5ce6, accessed on 12 May 2025.

2.1. Participants

Participants (N = 356) were recruited from the Psychology subject pool of Arizona State University, a large public research university. Sampling from this subject pool gives us access to a large group of undergraduate students, particularly first-year students who may benefit the most from interventions like academic early alerts and are therefore a population of interest for this work (a point we expand upon in Section 4). Data from the initial investigation into rates of ASR system usage in introductory courses described in 1.5 led us to expect that this subject pool would yield a sample with a range of experiences with ASRs. An additional 15 participants were recruited to participate but their data were excluded because they failed an attention check. Ethics approval was obtained from ASU’s Institutional Review Board (IRB) and all participants provided informed consent by electronically signing a consent form before beginning the survey. Participants were selected to participate in this study by signing up to participate via the subject pool portal, and gave informed consent to participate electronically when they signed up for the study. Participants were on average 19.1 years old (SD [Standard Deviation] = 1.11). All participants were in-person students and primarily enrolled in degree programs offered by The W. P. Carey School of Business and The College of Liberal Arts and Sciences. Complete demographic information for the sample is available in Table 1.

2.2. Materials

Survey items were informed by conversations with faculty and student success staff; these conversations involved asking faculty and staff about their experiences discussing ASRs with students and their observations of student responses to ASRs. We also conducted a review of peer-reviewed literature on early alert systems. The author team then held a series of discussions to ideate on themes that emerged from these conversations and literature review to write the initial survey items. The survey items were reviewed for clarity and appropriateness by an undergraduate research assistant, resulting in minor changes to the wording of items. None of the survey items were taken from established measures of knowledge and experiences with early alert systems. Indeed, it appears that there is a current lack of validated measures regarding this topic, as prior survey studies of student experiences with early alert systems have also used measures developed specifically for those investigations (e.g., [34]).
The survey included 34 items, excluding demographic information items. These items investigated five key areas regarding student engagement with the ASR system. Pertaining to Research Question 1, What do students know and believe about ASRS? were (1) knowledge of ASRs and (2) beliefs about ASRs. Pertaining to Research Question 2, How do students respond to receiving an ASR?, were (3) experiences receiving an ASR and (4) emotional and behavioral responses to receiving an ASR. Finally, pertaining to Research Question 3, What recommendations do students have to improve the system? was (5) recommendations for system improvement. The items were a mixture of open-text and closed-response (e.g., Likert scale) items. Eight of the 11 open-response items were coded by human raters (two additional items did not have sufficient data for analysis and one additional item which asked participants to list sources did not produce responses of qualitative depth needed for coding). Two raters inductively coded responses by separately reading participant responses, developing a coding schema, and then creating a consensus coding schema through discussion. Then, 80–100% of the items were coded separately and checked for reliability. As responses were allowed to be included in multiple categories, 80% inter-rater agreement was set as the threshold for acceptable reliability, as done in prior research [42]. If the raters met that threshold, disagreements were resolved through discussion and then one rater coded the remaining responses. In certain cases, disagreement resulted in adjusting the coding schema to better reflect the data and raters repeating the process until the threshold was met. The final range of percentage agreement among raters for the items was 80.9–95.1%.

2.3. Procedure

This online survey was administered via Qualtrics (https://www.qualtrics.com/, accessed on 12 May 2025) during Spring 2024. Participants completed the survey asynchronously on their own device in exchange for partial course credit. Participants were first asked if they had heard of ASRs. If the participant responded that they had not heard of ASRs or were unsure if they had heard of ASRs, they were directed to answer survey items on a different topic, then report their demographic information, and finally read a short debrief statement. Of the 356 participants in the final sample, 107 (30.1%) reported that they had not heard of ASRs, 78 (21.9%) reported that they were unsure if they had heard of ASRs, and 3 (0.8%) chose not to respond to the item.
Participants who reported that they had heard of ASRs (n = 168, 47.2%) were then shown the remaining survey items about ASRs. Median survey completion time for these participants was 8.31 min. Survey items were grouped together in blocks based on theme (e.g., knowledge of ASRs). Participants who reported that they had received more than one ASR during their academic career were instructed to consider the first ASR that they had received when responding to the survey items. Participants’ responses to survey items determined which items they were subsequently shown (e.g., if a participant reported that they had never received an ASR then they were not presented with items about their response to receiving an ASR), and participants were allowed to skip questions that they did not want to answer. For those reasons, not every participant responded to every survey item. After completing the survey items about ASRs, participants reported their demographic information and read a debrief statement explaining the purpose of the study.

3. Results

Frequencies for categorical items are reported in tables whereas means and standard deviations of scale items are reported in-text. Section 3.1.1, Section 3.1.2, Section 3.1.3 and Section 3.1.4 answers our first research question: What do students know and believe about ASRs? Section 3.2.1, Section 3.2.2, Section 3.2.3, Section 3.2.4, Section 3.2.5 and Section 3.2.6 answers our second research question: How do students respond to receiving an ASR? Finally, Section 3.3 answers our third research question: What recommendations do students have to improve the system?

3.1. What Do Students Know and Believe About ASRs?

These first sections pertain to our research question: What do students know and believe about ASRs? As detailed in Table 2, 47.2% (n = 168) of respondents have heard of ASRs. Of those who have heard of ASRs, 69.6% (n = 117) reported that ASRs were mentioned by an instructor or teaching assistant during class, 43.5% via email (n = 73), and 43.5% (n = 73) via a Canvas message. Overall, students were moderately confident in the accuracy (M [Mean] = 4.2, SD = 1.1) and completeness (M = 4.1, SD = 1.2) of their description of ASRs, as assessed on a six-point scale.

3.1.1. Describing ASRs

Participants were asked to describe an ASR in 1–2 sentences. Within responses to this question, we identified six dimensions of the system that participants most often addressed in describing ASRs: what is reported via an ASR, the valence of what is reported, who submits an ASR, the intended audience of an ASR, the scope of an ASR, and the purpose of an ASR (described in Table 3). At least half of respondents mentioned what is reported, intended audience, purpose, and scope, suggesting that these features might be most relevant in differentiating the ASR system from other systems at ASU. Overall, descriptions of the ASR system suggest that students believe it is used by instructors to communicate a student’s general performance in their class.

3.1.2. Why the ASR System Was Created

Most (57.8%; n = 59) responses mentioned that ASRs were created to facilitate reporting of students’ general progress, followed by 20.6% (n = 21) mentioning ASRs as a way to help students improve (full results presented in Table 4). These responses reflect a perception of ASRs as a tool for tracking general progress and giving students specific feedback on areas that require improvement.

3.1.3. Who Uses the Information Contained in ASRs

This item queried participants about who has access to ASR data (Table 5). The most frequent response was the administration/the university (38.7%; n = 65), followed by advisors (21.4%; n = 36). These responses demonstrate a preponderance of accurate knowledge of the system. The next most common response was parents (18.5%; n = 31), which likely reflects that many students give their parents access to the student portal (i.e., MyASU) where ASRs are displayed.

3.1.4. Perceived Implications of an ASR

As shown in Table 6, among students who have heard of ASRs, the majority (70.8%; n = 119) believed that receiving an ASR can indicate something negative or positive about their academic performance, which is coherent with the design of ASRs (i.e., ASRs are designed to offer either kudos or alert a student to an instructor’s concerns). Students generally disagreed that one who received a negative ASR should drop or withdraw from the class (M = 2.6, SD = 1.3) or has little hope of passing the class (M = 2.9, SD = 1.4). Students were neutral (M = 3.4, SD = 1.4) about a negative ASR signaling a failing grade.

3.2. How Do Students Respond to Receiving an ASR?

The following sections address the research question: How do students respond to receiving an ASR? Students reported on their real-life experiences in receiving ASRs (Table 7). Students typically learned about their ASR from their student portal (77.3%, n = 85) (Table 8). Roughly equal percentages of students reported receiving a negative ASR (22.0%; n = 37), receiving a positive ASR (23.8%; n = 40), receiving both a negative and a positive ASR (19.6%; n = 33), or never receiving an ASR (21.4%; n = 36). Among students who received either negative or positive ASRs, 49.4% (n = 77) received more than one ASR. Thus, we asked students to think about the first ASR they received for subsequent items pertaining to their personal experience receiving an ASR.

3.2.1. Initial Emotions

Students most often reported that they felt bad when receiving their ASR (43.6%; n = 48), followed by indifference or apathy (22.7%; n = 25) (Table 9). Participants also reported feeling academic pressure (20.9%; n = 23), whether negative (e.g., feeling concerned that they need to drop the course) or positive (e.g., motivation to improve). The fourth most common emotional response was feeling good (11.8%; n = 13). The fact that ASRs can be either positive or negative can partially explain the range in response valence.

3.2.2. Motivations and Concerns

Students agreed that they understood why they had received an ASR (M = 4.9, SD = 1.2). Moreover, when receiving an ASR, whether positive or negative, they felt relatively supported by the instructor (M = 4.3, SD = 1.4), and they felt motivated to improve their performance (M = 4.5, SD = 1.2) and stay in the course (M = 4.5, SD = 1.3). Likewise, participants reported that they felt like they could improve their performance (M = 4.6, SD = 1.2) and stay in the course (M = 4.8, SD = 1.1). Aligned with these results, participants disagreed that they felt pressured to drop or withdraw from the course after receiving an ASR (M = 2.6, SD = 1.7). Overall, students reported that receiving an ASR was a somewhat positive experience (M = 4.0, SD = 1.5).
Although the above results suggest that participants felt motivated and able to persist and improve in the course after receiving an ASR, they also reported holding many concerns after receiving an ASR (Table 10). Students were especially concerned about maintaining their GPA, (68.8%; n = 75), failing the class (59.6%; n = 65), and their permanent academic records (44.0%; n = 48). In addition, a fair number of students were concerned about the instructor’s perception of them as a student (42.2%; n = 46), and their parents learning about the ASR (39.4%; n = 43).

3.2.3. Seeking out Information

Few students (22.7%; n = 25) sought out information about their ASRs. Among those students, they most often went to the instructor or teaching assistant (68.0%; n = 17), then academic advising (56.0%; n = 14). Students typically looked for how to respond to the ASR (n = 12, 48.0%) (presented in Table 11). Students tended to agree that they were able to find the information that they were searching for (M = 4.8, SD = 1.3).

3.2.4. Connecting to Support

After receiving an ASR, 35.5% (n = 39) of participants reached out to the instructor or teaching assistant, and 22.7% (n = 25) reached out to an academic advisor (Table 12). Likewise, 39.1% (n = 43) reported that the instructor or teaching assistant followed up with them, and 26.4% (n = 29) reported that academic advising contacted them.

3.2.5. Changing Behaviors

When asked whether they changed their behavior after receiving an ASR, 59.1% (n = 65) stated that they changed their academic behavior (e.g., class attendance) and 36.4% (n = 40) stated that they changed their personal behavior (e.g., work schedules) (Table 12). Overall, 63.6% (n = 70) changed their academic behavior, personal behavior, or both. The most common changes were adjusting study behaviors, increasing class attendance, and improving course participation (presented in Table 13 and Table 14).

3.2.6. Advice to Others

Participants were asked what advice they would give to someone who had just received an ASR (Table 15). The most frequent advice (28.0%; n = 47) was to read and seek to understand the ASR, followed closely by to get help (24.4%; n = 41) and to not stress over the ASR (24.4%; n = 41). Other advice offered by at least 5% of respondents included to “do more”—work harder, study longer, etc.—(16.1%; n = 27), make a specific change (e.g., change a particular behavior) (14.9%; n = 25), and to persist (e.g., don’t give up) (7.7%; n = 13).

3.3. What Recommendations Do Students Have to Improve the System?

This section pertains to our research question: What recommendations do students have to improve the system? Participants were lastly asked how they would improve ASRs (Table 16). The most common response to this question was to not change anything about ASRs (26.8%; n = 45). An additional 10.1% (n = 17) reported that they simply did not know what changes should be made (leaving open that changes could be made). Recommendations for change made by at least 5% of respondents included to better help students respond to the ASR (11.3%; n = 19), have an instructor talk to the student after submitting an ASR for them (7.1%; n = 12), and have ASRs be more positive (6.0%; n = 10) and less negative (5.4%; n = 9).

4. Discussion

This study investigated students’ knowledge of and experiences with a human-driven early alert system designed to facilitate a triangle of connections between students, instructors, and staff at a large public university. Although many stakeholders participate in this system, responding to an academic early alert ultimately rests with the student. This survey offers insight into students’ experiences with an early alert system that complements prior investigations of instructors’ [36] and advisors’ [35] viewpoints.

4.1. What Do Students Know and Believe About ASRs?

Students’ survey responses reveal that only 47.2% of students in our sample had heard of ASRs, despite it featuring on official university websites and being used by instructors university-wide. Among students who reported knowing what an ASR is, 65.4% had received an ASR during their academic career. Students tended to hold accurate knowledge of the system, such as that it is used by instructors to communicate a student’s performance in a class and can be positive or negative. Students were also aware that multiple stakeholders are involved in the system, noting that the administrators, advisors, and parents use the information contained in ASRs. Students reported that parents who are given access to their academic portals may read ASRs, which is notable as the ASR system was not specifically designed with parents in mind as an audience. This result echoes that from a prior survey of advisors’ use of an early alert system which found that advisors showed the interface to students during meetings even though it was designed to be viewed only by advisors [35]. These findings affirm the importance of investigating intended versus actual use of systems with multiple stakeholders.

4.2. Reactions to Receiving an ASR

One of the most striking findings was students’ feelings after receiving an ASR: The three most common responses were feeling bad, feeling apathetic or indifferent, and experiencing academic pressure. Likewise, when asked about concerns they had when they received their ASR, 40% or more of the sample reported concerns over: maintaining their GPA, failing the class, their permanent academic record, and their instructor’s perception of them as a student. Some of these concerns may be barriers to re-engagement in the course. However, these concerns, such as students’ worries about an instructor’s judgment, can be addressed during meetings with advisors [10].
Simultaneously, students agreed that they felt supported and noticed by the instructor after receiving an ASR, and that they felt motivated and able to stay and improve in the course, in line with prior work on student responses to academic early alerts [10,11,43]. Further, a search of student knowledge and stances expressed via posts by student users on social media sites (e.g., Reddit) revealed that even a negative ASR can be perceived positively as an instructor’s attempt to help [44]. Together, these findings paint a nuanced picture of student responses to ASRs: Students experience negative emotions and hold understandable concerns about what an ASR means for them but are motivated to persist in the course and have self-efficacy to do so.

4.3. Behavioral Responses to Receiving an ASR

Despite the preponderance of initial negative reactions, for many students ASRs worked as intended by prompting action that could positively impact their academic performance. First and foremost, only 19.1% of students dropped or withdrew from the course pertaining to their ASR, suggesting that many students were motivated to persist in the course. Further, 63.6% of students changed their behavior in response to receiving an ASR, most often increasing their class attendance, adjusting their study habits, and increasing their course participation. These changes are aligned with good practices for academic performance [45,46]. For example, a meta-analysis of college students’ class attendance suggests that it is strongly and positively associated with GPA [47].
ASRs were moderately successful in facilitating supportive connections for students, with ~1/3 of students contacting or being contacted by their instructor and ~1/4 contacting or being contacted by academic advising. These findings suggest that early alert systems are in fact a useful tool in higher education, as prior work suggests that student connection with instructors is an important contributor to student retention [26,27]. Further, a unique goal of ASRs is to facilitate a “triangle” of connections between students, instructors, and staff to promote student success, and these responses provide evidence that these connections occur as a result of ASRs. They also align with prior work suggesting that early alert systems facilitate collaboration between instructors, staff, and students [10], and lead students to seek assistance from their instructor [43]. However, we also find room for growth in the frequency of these connections. Communication was more likely to be initiated by an instructor or academic advisor than by students. Indeed, faculty report that a major benefit of early alert systems is that these systems help them fulfill their duty of care to students by facilitating student awareness of their status in the course and, if necessary, how to improve [36]. These data also signal that students may feel that they do not have the knowledge, skills, or relationships to respond appropriately to an early alert, as part of a broader trend that students are hesitant to make use of institutional supports [48,49,50]. Regarding academic help-seeking, a review of existing literature identified several factors that contribute to a student’s choice to seek support, including whether seeking help can lower a student’s self-esteem by revealing academic deficiencies [51]. We discuss potential avenues to improve student rates of help-seeking after receiving an ASR in the next section.
One of the key contributions to this work is that it provides a window into the step-by-step processing that occurs when a student receives an ASR. First, students experience initial emotions, which are often negative (e.g., disappointment, frustration). After reflection, students form impressions as to their current and prospective situation in the course, on average determining that they have the motivation and ability to improve. Finally, students develop an action plan to address their current academic circumstances, involving, for example, connecting with their instructor or academic advisor, increasing their course attendance, or changing their study habits. Assessing student emotions, impressions, or actions alone would have given an incomplete report as to the complex interplay of emotional processing and decision-making that students undergo after receiving an academic early alert. Each of these stages may also be targeted by revisions to the early alert system, as described in the next section.

4.4. Design and Implementation Implications

These findings have design, implementation, and policy implications for early alert systems in higher education. Direct recommendations from students include making the experience of receiving an early alert less focused on negative feedback and more constructive in providing actionable steps. Theoretical models of student persistence [13] and extant research [19,40] has demonstrated that taking action with positive outcomes and feedback can improve students’ self-efficacy, and thereby their overall persistence in college. Therefore, universities should design and implement early alert systems that promote instructors giving positive feedback. Although follow-up email communications provide information to students to help them with next steps, such as information on how they can contact their academic advisor, recommendations from students regarding system improvement included better supporting them in responding to the ASR (e.g., providing course-specific options for next steps) and pairing an ASR with one-on-one communication with the instructor. Doing so may improve the rates of students reaching out to instructors or student success staff after receiving an ASR.
Within the context of self-determination theory (SDT) ASRs currently may improve student persistence by improving relatedness. However, future design features aimed at improving other aspects of students’ motivation within an SDT (i.e., autonomy, competence) may further improve persistence [10,52]. SDT is commonly used to inform the design and study of learning activities and technologies in higher education [53], including early alert systems [34]. Within early alert systems, features could be built to support each of these psychological needs and improve motivation to act. For example, students’ perceived autonomy can be improved by offering them choice from a set of possible actions [54], students’ perceived competence can be increased by pairing an early alert with personalized feedback that does not compare them to other students [55] (and aligns with recommendations for useful early alerts) [9], and students’ perceived relatedness can be even further supported by offering avenues to form personal connections, such as to academic advisors [56].
Universities may also use informational interventions to bolster instructors’ effective use of the early alert system and improve students’ understanding of early alert systems. For example, an institution may offer training to instructors about how to use system features to provide actionable feedback to students. Additionally, previous work regarding student use of on-campus support services indicates that lack of knowledge is a barrier to accessing services [57] and direct instruction on resources can increase their use [58]. Thus, disseminating information to build students’ knowledge of the system may increase their responsiveness to early alerts.
Another consideration relates to best ways to connect students to support. In our sample, students reported that it was relatively uncommon that they communicated with an academic advisor after receiving an ASR, even though creating a “triangle” connection between students, staff, and instructors is a unique affordance of this system. While on the surface the low rate of advising communication seems like an area of improvement, it is also possible that these numbers reflect targeted reach-outs by staff (i.e., only when students meet certain criteria). Use of human-driven early alert systems is constrained by the number of human hours available for interaction with the system and academic support staff do not always have the resources to reach out to every single student who receives an alert [59]. Institutions can therefore aid student success staff by bringing together institutional data to inform decisions on which students are most in need of contact. ASU’s Learning@Scale (https://learningatscale.asu.edu/) is an example of how institutional data from varied sources can be combined for such analyses (see also [53]).

4.5. Limitations

Interpretation of these findings should consider two limitations of this work. First, students who participated in this survey study were primarily first-year students. It is possible that the low rates of knowledge about ASRs was due in part to sampling early career students–albeit at the end of the academic year. And yet, knowledge of less experienced students is particularly informative, as these students are often the target of research on early alert systems [15,18], and persisting past the first year of college is a key predictor of ultimately achieving a college degree [60,61]. Furthermore, the other demographic information of the sample (presented in Table 1; e.g., percentage of student identifying as Latino/a/x is generally aligned with university-wide demographic trends, supporting the generalizability of our findings for first-year students within the institutional context [62]. Second, these results are based on student self-reports. When querying students on topics where their answers may portray them in a negative light, students can be consciously or unconsciously motivated to appear more positive by biasing their responses (i.e., social desirability bias) [61]. We attempted to account for these biases by offering a mixture of items with positive and negative wording and by allowing students to skip any item that they preferred not to answer.

5. Conclusions

This survey study of undergraduates at a large public university revealed a nuanced picture of students’ knowledge of, experiences with, and responses to an early alert system. Students often held positive beliefs about the system (and rated receiving an academic early alert as an overall positive experience) despite frequently reporting that they experienced negative emotions when first receiving an alert. Further, responses suggest that receiving an early alert can facilitate student connections with potential supports (e.g., instructors, academic advisors) and motivate positive behavior change (e.g., improving course attendance), but that many students feel that they need additional guidance on how to best respond to an early alert. Our findings suggest that two major challenges to the success of academic early alerts are (1) student awareness of the early alert system and (2) effectively using the early alert system to deliver actionable feedback to students. Future work may explore whether informational materials about early alerts (offered on a course syllabus or even as part of first-year orientation) are effective in improving awareness, and whether this increased knowledge of the system impacts student reactions to receiving an early alert. Future research could also explore the effects of training faculty on how to use the early alert system to provide personalized feedback grounded in theories of motivation [e.g., 52] in terms of improving the rate of students taking action in response to an early alert. In sum, our work offers insight into how students view and respond to academic early alerts that can be leveraged to improve these systems in higher education.

Author Contributions

Conceptualization, M.N.I., M.G., M.W., J.G., D.N.C., R.D.R. and T.A.; methodology, M.N.I., M.G., M.W., J.G. and D.N.C.; formal analysis, M.N.I., M.G., M.W. and J.G.; investigation, M.N.I., M.G., M.W., J.G. and D.N.C.; resources, D.S.M.; data curation, M.N.I., M.G., M.W. and J.G.; writing—original draft preparation, M.N.I., M.G., M.W. and J.G.; writing—review and editing, M.N.I., M.G., M.W., J.G., D.N.C., R.D.R., T.A. and D.S.M.; visualization, M.N.I. and D.N.C.; supervision, R.D.R., T.A. and D.S.M.; project administration, M.N.I. and D.N.C.; funding acquisition, D.S.M. All authors have read and agreed to the published version of the manuscript.

Funding

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305N210041 to Arizona State University. The opinions expressed are those of the authors and do not represent views of the institute or the U.S. Department of Education.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board of Arizona State University (STUDY00019876 29/3/2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The current study was preregistered at https://doi.org/10.17605/OSF.IO/VK6YE. Survey materials and deidentified data are available at https://osf.io/a246c/?view_only=5fff7e024dba46609a57aae9833a5ce6, accessed on 12 May 2025.

Acknowledgments

We thank Lauryn Dunkley-Goetz for her contributions to this work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Kezar, A.; Kitchen, J.A.; Estes, H.; Hallett, R.; Perez, R. Tailoring programs to best support low-income, first-generation, and racially minoritized college student success. J. Coll. Stud. Retent. Res. Theory Pract. 2023, 25, 126–152. [Google Scholar] [CrossRef]
  2. Wilmer, E. Student support services for the underprepared student. Inq. J. Va. Community Coll. 2008, 13, 4–19. Available online: https://commons.vccs.edu/inquiry/vol13/iss1/2 (accessed on 12 May 2025).
  3. Andrews, R.J.; Imberman, S.A.; Lovenheim, M.F. Recruiting and supporting low-income, high-achieving students at flagship universities. Econ. Educ. Rev. 2020, 74, 101923. [Google Scholar] [CrossRef]
  4. Hoyt, J.E. Student connections: The critical role of student affairs and academic support services in retention efforts. J. Coll. Stud. Retent. Res. Theory Pract. 2023, 25, 480–491. [Google Scholar] [CrossRef]
  5. Kuh, G.D. Student success. In Student Services: A Handbook for the Profession, 5th ed.; Schuh, J.H., Jones, S.R., Harper, S.R., Eds.; Jossey-Bass: Hoboken, NJ, USA, 2011; pp. 257–270. [Google Scholar]
  6. Swecker, H.K.; Fifolt, M.; Searby, L. Academic advising and first-generation college students: A quantitative study on student retention. NACADA J. 2013, 33, 46–53. [Google Scholar] [CrossRef]
  7. Jeno, L.M.; Danielsen, A.G.; Raaheim, A. A prospective investigation of students’ academic achievement and dropout in higher education: A Self-Determination Theory approach. Educ. Psychol. 2018, 38, 1163–1184. [Google Scholar] [CrossRef]
  8. Ryan, R.M.; Deci, E.L. Self-Determination Theory. Basic Psychological Needs in Motivation, Development, and Wellness; The Guilford Press: New York, NJ, USA, 2017. [Google Scholar]
  9. Cuseo, J. At-Risk Prediction Instruments, Early-Alert Systems & Exit Interviews: A Proactive-to-Reactive Continuum of Efforts to Promote Student Success. 2006. Available online: https://www.shawnee.edu/sites/default/files/2019-01/at-risk-early-alert-exit-total-10.pdf (accessed on 12 May 2025).
  10. Hudson, W.E. Can an early alert excessive absenteeism warning system be effective in retaining freshman students? J. Coll. Stud. Retent. Res. Theory Pract. 2005, 7, 217–226. [Google Scholar] [CrossRef]
  11. Pistilli, M.D.; Arnold, K.E. Purdue signals: Mining real-time academic data to enhance student success. About Campus Enrich. Stud. Learn. Exp. 2010, 15, 22–24. [Google Scholar] [CrossRef]
  12. Tampke, D.R. Developing, implementing, and assessing an early alert system. J. Coll. Stud. Retent. Res. Theory Pract. 2013, 14, 523–532. [Google Scholar] [CrossRef]
  13. Tinto, V. Dropouts from higher education: A theoretical synthesis of recent research. Rev. Educ. Res. 1975, 45, 89–125. [Google Scholar] [CrossRef]
  14. Hannover Research. Early Alert Systems in Higher Education; Hannover Research: Arlington, VA, USA, 2014. [Google Scholar]
  15. Bañeres, D.; Rodríguez, M.E.; Guerrero-Roldán, A.E.; Karadeniz, A. An early warning system to detect at-risk students in online higher education. Appl. Sci. 2020, 10, 4427. [Google Scholar] [CrossRef]
  16. Cassells, L. The effectiveness of early identification of ‘at risk’ students in higher education institutions. Assess. Eval. High. Educ. 2018, 43, 515–526. [Google Scholar] [CrossRef]
  17. Lee, Y.H.; Chong, S.; Tang, Y.W. Early alert of the academically at-risk adult learner in higher education. In Proceedings of the 11th International Conference on Education and New Learning Technologies, Palma, Spain, 1–3 July 2019; pp. 5101–5109. [Google Scholar] [CrossRef]
  18. Jokhan, A.; Sharma, B.; Singh, S. Early warning system as a predictor for student performance in higher education blended courses. Stud. High. Educ. 2018, 44, 1900–1911. [Google Scholar] [CrossRef]
  19. Raffaghelli, J.E.; Rodríguez, M.E.; Guerrero-Roldán, A.-E.; Bañeres, D. Applying the UTAUT model to explain the students’ acceptance of an early warning system in Higher Education. Comput. Educ. 2022, 182, 104468. [Google Scholar] [CrossRef]
  20. Macfadyen, L.P.; Dawson, S. Mining LMS data to develop an “early warning system” for educators: A proof of concept. Comput. Educ. 2010, 54, 588–599. [Google Scholar] [CrossRef]
  21. Plak, S.; Cornelisz, I.; Meeter, M.; van Klaveren, C. Early warning systems for more effective student counselling in higher education: Evidence from a Dutch field experiment. High. Educ. Q. 2022, 76, 131–152. [Google Scholar] [CrossRef]
  22. Villano, R.; Harrison, S.; Lynch, G.; Chen, G. Linking early alert systems and student retention: A survival analysis approach. High. Educ. 2018, 76, 903–920. [Google Scholar] [CrossRef]
  23. Budig, J.E. Postcards for student success [Paper presentation]. In Proceedings of the INAIR Conference, West Lafayette, IN, USA, 20–21 March 1995. [Google Scholar]
  24. Hanover Research. Early Alert Systems in Higher Education. 2014. Available online: https://www.hanoverresearch.com/wp-content/uploads/2017/08/Early-AlertSystems-in-Higher-Education.pdf (accessed on 12 May 2025).
  25. Kuh, G.D.; Kinzie, J.L.; Buckley, J.A.; Bridges, B.K.; Hayek, J.C. What Matters to Student Success: A Review of the Literature; Postsecondary Education Cooperative: Washington, DC, USA, 2006; Volume 8. Available online: https://nces.ed.gov/npec/pdf/Kuh_Team_Report.pdf (accessed on 12 May 2025).
  26. Hawk, T.F. Getting to know your students and an educational ethic of care. J. Manag. Educ. 2017, 41, 669–686. [Google Scholar] [CrossRef]
  27. Simpson, C. Early Alert in Community College: Student Success, Persistence, and Retention in Developmental Courses (Publication no. 3620377). Doctoral Dissertation, Fordham University, New York, NY, USA, 2014. [Google Scholar]
  28. Nieuwoudt, J.E.; Pedler, M.L. Student retention in higher education: Why students choose to remain at university. J. Coll. Stud. Retent. 2021, 25, 326–349. [Google Scholar] [CrossRef]
  29. Mulryan-Kyne, C. Teaching large classes at college and university level: Challenges and opportunities. Teach. High. Educ. 2010, 15, 175–185. [Google Scholar] [CrossRef]
  30. Trammell, B.A.; LaForge, C. Common challenges for instructors in large online courses: Strategies to mitigate student and instructor frustration. J. Educ. Online 2017, 14, 10–19. [Google Scholar]
  31. Or, C.K.; Holden, R.J.; Valdez, R.S. Human factors engineering and user-centered design for mobile health technology: Enhancing effectiveness, efficiency, and satisfaction. In Human-Automation Interaction. Automation, Collaboration, & e-Services; Duffy, V., Ziefle, G.M., Rau, P.-L.P., Tseng, M.M., Eds.; Springer: Cham, Switzerland, 2022; Volume 12, pp. 97–118. [Google Scholar] [CrossRef]
  32. Patterson, C.; York, E.; Maxham, D.; Molina, R.; Mabrey, P. Applying a responsible innovation framework in developing an equitable early alert system: A case study. J. Learn. Anal. 2023, 10, 24–36. [Google Scholar] [CrossRef]
  33. Delmas, P.M.; Childs, T.N. Increasing faculty engagement in the early alert process. Innov. Educ. Teach. Int. 2021, 58, 283–293. [Google Scholar] [CrossRef]
  34. Velasco, J.M. Early Alert Programs: A Closer Look (Publication No. 27998226). Doctoral Dissertation, Valdosta State University, Valdosta, GA, USA, 2020. [Google Scholar]
  35. Aguilar, S.; Lonn, S.; Teasley, S.D. Perceptions and use of an early warning system during a higher education transition program. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge; Willis, M.J., Koch, D., Arnold, K., Teasley, S., Pardo, A., Eds.; Association for Computing Machinery: New York, NY, USA, 2014; pp. 113–117. [Google Scholar] [CrossRef]
  36. Atif, A.; Richards, D.; Liu, D.; Bilgin, A.A. Perceived benefits and barriers of a prototype early alert system to detect engagement and support ‘at-risk’ students: The teacher perspective. Comput. Educ. 2020, 156, 103954. [Google Scholar] [CrossRef]
  37. Bentham, C. Faculty Perspectives and Participation in Implementing an Early Alert System and Intervention in a Community College. (Publication No. 5661). Doctoral Dissertation, University of Central Florida, Orlando, FL, USA, 2017. Available online: https://stars.library.ucf.edu/etd/5661/ (accessed on 12 May 2025).
  38. Atif, A.; Richards, D.; Bilgin, A. Student preferences and attitudes to the use of early alerts. In Proceedings of the Twenty-First Americas Conference on Information Systems. AIS eLibrary, Fajardo, Puerto Rico, 13–15 August 2015; Available online: https://aisel.aisnet.org/amcis2015/ISEdu/GeneralPresentations/19 (accessed on 12 May 2025).
  39. Imundo, M.N.; Goldshtein, M.; Watanabe, M.; Gong, J.; Roscoe, R.D.; Arner, T.; McNamara, D.S. A learning engineering approach to the investigation of Academic Status Reports to facilitate student success [Show and Share]. In Proceedings of the 2024 Meeting of the IEEE International Consortium for Innovation and Collaboration in Learning Engineering (ICICLE 2024), Tempe, AZ, USA, 22–24 July 2024. [Google Scholar]
  40. Krumrei-Mancuso, E.J.; Newton, F.B.; Kim, E.; Wilcox, D. Psychosocial factors predicting first-year college student success. J. Coll. Stud. Dev. 2013, 54, 247–266. [Google Scholar] [CrossRef]
  41. Marwan, S.; Gao, G.; Fisk, S.; Price, T.W.; Barnes, T. Adaptive immediate feedback can improve novice programming engagement and intention to persist in computer science. In Proceedings of the 2020 ACM Conference on International Computing Education Research; Robins, A., Moskal, A., Ko, A.J., McCauley, R., Eds.; Association for Computing Machinery: New York, NY, USA, 2020; pp. 194–203. [Google Scholar] [CrossRef]
  42. Zung, I.; Imundo, M.N.; Pan, S.C. How do college students use digital flashcards during self-regulated learning? Memory 2022, 30, 923–941. [Google Scholar] [CrossRef] [PubMed]
  43. Jackson, F.B. First Year Perceptions: How Does Starfish Retention Solutions™ Help with Student Engagement? (Publication No. 28417863). Doctoral Dissertation, Eastern Kentucky University, Richmond, KY, USA, 2021. Available online: https://www.proquest.com/dissertations-theses/first-year-perceptions-how-does-starfish/docview/2561949284/se-2 (accessed on 12 May 2025).
  44. [jdib678]. [Online forum post]. 2023. Available online: https://www.reddit.com/r/ASU/comments/17w1hdt/comment/k9fdoo0/ (accessed on 12 May 2025).
  45. ASU College of Integrative Sciences and Arts. Strategies for Success. Available online: https://cisa.asu.edu/resources/success_strategies (accessed on 12 May 2025).
  46. Michigan State University College of Education. Study Strategies & Tips. Available online: https://education.msu.edu/kin-nections/study-strategies-tips/ (accessed on 12 May 2025).
  47. Credé, M.; Roch, S.G.; Kieszczynka, U.M. Class attendance in college: A meta-analytic review of the relationship of class attendance with grades and student characteristics. Rev. Educ. Res. 2010, 80, 272–295. [Google Scholar] [CrossRef]
  48. Duraku, Z.H.; Davis, H.; Hamiti, E. Mental health, study skills, social support, and barriers to seeking psychological help among university students: A call for mental health support in higher education. Front. Public Health 2023, 11, 1220614. [Google Scholar] [CrossRef]
  49. Harris, K.; English, T.; Harms, P.D.; Gross, J.J.; Jackson, J.J. Why are extraverts more satisfied? Personality, social experiences, and subjective well-being in college. Eur. J. Personal. 2017, 31, 170–186. [Google Scholar] [CrossRef]
  50. Sithaldeen, R.; Phetlhu, O.; Kokolo, B.; August, L. Student sense of belonging and its impacts on help-seeking behaviour. S. Afr. J. High. Educ. 2022, 36, 67–87. [Google Scholar] [CrossRef]
  51. Li, R.; Hassan, N.C.; Saharuddin, N. College student’s academic help-seeking behavior: A systematic literature review. Behav. Sci. 2023, 13, 637. [Google Scholar] [CrossRef]
  52. Deci, E.L.; Ryan, R.M. The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychol. Inq. 2009, 11, 227–268. [Google Scholar] [CrossRef]
  53. Imundo, M.N.; Li, S.; Gong, J.; Potter, A.H.; Arner, T.; McNamara, D.S. Applying self-determination theory to the effective implementation of personalized learning in online higher education. In Handbook of Personalized Learning; Bernacki, M., Walkington, C., Emery, A., Zhang, L., Eds.; Routledge: London, UK, in press.
  54. Zong, Y.; Patall, E. Personalizing to promote autonomy & choice. In Handbook of Personalized Learning; Bernacki, M., Walkington, C., Emery, A., Zhang, L., Eds.; Routledge: London, UK, in press.
  55. Katz, I.; Assor, A. When choice motivates and when it does not. Educ. Psychol. Rev. 2007, 19, 429–442. [Google Scholar] [CrossRef]
  56. Pace, J.L. The Impact of Academic Advising Practices on Rural Community College Students’ Self-Determination: A Single Case Study (Publication No. 30817051). Doctoral Dissertation, Baylor University, Waco, TX, USA, 2023. [Google Scholar]
  57. Broglia, E.; Millings, A.; Barkham, M. Student mental health profiles and barriers to help seeking: When and why students seek help for a mental health concern. Couns. Psychother. Res. 2021, 21, 816–826. [Google Scholar] [CrossRef]
  58. Beisler, M.; Medaille, A. How do students get help with research assignments? Using drawings to understand students’ help seeking behavior. J. Acad. Librariansh. 2016, 42, 390–400. [Google Scholar] [CrossRef]
  59. Chan, Z.C.Y.; Chan, H.Y.; Chow, H.C.J.; Choy, S.N.; Ng, K.Y.; Wong, K.Y.; Yu, P.K. Academic advising in undergraduate education: A systematic review. Nurse Educ. Today 2019, 75, 58–74. [Google Scholar] [CrossRef] [PubMed]
  60. Burrus, J.; Elliott, D.; Brenneman, M.; Markle, R.; Carney, L.; Moore, G.; Betancourt, A.; Jackson, T.; Robbins, S.; Kyllonen, P.; et al. Putting and keeping students on track: Toward a comprehensive model of college persistence and goal attainment. ETS Res. Rep. Ser. 2014, 2013, i-61. [Google Scholar] [CrossRef]
  61. Wright, S.L.; Jenkins-Guarnieri, M.A.; Murdock, J.L. Career development among first-year college students: College self-efficacy, student persistence, and academic success. J. Career Dev. 2012, 40, 271–367. [Google Scholar] [CrossRef]
  62. Arizona State University. Facts and Figures. 2025. Available online: https://www.asu.edu/about/facts-and-figures (accessed on 12 May 2025).
Table 1. Participant Demographic Information.
Table 1. Participant Demographic Information.
CategoryResponsen%
Gender
Man20858.4
Woman14239.9
Non-binary20.6
Race or Ethnicity
White21460.1
Asian4312.1
Multiracial236.5
Black226.2
Hispanic226.2
Latino92.5
American Indian or Native American41.1
Arab or Middle Eastern41.1
Pacific Islander30.8
Hispanic/Latino20.6
Another race or ethnicity41.1
Hispanic or Latino/a/x
No26373.9
Yes8624.2
Prefer not to say30.8
Year in Undergraduate Career
First 25671.9
Second7521.1
Third143.9
Fourth51.4
Fifth20.6
Transfer Student
No32592.3
Yes277.6
Major
Business17950.3
Psychology329.0
Social and Behavioral Sciences298.1
Computer Science123.4
Biology113.1
Arts (e.g., Film)113.1
Health-Related (e.g., Nursing)102.8
Four participants declined to provide any demographic information. Items asking participants for their gender, race, and major were open response.
Table 2. Knowledge of ASRs.
Table 2. Knowledge of ASRs.
QuestionResponsen%
Have you heard of Academic Status Reports (ASRs)?
Yes 168 47.2
No 107 30.1
Unsure 78 21.9
Prefer not to answer 3 0.8
From where have you heard about Academic Status Reports (ASRs)? Please check all that apply. n%
An instructor or teaching assistant during class 117 69.6
An instructor or teaching assistant via email 73 43.5
An instructor or teaching assistant via a Canvas message 73 43.5
A syllabus 65 38.7
Academic advising 61 36.3
ASU website 60 35.7
An email from ASU other than academic advising (e.g., from the Provost) 30 17.9
A friend via a face-to-face conversation, text, or instant message 26 15.5
Social media (e.g., Reddit, Tiktok) 8 4.8
Other 4 2.4
Table 3. In 1–2 sentences, please describe what is an Academic Status Report (ASR). (n = 166).
Table 3. In 1–2 sentences, please describe what is an Academic Status Report (ASR). (n = 166).
CategoryExamplen%
Reported behavior
valence
N/A “It is a document that shows the credits you have taken and what is left to take”. 11468.7
Negative “It is a warning sign that professors can use to report when a student is
struggling academically”.
3219.3
Both “An academic status report is usually an alert given to you as a student to give you a helpful update throughout the school semester. They may tell you you’re doing good if you’re performing well in the class, or they may offer some helpful resources if you’re falling behind or missing class”. 1911.4
Positive “An academic status report is when an instructor reaches out to students based on their current success within the course”. 10.6
What is reported
General performance in the course “To report how you are currently doing in that class”. 11871.1
Grades “It is a report of the actual position of my grade at a given time for a class” 2716.3
Other “You write what you like and don’t like about your professors and the class work”. 2012.0
Not mentioned “Im pretty sure an Academic Status report is similar to a academic warning”. 116.6
Attendance “ASR is the report from your professor (for example) when you have troubles with your studies. i.e., skipping the class. this reports will be shown in your asu portal” 95.4
Missing assignments“It tells us about how we’re doing in a class like if we’re missing assignments or if we satisfy the requirements to be in good standing”31.8
Who submits
Not mentioned “An academic status report is a general overview of your academic progress” 9255.4
Instructor “Its basically just a way for professors to communicate with students about their academic standing”. 6740.4
Dean/Registrar/Administrators “An Academic Status Report (ASR) is a document typically issued by
educational institutions to provide a summary of a student’s academic
performance, including grades, attendance, and any other relevant information, often used for monitoring progress and identifying areas of improvement”.
31.8
Student “An Academic Status report is a way to provide feedback as well as to share your experience to better the program”. 31.8
N/A or Did not answer
the question
“Im freashman”21.2
Intended audience
Student “ASR is the report from your professor (for example) when you have troubles with your studies. i.e., skipping the class. this reports will be shown in your asu portal” 11066.3
Not mentioned “I believe it is a review and overview of how the student did academically in the class” 5030.1
Instructor “An ASR is a checkpoint that a teacher gives a student to see how they are doing in class so far”. 42.4
Dean/Registrar/Administrators “An academic status report is the report an instructor provides to the registrar office and dean providing a student’s course grade”. 21.2
N/A or Did not answer
the question
“Im freashman”21.2
Advisors/Counselors“Basically a way for teachers to record concerns or comments they have on a students academics. They are not put onto transcripts thought. They are posted to myasu so counselors can see them as well”10.6
Scope
Class/Course “It describes and shows how one is doing in the class”. 11066.3
Not mentioned “basically a way for teachers to record concerns or comments they have on a students academics. They are not put onto transcripts thought. They are posted to myasu so counselors can see them as well” 3118.7
Progress/Academic progress “It is a progress check, to make sure you are in good standing for the school, grades wise. 1810.8
Semester “An Academic Status Report is a report given to students throughout the
semester reflecting their current academic status”.
106.0
Degree “Sars is an academic report that monitors pace rate, gpa, and credit hours”. 53.0
N/A or Did not answer
the question
“Im freashman”10.6
Purpose
Communication “It is a notice, that when you are on the verge of failing your class, what the
reasons for that are, and how to fix it in the future”.
11871.1
Not mentioned “It is the reports that teachers sometimes give students” 3118.7
Evaluation “it is a measure of your academic success. it may be based on your grade point average”. 106.0
DocumentationAn academic status report is a document or summary that provides an overview of a student’s current academic standing and progress within a specific academic program or institution”. 63.6
N/A or Did not answer
the question
“Im freashman”10.6
Table 4. Why do you think Academic Status Reports (ASRs) as an official university tool were created? (n = 102).
Table 4. Why do you think Academic Status Reports (ASRs) as an official university tool were created? (n = 102).
ReasonExamplen%
General Progress“Make it more effective and easier for students to track their progress”. 59 57.8
Help improve “To help students assess changes they need to make in order to be successful in a course”. 21 20.6
Other “To make sure that students have transparency on when they’re going to graduate, and what they need to do to get to the finish line”. 17 16.7
Grades “To make sure students stay on top of their grades”. 12 11.8
Negative feedback “I think they were created to bring awareness to a student when they are in danger in academics”. 8 7.8
Unclear “to track teaching levels” 7 6.9
Notify Advising“To notify students and academic advisor of how they are doing” 1 1.0
Table 5. Other than instructors or students, who do you think uses the information contained in Academic Status Reports (ASRs)? (n = 168).
Table 5. Other than instructors or students, who do you think uses the information contained in Academic Status Reports (ASRs)? (n = 168).
UserExamplen%
Administration/The university “The university to study which classes are the most difficult and which are easier”. 6538.7
Advisors “Academic advisors” 3621.4
Parents“parents”3118.5
Jobs/Employers“probably jobs that have internships” 158.9
Faculty“I think that professors also can use ASRs”. 106.0
Don’t Know/Not sure“im not sure” 106.0
Financial aid/Scholarship organizations“Financial aid and scholarships”95.4
Other“The government” 84.8
Students“Anyone who wants to look into the classes they would need if they wanted to pursue a major or career”.84.8
Other colleges/universities“I think faculty and other colleges potentially”.63.6
N/A/No one “N/A”53.0
Researchers“I would maybe guess researchers to read or understand how a class as a whole is doing or the department in which the class is in”.42.4
TAs“I think that TA’s use it”.31.8
Table 6. What best represents your belief about Academic Status Reports (ASRs)? (n = 168).
Table 6. What best represents your belief about Academic Status Reports (ASRs)? (n = 168).
Beliefn%
Receiving an Academic Status Report indicates something negative about your academic performance6538.7
Receiving an Academic Status Report indicates something positive about your academic performance3621.4
Receiving an Academic Status Report can indicate something negative or something positive about your academic performance158.9
Table 7. Experience Receiving an ASR.
Table 7. Experience Receiving an ASR.
QuestionResponsen%
Please indicate which option is true for you during your time at ASU. (n = 168)
I have received a positive Academic Status Report (ASR) from an instructor or teaching assistant4023.8
I have received a negative Academic Status Report (ASR) from an instructor or teaching assistant3722.0
I have never received an Academic Status Report (ASR) from an instructor or teaching assistant3621.4
I have received both a negative and a positive Academic Status Report (ASR) from an instructor or teaching assistant3319.6
Prefer not to answer2213.1
Please indicate which option is true for you during your time at ASU (n = 77)
I have received more than 1 ASR3849.4
I have received 1 ASR3545.5
Prefer not to answer45.2
Table 8. How did you find out that you had received an Academic Status Report (ASR)? (n = 110).
Table 8. How did you find out that you had received an Academic Status Report (ASR)? (n = 110).
Methodn%
MyASU8577.3
Email2320.9
Other21.8
Phone call00
Table 9. How did you feel when you found out that you received an Academic Status Report (ASR)? (n = 110).
Table 9. How did you feel when you found out that you received an Academic Status Report (ASR)? (n = 110).
FeelingExamplen%
I felt bad“upset because I knew I didnt deserve it”4843.6
I didn’t really feel anything/apathy“Nothing really just saw the notification for the ASR and read the report”.2522.7
I felt academic pressure“Motivated to do better”2320.9
I felt good“my first ASR was congratulating me on my hard work which
motivated me to do the same across the board”
1311.8
Other“I didn’t know what it was the first time I saw it so I was confused”.98.2
I felt informed“I knew the current status of my learning”87.3
I felt the ASR was expected“I understood why. I hadn’t submitted a single assignment for well over a month and was on track to failing”.87.3
I felt noticed“I liked getting feedback from the instructor, them willing to help me meant they believed in me and helped with what I could improve on”.43.6
I felt the ASR was unexpected“Suprised”32.7
Table 10. When you found out that you had received an Academic Status Report (ASR), what, if anything, were you concerned about? [Check all that apply] (n = 109).
Table 10. When you found out that you had received an Academic Status Report (ASR), what, if anything, were you concerned about? [Check all that apply] (n = 109).
“I have Received a Negative ASR”
(n = 37)
“I Have Received Both a Negative and a Positive ASR”
(n = 33)
“I Have Received a Positive ASR”
(n = 39)
n % n % n %
Your parents/caregivers learning that you had received an Academic Status Report2156.81442.4820.5
Your classmates learning that you had received an
Academic Status Report
616.239.112.6
Your instructor’s perception of you as a student1643.22060.61025.6
Maintaining your GPA3081.12884.81743.6
Failing the class2978.42472.71230.8
Your permanent academic record2362.21648.5923.1
Your ability to remain in your major1027.01236.425.1
Your ability to finish your degree1335.11339.412.6
Other----615.4
Table 11. What kind of information were you looking for? (n = 25).
Table 11. What kind of information were you looking for? (n = 25).
InformationExamplen%
What should I do next“On what my next steps should be”1248.0
Information about ASRs “what an ASR was and if it was bad” 728.0
Why I got it“Why was I getting dropped from a class, why is the maximum absences the number it is”.520.0
Other“I was just concerned about the class specifically, so I sought help from those I thought fit best”.28.0
N/A/None“none”14.0
Table 12. Behavioral Response.
Table 12. Behavioral Response.
QuestionResponsen%
Did you seek out additional information about your academic status report? (n = 110)
No8274.5
Yes2522.7
Prefer not to answer32.7
From where did you seek out information? Please check all that apply. (n = 25)
Instructor or teaching assistant1768.0
Academic advising1456.0
Official ASU website832.0
A friend via a face-to-face conversation, text, or instant message624.0
Social media14.0
ChatGPT or other generative AI tools14.0
Other: ____--
Did you reach out to the instructor or the teaching assistant who submitted the Academic Status Report (ASR) on your behalf about your Academic Status Report? (n = 110)
No7164.5
Yes3935.5
Did the instructor or teaching assistant who submitted the Academic Status Report (ASR) on your behalf follow-up with you (e.g., via an email, face-to-face conversation) about your Academic Status Report?
(n= 110)
No6760.9
Yes4339.1
Did an academic advisor follow up with you about your Academic Status Report (ASR)? (n = 110)
No8173.6
Yes2926.4
Did you follow up with an academic advisor about your Academic Status Report (ASR)? (n = 110)
No8577.3
Yes2522.7
Did you drop or withdraw from the course pertaining to your Academic Status Report (ASR)? (n = 110)
No8980.9
Yes2119.1
Did you change any aspect of your personal behavior (e.g., quit your part-time job, start going to bed earlier) in response to the Academic Status Report (ASR)? (n = 110)
No4843.6
Yes4036.4
I don’t remember2220.0
Did you change any aspect of your academic behavior (e.g., increase your attendance at class, begin studying with friends) in response to the Academic Status Report (ASR)? (n = 110)
Yes6559.1
No3531.8
I don’t remember109.1
Table 13. What did you change about your academic behavior? (n = 65).
Table 13. What did you change about your academic behavior? (n = 65).
BehaviorExamplen%
Study behaviors“Started studying harder and completing my work on time” 38 58.5
Attendance“Started going to class more” 13 20.0
Course Participation“I started participating in class discussions more”. 13 20.0
Life behavior changes“Created more time for that class”. 3 4.6
Nothing “Created more time for that class”. 2 3.1
Other“I worked harder to assist other students who may have been struggling in the class whenever I saw an opportunity to do so”. 2 3.1
Unclear/not mentioned“n/a” 2 3.1
Table 14. What did you change about your personal behavior? (n = 40).
Table 14. What did you change about your personal behavior? (n = 40).
BehaviorExamplen%
Study behaviors “Started taking the classes more seriously”. 19 47.5
Attendance “Went to class more”. 10 25.0
Personal habits “Created a better schedule to maintain a good standing in class” 9 22.5
Assignments “Made me put more focus on what I needed to work on, in this case discussion posts”. 6 15.0
Unclear/Not mentioned “I think i was disapointed in myself a lot i began to seem sad/unmotivated”. 3 7.5
Changes to degree “I changed my major and realized that major wasn’t for me”. 2 5.0
Other “nothing” 2 5.0
Table 15. What advice would you give someone who had just received an Academic Status Report (ASR)? (n = 168).
Table 15. What advice would you give someone who had just received an Academic Status Report (ASR)? (n = 168).
AdviceExamplen%
Read/listen/
understand the ASR
“To listen to the report and work on it”. 47 28.0
Get external help “To talk to the professor about what they can do to get their grade up”. 41 24.4
Don’t stress “To not stress about it because it could be good and bad and overall its
just to help you”
41 24.4
Do more “Work hard, change now! sooner rather than later!” 27 16.1
Do something specific
to change
“take a step back, reevaluate, and come back stronger than before”. 25 14.9
Persist “to not give up” 13 7.7
Don’t know “no clue” 9 5.4
Other “It’s a way of improvement”. 6 3.6
Do nothing“If it is positive: Keep doing what you are doing. If it is negative: Read it, and
really sit with what your professor is trying to tell you”.
3 1.8
Table 16. How would you change Academic Status Reports (ASRs) to improve the experience of using or receiving them? (n = 168).
Table 16. How would you change Academic Status Reports (ASRs) to improve the experience of using or receiving them? (n = 168).
ChangeExamplen%
No changes“I wouldn’t change anything”4526.8
Don’t know“I am not sure”.1710.1
Have an instructor
talk to you
“I would make sure the instructor or TA has a requirement to reach
out or the student should to understand the report’s specificity”.
127.1
More positive“More feedback for positive ASR and not just negative ones”106.0
Less negative“I would make it sound less threatening as I thought at first I was gonna get dropped from the course”.95.4
Send ASRs via different
communication streams
“I would make an announcement on canvaas, as im more likely to
look at it”.
84.8
No response“n/a”84.8
Make ASRs more evident “make it an alert” 74.2
Have them sent out earlier
or more frequently
“Perhaps make them more common? I honestly can’t remember any other time I have gotten one, aside from once from my professor. More feedback for how a student may be doing in a class could be helpful”. 74.2
Visual implementation in MyASU (student portal)“Don’t make them stay there forever if you recover the grade”53.0
Meet with people/
student follow-up
“To always have a follow up with the student (if it was negative)”53.0
Change the implementation
of ASRs
“I would make them more accessible to students”53.0
Have an advisor
talk with you
“have it set up a mandatory meeting with advisor” 31.8
Remind students of an ASR’s purpose when receiving one“I would exploit more information on what they are used for”31.8
Change their name “Change the name of it. Comes off as negative just from receiving an alert even if it’s a good thing”. 21.2
Get rid of ASRs “Don’t send them out if your doing well because it causes stress that it unneccessary” 21.2
Make ASR experience more
constructive
--
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Imundo, M.N.; Goldshtein, M.; Watanabe, M.; Gong, J.; Crosby, D.N.; Roscoe, R.D.; Arner, T.; McNamara, D.S. Awareness to Action: Student Knowledge of and Responses to an Early Alert System. Appl. Sci. 2025, 15, 6316. https://doi.org/10.3390/app15116316

AMA Style

Imundo MN, Goldshtein M, Watanabe M, Gong J, Crosby DN, Roscoe RD, Arner T, McNamara DS. Awareness to Action: Student Knowledge of and Responses to an Early Alert System. Applied Sciences. 2025; 15(11):6316. https://doi.org/10.3390/app15116316

Chicago/Turabian Style

Imundo, Megan N., Maria Goldshtein, Micah Watanabe, Jiachen Gong, Devon Nicole Crosby, Rod D. Roscoe, Tracy Arner, and Danielle S. McNamara. 2025. "Awareness to Action: Student Knowledge of and Responses to an Early Alert System" Applied Sciences 15, no. 11: 6316. https://doi.org/10.3390/app15116316

APA Style

Imundo, M. N., Goldshtein, M., Watanabe, M., Gong, J., Crosby, D. N., Roscoe, R. D., Arner, T., & McNamara, D. S. (2025). Awareness to Action: Student Knowledge of and Responses to an Early Alert System. Applied Sciences, 15(11), 6316. https://doi.org/10.3390/app15116316

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop