1. Introduction
The ability to read critically and effectively is essential for academic success in higher education. However, recent data from American College Testing (
ACT, Inc., 2024) reports (2019–2024) highlight a concerning decline in students’ reading proficiency, with only 40% of students meeting the ACT College Readiness Benchmark in Reading in 2024, maintaining the lowest level observed in the past five years. This trend underscores the urgent need for evidence-based, innovative approaches to improving literacy skills, particularly in digital learning environments, where challenges such as fragmented attention, reduced comprehension, and increased screen exposure demand targeted pedagogical responses.
Reading comprehension is a multifaceted process supported by the strategic application of various cognitive strategies. These strategies are deliberate, goal-oriented actions that readers use to monitor and enhance their comprehension, particularly when reading complex or unfamiliar texts. While the distinction between reading skills and strategies is often blurred in practice, strategies are generally characterized by intentionality and metacognitive awareness (
Afflerbach et al., 2008).
Summarizing, for example, involves condensing the main ideas of a text into a concise restatement, which not only supports comprehension but also aids in information retention. This active process, as described by
Brown and Day (
1983) and
Duke and Pearson (
2009), requires identifying key points, eliminating irrelevant details, and integrating central ideas in a meaningful way. Questioning is the process of generating queries about the content or language to clarify meaning and stimulate critical thinking.
King (
1994) emphasizes the importance of student-generated questions, moving beyond passive reception toward active engagement, particularly through strategies such as reciprocal peer questioning. This process not only helps clarify meaning but also helps understanding the author’s purpose and making predictions.
Evaluating refers to making judgments about the value, accuracy, or relevance of information, often grounded in personal knowledge or external criteria. As
Afflerbach et al. (
2008) suggest, evaluating is a metacognitive skill that involves assessing the task at hand and monitoring one’s comprehension throughout the reading process. Making judgments based on criteria, being open-minded, and considering multiple perspectives are crucial aspects of this strategy. Making connections entails linking text content to prior knowledge, personal experiences, or other texts to construct meaning.
Keene and Zimmermann (
1997) categorize these connections as text-to-self, text-to-text, and text-to-world, emphasizing their role in enhancing comprehension and engagement by relating the unfamiliar text to relevant background knowledge.
Predicting is the act of anticipating upcoming content based on textual cues and prior knowledge, which promotes active engagement.
Duke and Pearson (
2009) highlight that this strategy allows readers to leverage existing knowledge, use textual clues, and continuously evaluate and revise predictions as they read, thereby developing comprehension. Inferring requires deducing implicit meaning by integrating textual clues with background knowledge.
McNamara (
2007) views inferring as going beyond the literal, integrating information from the text and prior knowledge to build a coherent understanding and fill in gaps left by the author. Finally, for bilingual learners,
Jiménez et al. (
1996) define translating and defining as metalinguistic strategies in which learners clarify meaning by cross-linguistic mediation or by providing explicit explanations of vocabulary.
Collectively, these strategies empower readers to actively engage with text, monitor their understanding, and construct meaning at a deeper level. However, even advanced language learners may encounter persistent challenges when working with academic texts. For example,
Velásquez (
2022) conducted a study comparing Spanish heritage and second language learners and found that the ability to summarize argumentative texts was generally weak across groups, regardless of linguistic background. These findings highlight the importance of explicit instruction in reading strategies that foster awareness of text structure and support students in distinguishing essential from irrelevant information—skills that are critical for both comprehension and academic writing.
As digital tools reshape literacy practices, digital social reading (DSR) has emerged as a promising pedagogical approach.
Blyth (
2014) defines DSR as “the act of sharing one’s thoughts about a text with the help of tools such as social media networks and collaborative annotation.” He explains that DSR can be used publicly through websites like Goodreads or privately in classroom settings using platforms such as Perusall, Hypothes.is, SocialBook, eComma, and others.
In classroom settings, this practice often takes the form of social digital annotation (SDA), which allows learners to comment directly on texts and interact with one another’s ideas in real time or asynchronously.
Egger (
2022) defines SDA as a “note added to a text” that may serve several purposes—providing information, sharing commentary, expressing power, sparking conversation, and aiding learning. This approach promotes a participatory and dialogic reading experience, enabling students to construct meaning collaboratively while developing cognitive and metacognitive strategies.
Building on the theoretical and pedagogical promise of SDA, a growing body of empirical research has explored how these tools function in practice. Studies have examined how collaborative annotation technologies impact student engagement, comprehension, and critical thinking across diverse instructional settings. This research supports the idea that social annotation not only enhances text-based interaction but also fosters deeper cognitive processing.
For instance, several studies underscore the value of digital annotation tools in enhancing students’ reading abilities and critical thinking skills in higher education.
Nor et al. (
2013), in a study with L2 English learners enrolled in a first-year undergraduate program, found that students engaged in active reading using an online annotation tool by making annotations such as forming questions, linking information, expressing agreement or disagreement, and writing summaries—demonstrating higher-order thinking skills like analyzing and comparing.
The benefits of DSR—and specifically of SDA—are widely accepted among educators as key to promoting active reading practices.
Blyth (
2014) identifies numerous pedagogical affordances of DSR, emphasizing its potential to enhance collaborative learning by enabling shared interpretation and collective understanding. He also argues that DSR helps distribute cognitive load through shared annotations, allowing students to scaffold one another’s learning. These dialogic interactions deepen engagement and synthesize multiple literacy practices—reading, annotating, and discussing—into a cohesive process.
Supporting this perspective,
Egbert et al. (
2022), in a study of graduate students in three asynchronous online courses, found that DSR transformed individual reading into a participatory experience, increasing student engagement with academic texts. The authors stress that effective implementation depends on clear instructional goals, thorough training, meaningful student choice, and sustained instructor involvement.
Beyond individual studies,
Novak et al.’s (
2012) literature review synthesizes findings from studies involving primarily L1 English learners in postsecondary settings, identifying key affordances of social annotation tools. Their review identified a positive impact on reading comprehension, critical thinking, and metacognitive awareness, especially when social annotation was embedded within collaborative, task-specific activities. Importantly, the review noted that while social annotation fosters student engagement and reflective learning, its effectiveness depends on factors such as tool usability, instructional design, and adequate student training.
Zhu et al. (
2020) highlight that SDA fosters collaborative learning environments, aligning with
Vygotsky’s (
1962) social constructivist theory, which emphasizes the central role of language and social interaction in learning. From this perspective, SDA functions as a space where meaning is co-constructed through peer dialogue, underscoring the importance of social engagement in academic reading.
This social dimension of SDA is further reinforced by
Adams and Wilson (
2020), who examined peer engagement among graduate students in an asynchronous online literacy course using Perusall, finding increased peer interaction over the semester. Their design-based study showed that both text-based and peer interactions increased over the semester, with peer interaction growing at a faster rate. This trend suggests that Perusall fosters a sense of community by supporting authentic, student-to-student engagement that mirrors the interactional dynamics of in-person classrooms.
While research on DSR in L2 instruction is expanding, studies focused specifically on L2 Spanish classrooms remain limited. Among the few available,
Thoms and Poole (
2018) analyzed how L2 Spanish learners in a college-level literature course engaged with collaborative annotation of poetic texts. They analyzed how students annotated Spanish poems using a digital tool and identified literary, social, and linguistic affordances. Their findings suggest that DSR enhances both interpretive and collaborative skills, offering valuable insights for language instruction.
Similarly,
Zapata and Mesa Morales (
2018) investigated intermediate L2 Spanish learners at a U.S. university using the eComma platform for collaborative annotation. Their mixed-methods research showed that eComma supported reading comprehension, vocabulary development, and cultural understanding. Despite minor technical issues, students reported that the tool facilitated deeper engagement with both texts and peers—an essential component of 21st-century literacy.
Although recent studies underscore a shift from traditional reading approaches to more participatory digital literacies, there is still a notable lack of research on how digital annotation is used in classrooms that serve both Spanish heritage language (SHL) learners and second language (L2) learners. This gap is especially important given the changing demographics of Spanish programs across the United States, where classrooms increasingly include students with diverse linguistic profiles—some who learned Spanish informally at home, and others who are acquiring it as a foreign or second language.
The urgency of this line of inquiry stems from a growing pedagogical need: instructors are frequently tasked with designing instruction that is simultaneously effective, equitable, and engaging for students whose language development paths differ significantly. SHL and L2 learners often sit side-by-side in the same classrooms, but bring distinct strengths, challenges, and prior experiences to academic reading tasks—differences that annotation tools may either amplify or bridge. Yet, research rarely investigates how these two groups interact with texts through digital annotation, or how their linguistic backgrounds shape cognitive engagement.
Research has shown that SHL and L2 learners exhibit distinct linguistic profiles that influence how they approach language tasks. Heritage learners, for example, tend to acquire Spanish informally through family and community interactions, without formal literacy instruction, which often results in limited awareness of grammar rules and metalinguistic terminology. At the same time, they typically demonstrate stronger oral comprehension, pronunciation, and fluency compared to their L2 peers (
Carreira & Kagan, 2011). However, despite these well-documented linguistic contrasts, research comparing how SHL and L2 learners approach reading tasks remains limited. Few studies have examined how these groups apply cognitive and metacognitive strategies to support reading comprehension.
To better understand the distinct experiences of heritage language learners,
MacGregor-Mendoza and Moreno (
2021) challenge traditional deficit-based views of literacy that frame SHL learners as lacking reading skills in Spanish. This misrepresentation, they argue, stems from early SHL research that emphasized learners’ limited exposure to formal texts while overlooking their rich, existing literacy practices. By reconceptualizing reading as both a cognitive and social activity—shaped by bilingualism and cultural identity—they highlight how conventional cognitive models, such as bottom-up decoding and top-down inferencing, inadequately capture the complexity of SHL literacy development in diverse bilingual contexts.
Similarly,
Velázquez (
2017) illustrates how college-aged heritage speakers actively use Spanish in informal digital contexts such as texting, music, and social media. These informal practices demonstrate meaningful literacy engagement that often goes unrecognized in academic settings. Together, these findings reinforce the value of integrating culturally relevant, digitally mediated tasks into SHL instruction. In particular, they underscore the potential of digital annotation as a bridge between students’ lived linguistic practices and the academic literacy expectations they encounter in university classrooms.
Informed by these previous studies and the clear need for differentiated instructional approaches in linguistically diverse classrooms, this research explores collaborative digital annotation as a meaningful way to examine both the cognitive and social dimensions of reading among SHL and L2 learners. Through collaborative annotation tasks, students engage in practices such as summarizing, questioning, paraphrasing, evaluating, and making connections, practices that reflect not only cognitive processing but also cultural identity, peer interaction, and prior language experience.
This framework offers a lens to observe how reading behaviors are shaped by linguistic background, revealing both distinct and overlapping patterns across SHL and L2 learners. It facilitates a nuanced understanding of the interplay between strategy use, identity, and engagement during academic reading. Ultimately, this study views literacy as more than just decoding; it is a socially embedded, culturally informed activity that digital annotation can uniquely illuminate.
Based on this conceptual framing, the study addresses the following research questions:
RQ 1—How do SHL and L2 learners cognitively engage with texts—and with each other—through collaborative digital annotations on the Perusall platform? Is there evidence of a developing community of readers?
RQ 2—What cognitive strategies (e.g., making connections, summarizing, questioning, inferring, evaluating, predicting) do SHL and L2 learners demonstrate in their annotations? Are there notable differences or similarities in how each group employs these strategies?
RQ 3—How do students perceive the impact of collaborative digital annotation on their academic language development and reading engagement in an advanced Spanish course?
3. Results
3.1. Annotation Patterns
This section addresses Research Question 1: How do SHL and L2 learners cognitively engage with texts—and with each other—through collaborative digital annotations on the Perusall platform? Thematic patterns in student annotations are examined to understand both cognitive engagement and evidence of a developing community of readers.
After collecting the annotations, a systematic analysis was conducted to identify recurring patterns. This analysis resulted in the classification of comments into two primary categories: Textual Interactions (TI) and Peer Interactions (PI). The coding categories were developed using an inductive approach, grounded in repeated observation of the most common types of student annotations. While informed by existing research on cognitive reading strategies, the final categories emerged directly from the data.
These strategies reflect students’ cognitive engagement with the text, as they involve interpretation, evaluation, and meaning-making. The inclusion of peer interactions further captures how students collaboratively built understanding, providing evidence of dialogic reading and the emergence of a reading community.
The following section presents representative examples of the most frequent interaction types, illustrating the range of underlying cognitive processes they reflect. English translations are provided in square brackets for each example.
3.1.1. Interactions with the Text (TI)
Interactions with the text encompass annotations where students directly engaged with the reading material by evaluating content, making inferences or connections, summarizing, paraphrasing, posing questions, among other cognitive processes. Below are illustrative examples of these interactions:
Making Connections
“Jorge Ramos es alguien a quien creo que cada hispano debe conocer, no solo por sus reportajes en Univisión, sino por mucho más… La manera en que él habla y se expresa es el nivel académico que yo quiero alcanzar. Un día voy a poder hablar con confianza y fluidez, y muchos estarán muy impresionados.”
[Jorge Ramos is someone every Hispanic should know, not just for his reporting on Univisión, but for much more… The way he speaks and expresses himself is the academic level I aspire to reach. One day, I will be able to speak with confidence and fluency, and many will be very impressed.]
The student’s commentary on Jorge Ramos, the author of the assigned reading, exemplifies a text-to-self connection, highlighting a personal engagement with the content. Such connections occur when readers relate the material directly to their experiences, thereby enhancing comprehension and fostering deeper cognitive engagement. In this instance, the student not only acknowledges Ramos’s significance but explicitly expresses a desire to emulate his communicative competence, underscoring a profound personal connection to the text.
Students also demonstrated text-to-text or text-to-content connections, as illustrated by the following annotation:
“Como el video del ‘hielo’ que vimos la semana pasada. Esto se relaciona en el miedo que causa el servicio de inmigración a la gente.”
[AC-F-02-HL]
[Like the “Ice” video we watched last week. This relates to the fear that immigration enforcement causes people.]
Here, the student connects the current reading with previously viewed class content material, reinforcing the cognitive process of linking new information with prior knowledge. Such associations enhance understanding and retention by situating new insights within an existing knowledge framework.
Summarizing
“La idea principal de este párrafo es que el spanglish se trata de una invasión de grave peligro.”
[YO-F-02-HL]
[The main idea of this paragraph is that Spanglish represents a gravely dangerous invasion.]
This annotation exemplifies the cognitive skill of summarization, which involves distilling complex or detailed information into its essential elements. By succinctly capturing the core message, the student demonstrates an ability to identify and express the author’s central argument clearly and efficiently. Summarizing is integral to deep reading comprehension, as it necessitates active engagement with the text, discernment between main ideas and supporting details, and precise articulation of key concepts in the student’s own words.
Questioning
“Me pregunto que podra ser una solucion para estas clinicas. Estoy segura que no tienen los recursos pero que se tiene que hacer para recibirlos.”
[JV-F-03-HL]
[I wonder what a solution for these clinics could be. I am sure they lack the resources, but what needs to be done to obtain them?]
“¿Disparidad es otra manera de decir diferencia?”
[MV-F-03-HL]
[Is ‘disparity’ another way of saying ‘difference’?]
Students employed the cognitive skill of questioning both content and vocabulary, reflecting active engagement with the text. This approach promotes deeper comprehension and critical thinking. It also supports independent learning, fosters peer discussion, and strengthens lexical development. By interacting with the text through questioning, students reinforce their inferential, comparative, and vocabulary application abilities in academic contexts.
Inferring Unknown Vocabulary
“Rotulos significado es como ‘titulo’ eso fue lo que entedi por las claves del texto.”
[MV-F-03-HL]
[The meaning of ‘rótulos’ is similar to ‘title’; that’s what I understood based on context clues from the text.]
This example illustrates the cognitive skill of inferring, where a student used contextual clues to deduce the meaning of an unfamiliar term. Although the term “rótulos” was not explicitly defined, the student successfully connected it with prior knowledge and textual context to formulate a reasonable interpretation. Inferring supports deeper textual comprehension and critical thinking by prompting learners to independently bridge gaps in understanding.
Evaluating
“Esto si es verdad. Mis padres cuando van a algunos lugares batallan porque las barreras en la comunicación. Necesitamos mas gente bilingüe para que ayuden a la comunidad latina.”
[HE-F-03-HL]
[This is true. When my parents go to certain places, they struggle due to communication barriers. We need more bilingual people to assist the Latino community.]
“Si de acuerdo, los indocumentados sí son bien trabajadores y luchan para salir adelante. Nadien le quita nada a nadien solo que hay unas personas que son arraganes y los corren por no querer trabajar. Entonces se quedan con los indocumentados que son bien trabajadores.”
[HE-F-03-HL]
[Yes, I agree that undocumented individuals are indeed hardworking and strive to get ahead. No one takes anything from anyone; it’s just that some people are lazy and get fired for not wanting to work. So, they keep the undocumented workers who are very hardworking.]
These students’ comments exemplify the cognitive skill of evaluating, which involves making judgments or assessments about the content based on personal experiences, values, or beliefs. By critically evaluating statements within the text, students demonstrate their ability to reflect on and interpret information, connecting it meaningfully with their lived experiences. This cognitive skill enhances deeper engagement with the material, encourages thoughtful analysis, and supports learners in forming informed opinions grounded in personal perspectives and critical thinking.
Predicting
“Estoy segura que unos centavos no tendran mucho efecto, pero seguire leyendo para ver el efecto.”
[JV-F-03-HL]
[I’m sure a few cents won’t have much of an effect, but I’ll keep reading to see the impact.]
“Siento que la imagen de la niña y la abuela está ahí para mostrarnos lo cerca que somos de la familia y debemos cuidarnos porque una nueva generación en nuestra familia nos está viendo crecer y seguimos sus pasos.”
[MH-F-02-HL]
[I feel that the image of the girl and the grandmother is there to show us how close we are to family, and that we should take care of ourselves because a new generation in our family is watching us grow, and we follow in their footsteps.]
These annotations demonstrate the cognitive skill of predicting, where students anticipate or hypothesize about forthcoming information or potential outcomes based on textual clues (including images) and their existing knowledge. This skill encourages active reading and engagement, prompting students to remain attentive and curious as they verify or adjust their predictions while reading, thus enhancing comprehension and deeper interaction with the text.
Finally, a significant portion of the annotations focused on vocabulary. Students provided definitions for unfamiliar words and translated new vocabulary into English, demonstrating efforts to build lexical understanding and supporting peers in making meaning from the text. Below are examples of such annotations, with the target word highlighted in bold.
Word Defining
Amnistía: “Proceso legal por el cual se condonan y defienden delitos.”
[SA-F-03-HL]
[Amnesty: a legal process through which crimes are pardoned and forgiven]
Partidarios: “El que defiende una idea, una tendencia, un movimiento o persona.”
[SA-F-03-HL]
[Partidarios: Someone who defends an idea, a trend, a movement, or a person.]
Translating
Clínicas geriátricas: “Old person homes; retirement homes; geriatric clinics. La frase es un cognado.”
[CS-M-02-L2]
Hallazgo: “En inglés la palabra es ‘finding.’ En el contexto, ‘An interesting finding is…”
[CS-M-02-L2]
These annotations reflect foundational cognitive strategies such as clarifying meaning and facilitating comprehension through linguistic mediation.
Next, examples from the second major category—peer interactions—are presented, followed by an analysis of the frequency with which each group employed these cognitive strategies. The most prevalent types of interactions are highlighted.
3.1.2. Interactions with Peers (PI)
These were comments in response to a classmate’s contribution. We categorized these interactions with peers into two different types.
Extensions
Comments where students respond by agreeing, disagreeing, and/or extending a classmate’s contribution. For example, one student agreed with and expanded on a peer’s comment about the reasons why immigrants change their names to assimilate into the receiving culture:
“Estoy completamente de acuerdo contigo, me parece muy triste que muchos inmigrantes prefieran cambiar su nombre. Todos tienen razones diferentes, pero la más común que yo he visto es que a veces nuestros nombres son complicados para pronunciar para los Estadounidenses y entonces nosotros tendemos a cambiar la pronunciación y el nombre completamente.”
[IG-F-03-HL]
[I completely agree with you; I find it very sad that many immigrants prefer to change their names. Everyone has different reasons, but the most common one I’ve seen is that sometimes our names are difficult for Americans to pronounce, so we tend to change the pronunciation or completely alter the name.]
Through extensions, students demonstrate deeper engagement with peer-generated ideas, reinforcing or enriching discussions by adding personal experiences, perspectives, or further analysis.
Content Questions
These are the questions, related to the reading content of the text, that are directly posed to peers.
“En las noticias, he visto políticos, republicanos y demócratas, usan el término “latino”, (no “hispano”). Estos políticos suelen ser blancos. Mis compañeros de clase, ¿con qué término preferirían que los llamaran?”
[CS-M-02-L2]
[In the news, I have seen politicians, both Republicans and Democrats, use the term ‘Latino’, not ‘hispanic.’ These politicians are usually white. My classmates, which term would you prefer to be called?]
“¿Qué piensas sobre la opinión del autor?”
[MH-F-02-HL]
[What do you think about the author’s opinion?]
Such questions reflect a genuine desire to foster academic dialogue, clarify interpretations, and stimulate critical reflection on the text’s central themes or arguments.
These categories show how students interact with the text and their peers, providing deeper insights into their academic discourse and engagement. The coded annotations and tabulated data provided a structured framework for the subsequent statistical analysis and discussion of the study’s findings.
3.2. Interaction Patterns Across Groups (RQ 1)
To address the first research question, this section analyzes how SHL and L2 learners engaged with texts and peers through collaborative annotations on the Perusall platform.
Table 1 presents descriptive engagement metrics obtained from Perusall for each group. In this study,
viewing time refers to the total average duration students had the content open, while
active engagement measures real-time interaction (e.g., mouse movement or typing activity at least once every two minutes). HL students spent more time viewing annotations on average (M = 638 min), while L2 students demonstrated slightly higher active engagement (M = 40 min) per reading session. Both groups averaged similar numbers of text interactions (TI) and peer interactions (PI), with a mean of 3 comments in each category.
Given the small sample size of the L2 group (
n = 4), this analysis avoids statistical comparisons and instead focuses on descriptive trends to offer a preliminary understanding of engagement patterns. These metrics are presented in
Table 1 alongside measures of variance (standard deviation, minimum, and maximum values).
The results illustrate variation in how each group participated in digital annotation. HL learners engaged with content for longer durations, which may indicate more reflective reading patterns. In contrast, L2 learners exhibited more concentrated interaction within shorter sessions. The extended viewing time among HL learners may also reflect a greater need for cognitive processing, as they often have less exposure to academic Spanish. Unlike their L2 peers—who typically develop reading strategies in structured classroom settings—HL learners may lack formal training in navigating academic texts. This instructional gap likely contributes to the longer engagement times observed, as HL students work to make sense of unfamiliar vocabulary, syntax, and discourse styles.
Interestingly, both HL and L2 learners demonstrated the same average number of Text Interactions (TI = 3) and Peer Interactions (PI = 3) per reading. This similarity invites a closer examination of how each group distributed its efforts across interaction types. While the frequency of contributions was comparable, the underlying engagement styles appeared to differ. HL learners may have distributed their annotations across longer periods, reflecting a more contemplative and iterative reading process. In contrast, L2 learners may have concentrated their responses into shorter, more intensive bursts of activity. These divergent engagement patterns provide insight into how each group approaches digital annotation and suggest differences in cognitive processing and reading behavior shaped by their distinct linguistic and educational backgrounds.
We now turn to an analysis of the overall interaction patterns across the full sample, focusing on the distribution of text interactions (TI) and peer interactions (PI) throughout the semester.
Chart 1 illustrates the types of interactions observed across each of the four thematic chapters. As shown, text interactions (depicted in dark gray) were the predominant form of engagement. However, peer interactions (shown in light gray) displayed a steady upward trend over time. In the first chapter, students averaged 2.3 peer interactions (SD = 1.7), increasing to 2.8 (SD = 1.8) in the second chapter, 2.9 (SD = 2.1) in the third, and reaching 3.3 (SD = 2.4) in the final chapter.
This upward trajectory in whole peer-to-peer engagement provides evidence of the potential formation of an online community of readers, as suggested by
Adams and Wilson (
2020). Their findings indicate that peer interactions in Perusall tend to increase more rapidly than text interactions, reflecting the emergence of authentic student-to-student engagement in asynchronous settings. In alignment with this pattern, the present study reveals that students progressed from primarily interacting with the text to actively engaging with one another—an evolution that supports collaborative meaning-making and social presence, both of which are essential elements in the development of digital learning communities.
3.3. Cognitive Strategy Use by Group (RQ 2)
To address the second research question, this section examines the cognitive strategies HL and L2 learners used in their annotations and explores group-specific patterns in strategy use. We focus on descriptive comparisons, including raw counts, percentages, and observed trends.
As illustrated in
Chart 2 and detailed in
Table 2, Evaluating emerged as the most used strategy across both groups, though it was more frequent among HL learners (65.6%) than L2 learners (59.7%). Questioning appeared more often in L2 annotations (15.6%) than in HL annotations (10.0%). SHL learners also showed a higher use of Connecting strategies (19.9%) compared to L2 learners (12.9%), whereas L2 learners demonstrated slightly greater use of Inferring, Summarizing, and Predicting.
These patterns suggest group-specific tendencies: SHL learners appear to favor strategies that involve personal reflection and evaluative engagement, potentially tied to their cultural familiarity with the topics. In contrast, L2 learners may rely more on questioning and inferential reasoning to aid comprehension, reflecting their developing proficiency and need to clarify meaning.
When examining less frequent but pedagogically significant strategies—Defining, Translating, and Inferring—further contrasts emerge (see
Chart 3). L2 learners used Translating at a notably higher rate (8.5%) than HL learners (0.5%), indicating a greater dependence on cross-linguistic transfer for comprehension. Conversely, Defining occurred only among HL learners (2.7%), possibly reflecting a more metalinguistic approach to deciphering academic vocabulary.
Though the sample size of L2 learners is small, these descriptive findings suggest relevant pedagogical implications. Instructors might support HL learners in deepening their inferential skills while helping L2 learners move beyond translation toward more reflective, contextually grounded strategies.
In sum, the observed patterns in strategy use offer valuable insights into how SHL and L2 learners engage cognitively during digital annotation tasks.
3.4. Findings from the Perceptions Survey (RQ 3)
To respond to the third research question, this section presents students’ perceptions of the impact of collaborative annotation tasks on their reading and writing development, as reported in end-of-semester surveys. After analyzing the perceptions survey results, we found that students’ responses highlighted several perceived benefits of engaging in digital reading and collaborative annotations.
Table 3 summarizes the areas where students reported additional progress, along with the corresponding mean scores.
While the perceptions data are reported in aggregate form, a brief exploratory review of SHL and L2 learners’ responses revealed broadly similar patterns across groups. Given the small number of L2 participants, disaggregating the data was not statistically meaningful; however, both groups generally expressed positive perceptions of collaborative digital annotation. These similarities suggest that the perceived benefits of annotation—such as vocabulary development, motivation, and clarification of doubts—were shared by learners with different linguistic backgrounds.
The area of greatest reported progress was vocabulary acquisition, with a mean score of 4.27 (SD = 1.03). Many students emphasized that collaborative annotation helped them learn new vocabulary in context, enhancing their reading comprehension and confidence. For instance, one participant shared, “I personally found it super helpful to have collaborative annotations because I could expand on ideas that my classmates had. It also helped me comprehend words and phrases that I did not know while reading.” This benefit also appeared to enhance motivation to read in Spanish, which received a mean score of 4.24 (SD = 1.08).
Motivation was often linked to peer interaction and engagement with relevant topics. As one student explained, “I found it very interesting and a fun way to learn about different topics as well as connecting with certain classmates. I don’t do much reading in Spanish, and this helped me understand the importance of reading in Spanish.”
Clarifying doubts and improving grammar (M = 4.22, SD = 1.04) and spelling (M = 4.14, SD = 1.12) were also perceived as valuable outcomes. Students described how the social reading environment supported problem-solving and peer-to-peer clarification: “I believe collaborative annotations are a tremendous idea to be able to analyze a reading and achieve a better comprehension by interacting with classmates.” Another added, “It was helpful to see others’ ideas and be able to create or continue conversations with peers.”
Finally, students reported increased awareness of academic register (M = 4.00, SD = 1.08), noting a decrease in informal language use. One participant noted, “I feel digital annotations have helped me improve my academic voice in Spanish and share my thoughts with others.” This shift toward more formal expression suggests that the collaborative annotation process helped students refine their writing tone in line with academic expectations.
4. Discussion
To address the first research question, this study examined how SHL and L2 learners cognitively engage with texts and peers through collaborative digital annotations on Perusall. Despite differences in session duration, both groups averaged the same number of text and peer interactions (TI = 3; PI = 3) per assigned reading, suggesting a shared willingness to engage with content and one another. SHL learners demonstrated significantly longer viewing times (M = 638 min) compared to their L2 peers (M = 226 min), while L2 learners showed higher levels of active engagement per session (M = 40 min vs. M = 32 min). These results suggest that although both groups participated meaningfully, they employed distinct engagement patterns aligned with their linguistic profiles.
SHL learners frequently used strategies such as evaluating and making connections, often drawing on lived experience and cultural knowledge. In contrast, L2 learners favored questioning and inferencing strategies, indicating a more analytical orientation. These contrasting styles reinforce the need for differentiated instructional approaches and demonstrate that both groups are capable of deep engagement when supported by collaborative digital tools.
These findings support the pedagogical value of collaborative annotation tasks in fostering cognitive engagement across diverse learner populations. The use of community-relevant texts within a familiar, interactive platform likely contributed to more authentic student interaction. As learners annotated and responded to peers, they transformed reading into a dialogic and participatory act. For SHL learners in particular, the richness of their evaluative and reflective annotations supports
MacGregor-Mendoza and Moreno’s (
2021) call to move beyond deficit-based views. Rather than viewing SHL students as academically limited, this study shows they actively engage in complex knowledge construction when given access to inclusive, culturally responsive learning environments.
Moreover, a progressive increase in peer interactions over the semester provides compelling evidence of a developing community of readers. As in
Adams and Wilson’s (
2020) study, our findings suggest a shift from isolated text-based engagement toward sustained peer-to-peer interaction. Though the small size of the L2 group limits generalizability, their active participation and contributions to peer discussion reinforce their role in this emerging community. These results highlight the potential of digital annotation to foster academic community and promote social presence among learners with diverse linguistic trajectories.
The second research question focused on identifying cognitive strategies used by each group and whether notable differences or similarities emerged. While Evaluating was the most used strategy overall, SHL learners employed it more frequently than L2 learners (65.6% vs. 59.7%), along with a higher use of Connecting strategies (19.9% vs. 12.9%). These patterns suggest that SHL learners engage reflectively, drawing from personal and cultural knowledge to make meaning. This aligns with
MacGregor-Mendoza and Moreno’s (
2021) argument that SHL learners’ reading practices emerge from bilingual, culturally situated contexts—contexts that are often overlooked by traditional cognitive frameworks like top-down inferencing or bottom-up decoding.
In contrast, L2 learners showed greater use of Questioning (15.6% vs. 10.0%), Inferring (2.0% vs. 0.8%), and Translating strategies (8.5% vs. 0.5%)—skills that reflect an analytical orientation often developed through formal instruction. Defining, though less frequent, was used exclusively by SHL learners (2.7%), possibly reflecting an effort to clarify academic vocabulary not typically encountered in everyday language. These findings further challenge the assumption that SHL learners lack academic literacy skills; instead, they highlight that SHL and L2 learners draw on different, but equally valid, repertoires for reading in academic contexts.
These strategy preferences also reveal how differently SHL and L2 learners navigate meaning-making in digital environments. While SHL learners often construct understanding through reflection, lived experience, and cultural resonance, L2 learners engage more often through questioning and analytical decoding—approaches aligned with their formal instructional backgrounds. These distinctions emphasize the importance of designing digital reading tasks that are flexible and responsive to students’ varied literacy repertoires.
Importantly, both groups made minimal use of key strategies such as Predicting and Summarizing—skills essential for fluent academic reading and writing. This gap echoes findings by
Velásquez (
2022), who noted underperformance in summary writing among both SHL and L2 learners. It suggests the need for targeted pedagogical interventions that explicitly teach these strategies and provide opportunities for structured practice.
The third research question explored students’ perceptions of how digital annotation influenced their academic language development and engagement. Survey data showed strong perceived gains, particularly in vocabulary acquisition (M = 4.3), followed by motivation to read (M = 4.2). Students credited the interactive and collaborative nature of annotation with increasing their interest in and commitment to reading in Spanish. These findings align with
Velázquez’s (
2017) work on the vitality of Spanish in informal and digital spaces and social media—which remains a key domain of literacy for many heritage learners. By integrating academic reading tasks with these digital literacy practices, annotation platforms like Perusall may help reinforce Spanish language use and affirm students’ linguistic identities.
Participants also noted improvements in grammar, spelling, and overall comprehension, as well as a reduction in informal language use (M = 4.0), attributed to a heightened awareness of register and peer accountability. The annotation space functioned as a forum for posing questions, clarifying meaning, and engaging in collaborative learning. These socially embedded interactions supported both metalinguistic awareness and academic identity development—further affirming reading as not only a cognitive act, but also a culturally and socially situated one.
Building on these outcomes, the findings point to the value of pedagogical practices that blend structure with flexibility. Instructors may consider using open-ended annotation prompts, peer modeling, and small-group annotation threads to support multiple forms of meaning-making. Providing explicit instruction in strategies like summarizing and predicting—especially in bilingual and mixed classrooms—can help learners strengthen underused skills while continuing to build on their strengths.
Taken together, these findings challenge traditional assumptions about SHL learners, emphasizing the need to reframe literacy instruction to reflect the realities of bilingual students’ lives. Collaborative annotation, especially when grounded in relevant texts and inclusive tools, offers a powerful vehicle for honoring linguistic diversity while cultivating the cognitive strategies essential for academic success.
Looking ahead, digital annotation tools will continue to evolve alongside shifts in literacy, technology, and pedagogy. In heritage and L2 education, platforms like Perusall offer the potential to support both individual reflection and collective meaning-making. As artificial intelligence becomes increasingly integrated into digital reading platforms, thoughtful design will be needed to ensure that AI-generated annotations complement rather than replace peer dialogue. When guided by inclusive instructional strategies, digital annotation has the potential to foster deeper engagement, support multilingual learners, and promote equity in hybrid and online learning environments.
Limitations and Directions for Future Research
While this study offers valuable insights into the cognitive and social dimensions of collaborative digital annotation among SHL and L2 learners, several limitations must be acknowledged. First, the small sample size, particularly the L2 group, limits the generalizability of the findings. Although the results offer compelling patterns, they should be interpreted with caution and viewed as exploratory. Future studies with larger and more balanced sample sizes are needed to validate these findings and provide more robust statistical comparisons across learner groups.
Additionally, this study was conducted in the context of an advanced online Spanish course, which may not reflect the dynamics of in-person or mixed-level learning environments. Replicating this research across different course modalities, institutional types, and proficiency levels would help determine the broader applicability of social annotation tools like Perusall in language education.
Another limitation lies in the reliance on annotation content and self-reported survey data to assess cognitive engagement and perceived learning outcomes. While these sources offer valuable qualitative and quantitative insights, future research could incorporate complementary methods such as think-aloud protocols, interviews, or longitudinal tracking to deepen understanding of how learners process and apply reading strategies over time. Additionally, to better assess learning outcomes beyond perception, future studies could include measurable indicators such as pre- and post-tests or comprehension tasks to evaluate students’ academic reading gains more objectively.
Finally, this study focused on one annotation platform and a specific set of culturally relevant texts. Further investigation is needed to explore how platform design, task structure, and text genre influence student engagement and learning outcomes, especially for SHL populations. Expanding research in this area will be essential to designing equitable and culturally sustaining digital reading practices in 21st-century classrooms.
Despite its limitations, this study offers a meaningful contribution by highlighting how collaborative digital annotation can serve as a bridge between academic literacy development and students’ lived linguistic experiences. As digital tools continue to reshape how reading is taught and experienced, ongoing research will be key for designing instruction that is both inclusive and effective for diverse learners.
5. Conclusions
This study contributes to a growing body of research that positions digital reading not merely as a shift in format, but as a pedagogical opportunity. Its relevance lies in the urgent need for inclusive, evidence-based literacy practices that address the cognitive and social dimensions of reading for increasingly diverse university classrooms. Findings from this research underscore the value of collaborative annotation tools like Perusall in fostering both cognitive engagement and peer interaction among SHL and L2 learners. The results challenge traditional deficit views of SHL literacy by demonstrating that heritage learners actively engage in meaning-making when given access to culturally relevant materials and digital tools that support social interaction. These findings support calls from
MacGregor-Mendoza and Moreno (
2021) to reframe literacy as both cognitive and social, and they reinforce
Velázquez’s (
2017) argument for recognizing the vitality of Spanish in informal and digital spaces.
This study also revealed how reading behaviors are shaped by linguistic background, with SHL and L2 learners demonstrating distinct but equally valid engagement strategies. By analyzing the cognitive strategies employed during annotation, the findings illuminate the interplay between strategy use, academic identity, and sociocultural knowledge.
This research offers several pedagogical implications. Based on the data, instructors designing digital annotation tasks in linguistically diverse classrooms should
Scaffold strategy instruction using models and examples of targeted annotation types (e.g., evaluating, connecting, questioning);
Pair students from different linguistic backgrounds (SHL–L2) to foster peer-to-peer support and diverse perspectives;
Use analytics to monitor not just frequency but depth of engagement, adjusting tasks accordingly.
These recommendations are directly supported by observed trends in the students’ annotation strategies and interaction patterns across the semester.
In addition, based on my experience implementing these tasks, I offer a few broader considerations:
Begin with ungraded or low-stakes annotation practice to build comfort and reduce performance anxiety;
Create small annotation groups (5–10 students) in large classes to enhance interaction and rotate those groups periodically;
Encourage reflective practices such as annotation review or peer commentary to support metacognitive growth.
Finally, the use of Perusall’s built-in analytics and engagement metrics provides instructors with a valuable feedback loop. These tools can help monitor student participation, assess the depth of engagement, and inform pedagogical adjustments, ultimately supporting a more intentional and inclusive approach to digital reading instruction in the twenty-first-century classroom. Rather than viewing digital reading as a constraint, this study demonstrates its potential to transform how we support student engagement, literacy development, and critical thinking in contemporary learning environments. Thoughtfully integrated digital tools can promote deeper reading, meaningful collaboration, and the development of academic voice—particularly for SHL and L2 learners navigating bilingual academic contexts.