Overall, the literature agrees that recycled aggregates derived from concrete exhibit high absorption due to the presence of residual mortar, directly affecting mixture volumetrics and resulting in higher binder demand compared with mixtures using virgin aggregates [
5]. Even so, several studies have shown that asphalt concretes incorporating recycled coarse aggregates can meet technical specifications and even enhance certain mechanical properties [
9,
10,
11]. Pasadín and Pérez [
10,
11] reported greater stiffness—e.g., increased resilient modulus—at certain incorporation levels and a significant extension of fatigue life. Mixtures with CDW are thus considered viable for low-volume roads, reinforcing their potential in secondary or local infrastructure [
16]. The results from studies evaluating the incorporation of recycled concrete aggregates (RCAs) into asphalt mixtures are often contradictory, making it difficult to establish a general behavioral pattern. This ambiguity arises from the highly heterogeneous nature of RCA, as evidenced by its high coefficients of variation. Since performance largely depends on the specific RCA source (including this origin, the quantity and quality of adhered mortar, and crushing process), the findings of individual investigations should be regarded only as reference. Therefore, quantitative results derived from one RCA source should not be numerically extrapolated to recycled aggregates originating from different sources [
8,
10,
11,
13,
14,
17,
18,
19,
20,
21,
22].
Consequently, incorporating recycled aggregates in asphalt concrete is technically feasible and environmentally favorable, reducing both natural resource consumption and solid waste generation. Implementation, however, requires careful economic assessment, as RCA’s high absorption increases binder demand. Therefore, defining an optimal substitution level is essential to maximize environmental and economic benefits without compromising functional performance or structural integrity over the pavement’s service life.
Moisture Damage and Fatigue Cracking in Hot-Mix Asphalt Pavements
Throughout their service life, asphalt pavements are subjected to combined environmental actions and repeated traffic loading, which progressively reduce structural integrity and commonly manifest as rutting, cracking, and raveling [
23]. Moisture damage is defined herein as the deterioration of asphalt mixture performance due to water ingress, arising from debonding at the binder and aggregate interface (adhesive failure) and softening of the mastic (cohesive failure) [
24]. As is illustrated in
Figure 1, prolonged water infiltration through the interconnected air voids further accelerates this damage, especially under repeated loading, ultimately compromising mixture integrity and hastening structural deterioration [
24,
25,
26]. Stripping, a characteristic surface distress involving aggregate loss in the wearing course due to the combined action of moisture- and traffic-induced dynamic loads, illustrates this process; progressive weakening of the binder film encapsulating aggregates can culminate in structural failure of the pavement [
24].
Two principal moisture related failure modes are identified in asphalt mixtures: adhesive failure, when the asphalt film separates from the aggregate surface, and cohesive failure, characterized by internal stiffness loss within the asphalt mastic (the binder and fine aggregate matrix). Both mechanisms markedly affect durability and long-term structural performance (
Figure 2). In both modes, adhesion properties at the binder and aggregate interface are decisive for ensuring durability [
27]. This consideration is particularly relevant for mixtures incorporating recycled aggregates from CDW, whose rough surface morphology and heterogeneous chemistry can significantly modify the quality of interaction with the asphalt binder.
Under extended water exposure, mixtures incorporating higher contents of RCA exhibit progressive reductions in Indirect Tensile Strength (ITS), Tensile Strength Ratio (TSR), and fracture energy, reflecting increased brittleness and reduced mastic cohesion. This tendency, observed in several long-term water-immersion aging studies, confirms the cumulative effect of hydric exposure on RCA-modified mixtures [
28,
29,
30]. The gradual loss of strength parallels the decrease in fracture energy reported for dense-graded mixtures with high RCA content, indicating that the brittle nature of the residual mortar governs post-fracture behavior [
9,
28].
Moisture susceptibility in asphalt mixtures is commonly assessed using quantitative mechanical tests that determine the loss of fundamental properties, such as Indirect Tensile Strength (ITS). These reductions, which are directly associated with the presence of moisture, negatively affect structural and functional performance when pavements are exposed to wet or saturated conditions [
31].
To formally evaluate the moisture damage resistance of asphalt mixtures, according to Superpave-based criteria, TSR for a new asphalt mixture should be at least 80% to reduce or prevent asphalt binder stripping from the aggregate. However, TSR captures only the post-conditioning loss in strength and may not reflect the absolute structural capacity of the mixture; some asphalt concretes may show high TSR while still exhibiting low absolute tensile strength.
In light of these limitations, numerous transportation authorities across different countries have incorporated supplementary criteria for dense-graded hot-mix asphalt. Typical provisions include compliance with TSR thresholds of 75–80% together with verification of minimum Indirect Tensile Strength (ITS). For illustration, the California Department of Transportation (Caltrans) specifies minimum tensile strengths of 483 kPa for conditioned specimens and 690 kPa for unconditioned specimens [
32].
Recent research has examined the mechanical properties and moisture susceptibility of asphalt concrete incorporating recycled aggregates from CDW. Ossa et al. [
33] produced mixtures with varying RCA contents using the Superpave methodology and reported that both average and minimum TSR decreased as RCA content increased. The authors concluded that incorporating up to 20% CDW is suitable for urban wearing courses; for higher contents, anti-stripping additives are recommended to mitigate the heightened moisture susceptibility [
33].
Similarly, Pasadín and Pérez [
10] evaluated Marshall-designed mixtures in which CDW partially replaced natural aggregate. Considering stiffness, permanent deformation, fatigue resistance, and moisture susceptibility, they observed a decrease in ITS with increasing CDW content under both dry and wet conditions. Nevertheless, TSR values for CDW mixtures exceeded those of the control mixture without recycled material, suggesting improved resistance to moisture-induced damage. The authors emphasized that performance depends not only on the CDW characteristics—origin, impurity content, and crushing process—but also on the physicochemical compatibility between the recycled aggregate and the asphalt binder [
10].
Other studies [
16,
28,
34] have reported slight increases in ITS for dense- and semi-dense-graded mixtures containing recycled aggregates. This behavior has been attributed to (i) higher binder contents used to compensate for RCA absorption, (ii) the angularity and rough surface texture of recycled aggregates that enhance particle interlock after compaction, and (iii) potential physicochemical interactions at the binder–aggregate interface [
16].
Furthermore, Zulkati et al. (2013) noted that calcium oxide (CaO), aluminum oxide (Al
2O
3), and silica (SiO
2) present in recycled aggregates can promote pozzolanic activity within the asphalt mixture, potentially improving binder adhesion by fostering interfacial interactions that strengthen the aggregate–binder bond [
35].
On the other hand, fatigue cracking is a consequence of repeated traffic-induced loading (
Figure 3). Although mechanical in origin, it is more prevalent at low temperatures, where increased mixture stiffness and brittleness reduce flexibility, so fracture under cyclic loading tends to be a consequence of reduced ductility rather than plastic deformation [
36]. Moisture exposure further accelerates fatigue crack initiation by lowering stiffness and interfacial strength. In RCA-based mixtures, this effect is intensified by the porous mortar phase, which retains internal moisture and promotes microcrack coalescence under cyclic stress [
13,
20,
37]. Consequently, pavements containing high RCA proportions exhibit reduced fatigue life, particularly under high groundwater levels or poor drainage conditions where saturation cycles are frequent [
19,
38]. Initially, fatigue cracks appear as fine, discontinuous surface fissures. With continued traffic, these fissures propagate and interconnect, forming larger cracking patterns. In advanced stages, alligator cracking develops, compromising structural capacity and surface functionality by reducing the integrity of the wearing course, altering profile smoothness, and facilitating the ingress of water and air; over time, these conditions accelerate internal damage, promote pothole formation, and may culminate in structural failure [
36].
Recently, fatigue cracking has gained prominence with the incorporation of recycled materials such as RAP (Reclaimed Asphalt Pavement), rejuvenators, and modified binders. Although intended to improve performance, these practices can be associated with increased cracking susceptibility, in part since oxidative aging of the binder is linked to reduced ductility and diminished energy dissipation under cyclic loads [
39,
40].
To evaluate this susceptibility to cracking, numerous performance-based tests have been developed, with the IDEAL-CT (Indirect Tensile Asphalt Cracking Test) gaining significant acceptance. Notably, due to its strong correlation with field-observed cracking, several Departments of Transportation (DOTs) in the United States have promoted research aimed at proposing preliminary criteria within standardized performance-based testing frameworks. These efforts seek to facilitate quality control through the adoption of IDEAL-CT as a routine quality assurance test. A relevant example is provided by the New Jersey Department of Transportation (NJDOT), which, based on a study correlating different cracking performance tests, proposed minimum CT-index values for mixtures with high RAP content. The specifications require a CT-index greater than 180 for surface courses and greater than 140 for intermediate courses [
41]. Similarly, the New York State Department of Transportation (NYSDOT) establishes a minimum CT-index of 134 for these mixtures, while the Pennsylvania Department of Transportation (PennDOT) specifies values ranging from 70 to 90, depending on the traffic level [
40].
In line with these agency-based findings, several studies have demonstrated the usefulness of the Cracking Tolerance Index (CT-index) as a reliable parameter for assessing the cracking resistance of conventional asphalt mixtures. Yan et al. [
42] compared different IDEAL-CT configurations and reported a strong correlation with the SCB-IFIT fracture index (R
2 ≈ 0.97), with CT-index values ranging between 70 and 150 for dense-graded and gap-graded mixtures produced with conventional PG binders, highlighting the sensitivity of the parameter to binder stiffness and gradation. Similarly, Saha Chowdhury et al. [
43] confirmed that mixtures exhibiting better fatigue performance also presented higher CT-index values, reinforcing its validity as an indicator of cracking tolerance. More recently, Shaikh and Gupta [
41] employed predictive modeling techniques to define threshold CT-index values associated with satisfactory performance, establishing its relevance as an acceptance criterion in asphalt mixture design. Collectively, these studies suggest that CT-index values above approximately 120 may be associated with good cracking performance, whereas lower values tend to reflect reduced resistance to fatigue and thermal cracking.
Field implementation should therefore account for drainage efficiency and site-specific hydrological conditions, including proximity to groundwater and susceptibility to flooding, as these factors significantly intensify moisture-induced aging in RCA mixtures. To mitigate such effects and ensure long-term durability, the use of anti-stripping agents, polymer-modified binders, and optimized volumetric parameters—particularly voids in mineral aggregate (VMA) and voids filled with asphalt (VFA)—is strongly recommended [
12,
13,
25,
44]. Moreover, durability evaluation should extend beyond conventional Tensile Strength Ratio (TSR) analysis to include long-term immersion conditioning and fracture-based performance tests, such as the IDEAL-CT and fracture energy (Gf) assessments. These methodologies provide a more comprehensive understanding of moisture-induced damage mechanisms and the progressive loss of cohesive and adhesive integrity in RCA-modified asphalt mixtures [
17,
18,
28].