Antigen Receptors Gene Analysis for Minimal Residual Disease Detection in Acute Lymphoblastic Leukemia: The Role of High Throughput Sequencing

: The prognosis of adult acute lymphoblastic leukemia (ALL) is variable but more often dismal. Indeed, its clinical management is challenging, current therapies inducing complete remission in 65–90% of cases, but only 30–40% of patients being cured. The major determinant of treatment failure is relapse; consequently, measurement of residual leukemic blast (minimal residual disease, MRD) has become a powerful independent prognostic indicator in adults. Numerous evidences have also supported the clinical relevance of MRD assessment for risk class assignment and treatment selection. MRD can be virtually evaluated in all ALL patients using different technologies, such as polymerase chain reaction ampliﬁcation of fusion transcripts and clonal rearrangements of antigen receptor genes, ﬂow cytometric study of leukemic immunophenotypes and, the most recent, high throughput sequencing (HTS). In this review, the authors focused on the latest developments on MRD monitoring with emphasis on the use of HTS, as well as on the clinical impact of MRD monitoring.


Introduction
Adult acute lymphoblastic leukemia (ALL) represents a heterogeneous group of neoplasms characterized by a malignant proliferation of lymphoid precursor cells in bone marrow, blood and extra-medullary sites [1], with an estimated annual incidence of one in 100,000 and a tendentially dismal prognosis [2,3]. In fact, the clinical management of ALL is very challenging: current therapy can induce a complete remission (CR) in 65-90% of adults, but only 30-40% of patients survive for five or more years [2,4,5].
Importantly, the outcome depends on many clinical and biologic parameters such as age, white blood cell (WBC) count, time to CR, disease immunophenotype and cytogenetics. Consequently, biological parameters are currently used for both diagnosis and risk class assessment in ALL patients. However, risk classes do not often ensure a proper prognostication in a single given patient. In particular, in a consistent proportion of "standard-risk" (SR) patients, common therapy fails [6].
The major determinant of treatment failure is relapse [5,7,8], which originates from leukemic cells that are resistant to chemotherapy [9]. Consequently, measurement of levels of residual leukemic blast, (minimal residual disease/MRD), became a powerful independent prognostic indicator in adult ALL [4,10,11] and refined risk stratification schemes.
Traditionally, quantification of residual leukemic cells was assessed by counting pathological cells in blood or bone marrow by optic microscope. This mission is particularly difficult for reduced tumor burden (e.g., after therapy) because the morphology of ALL blast, lymphoid precursors and activated mature lymphocytes is often similar. Therefore, the sole morphological valuation of complete remission is often inaccurate, especially when non-neoplastic hematopoiesis is undergoing restoration after chemotherapy-induced bone marrow aplasia [8].
This underscores the importance of addressing a more sensitive system to assess treatment response and then MRD. Indeed, different clinical trials, initially in childhood ALL, showed that an efficient MRD analysis grants a more accurate estimate of response to treatment, representing a powerful independent prognostic tool [7,12,13]. Notably, the MRD-based consolidation treatment choice provided a significant benefit in terms of clinical outcome [7,12,13].
To date, the most used methods for MRD assessment are: (1) multi-parameter flow cytometry, based on leukemia-associated aberrant immunophenotypes; and (2) real-time quantitative polymerase chain reaction (qPCR) founded on amplification of antigen receptor gene or fusion transcript [3,5,6,14].
Recently, massive parallel sequencing (also known as next generation sequencing or high throughput sequencing/HTS) of antigen receptor genes emerged as a leading technology to carefully evaluate the diversity of the immune system. When applied to clinics, HTS of antigen receptor genes was able to identify and define lower detection thresholds for MRD than other methodologies [15,16].
In this review, the authors discuss the latest technical developments on MRD monitoring with emphasis on the role of HTS, as well as on the relations between MRD and clinical features of disease and treatment outcome.

Minimal Residual Disease Monitoring by PCR
Currently, different approaches can be used to monitor MRD by PCR: qualitative, semi-quantitative and quantitative analysis. The first two methods provide significant information but are partially limited and do not allow precise analysis of tumor kinetics. On the other hand, quantitative data appear to be crucial for correct assessment of treatment response, and therefore, one of the most useful MDR assays available is the real-time quantitative PCR (qPCR). In fact, it permits an accurate quantification of the PCR product and monitors disease amounts over time [17].
Three main types of qPCR are accessible, using SYBR Green I, hydrolysis (TaqMan) and hybridization (LightCycler) probes, respectively. The SYBR Green I based approach is relatively simple and allows studying different targets. On the other hand, it is not sequence-specific and may detect non-specific PCR products. Conversely, qPCR methods based on either hydrolysis or hybridization probes are possibly more sensitive and analyze sequence-specific targets. In fact, signal generation requires a dual peculiar hybridization (with 2 primers and 1 or 2 probes), conferring higher specificity [17].
Independently of the used technique, the main qPCR targets for MRD detection in ALL patients are: (1) fusion gene transcripts; and (2) clonal rearrangements of the antigen receptor [3,17,18].
According to the specific PCR target, different types of oligonucleotides can be then used for specific detection, such as the allele-specific oligonucleotide (ASO) probe, ASO forward primer, ASO reverse primer and germline probe and primers [17].

Antigen Receptor Gene Rearrangements
During early B-and T-cell development, the germline variable (V), diversity (D) and joining (J) fragment of the immunoglobulin (IGH) and T-cell receptor (TCR) genes rearrange with random deletion or insertion of nucleotides in the junctional region, generating a specific unique sequence for each lymphocyte [19][20][21]. Since ALL cells arise from an alteration of a single lymphoid precursor, they have a clonal IGH or TCR junctional region that can be considered as 'DNA-fingerprints' of the malignant cell and then used as tumorspecific PCR targets for MRD detection [22]. Importantly, more than 95% of B-lineage ALL (B-ALL) cases have IGH gene rearrangements [21,23]. In addition, TCR (specifically, TRB, TRG and TRD) genes rearrangements are also found in 35%, 60% and 90% of B-ALL cases, respectively [24][25][26]. Regarding T-lineage ALL (T-ALL), clonal TCR gene rearrangements are present in almost all cases, whereas IGH gene rearrangements occur in only 20% of patients. Thus, this approach is virtually applicable for all ALL patients.
On the other hand, the MRD assessment by antigen receptor gene is quite complex, because the junctional region rearrangements of each ALL cases have to be identified before the patient-specific qPCR can be performed. Particularly, as a first step, DNA sequencing is necessary to identify the clonal rearrangement characterizing the clone in diagnostic samples [27]. Second, patient-allele-specific oligonucleotide (ASO) primers are designed. Usually, three to five ASO primers are designed and tested for sensitivity and specificity; the best ones are kept for analysis and then applied to determinate MRD in bone marrow (BM) and/or peripheral blood (PB) mononuclear cells obtained at different sequential time points after treatment [22].
Despite the difficult process, this approach is widely used because it is reliable, accurate, specific and extremely sensitive [17,27,28]. In fact, it is able to detect one cancer cell out of 10,000 or 100,000 normal cells corresponding to a sensitivity of 10 −4 -10 −5 . Sensitivity values changes on the basis of type of rearrangements and amount of normal lymphoid cells with identical or similar junctional regions in the analyzed sample [29][30][31].
By contrast, the main pitfalls and drawbacks are the elevated technical costs to perform patient-specific assays and possible unpredictable false-negatives due to the biology of ALL itself [30]. In fact, minor clones can be unnoticed at diagnosis but progressively expand throughout the disease progression [19,32,33]. This phenomenon is termed "oligoclonality" [34] (the presence of few distinct clones sustaining the leukemic proliferation). For false-negative occurrence, another possible prospect is that the clonal marker is substituted by a secondary rearrangement, an event termed "clonal evolution" [35]. Indeed, when clonal markers identified at diagnosis and relapse are compared, modifications in the genetic patterns are frequent, being observed in up to 40% of IGH rearrangements and 7-20% of TRG/TRD rearrangements [36].
In order to reduce the risk of false-negative results, it could be reasonable to monitor at least two different rearrangements [27]. If only one suitable gene rearrangement is detected, the use of an additional assay (more often flow cytometry) may help. This behavior, however, is probably too stringent, reducing the number of patients correctly monitored.

Fusion-Gene Transcripts
Recurrent chromosome translocations, suitable for MDR studies by qPCR analysis, are found in 30 to 40% of ALL [8,37]. These translocations result in the expression of aberrant mRNA transcripts, the most widely used being summarized in Table 1.
A strength of using fusion transcripts to monitor MDR in ALL patients is the stable association between the molecular abnormality and the leukemic cell, irrespective of biological changes due to clonal selection. In addition, this technique does not require the laborious technical work as the sequencing and patient-specific primer synthesis, required by antigen-receptor gene rearrangements analysis. In fact, fusion gene transcripts can be identified using a limited set of primers, and, for qPCR analysis, a single hydrolysis probe or pair of hybridization probes can be used for the detection of several possible transcript variants [38]. However, the choice of fusion transcripts as the target for MRD assessment presents some possible drawbacks [4]. The main one is the use of RNA as starting material. RNA is more prone to degradation than DNA, causing false-negative results, and it does not allow a precise assessment of the amount of leukemic cells. In fact, in contrast with the antigen receptor gene rearrangements, quantity of transcripts may vary from cell to cell, making it unfeasible to determine a correct relation between amplicons and number of leukemic cells. That implicates a lack of accuracy; despite a potential sensitivity of 10 −4 , it is sometimes hard to reach a sensitivity of 10 −3 in routine practice [3].
In order to obviate this problem, breakpoint fusion regions of chromosomal aberrations can be used as MRD targets. This method makes use of DNA as a starting material but is feasible only when the breakpoints cluster in relatively small regions (2 kb) [39,40]. An example concerns the sub-microscopic 1p32 (TAL1) deletions, present in 5-15% of T-ALL patients [39]. In other instances, long race PCR could allow for cloning larger regions as MLL breakpoints [40].
In spite of confined employment of this technique, it presents some advantages: (1) DNA is less sensitive to degradation than RNA; (2) in contrast to fusion gene transcripts but comparable to antigen receptor gene rearrangements, only one target is present per cell, increasing sensitivity; (3) it allows for performing patient-specific assays to evaluate MDR despite differences in the DNA sequence in each patient (unlike fusion gene transcripts); and (4) breakpoint fusion regions are stable throughout the disease course and directly related to the oncogenic process [8].
Overall, qPCR amplification of fusion transcripts and breakpoint fusion regions provides useful clinical information, but, given the frequency of each translocation, it permits only the study of a subset of patients [37].
Importantly, independently of the used approach (IGH/TCR vs. fusion transcripts), the interpretation of qPCR MRD data needs standardized criteria and international uniformity [27,41]. Several European groups joined and established common guidelines for data analysis and interpretation. These networks work to standardize technology in order to uniform qPCR-based MRD detection in a multi-centric clinical setting [17,27], abolishing the puzzling effects of diverse evaluation systems. In fact, it is mandatory to define an efficient cut-off for MDR negativity. In this regard, most groups agreed on the 10 −4 valuable level, even if no formal evidence is available. On the other hand, it should be kept in mind that sensitivity varies based on the analyzed sequence.

Minimal Residual Disease Monitoring by Flow Cytometry
Flow cytometric detection of MRD is based on the immunophenotypic analysis of surface antigens univocally combined on leukemic cells but not on normal hematopoietic cells [5,37]. In fact, leukemia cells often display aberrant or unusual antigen expression when compared to their normal counterpart, this being defined as leukemia-associated immunophenotype (LAI).
Flow cytometry permits detection of the tumor-specific phenotypic signature in virtually all patients with ALL (95%), by combining information on cell size, surface membrane, cytoplasmic and nuclear molecules, with a routine sensitivity of 0.01% [41].
In everyday practice, this analysis is conducted through the monitoring of four markers, at the most; however, at present time, this range might be extended to eight or more markers, significantly raising the capability to discriminate between normal and leukemic cells and increasing the sensitivity of MRD assessment [8]. In fact, multi-parameter flow cytometry enables detection of residual blasts with a sensitivity of up to 10 −4 , which is only slightly below the sensitivity of molecular methods [18]. On the other hand, notably, sensitivity degrees may vary depending on the type of phenotypic aberration, the monoclonal antibodies used for detection and the type of sample under study [42].
Usually, LAI is identified at diagnosis by comparing the immunophenotypic profile of leukemic cells to that of normal bone marrow cells. For each case, one or more specific tumor marker combinations are selected and then applied to study MRD during therapy. That is usually performed by using monoclonal antibodies and/or heterologous antisera conjugated to fluorescein isothiocyanate, phycoerythrin, peridinin chlorophyll protein and allophycocyanin [41].
The reliability of flow cytometric assays for MRD assessment depends on many aspects. The main one is the uniqueness of the tumor-specific immunophenotypes. In fact, to be absolutely specific, they should never be expressed by normal hematopoietic cells at any stage of development or cell cycle. However, immunophenotypes, apparently absent, can become evident when the bone marrow is actively proliferating after chemotherapy [18]. In addition, phenotypic changes during the course of the disease have been reported to occur at various frequencies, ranging from 20 to 70% of cases studied at relapse [42][43][44][45]. For this reason, markers need to be stably expressed on leukemic cells [46], and, in order to correctly identify ALL immunophenotypes, it is important to consider the variations of bone marrow cell population composition according to age and exposure to drugs. Therefore, a proper immunophenotypic assessment must be conducted thorough serial analysis of bone marrow obtained under a multiplicity of conditions. The most widely used markers for MDR studies include three main classes: (1) immunophenotypes expressed during normal development but limited to specific stages and anatomical locations (e.g., the immunophenotype of T-ALL, physiologically present only in selected thymocytes but never found outside thymus) [47]; (2) aberrant markers (i.e., antigens expressed during lymphopoiesis but found on leukemic cells either as asynchronous or in atypical/bizarre combinations, often in B-ALL [8,[47][48][49][50][51] (Table 2)); and (3) fusion protein expression derived from chromosomal breakpoints, such as BCR-ABL1, or ectopic protein expression derived by gene translocations [52]. The latter is indeed leukemia-specific but not extensively used because of the lack of antibodies necessary for their consistent recognition at protein level. Interestingly, new tumor-specific markers combinations have been identified by recent gene expression analysis (GEP) studies, above all in pediatric ALL [9], improving, sometimes considerably, sensibility and feasibility of MRD detection [9,53]. For example, Coustan-Smith et al. found, by GEP, 22 markers (CD44, BCL2, HSPB1, CD73, CD24, CD123, CD72, CD86, CD200, CD79b, CD164, CD304, CD97, CD102, CD99, CD300a, CD130, PBX1, CTNNA1, ITGB7, CD69, CD49f) differentially expressed in up to 81.4% of ALL studied cases. These new markers, used in six different combinations, allow the detection of MRD by flow cytometry in all B-lineage ALL patients, with a substantially improved sensitivity of 10 −5 [9].
Importantly, the use of multi-parameter flow cytometry is an attractive approach in relation to MDR analysis. In fact, conversely to PCR-based methods, it is able to study a large number of cells within a short period of time and also to provide information on viable, dead and normal hematopoietic cells [42].
The major limitation to the applicability of flow cytometry MRD studies is the absence of a clearly stable leukemia-associated phenotype. On the other hand, this restriction might be overcome, increasing the number of marker combinations applied to each case, but this would imply additional costs and the need for an increased complexity in technique standardization [18].
Multiple studies have shown correlation between flow cytometric and molecular genetic studies of MDR in ALL. The data demonstrate that the two most widely used methods for MRD detection show remarkably concordant results, both in bone marrow and in peripheral blood samples collected at different time points during treatment [5,14,42]. However, the estimated levels of MDR obtained by these two modalities may vary, and, although molecular studies with patient-specific probes are generally considered more sensitive, both techniques can be potentially applied to clinical studies.
In spite of several schools of thought, which report a preferential use of only one technique, (flow cytometry vs. qPCR) reserving the other one for cases where a reliable marker is not identifiable [8,27], the combination of both methods might be also recommended to allow MRD detection in virtually all ALL patients, providing prognostically useful information [37].

The Future: Next Generation Sequencing
Since the first complete human genome sequence was published, technologies have dramatically developed to facilitate HTS analysis of nucleic acids. Of note, the new methodologies allow complete genomic analyses within time frames and costs that are acceptable in clinical routine. The potential applications of this novel laboratory technology are many, and they range from deep characterization and classification of disease to patient stratification, until monitoring of minimal residual disease [54]. In particular, the capacity to identify and quantify even a few circulating tumor cells is clinically meaningful to more carefully predict patients' outcome and sometimes direct treatment choices [55,56].
Different from qPCR that requires a priori knowledge of the target sequence/characteristics, next-generation sequencing (NGS) is an unbiased approach to nucleic acid detection [57]. Coupled with the immense numbers of individual sequence reads produced by NGS instruments (deep sequencing), these techniques offer a novel and powerful approach to detect MRD [57,58].
NGS can potentially improve MRD detection by characterizing specific genomic alterations or detecting small amounts of mutant or clonal DNA without a priori knowledge of the mutant DNA sequence. Moreover, this method has the potential capability to significantly increase sensitivity, the only limiting factor being the DNA input amount. This is, however, not trivial in daily clinical practice; in fact, based on calculations derived from the theoretical DNA amount of single cells, up to 5 micro-g DNA might be necessary to achieve a sensitivity of 10 −5 with NGS [59]. On the other hand, NGS might also allow: (1) to extend the evaluation of MRD in all patients, (2) to eliminate the troubles linked to patient-specific assays, (3) to reduce data interpretation, which can be operator-and laboratory-dependent, and (4) to increase the information that could affect clinical treatment decisions [59]. Remarkably, the overall costs would be reduced in comparison to patientspecific PCR approaches.
Based on these premises, many groups are currently interested in developing/implementing this new application. In an elegant demonstration of several benefits of this approach, Wu et al. [15] used, for the first time, NGS of TCR genes to track MRD in T-lineage ALL. They assessed MRD in 43 samples at day 29 after treatment. They found that TCR NGS not only identified clonality at diagnosis but also detected subsequent MRD that was not detected by flow cytometry in a subset of cases. More recently, Piccaluga et al. used TCR-NGS to monitor T-cell clones developing during immune reconstitution after allogeneic stem cell transplantation and differentiated them from malignant ones [16]. Both studies highlighted the potential of this technology to define lower detection thresholds for MRD and suggested its use in clinical molecular laboratories. It is quite relevant, in fact, to consider that NGS of TCR/IGH allows the detection of any rearrangement. In particular, during immune reconstitution following treatment, small, reactive clones might be erroneously interpreted as leukemic. In this light, NGS and patient-specific qPCR can easily distinguish them, while conventional clonality testing would be challenging [16].
Similarly, NGS of IGH gene rearrangements can readily be performed in B-cell malignancies [60][61][62][63][64]. In B-ALL, Shin et al. showed that all initial diagnostic samples were positive for clonal IGH rearrangement [64]. Interestingly, among follow-up samples, in one Philadelphia-positive case, BCR-ABL1 quantitative PCR was negative, but the NGS IGH assay was positive just before the hematological relapse, suggesting the high sensitivity and clinical utility of the NGS-based approach [64].
The main different features of flow cytometry, qPCR and NGS are summarized in Table 3. * The costs of NGS may significantly vary according to the number of cases analyzed in a single run.

Clinical Impact of MRD in Adult ALL
MRD evaluation is becoming increasingly important in the clinical management of patients with ALL. As mentioned, in a significant proportion of ALL patients, an apparent clinical CR is accompanied by the presence of the resistant cells in the BM and/or PB. Early detection of (the expansion) residual cells is particularly useful for the evaluation of early treatment response during the induction phase and, consequently, for optimizing the therapy choice, aiming to overcome the chemo-resistance of leukemic cells. MRD measurement is also clinically useful in patients with relapse who achieve a second remission; in addition, it can help to optimize the timing of hematopoietic stem cell transplantation and to guide decisions about donor lymphocyte infusion post-transplant [12,13]. Notably, a clinical trial evaluating the bispecific monoclonal antibody blinatumomab (Micromet) was designed to treat patients with MRD, having the molecular CR as main clinical endpoint.
At present, although the clinical impact of MDR has been extensively evaluated in children with ALL, some studies support its value in adult patients [11,27,[65][66][67][68][69]. MRD tests allowed, first of all, to better define clinical remission and to stratify patients according to the risk of treatment failure in a way that is much more accurate and rigorous than conventional clinical biological parameters [11,27,[65][66][67][68][69].
Mortuza et al. [66] analyzed the impact of MDR tests in 85 adult patients with Philadelphia (Ph) chromosome-negative B-lineage ALL by using PCR amplification of antigen receptor genes. They found that MDR presence at three to five months after induction therapy was related with an inferior clinical outcome. On the other hand, Holowiecki et al. [65] used flow cytometry to estimate MRD in 116 patients with Ph-negative ALL and found that MRD ≥ 0.1% after induction therapy was an independent predictor for relapse in both standard-and high-risk groups. More recently, Bassan et al. [4] showed that MRD was the most significant risk factor for prediction of relapse; in fact, five-year overall disease-free survival evaluations were 72% in the MRD negative group versus 14% in the MRD positive group, regardless of clinical risk factors. Finally, Krampera et al. found that MRD detection at different follow-up points during the first year of treatment was significantly associated with a high risk of relapse in a series of 53 T-ALL adult patients enrolled in the GIMEMA LAL0496 protocol [10]. That is also noted in a study conducted by Piccaluga et al. [70] on Ph+ ALL patients treated with imatinib. MDR assessment performed by BCR::ABL1 RT-PCR showed that persisting molecular CR was associated with long-lasting CR, and a wide and rapid increment of BCR::ABL1 values was predictive of leukemia relapse.
Remarkably, current studies also incorporated MRD assessment in order to improve risk stratification. The classical prognostic model is usually quite accurate for high-risk (HR) or very high-risk (VHR) patients. On the other hand, adult patients who are stratified into a standard group (SR) (absence of adverse prognostic factors) relapse in 40% to 55% of cases and cannot be predicted by any other conventional risk factor [27,[71][72][73].
Within the GMALL 06/99 trial, the predictive value of MDR in 196 SR ALL patients was analyzed, and, using the combined information on day-11, day-24, and week-16, it was demonstrated that sequential monitoring of MDR is a powerful indicator of treatment outcome. In fact, MRD assessment divided SR ALL patients in three different risks of relapse: (1) patients with a rapid tumor clearance and low relapse rates; (2) high-risk group with an MRD of 10 −4 or higher until week 16 and an extremely poor outcome; and (3) intermediate-risk group with a relapse rate of 47% [28]. This distinction is crucial and has relevant therapeutic consequences (i.e., intensification in HR patients). That is also confirmed by the studies conducted by Vidriales et al. [69] and Brisco et al. [74], which demonstrated a high discriminative power of MRD.
In the allogeneic transplant setting, different groups [75][76][77] showed that MDR monitoring in adult patients with Ph-positive ALL, receiving HSCT and/or imatinib therapy, had a significant predictive power of treatment outcome [75][76][77]. Similarly, Sanchez et al. demonstrated, by multiparameter flow cytometry, that in Ph-negative ALL, MRD detected before initiation of conditioning for hematopoietic stem cell transplantation (HSCT) was a significant predictor of post-transplant failure [78].
That is in keeping with the findings of Spinelli et al. [79], who studied a group of 43 adult patients with ALL undergoing HSCT. In this series, the relapse rate at 36 months was 0% for the 12 patients who were MRD-negative before HSCT, versus 46% for those MRD-positive.
Most recently, Short et al. showed that a very sensitive HTS-based protocol could identify with high accuracy patients at lower risk of relapse [80].
Taken together, these results indicate that a correct MDR assessment is an important independent prognostic factor for the duration of CR and relapse prediction in adult ALL.

Conclusions
MRD is commonly evaluated at fixed time points during induction, consolidation and maintenance therapy, and there is a close positive association between rapid MRD signal reduction and the duration of CR, independent of the applied treatments. In fact, a high and/or an increased number of residual blast cells was associated with an impending relapse. On the other hand, conversion to MRD negativity, during treatment, is associated with a favorable prognosis. Thus, monitoring MRD may allow for optimal guiding of the decision to use HSCT or post-consolidation maintenance. That could prevent the toxicity of HSCT in some HR patients as well as identify SR cases for whom standard chemotherapy is likely to fail.
However, lack of MRD after treatment is not synonymous of cure. In fact, despite a remarkable sensitivity of molecular techniques as well as of flow cytometry, residual leukemic cells might be present even in absence of any evidence. It might be the case of extra-medullary disease, not infrequent in ALL.
In order to avoid false-negative results, it is crucial to evaluate MRD in BM samples taking into account that MRD assessment in BM is more sensitive compared with PB [81]. Even if BM sampling might be problematic for many patients, PB cannot replace BM, and they should be tested in parallel.
In spite of the overwhelming evidence that supports the prognostic impact of MDR, the type of the treatment protocol, timing of the follow-up samples and the applied MRD technique might influence the significance of MRD. Modern first-line therapy consists of standard-and high-dose chemotherapy, HSCT and new targeted therapy, all integrated with the analysis of prognostic factors. Thus, precise MRD threshold levels for risk-group assignment have to be defined carefully for each treatment protocol and assay before the clinical application. In this regard, rigorous standardization, within cooperative groups, is mandatory. In addition, beyond its direct usage for patient management, MRD assessments might also reveal new molecular determinants of treatment response, as indicated by gene expression analyses in pediatric ALL [82][83][84]. Indeed, it is conceivable that the most recent tools for genetic analysis, such as NGS, may allow for improving our knowledge of the molecular basis of chemoresistance. The main limitations, at present, for routinary usage of NGS are the costs and need of specific expertise, including bioinformatics. Although reduced since its introduction, NGS is still expensive; it should be noted, however, that patient-specific qPCR of IGH/TCR analysis is even more expensive. Ongoing as well as future studies have to demonstrate the routinary feasibility of this method, the real advantage, while standardization procedures have to be undertaken within the main collaborative groups.
Author Contributions: P.P.P. conceptualized and wrote the article; S.P. and G.V. collected data and revised the article critically. All authors have read and agreed to the published version of the manuscript.