Next Article in Journal
Reply to Torlakovic, E.; Normanno, N. Comment on “Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10”
Previous Article in Journal
Exploring the Molecular Pathology of Iatrogenic Amyloidosis
 
 
Reply published on 27 June 2024, see J. Mol. Pathol. 2024, 5(3), 262-263.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Comment

Comment on Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10

by
Emina Torlakovic
1,2,* and
Nicola Normanno
2,3,†
1
Department of Pathology and Laboratory Medicine, College of Medicine, University of Saskatchewan, Saskatoon, SK S7N 5A2, Canada
2
International Quality Network for Pathology, IQN Path ASBL, 61 rue de Rollingergrund, L-2440 Luxembourg, Luxembourg
3
IRCCS Istituto Romagnolo per lo Studio dei Tumori (IRST) “Dino Amadori”, 40-47014 Meldola, Italy
*
Author to whom correspondence should be addressed.
President, IQN Path.
J. Mol. Pathol. 2024, 5(3), 258-261; https://doi.org/10.3390/jmp5030017
Submission received: 2 March 2024 / Revised: 8 May 2024 / Accepted: 14 June 2024 / Published: 27 June 2024
We have read, with great interest, a recently published article by Bisson KR et al. entitled “Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery” [1]. This paper shows that even in the small inaugural run in which only seven laboratories participated and produced 21 reports [three from each lab], external oncologists recruited by the CPQA would select a suboptimal treatment for two of the reported assays. These results indicated that improvement in reporting may be needed. This indeed is useful information for laboratories.
In 2013, van Krieken et al. published “Guideline on the requirements of external quality assessment programs in molecular pathology” [2]. This guideline included recommendations on “reporting with particular attention to the following aspects: identification of the sample(s) analyzed, information on the type of assay used, adequacy of the sample relative to the underlying request and the test used, and accurate assessment of the clinical implications of the result”. In 2021, an updated guideline on behalf of the International Quality Network for Pathology (IQN Path) followed to include specific recommendations regarding the reporting of results [3]. In this update, it is clearly shown that PT providers may use different schemes to conduct PT for biomarkers. There is a distinction between two types of PT challenges; there are those that use a holistic approach (the so-called “Integrated approach”) and those that focus on one or more test performance characteristics (the so-called “TPC-focused approach”). The proficiency testing for biomarkers in oncology approaches assessments for reporting and interpretation through an integrative approach in which many different TPCs are included. This includes the precision of the pathology review process, assay protocol review, readout precision and/or accuracy, interpretation accuracy, and reporting accuracy. These TPCs could also be assessed separately by a PT program using a TPC-focused approach [2,3,4,5].
Although it is difficult to get full insight into the current practices of all PT providers globally, the published literature suggests that the assessment of interpretation accuracy and reporting accuracy has long been in place [2,3,6,7,8,9,10,11]. Therefore, it is not clear how peer review failed to recognize that, and this paper was published with a claim that this CPQA inaugural run in which reporting and interpretation are assessed by experts is indeed “novel”.
Proficiency testing for molecular tumor boards is a new initiative of the IQN Path. Perhaps it is in this context that the paper published by Bisson et al. on behalf of the CPQA needs to be positioned. In order to test the proficiency of institutional molecular tumor boards, institutional molecular tumor boards would need to participate and issue their opinion either by using the report(s) provided by their local laboratory (an integrated approach) or the results provided by the PT program (a TPC-focused approach). The CPQA inaugural run recruited external oncologists to serve in this role instead of local institutional tumor boards or institutional oncologists where molecular tumor boards do not exist. While their approach certainly brings value in showing that the results may not be interpreted as intended (especially if the interpretation is done by external oncologists), from the results of this PT challenge, we do not have information about how their local molecular tumor boards (or their institutional oncologists) would perform with their own laboratory results. Perhaps there would be the same issue or perhaps they are used to laboratory terminology and reporting and no errors in selecting optimal treatment would be made. We do not know what the results would be until that has been assessed in a PT challenge or an internal audit. Although Bisson et al. do not claim in their paper that their PT run is indeed the PT for molecular tumor boards, in the light of current global developments, we hope that our letter will be helpful in clarifying that the PT for molecular tumor boards needs to include local institutional tumor boards. There is a higher possibility that the paper by Bisson et al. may be misinterpreted because there is a claim that their approach is “novel”, while the published literature and even more unpublished practices of many PT programs show otherwise. An integrated approach for molecular biomarker testing has been there for more than a decade and interpretation accuracy and reporting accuracy have been particularly important components of the current practices of many PT programs. Therefore, we could say that the integrated approach that includes the assessment of reporting and interpretation by PT programs is currently “traditional” since many programs have moved away from assessing only (one or more) analytical components of molecular biomarker assays.
Bisson et al. also make a claim that in their inaugural run, “this exercise is designed to capture many of the additional pre- and post-analytic issues that can be most challenging to oncologists when treating their patients”. It is exceedingly difficult for PT programs to capture the performance of laboratories regarding pre-analytic issues. Samples sent to participating laboratories must be the same where analytical and post-analytical components are tested. Sending samples with different pre-analytical conditions to laboratories would assess how their assays perform with different pre-analytical conditions, but such pre-analytical conditions may not be relevant to participants because it is their local pre-analytical conditions that matter for the success or failure of their testing. What is considered to be “pre-analytical” depending on the testing methodology is also relevant. If tissue scraping from the unstained slides or DNA/RNA extraction are considered pre-analytical, these components may be tested for their quality. Therefore, knowing these difficulties and limitations for PT for pre-analytical conditions, PT programs rarely assess them [12,13]. A comment on this component of PT testing is relevant: since the inclusion of a whole pre-analytical phase in PT is not realistic, a claim that a pre-analytical phase was also assessed may lead to wrong conclusions about what results of PT testing may mean for laboratory practice with patients’ samples.
Finally, regarding the TAT, most PT programs treat it very seriously and fail laboratories that do not submit the results of the challenge in the prescribed/allowed time. Clearly, the TAT is important, and it should continue to be routinely assessed in PT runs [10,14]. However, there is also an expectation of modern laboratory practices to internally audit their TAT for any given laboratory assay as this would be more relevant to their service to patients. In addition to PT challenges where laboratories fail if they do not submit the results by the expected deadline, another, perhaps more efficient way to improve on and standardize the optimal TAT for biomarkers in oncology would be to include it in the list of parameters that are assessed for laboratory accreditation.
The potential confusion for PT programs, PT participants, journals that publish results of PT for biomarkers in oncology, and their reviewers is at least partly cofounded by the lack of transparency in PT in general, as well as of the lack of implementing a fit-for-purpose approach and using the current relevant terminology that applies to the area of PT as well as more broadly to the validation of biomarkers in oncology. How is this relevant to the paper by Bisson et al.? We first need to ask the question: what is the purpose of the CPQA PT published run [1]? In the holistic approach, we need to differentiate between the following possible scenarios: (1) was the purpose to test if the appropriate therapy would be selected based on the laboratory reports by the oncologists, or (2) was the purpose to test if the appropriate therapy would be selected based on the laboratory reports by external experts who do not read this laboratory report usually? Since all laboratories provided correct results for the assays, it is the reporting that became an issue. External experts reported that they misinterpreted some of the correct results. As mentioned above, this provides feedback to laboratories and incentive to improve their reporting. However, this is a larger issue than it superficially seems to be. It leads us back to the terminological issues and transparency of what is being tested. The analytical phase of any assay is finalized when the results are generated. In this step, the generation of the results is the direct consequence of a “readout”. A readout may be generated automatically by an instrument or associated software (bioinformatics and variant interpretation, which includes an interpretive algorithm, but is not to be considered the “interpretation of the results”), or it may be even generated by a human readout as is the case with in situ methodologies such as immunohistochemistry [IHC] and fluorescent in situ hybridization [FISH] where pathologists or technologists/scientists give a readout. The difference between a readout and interpretation was only recently highlighted for in situ methodologies because this is where it is most often a source of conceptual confusion [15]. The reporting of the results may or may not include “interpretation”. Distinguishing between the readout and interpretation is critical for PT programs.
When we assess the interpretation of the results by PT programs, we need to ask whose job it is to provide the interpretation of a readout. This will tell us what needs to be included in the biomarker reports. If it is the laboratory’s duty to provide an interpretation, the laboratory should/could be judged for their performance if that interpretation is lacking, unclear, or misleading. If that is a duty of oncologists, they should take the blame for misinterpretation. Unfortunately, this issue is even more complex because the original purpose of the biomarker needs to be considered for any interpretation of any given biomarker. Let us use a relatively simple example of a TP53 mutation that occurs in many different neoplasms. This mutation will be detected in tumors irrespective of whether it is germline or somatic. If the TP53 mutation is detected, the readout will include the mutational burden expressed as a percentage of the variant allele frequency [VAF]. The interpretation could include whether the VAF is consistent with a somatic or a germline mutation and also if that level is significant for any given, specific disease [e.g., if it is somatic, there is a 6% cut-off for myelodysplastic syndrome] [16]. Additional interpretation requires a judgment of if the patient will be treated differently (e.g., receiving a stem cell transplant or not, which often depends not only on the results of a biomarker assay, but also on other relevant clinical parameters). If laboratories perform no interpretation, they will report on the presence or absence of the TP53 mutation as well as the VAF. Most laboratories today may include some interpretation in their report, and we should argue that some interpretation is required (e.g., in this example providing information about the meaning of the VAF, somatic vs. germline is expected), but additional interpretation regarding how to treat the patient or whether to request genetic consultation could be in the hands of oncologists.
If a patient received the wrong therapy related to their biomarker results, this could have been the result of the incorrect pre-analytical conditions of the patient’s sample, incorrect assay protocols, an incorrect readout of the assay, an incorrect interpretation on the level of the laboratory, or an incorrect interpretation by the oncologist or any combinations of these. The interpretation of the results of biomarker testing in oncology is critical for patient safety. Since a fit-for-purpose approach is critical and it has two levels [laboratory interpretation and clinical interpretation], the complexity of the PT task (where interpretation is included in the assessment) has to be carefully assessed in the planning, design, execution, and interpretation of the results of PT challenges. Precision medicine demands that everything is fit-for-purpose, not only the biomarker laboratory testing, but also the PT for biomarkers.

Conflicts of Interest

Dr. Emina Torlakovic is a Director of the Canadian Biomarker Quality Assurance, academic proficiency testing program. Dr. Nicola Normanno declares no conflict of interest.

References

  1. Bisson, K.R.; Won, J.R.; Beharry, A.; Carter, M.D.; Dudani, S.; Garratt, J.G.; Loree, J.M.; Snow, S.; Yip, S.; Sheffield, B.S. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10. [Google Scholar] [CrossRef]
  2. van Krieken, J.H.; Normanno, N.; Blackhall, F.; Boone, E.; Botti, G.; Carneiro, F.; Celik, I.; Ciardiello, F.; Cree, I.A.; Deans, Z.C.; et al. Guideline on the requirements of external quality assessment programs in molecular pathology. Virchows Arch. 2013, 462, 27–37. [Google Scholar] [CrossRef]
  3. Dufraing, K.; Fenizia, F.; Torlakovic, E.; Wolstenholme, N.; Deans, Z.C.; Rouleau, E.; Vyberg, M.; Parry, S.; Schuuring, E.; Dequeker, E.M.C.; et al. Biomarker testing in oncology—Requirements for organizing external quality assessment programs to improve the performance of laboratory testing: Revision of an expert opinion paper on behalf of IQNPath ABSL. Virchows Arch. 2021, 478, 553–565. [Google Scholar] [CrossRef]
  4. Raggi, C.C.; Pinzani, P.; Paradiso, A.; Pazzagli, M.; Orlando, C. External quality assurance program for PCR amplification of genomic DNA: An Italian experience. Clin. Chem. 2003, 49, 782–791. [Google Scholar] [CrossRef]
  5. Schrijver, I.; Aziz, N.; Jennings, L.J.; Richards, C.S.; Voelkerding, K.V.; Weck, K.E. Methods-based proficiency testing in molecular genetic pathology. J. Mol. Diagn. 2014, 16, 283–287. [Google Scholar] [CrossRef]
  6. Penault-Llorca, F.; Kerr, K.M.; Garrido, P.; Thunnissen, E.; Dequeker, E.; Normanno, N.; Patton, S.J.; Fairley, J.; Kapp, J.; de Ridder, D.; et al. Expert opinion on NSCLC small specimen biomarker testing—Part 2: Analysis, reporting, and quality assessment. Virchows Arch. 2022, 481, 351–366. [Google Scholar] [CrossRef]
  7. Davies, K.D.; Farooqi, M.S.; Gruidl, M.; Hill, C.E.; Woolworth-Hirschhorn, J.; Jones, H.; Jones, K.L.; Magliocco, A.; Mitui, M.; O’Neill, P.H.; et al. Multi-Institutional FASTQ File Exchange as a Means of Proficiency Testing for Next-Generation Sequencing Bioinformatics and Variant Interpretation. J. Mol. Diagn. 2016, 18, 572–579. [Google Scholar] [CrossRef]
  8. Bridge, J.A.; Halling, K.C.; Moncur, J.T.; Souers, R.J.; Hameed, M.R.; Fernandes, H.; Roy, A.; Surrey, L.; Tafe, L.J.; Vasalos, P.; et al. RNA Sequencing for Solid Tumor Fusion Gene Detection: Proficiency Testing Practice and Performance Comparison. Arch. Pathol. Lab. Med. 2023, 148, 538–544. [Google Scholar] [CrossRef]
  9. Segal, J.P. Next-Generation Proficiency Testing. J. Mol. Diagn. 2016, 18, 469–470. [Google Scholar] [CrossRef]
  10. Laudus, N.; Nijs, L.; Nauwelaers, I.; Dequeker, E.M.C. The Significance of External Quality Assessment Schemes for Molecular Testing in Clinical Laboratories. Cancers 2022, 14, 3686. [Google Scholar] [CrossRef]
  11. Bellon, E.; Ligtenberg, M.J.; Tejpar, S.; Cox, K.; de Hertogh, G.; de Stricker, K.; Edsjö, A.; Gorgoulis, V.; Höfler, G.; Jung, A.; et al. External quality assessment for KRAS testing is needed: Setup of a European program and report of the first joined regional quality assessment rounds. Oncologist 2011, 16, 467–478. [Google Scholar] [CrossRef]
  12. Malentacchi, F.; Pazzagli, M.; Simi, L.; Orlando, C.; Wyrich, R.; Hartmann, C.C.; Verderio, P.; Pizzamiglio, S.; Ciniselli, C.M.; Tichopad, A.; et al. SPIDIA-DNA: An External Quality Assessment for the pre-analytical phase of blood samples used for DNA-based analyses. Clin. Chim. Acta 2013, 424, 274–286. [Google Scholar] [CrossRef]
  13. Malentacchi, F.; Pizzamiglio, S.; Ibrahim-Gawel, H.; Pazzagli, M.; Verderio, P.; Ciniselli, C.M.; Wyrich, R.; Gelmini, S. Second SPIDIA-DNA External Quality Assessment (EQA): Influence of pre-analytical phase of blood samples on genomic DNA quality. Clin. Chim. Acta 2016, 454, 10–14. [Google Scholar] [CrossRef]
  14. Sadik, H.; Pritchard, D.; Keeling, D.M.; Policht, F.; Riccelli, P.; Stone, G.; Finkel, K.; Schreier, J.; Munksted, S. Impact of Clinical Practice Gaps on the Implementation of Personalized Medicine in Advanced Non-Small-Cell Lung Cancer. JCO Precis. Oncol. 2022, 6, e2200246. [Google Scholar] [CrossRef]
  15. Cheung, C.C.; D’Arrigo, C.; Dietel, M.; Francis, G.D.; Gilks, C.B.; Hall, J.A.; Hornick, J.L.; Ibrahim, M.; Marchetti, A.; Miller, K.; et al. From the International Society for Immunohistochemistry and Molecular Morphology (ISIMM) and International Quality Network for Pathology (IQN Path). Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine: Part 1: Fit-for-Purpose Approach to Classification of Clinical Immunohistochemistry Biomarkers. Appl. Immunohistochem. Mol. Morphol. 2017, 25, 4–11. [Google Scholar]
  16. Belickova, M.; Vesela, J.; Jonasova, A.; Pejsova, B.; Votavova, H.; Merkerova, M.D.; Zemanova, Z.; Brezinova, J.; Mikulenkova, D.; Lauermannova, M.; et al. TP53 mutation variant allele frequency is a potential predictor for clinical outcome of patients with lower-risk myelodysplastic syndromes. Oncotarget 2016, 7, 36266–36279. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Torlakovic, E.; Normanno, N. Comment on Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10. J. Mol. Pathol. 2024, 5, 258-261. https://doi.org/10.3390/jmp5030017

AMA Style

Torlakovic E, Normanno N. Comment on Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10. Journal of Molecular Pathology. 2024; 5(3):258-261. https://doi.org/10.3390/jmp5030017

Chicago/Turabian Style

Torlakovic, Emina, and Nicola Normanno. 2024. "Comment on Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10" Journal of Molecular Pathology 5, no. 3: 258-261. https://doi.org/10.3390/jmp5030017

APA Style

Torlakovic, E., & Normanno, N. (2024). Comment on Bisson et al. Novel Approach to Proficiency Testing Highlights Key Practice Variations in Cancer Biomarker Delivery. J. Mol. Pathol. 2024, 5, 1–10. Journal of Molecular Pathology, 5(3), 258-261. https://doi.org/10.3390/jmp5030017

Article Metrics

Back to TopTop