1. Introduction
The procedure of dismantling and decommissioning (D&D) a nuclear power plant, and to a broader extent a nuclear facility, is a complex and highly regulated task [
1,
2,
3,
4] that involves various stages. During all of them, a previously defined Radiological Environmental Monitoring Plan (REMP) adapted to the different phases of the D&D must be followed. A key aspect of this plan is the definition of the radionuclides to be determined, a list which may evolve during the different phases of the process, as part of the adaptation of the plan to the different phases of the D&D process, but which may also affect the types of matrices to be radiologically characterised. The results of this characterization must align with regulatory requirements to ensure the protection of the environment and public health.
The basic aspects of this radiological characterisation are the definition of the detection limits that must be achieved for the radionuclides to be determined, the matrixes to be measured and the objectives of the characterisation.
In the context of D&D, the detection limits are of major importance within the analysis of the radionuclides, since these refer to the lowest concentrations of the radionuclides that can be reliably measured by a given laboratory using its analytical methods and instrumentation. The consequences of establishing appropriate detection limits cannot be underestimated, as they are straightforward regarding the accuracy and reliability of the radiological characterization. Therefore, the detection limits to be reached are a fundamental input that the laboratories in charge of these determinations must know since it allows them to outline their measurement strategy (amount of sample to be taken, counting time, etc.) including the most appropriate determination technique.
Detection limits depend on the type of sample to analyze [
5]. In this framework, the present paper aims to be a contribution to the discussion regarding water and the radionuclides that are likely to appear in it during the different phases of the D&D processes. We have decided to start by choosing water as it is a matrix that is highly regulated [
6,
7,
8,
9,
10,
11,
12,
13] and can have a strong impact on the radiological protection of the population because it can end up being ingested.
From a radiological protection perspective and applying the ALARA radiation protection philosophy (doses as low as reasonably achievable), it is crucial that the doses to the population stay well below maximum allowable values. This requires that the activity concentrations of radionuclides in environmental samples, as one of the contributors to that dose, are kept as low as reasonably achievable. In this regard, regulatory authorities often set maximum allowable levels of radionuclide concentrations in different environmental sectors to protect public health and the environment.
Furthermore, these maximum levels should be significantly higher than the corresponding detection limits [
14]. This would ensure that the concentrations measured are easily distinguishable from the detection limits, allowing for precise and reliable assessments of the radiological impact [
15,
16].
In the context of D&D, certain environmental compartments studied in the aforementioned REMP are of particular concern due to their potential impact on public health and the environment. Water is of great importance due to its great capacity to transfer substances contained in it to other environmental compartments and transport them over long distances. In addition, a significant part of rainwater, surface, and groundwater may end up being exploited for domestic use or irrigation and hence, human consumption [
6,
7,
17,
18,
19,
20]. Accordingly, the mentioned types of water may ultimately be used for domestic purposes, including human consumption, which calls for thorough radiological monitoring.
As the detection limit is defined as the smallest true value of the measurand that ensures a specified probability of being detectable by the measurement procedure, they depend on many factors [
21]. These factors are the background found in the measurement, the counting time, the volume of the sample being treated, and the whole performance of the radionuclide analysis [
22]; involving that they are intrinsically variable for each type of determination; therefore, for any measurement result to be meaningful, both the uncertainty and the detection limit must be stated.
The discussion above brings the issue of establishing the maximum detection limit required for each specific radionuclide determination. In the case of drinking water, guidance levels and detection limits have found a straightforward solution: for every radionuclide relevant to the particular situation or facility, the determined detection limit should be set significantly below certain guidance levels. Corresponding guidance levels can be found in different standards and legislation [
8,
9]. However, the matter is more intricate in the scenario we are addressing. Specifically, in the analysis of rainwater, surface, and groundwater within the vicinity of the nuclear facility, guidance levels or their corresponding detection limits are not contained in any standard or regulation.
In summary, while some national and international organizations have established detection limits for specific radionuclides in drinking water, there is a notable absence of regulation containing detection limits required for all radionuclides in D&D analysis and for other types of water. This gap underscores the need for a unified approach to setting detection limits that ensures consistency and accuracy in radiological characterization.
In this context, the bodies in charge of the D&D process at nuclear facilities do not have a regulatory framework to define the detection limits to be achieved by the laboratories in the determination of radionuclides within the plan.
Therefore, these are defined on a case-by-case basis. See, for example, the case of the radiological characterization of the environment at different nuclear facilities in the process of D&D in Spain. Different detection limits for the characterization of groundwater, rainwater, surface water, and drinking water in different radiological monitoring plans settled within the scope of those D&D activities were asked for. Specifically, as examples, for H-3, Sr-90, and Cs-137 the mandatory detection limits were always the same, i.e., 6, 0.025, and 0.2 Bq/L, respectively, but for C-14 it ranged between 1.8 and 6 Bq/L, for Fe-55 between 0.05 and 0.5 Bq/L, and for Ni-63 between 0.018 and 0.1 Bq/L.
An examination of the detection limits reveals significant differences, particularly for Fe-55 and Ni-63, across different monitoring plans. This variability suggests a lack of harmonization in the establishment of detection limits for radionuclide analysis in D&D. Such inconsistencies can lead to uncertainty in the radiological characterization process, making it difficult to draw clear conclusions about the radiological impact of a decommissioned nuclear power plant.
This example highlights the need to try to harmonize these detection limits, and to contribute to this discussion is the objective of this paper, considering that the following key elements are essential for achieving consistency and reliability in setting any detection limits in any radiation protection framework:
Clear and Comprehensive Regulations: Regulatory authorities should develop clear and comprehensive regulations that specify detection limits for a wide range of radionuclides. These limits should consider the potential pathways of exposure, such as ingestion, inhalation, and external exposure, to ensure a comprehensive radiological assessment.
Scientific Consensus: Establishing detection limits should involve scientific consensus. This process should consider the specific characteristics of each radionuclide, including its half-life, decay scheme, and radiological impact. Additionally, it should account for different environmental matrices, such as soil, water, and air.
Risk Assessment: Detection limits should be set with the aim of minimizing risks to the environment and public health. The limits should be based on the principle of ALARA (As Low As Reasonably Achievable) to ensure that the radiological impact is kept well below acceptable levels.
Method Validation: Laboratories should validate their analytical methods to ensure that the established detection limits can be reliably achieved. This involves rigorous testing and quality control procedures to verify the accuracy and precision of measurements.
To address the challenges related to the establishment of detection limits for radionuclide analysis, in laboratories in charge of the radiological determinations, in the radiological characterization of D&D activities, we propose a standardized approach within the scope of this paper. This approach aims to provide clear guidance for laboratories responsible for radiological determinations of rainwater, surface, and groundwater in the D&D radiological framework.
For that aim, we start from the assumption that all types of water may eventually be used for public consumption; therefore, the annual effective dose limits for drinking water will apply in this case. Then, we apply the methodology described in the Guidelines for drinking-water quality of the World Health Organization (WHO) [
8] to set maximum activity concentration levels for radionuclides and make a proposal for setting their corresponding detection limits as 1% of those maximum activity concentration levels.
2. Methods
In the context of radiation protection, our approach is based on two central assumptions as follows.
First, to maintain a conservative standpoint concerning radiological safety in terms of radiological protection. This involves adopting protective measures and standards that, in the event of instability or unexpected situations, are more than adequate to mitigate any potential risks.
Second, assessment of potential impacts of surface and groundwater contamination must address all the associated receptors of those types of water (drinking water sources, wetlands, storage, geothermal uses, etc.). We make the assumption that surface and groundwater could be directly consumed for drinking needs. We place ourselves in a more protective and conservative position in risk assessment, due to the potential severity of the consequences of drinking water supplies contamination.
Therefore, our radiological protection framework takes into account the possibility of radionuclide contamination in such water sources and endeavours to set appropriate limits to ensure that the potential radiological risks are within acceptable levels.
With these assumptions in mind, we propose the establishment of detection limits that are set at a level corresponding to one percent of the derived maximum activity concentration levels.
These maximum activity concentration levels are determined using the methodology followed by the WHO for defining the guidance levels in drinking water (World Health Organization [
8]. The WHO provides a systematic approach for determining the radionuclide activity concentration levels that would be safe for consumption. These levels are calculated considering factors such as the intake rate (2 L per day for one year for adults) and “individual dose criterion” in terms of an annual effective dose settled by the WHO as 0.1 mSv/year.
Essentially, the maximum activity concentration levels for each specific radionuclide are defined as the concentration at which said radionuclide, when consumed at a rate of 2 L each day for one year by an adult, results in the 0.1 mSv effective dose per year. The WHO [
8] considered that there is no sufficient evidence to introduce separate levels for different age groups, as although infants and children consume lower volumes of drinking water, the age-dependent dose coefficients for children are higher than those for adults, accounting for higher uptake or metabolic rates, balancing the result.
Each activity concentration level, ACLj for radionuclide j is obtained by using the expression of Equation (1)
where,
is the effective dose coefficient for ingestion of radionuclide j for the members of the public (mSv/Bq). These coefficients are provided by the International Commission on Radiological Protection (ICRP) [
23].
is the consumption rate of drinking water for adults; assumed to be 2 L/day for adults [
8].
For the sake of conservativeness in terms of radiation protection, when we look at the annual effective dose limits set by various organizations for radionuclides in drinking water, we observe a spectrum of values (
Table 1).
As can be seen in
Table 1, these values vary from the smallest limit of 0.04 millisieverts (mSv) established by the U.S. Environmental Protection Agency (EPA) to a limit of 1 mSv defined by the International Atomic Energy Agency (IAEA). The WHO, Canada, and the European Commission advocate for a limit of 0.1 mSv.
In our commitment to maintaining a conservative stance in radiological protection, we have chosen to align with the WHO and the European Commission by adopting the 0.1 mSv limit as our reference, which is the most restrictive effective dose, internationally accepted if the DOE limitation of 0.04 mSv, which excludes naturally occurring radionuclides, is not considered.
Having established the 0.1 mSv limit as our reference, the next step involves calculating the corresponding activity concentration levels for each radionuclide. The results obtained with this approach are shown in
Table 2 for some radionuclides.
For instance, one can consider the specific values for tritium, Fe-55, and Ni-63 (see
Table 2). The calculated activity concentration levels for tritium amount to 7610 Bq/L, for Fe-55 they are 415 Bq/L, and for Ni-63 they stand at 913 Bq/L. These values indicate the concentration of each radionuclide in drinking water that would result in an annual effective dose of 0.1 mSv, taking into account an intake of 2 L per day for one year by an adult.
This calculation could be performed for any artificial radionuclide that could be finally found in rainwater, surface water, and groundwater around any NPP in a situation of D&D. Reference [
24] proposes a list of such artificial radionuclides in the framework of specific NPP radiological monitoring plans.
As established before, our radiological protection proposal takes this a step further by aiming to establish detection limits that are sufficiently stringent. IAEA [
25,
26] also judges the settlement of good detection limits as a challenge and considers that monitoring techniques to verify exemption of practices or sources or for routine releases should have a detection limit well below the corresponding exemption levels or discharge limits, respectively. As an example, but in another field, IAEA [
26] proposes that the lower detection limits for on-line source monitoring programs could be less than 1% of discharge limits, in order to adequately cover the releases of each radionuclide group and to safely demonstrate compliance with the authorized limits.
We propose detection limits that are set at 1% of the corresponding activity concentration levels. Some of these detection limits appear in the last column of
Table 2. By setting detection limits at this level, we ensure that any trace presence of these radionuclides in drinking water can be identified and managed swiftly, well before they reach concentrations that could pose a risk to public health. Furthermore, the World Health Organization (WHO) [
8] also sets the detection limits for chemicals under the 1/100th of the guideline value.
From a qualitative rather than quantitative point of view, it must be considered that it should be possible to accurately assess the presence of an individual radionuclide with detection limits much lower than its maximum activity concentration level since water may contain more than one radiological contaminant. Such is the case of contamination due to a D&D process, where the doses due to the different radionuclides should be summed up and the result will always be less than the dose limits.
Accordingly, the detection limits for the different radionuclides established in this way should be fit for purpose, as their achievement would assure that the objective of the Radiological Environmental Monitoring Plan is met. Therefore, any concentration value measured would be clearly distinguished from each radionuclide activity concentration level, taking into account the uncertainties associated with the measurement process. This way, any laboratory under routine operating conditions would achieve detection limits that would be far below the activity concentration levels.
Thus, the corresponding detection limits for the radionuclides studied as examples would be 0.05 Bq/L for Sr-90, 0.11 Bq/L for Cs-137, 76 Bq/L for tritium, 2.36 Bq/L for carbon-14, 4.15 Bq/L for Fe-55, and 9.13 Bq/L for Ni-63. These detection limits represent a vigilant and precautionary approach to radiological protection, enabling the timely identification and mitigation of potential risks, assuming that all these waters could end up as public drinking waters.
3. Results and Discussion
In order to plan and conduct monitoring for compliance with levels for exemption and clearance in D&D, the corresponding organization needs to establish a planned monitoring plan in a responsible and effective manner. Among the management issues, data quality objectives of the monitoring program are to be considered, including, of course, the specification of detection limits for all measurement techniques and radionuclides. This is the case of monitoring of installations in the D&D stage.
The focus is now on the debate as to what the detection limits should be, since they do not depend on the laboratory itself, but on the body responsible for the process and, ultimately, on the regulatory authorities.
Much effort has been expended in recent years in harmonizing detection limits with respect to concepts with the ISO standard treating decision and detection limits [
21]. However, nothing is established about the values they must obtain in different environmental radiation monitoring plans. It is still an open question for regulatory bodies, governments or other relevant authorities that finally private or public entities in charge of carrying out the radiation monitoring plans must cope with. In this situation, there is a clear lack of harmonisation in the establishment of the detection limits to be reached for the different matrices to be analysed.
This work makes a technical proposal for the radiological analysis of water based on ensuring an adequate level of radiological protection. Having established the methodology for establishing detection limits for the characterization of radionuclides in water in D&D scenarios, it becomes feasible to compare these detection limits with the current adopted values.
As an illustrative example, we can compare the results for Sr-90, Cs-137, tritium, C-14, Fe-55, and Ni-63 from
Table 2 with the detection limits required by some REMPs for different installations in the D&D stage in Spain shown on the introduction section. For Sr-90 the required detection limit is 0.025 Bq/L, for Cs-137 it is 0.2 Bq/L, for tritium (H-3) it is always 6 Bq/L, for carbon-14 (C-14) it is between 1.8 and 6 Bq/L, for iron-55 (Fe-55) it is between 0.05 and 0.5 Bq/L, and for nickel-63 (Ni-63) it is between 0.018 and 0.1 Bq/L.
Notably, the proposed detection limits in
Table 2 are found to be at least an order of magnitude greater than those presently in use, except for Sr-90, Cs-137, and C-14, which are of similar magnitude.
On the one hand, the attainment of the proposed detection limits in this study would enable the unequivocal identification of any activity concentration in water capable of resulting in an annual effective dose through ingestion of 0.1 mSv, which stands as the most stringent internationally accepted effective dose criterion.
On the other hand, achieving even lower detection limits would facilitate a more comprehensive radiological characterization, thereby enabling the identification of exceedingly small radionuclide activity concentrations in water. This heightened sensitivity would serve as a valuable tool for diagnosing faults within the nuclear power facility and also to check the quality and development of some steps of the D&D process. For instance, in the event of a minor water leakage, early detection and response could be initiated.
In summary, the establishment of lower detection limits would yield a more exhaustive and stringent quality control framework for D&D tasks, ultimately enhancing the safety and efficiency of nuclear facility decommissioning procedures.
Clear examples of this conclusion are two different situations detected by our laboratory around a facility in decommissioning. Sampling of groundwater was made at different points around the facility over several years. All samples were analysed looking for different radionuclides. Ni-63 and H-3 were analysed using liquid scintillation spectrometry techniques. A full explanation of the results and monitoring program can be found in [
27].
In the first case, the activity levels of Ni-63 detected, in Bq/litre, can be seen in
Figure 1.
The activity concentration of Ni-63 exhibited temporal variability, characterized by two distinct sharp increases since the onset of the dismantling process. Upon applying the proposed detection limit, only the latter surge in activity would have been discernible. Nevertheless, it is crucial to bear in mind that, even at the highest recorded activity concentration level, the projected annual radiation dose to the public, assuming year-round consumption of this water, would amount to a negligible 2.2 μSv.
However, having worked with much lower detection limits than those now proposed, it was possible to analyse the causes of this slight increase in the water and to take, if needed, corrective measures.
In the second scenario, the activity concentration of tritium, measured in Bq/L, at three distinct sampling locations over the course of four years is depicted in
Figure 2. It is evident that the concentration of tritium exhibited temporal variations, at times displaying pronounced fluctuations. With the adoption of the proposed detection limit, a significant portion of tritium leakages during this period would have been readily identified. However, it is noteworthy that even at the highest observed activity concentration level, the resulting annual effective dose to the public, assuming year-round consumption of this water, would amount to a mere 13 μSv, a negligible level of exposure.
Setting a universal detection limit for each radionuclide at 1% of the level of the maximum permissible activity concentration in water is reliable in terms of radiation protection, of practical interest for the laboratories involved in radionuclide analysis, and of maximum interest for regulatory authorities as they could compare results from those analyses independently from the monitoring plan they come from. Moreover, since it is accepted practice to derive activities from a fraction of the detection limit in cases where activities are below those limits, it would make more sense to derive the result knowing that those limits are 1% of the maximum allowable activity concentration.
4. Conclusions
The radiological characterization of areas surrounding decommissioned nuclear facilities is a critical step in the D&D process. To guarantee the accuracy and reliability of this characterization, it is essential to establish appropriate detection limits for radionuclide analysis. Detection limits should be harmonized, well-defined, and in accordance with regulatory requirements to ensure that the radiological impact remains within safe limits.
To address the challenges and to provide clear guidance in laboratories in charge of the radiological determinations in relation to the detection limits for radionuclide analysis in the radiological characterization framework of D&D activities, we have proposed a standardized approach.
We assume radiation protection as the main purpose of sample analysis and radiological characterization and monitoring. The radiological protection framework assumed is firmly based in the principle of conservatism, assuming the WHO methodology for guidance levels in drinking water to establish activity concentration levels and the most restrictive internationally accepted annual effective dose limit of 0.1 mSv. Furthermore, we propose detection limits set at 1% of these activity concentration levels to ensure early identification and management of potential radiological risks. This approach embodies a commitment to robust radiological protection and the well-being of individuals who rely on surface and groundwater for their drinking water needs.
The proposed standardized approach provides a framework for laboratories to set and achieve these limits, ultimately contributing to the safe and effective decommissioning of nuclear facilities and the protection of the environment and the public health. The detection limits proposed here fill the gap in terms of harmonization in radiological characterization.
Moreover, the detection limits proposed here are easily achieved in radiological laboratories equipped with standard radiometric equipment using conventional methods, with average sample volumes and low counting times.
In any case, if lower detection limits would be required by regulatory authorities, a more complete radiological characterization could be achieved so that, although negligible from the radiation protection point of view, it would be possible to discover some technical problems in D&D. It is just a question of cost–benefit analysis: lower detection limits suppose higher costs in terms of radionuclide analysis, but at the same time, it can reduce technical costs if the risks of any possible technical problems were assessed beforehand.