Use of Biological Dosimetry for Monitoring Medical Workers Occupationally Exposed to Ionizing Radiation

Simple Summary: The use of medical procedures utilizing ionizing radiation has grown signiﬁcantly over the last several decades. Although the recommendations for radiation protection limit the exposures, the personnel administering the radiation can still be exposed to radiation doses capable of inducing adverse biological effects. The routine assessment of exposures is based on readings from personal dosimeters worn during work. Studies show that a signiﬁcant fraction of medical workers use radiation protection measures inconsistently, including wearing personal dosimeters, which might lead to underestimation of occupational exposures. Biological methods measuring cellular effects of radiation could provide an additional tool for a more accurate assessment of genotoxic effects of radiation in medical workers. The aim of this paper is to provide a summary of biological methods used for dose estimation in occupationally exposed medical workers, and also to assess the practical use of such methods in occupational dosimetry. Abstract: Medical workers are the largest group exposed to man-made sources of ionizing radiation. The annual doses received by medical workers have decreased over the last several decades, however for some applications, like ﬂuoroscopically guided procedures, the occupational doses still remain relatively high. Studies show that for some procedures the operator and staff still use insufﬁcient protective and dosimetric equipment, which might cause an underestimation of medical exposures. Physical dosimetry methods are a staple for estimating occupational exposures, although due to the inconsistent use of protection measures, an alternative method such as biological dosimetry might complement the physical methods to achieve a more complete picture. Such methods were used to detect exposures to doses as low as 0.1 mSv/year, and could be useful for a more accurate assessment of genotoxic effects of ionizing radiation in medical workers. Biological dosimetry is usually based on the measurement of the effects present in peripheral blood lymphocytes. Although some methods, such as chromosome aberration scoring or micronucleus assay, show promising results, currently there is no one method recognized as most suitable for dosimetric application in the case of chronic, low-dose exposures. In this review we decided to evaluate different methods used for biological dosimetry in assessment of occupational exposures of medical workers. Contributions: Conceptualization, A.D., K.K. W.M.S.; literature review, I.P. A.D.; investigation, I.P., A.D. and K.K.; writing—original preparation, I.P. and A.D.; writing—review editing, I.P., A.D., K.K. and W.M.S.; supervision, K.K. and W.M.S.


Introduction
Medical workers are the largest group exposed to man-made sources of ionizing radiation [1]. Over the last century the doses recorded in occupationally exposed medical workers have decreased significantly [2]. However, for some applications, such as fluoroscopically-guided (FG) procedures or nuclear medicine procedures, the doses received by personnel can be significantly higher. The physical dosimetry methods are routinely used for estimation of annual doses received by workers. Although personal dosimeters are a standard in most cases of occupational exposures, the insufficient compliance with the radiation protection recommendations can result in significantly inaccurate measurements. Additionally, personal detectors have limits of detection, which can result in inaccurately low doses for some exposed groups [3]. An increasing number of physicians without the sufficient level of radiation safety training are using radiation techniques such as fluoroscopic procedures, which increases both the risk of exposure and the risk of dose underestimation [4,5]. Studies show that a significant fraction of fluoroscopy operators do not use appropriate radiation protection measures, such as wearing dosimeters regularly or using lead shielding and glasses [6,7]. Biological dosimetry methods could help with more accurate estimation of ionizing radiation effects in these groups.
The biological dosimetry methods have been used for measurement of effects of radiation since as early as 1960s. Although the acute and chronic effects of high doses of ionizing radiation have been well described, there is still some debate on the effects of exposure to low doses of radiation over long periods of time [8]. Nevertheless, many studies have been able to measure genotoxic effects of ionizing radiation in workers chronically exposed to low doses. Several of these methods have been successfully used for the estimation of the effects of radiation incidents, where dose assessment based on personal dosimeters is not possible. Methods such as fluorescence in situ hybridization (FISH) and incidence of chromosomal translocations were used during the response to the 137 Cesium accident in Goiânia, Brazil in 1987 [9]. Biological dosimetry methods proved useful not only for the estimation of absorbed dose immediately after the accident, but also up to 10 years after the exposure. Several cytogenetic methods have been successfully used for estimation of high-and low-dose radiation exposures [10], and the dicentric chromosome assay (DCA) is currently recognized as a standard for radiation biodosimetry [11,12]. Although the DCA has been used in numerous studies for dose estimations after accidental exposures, there is no consensus on whether it could reliably reflect doses to which radiation workers are exposed chronically. The present study reviews whether additional methods of dose estimation might be beneficial for more accurate risk assessment in the medical personnel exposed to ionizing radiation, and provides data on the reliability of the most widely used biodosimetry methods.
For this study an electronic literature search was conducted between October and December 2020 using the PubMed database for articles published in English language investigating the use of biological dosimetry methods for estimation of occupational exposures. The search keywords were "ionizing radiation", "biological dosimetry", "occupational exposure", "medical staff", "comet assay", "micronuclei", "chromosomal aberrations", "DNA methylation". After the database search the references of relevant studies were screened to obtain papers fitting the topic of the review. For the review we included experimental studies which used biological dosimetry methods to detect occupational exposures to medical workers. We excluded studies in which biological dosimetry methods were used to estimate radiation exposure after accidents, and studies in which these methods were used to estimate occupational doses to non-medical staff.

Radiation Protection Recommendations
The decrease in annual occupational doses to medical workers since the implementation of radiation procedures was possible thanks to the evolution of radiation protection measures. Current recommendations from European Council Directive 2013/59/Euratom set the basic safety standards regarding ionizing radiation exposure for radiation workers in the European Union [13]. This document takes into account the recommendations made by the International Commission on Radiological Protection (ICRP) [14,15]. The Directive determined several recommendations for occupational exposure limits, inter alia: a single-year effective dose should not exceed 20 mSv (50 mSv in special circumstances), the equivalent dose to the eye lens should not exceed 20 mSv in a single year or 100 mSv in any five consecutive years subject to maximum dose of 50 mSv in a single year, the single-year dose to the skin shall not exceed 500 mSv. Additionally, the Directive suggests that member states should classify exposed workers into two categories: category A-workers "liable to receive an effective dose greater than 6 mSv per year or an equivalent dose greater than 15 mSv per year for the lens of the eye or greater than 150 mSv per year for skin and extremities"; and category B-"exposed workers who are not classified as category A workers". This categorisation was introduced for the purposes of monitoring and surveillance of radiation exposures. Additionally, the recommendations for individual monitoring as well as for medical surveillance differ between the two groups. Although recommendations on dose limits are very important for radiation protection, ICRP also emphasized the importance of introducing a framework of radiological protection elements regarding dose measurement techniques, estimation of exposures to different tissues, and auditing potentially dangerous procedures [16]. Authors point out the importance of measuring not only the effective dose, but also doses that could be received by organs such as eye lens and extremities, which are not protected by the apron. Additionally, for dose measurements, authors recommend the use of two dosimeters-one worn over the apron (at the collar), and one worn under the apron, for a more accurate estimation of effective dose and an indication of the dose received by the eye lens and the head. Exposure of the eyes should be considered an important aspect of medical staff protection, since reports indicate that excessive radiation doses to the lens can result in development of lens opacities over time [7,17]. The proper use of lead glasses and ceiling-suspended shields can significantly reduce the dose to the lens. Authors also emphasize the importance of the initial and periodic education and training of the staff involved with interventional procedures. Such training should involve information on the exposure monitoring and dose assessment, as well as the protection methods and garments. The International Atomic Energy Agency (IAEA) provides recommendations for medical staff in interventional fluoroscopy, which when followed should reduce annual radiation doses to the range of 0 to 5 mSv [18]. IAEA recommends the use of a lead apron, lead glass eyewear, protective shields, always wearing personal radiation monitoring badges, standing at the correct distance and position from the patient, and using fluoroscopy methods allowing for reduction of patient dose. The implementation of proper radiation protection measures and devices, as well as training of the personnel, allowed for the decrease in doses received by medical workers over the years, despite the growing number of performed procedures.

Occupational Radiation Doses over the Years
According to the 2008 UNSCEAR Report (United Nations Scientific Committee on the Effects of Atomic Radiation) three categories of medical practice which involve ionizing radiation exposure can be identified: diagnostic radiology, nuclear medicine and radiation therapy [19]. Medical workers performing these diagnostic and therapeutic procedures constitute the largest group of occupationally exposed workers. In the last several decades, the number of performed procedures involving ionizing radiation exposure has greatly increased, causing a surge in exposure of radiation workers. Mettler et al. estimated that from 1980 to 2006 the number of radiologic and nuclear medicine procedures performed in the United States increased about 10-fold and 2.5-fold, respectively, with an estimated 600% increase in the effective dose from these procedures per capita [20]. Researchers have also pointed out that the frequency of CT scanning increased three times between the periods 1977-1980 and 1997-2007, and in the year 2006 in the United States, CT scanning constituted the largest contributor to the annual per-capita effective radiation dose from man-made sources [21].
Although an increase in the number of performed radiologic and nuclear medicine procedures was observed in several studies, it did not necessarily correlate with a similar increase in an average dose of ionizing radiation to which medical workers are exposed annually. On the contrary, multiple studies report that the registered average annual doses significantly decreased in personnel working in the majority of the medical occupations exposed to ionizing radiation. This downward trend started as early as the 1950s, due to the implementation of improved protection devices and more strict radiation protection guidelines. Additionally, technological developments and advances implemented by equipment manufacturers allowed for further reduction of occupational doses. Data published in NCRP (National Council on Radiation Protection and Measurements) report no. 178 show that the steady decline in occupational doses for medical radiation workers has been observed since as early as the 1930s, and continued into the 2000s [2]. Before 1939, the average annual dose received by the medical radiation workers was estimated at 70 mSv, in the 1970s the average dose decreased to around 2 mSv/year and currently is lower than 1 mSv/year. Several large cohort studies confirm this trend. Zielinski et al. published an analysis of exposures of 67,562 medical workers (physicians, nurses, nuclear medicine technicians, radiation technologists, physicists and others occupationally exposed to medical radiation) selected from the National Dose Registry of Canada [22]. The mean annual dose for this group reached maximum in the mid-1950s, after which the dose steadily declined, and for the period of 1971-1987 measured 0.36 mSv/year. It is worth noting that the exposures measured in this cohort might be underestimated, as doses below the detection limit of the radiation dosimeter, i.e., <0.2 mSv, were recorded as zero.
It is important to recognize that doses can vary significantly for different occupations. Linet et al. collected exposure data for medical radiation workers from mostly smaller studies dating as early as the 1920s [23]. The available data on workers' exposure before 1940s is limited, however, estimates based on data published in the 1920s show greatly varying doses, ranging from 900 mSv/year up to 7000 mSv/year for radiologists. The registered doses decreased significantly in later years, averaging around 50 mSv annually in the 1950s, between 0.75 and 0.34 mSv in the 1980s, and between 0.23 and 0.08 mSv in the 2000s. It is worth noting that the estimated mean annual doses did not exceed the recommended dose limits from the corresponding time. Simon et al. investigated radiation doses for a cohort of 90,305 radiologic technologists in U.S. which included persons who started work in as early as 1916 [24]. This study also showed an overall decrease in the annual doses for the period between 1916 and 1984 from an average of 100 mSv/year to 2.3 mSv/year. The reported means were based on model predictions alone for the doses before 1977, and after 1977, the means included film-badge measurements as well. A study conducted by Zhang et al. on a Chinese medical diagnostic X-ray worker cohort showed results in accordance with those observed in the U.S. study [25]. Results published from this study of 27,000 workers showed that workers were exposed to an average annual dose of 182 mGy before 1949, which significantly decreased to 2.3 mGy/year between 1990 and 1994. Although the doses reported in this study differed from the U.S. cohort for the readings before the 1970s, the estimates for this period are often highly variable. For the measurements after the 1970s, the data correlated more closely with the previously mentioned U.S. study.
Apart from significant differences between the radiation doses registered for different occupations, the exposures are often not homogenous across the body, and significant variability has been observed between exposures of different organ. The inhomogeneity of dose distribution might contribute to less reliable dose readings from personal dosimeters. A large cohort study conducted by Choi et al. estimated occupational exposures based on the dosimetry data from 94,396 Korean medical radiation workers, deposited between 1996 and 2011 in the National Dosimetry Registry [26]. The data came from recorded badge doses; additionally, the authors performed a reconstruction of historical badge doses before 1996. The authors observed significant differences between exposures of medical radiation workers depending on sex, job title, and year at the first exposure. The highest mean cumulative badge doses for the whole period were observed for radiologists (26.87 mSv) and radiologic technologists (15.96 mSv), and the lowest for dentists (1.53 mSv) and dental hygienists (0.61 mSv). Additionally, the authors observed significant differences in expo-sures of different organs, with the highest cumulative doses observed in the thyroid gland (10.23 mGy) and the breast (5.03 mGy), and the lowest in the brain (1.17 mGy). Significant differences in doses received by different organs are caused by several factors. Firstly, carrying out medical procedures often exposes staff to doses that are inhomogeneous [27]. Additionally, radiation protection measures can differ in degree, and often personnel do not use dosimeters and protective equipment consistently. Changing protocols for medical procedures can also result in uncertainties in organ dose estimations. NCRP published Report No. 178, in which they provide guidelines for estimation of organ doses from external radiation, internal irradiation and for assessment of related uncertainties [2].
Both the variability between organ doses and the overall annual doses registered in staff performing fluoroscopically guided (FG) procedures are one of the highest among all medical occupations. Whereas for other radiological modalities, such as computed tomography (CT) scanning, staff can stay in a separate room during the procedure, for the FG procedures the staff performing them have to be in close proximity to the patient (and radiation source) and have to be in the room for most of the time of the procedure. This results in physicians being exposed to considerable doses of scattered radiation from the patient during each performed procedure. Kim et al. compiled reported radiation doses for staff performing cardiac catheterization and showed that effective doses were significantly different depending on the used procedure [28]. Effective doses were estimated using the Niklason algorithm [29]. For diagnostic catheterization (DC) doses per procedure ranged from 0.02 to 38 µSv, 0.17 to 31.2 µSv for percutaneous coronary interventions (PCI), 0.24 to 9.6 µSv for ablations, and 0.29 to 17.4 µSv for pacemaker/intracardiac defibrillator implantations (PCI). Additionally, authors observed significant differences in doses received by different anatomic sites, with eye level being exposed to mean doses ranging from 0.4 to 1 100 µSv per procedure, 1.2 to 580 µSv at the thyroid level, 3.5 to 750 µSv at trunk level, and between 0.4 and 790 µSv at hands level. It is worth noting that registered doses were significantly lower when measured under the protective equipment. The analysis of registered doses over time revealed that in the period between the 1970s and 2010s the average dose received by staff during DC and ablation decreased, whereas the average dose slightly increased for PCI. Although the number of procedures over that period increased, the slight decrease in doses received during DC was attributed to shorter procedure times and decreased dose rates. The increase in operator doses for PCI was caused by increasing time and complexity of cineradiography. The same group also reviewed doses received by the staff during fluoroscopy-guided non-cardiac procedures [30]. Authors calculated effective doses per case for the following procedures: percutaneous nephrolithotomy (PCNL) (dose range from 1.7 to 56 µSv, median = 6.2 µSv), vertebroplasty (VP) (0.1 to 101 µSv, median = 14.3 µSv), orthopaedic extremity nailing (2.5 to 88 µSv, median = 9.8 µSv), biliary tract procedures (2 to 46 µSv, median = 5 µSv), transjugular intrahepatic portosystemic shunt creation (TIPS) (2.5 to 74 µSv, median = 17 µSv), head/neck endovascular therapeutic procedures (HN) (1.8 to 53 µSv, median = 5.2 µSv) and endoscopic retrograde cholangiopancreatography (ERCP) (0.2 to 49 µSv, median = 1.1 µSv). The registered doses were highest at the level of hands (compared to eyes, neck, and trunk). The authors observed, that operator doses varied much more greatly than patient doses, and depended on several factors, such as patient and lesion characteristics, experience of the operator (which influenced time of procedure), use of protective equipment, position of the operator and characteristics of used equipment.

Compliance with Radiation Protection Measures
Although the radiation protection recommendations have become stricter over the last several decades, which is reflected in decreased medical exposures, the insufficient compliance with the limits can result in inaccurate dose estimations which can lead to increased risk to the staff. Recent studies investigating the compliance with the recommended radiation protection measures show insufficient use of these measures by medical workers. A survey of 159 US therapeutic endoscopists performing endoscopic retrograde cholan-giopancreatography (ERCP) showed several shortcomings regarding personal protection measures [31]. The survey showed that the majority of respondents had not received formal training in operating their fluoroscopy system, and around 20% did not know whether their system allowed for modification of parameters like collimation, frame-rate modification, and pulsed fluoroscopy, which allow the user to minimize the patient dose. Although a vast majority of responders always wore a lead apron and a thyroid shield while performing ERCP, only 20% declared that they always used lead glasses, and 40.7% consistently used shielding lead curtain, which indicates an insufficient eye protection level. Additionally, over half of attending respondents did not consistently wear a dosimeter. Since the radiation scattered from the patient is the main source of radiation to which the operator and staff is exposed, the underuse of dose-minimizing modifications could contribute to increased exposures to medical workers performing fluoroscopy-guided procedures. Combined with the inconsistent use of radiation protection measures, these results indicate that therapeutic endoscopists performing ERCP could be exposed to doses higher than necessary, which could also be underreported because of the lacking protection measures. Other similar studies regarding radiation safety measures among the staff using fluoroscopic imaging methods have arrived at similar conclusions. Van Papendorp et al. conducted a survey amongst orthopaedic surgeons and concluded that the majority of respondents use insufficient radiation protection measures (personal dosimeters, basic protection devices) and show insufficient knowledge of radiation safety [32]. Respondents cited unavailability as a main reason for the insufficient use of protective measures. Similar results were observed in a radiation safety survey collecting data among urology residents and fellows in the United States [33]. The majority of respondents complied with lead body shield and thyroid shield usage (99% and 73%, respectively); however, almost none used lead glasses and gloves. Additionally, 70% of the trainees never wore dosimeters. A survey conducted by Soylemez et al. among European urology residents also showed that in this group the use of protective measures other than lead apron was insufficient [34]. These results indicate that occupational doses to which medical workers are exposed might be underreported because of the insufficient use of radiation protection measures. This could mean that otherwise reliable physical dosimetry methods might benefit from being paired with other dosimetry methods, which could help with a more accurate estimation of radiation risks to medical personnel.

Biological Methods of Dose Estimation
Since breaches of radiation safety protocols like the inconsistent use of dosimeters or protection glasses are still too common, other approaches, not dependent on compliance with safety protocols, have been considered for the exposure estimation. Several dosimetry methods based on the measurement of biological effects of radiation have been investigated for their potential use in dosimetry. Such biological dosimetry method could more closely relate to the genotoxic effect of radiation, rather than the dose absorbed by the dosimeter. We decided to analyze the results of studies which used biological dosimetry methods to investigate the chronic, occupational radiation exposures. Several such methods have been used for estimation of radiation effects in the body, most of which are based on the detection of genotoxic effects of radiation. The damage to DNA induced by ionizing radiation depends on the absorbed dose, the time in which the dose was absorbed, and the type of radiation. Radiation types characterized by a low linear energy transfer (LET), like X-and secondary electrons produced by the gamma-rays, induce primarily sparsely distributed DNA breaks and base damage through induction of oxidative stress [35]. High-LET radiation, like alpha particles or carbon ions, induces more dangerous clustered, complex damage along the particle tracks [36]. Such clustered DNA lesions are more likely to result in chromosome fragmentation, and serious aberrations later on. The persistent chromosomal damage is more often used in biological dosimetry for chronic exposures, however, methods measuring DNA breaks have also been used. The methods used in the reviewed papers included comet assay, detection of chromosomal aberrations (CA) and rearrangements, detection of micronuclei (MN), sister chromatid exchange (SCE), changes in DNA methylation, measurement of histone H2AX and Glycophorin A mutation assay. It is important to note that the results from the abovementioned techniques might be influenced by other variables than received dose, like smoking status, age or sex. The majority of the published papers evaluating the biological methods of dosimetry used peripheral blood lymphocytes as a source of cells for analysis, which allowed for the estimation of the exposures to the whole body. In one of the studies, the authors used buccal epithelial cells as a source of cells for MN assay [37]. Performing the assay on buccal cells is less invasive than the standard procedure of evaluating damage to peripheral blood lymphocytes. Additionally, in a meta-analysis of 63 human population studies conducted by Ceppi et al. [38], the authors showed a linear correlation between the MN frequency in blood lymphocytes and MN frequency in buccal mucosa, although the study included only one paper which assessed exposure to ionizing radiation.

Alkaline Comet Assay
The effects of ionizing radiation on living cells are based on several mechanisms, such as induction of DNA single-strand breaks (SSB), double-strand breaks (DSB), damage to nucleotides, loss of bases and denaturation of protein-DNA and DNA-DNA bridges [39]. Biological dosimetry studies usually focus on detection of these DNA lesions caused by radiation, and the chromosomal damage that follows. Several of the papers we reviewed used comet assay to measure DNA damage induced by low-dose ionizing radiation in peripheral blood lymphocytes. The comet assay allows for detection of DNA damage within single cells, and has been widely used in genotoxicity studies in vitro, especially ones investigating effects of ionizing radiation [40]. The method relies on single-cell gel electrophoresis, which causes DNA fragments to migrate out of the cell, with more fragmented material (more DNA breaks) migrating faster than the intact DNA. Few years after the introduction of this method an alkaline version was developed, in which higher pH allowed for detection of not only frank strand breaks, but also alkali-labile sites such as apurinic/apyrimidinic regions and baseless sugars [41]. Although the alkaline comet assay is widely used to measure the level of DNA damage, the assay can also be further modified to detect other DNA lesions. Addition of bacterial or human enzymes with lesion-specific endonucleolytic activity allows for higher sensitivity and selectivity of the assay. Addition of such enzymes induces additional breaks in the DNA in locations of DNA lesions, which causes further migration of genetic material during the electrophoresis step. Several different enzymes have been used with comet assay for detection of specific DNA lesions: Endonuclease III (oxidized pyrimidines), Formamidopyrimidine DNA glycosylase (oxidized purines, formamidopyrimidines), 8-hydroxyguanine DNA-glycosylase (oxidized purines, formamidopyrimidines), T4 endonuclease V (cyclobutane pyrimidine dimers), 3-methyladenine DNA glycosylase II (alkylated bases, hypoxanthine), uracil-DNA glycosylase (uracil residue, deamination products) [42]. Detection of these types of damage finds use predominantly in measurement of genotoxic effects of chemicals, however, DNA lesions such as complex oxidative damage have been measured using comet assay with endonucleolytic enzymes in cells exposed to low doses of gamma rays [43]. The enzymatic modification of the comet assay was also shown to be more sensitive for detection of exposure in lymphocytes compared with the standard assay [44]. Higher sensitivity in detection of exposure could find use in biological dosimetry; however, the modified comet assay has not yet been widely used for detection of low-dose occupational exposures.
For the analysis, several descriptors can be used to denote intensity of DNA damage, such as the length and intensity of the comet "tail" (migrated DNA), tail moment, and Olive tail moment. Although the alkaline comet assay was used for occupational dosimetry in several papers with promising results, the method has some shortcomings that have to be overcome before its wider implementation. The choice of the descriptor of the DNA migration might affect the outcome. Based on published results, the tail length seems to correlate best with received occupational dose, and although many researchers discarded this descriptor for general use because the maximal migration is reached at relatively low doses [45], this limitation should not have a large impact in low-dose dosimetry. Tail moment, although preferred by some researchers, does not have standard units, which makes it impossible to visualize the level of damage described. The fact that comet assay parameters can be evaluated subjectively could also introduce errors dependent on the researchers experience with the technique. This potential problem seems to be mitigated by the use of analysing software, which all of the papers described below used. In 2020, Moller et al. published detailed recommendations for reporting comet assay results [46]. For the most objective analysis of the data, the authors suggest the use of calibration curves and conversion of results to relative lesion frequency compared to unaltered nucleotides. This method of reporting comet assay results would allow researchers not familiar with the method to understand the results more easily.
Several research groups investigated the use of comet assay in dosimetry of occupationally exposed medical workers. Gerić et al. [47] compared doses measured in radiology unit workers and matched volunteers (matched by age, sex, height and smoker fraction) using physical dosimetry (thermoluminescent detectors (TLD); LiF:Mg, Ti detectors (TLD-100)) with damage to the genome measured using the comet assay conducted on peripheral blood lymphocytes. The occupational effective dose measured by TLD-100 dosimeters averaged 1.82 ± 3.6 (0.00-13.87) mSv for the exposure period ranging from 1 to 35 years. The authors observed significantly higher comet tail length (TL) in the exposed group compared to the control group, and no difference in tail intensity between the two groups. Fang et al. [48] measured genomic damage in peripheral blood lymphocytes using comet assay in a group of individuals occupationally exposed to low levels of X-ray radiation, and in matched non-exposed workers. The exposures for study group ranged from 1 to 31 years, and the cumulative effective dose averaged 38.41 ± 27.36 (2.81-416.43) mSv. The authors observed a significant increase in the percentage of tailed DNA, tail moment and Olive tail moment in the exposed group compared with matched reference group. Additionally, the authors observed that longer exposition time (years worked) correlated with higher level of DNA damage, which suggests a possible correlation with received dose. Similar doseresponse correlation was not observed in the study conducted by Dobrzyńska et al. [49]. In this study the authors used comet assay to assess the DNA damage in lymphocytes from a group of medical workers (doctors, nurses, technicians, radiochemists, and administrative staff) who had contact with various radioisotopes, and/or were involved in scintigraphy and PET/CT. The mean effective dose for this group averaged 0.3 ± 0.23 mSv. Authors observed no correlation between the tail moment and percentage of tailed DNA and the mean yearly dose. Overall, the tail moment and % DNA were both significantly higher for the exposed group compared with matched control group. Technicians working with scintigraphy and PET/CT showed significantly higher values of tail moment and % DNA than control group. For other subgroups the differences were not significant. Surprisingly, the authors observed that staff in work category B (effective dose might exceed 1 mSv in one year), but not in category A (effective dose might exceed 6 mSv in one year), had increased levels of the two measured parameters of comet assay. Authors speculate that the adaptive response of lymphocytes to ionizing radiation might be responsible for this effect. Sakly et al. conducted a study measuring DNA damage in peripheral blood lymphocytes collected from health care workers from a radiology department and workers from administrative staff (control group) [50]. Authors observed that the mean TL was significantly higher for the exposed group compared with the control group. The mean TL was higher in the group of exposed female workers compared with control female workers, however, not in exposed male workers, which was also observed in the previously mentioned study conducted by Dobrzyńska et al. Martinez et al. measured DNA damage in blood lymphocytes collected from a group of 6 subjects working in a nuclear medicine department, 4 subjects working in a radiotherapy department, and 31 subjects working in a radiology department [39]. Venous blood was collected before and after the work shift. The authors observed a significant increase in tail length in samples collected after the work shift compared to samples collected before the shift in the exposed group. An increase was not observed for the control group. Additionally, the results showed a positive correlation between the monthly exposure dose and tail length before the workday and after the workday in the exposed group. Overall, the comet assay seems to be able to detect the DNA damage induced in radiation workers compared with non-exposed workers, however, only a few studies show correlation between received dose and the effect measured by the assay [48]. Additionally, since this assay measures the extent of DNA damage present in the cell, it is worth noting that those lesions might later be repaired and not contribute to further biological effects. The method in its unmodified version is also not capable of differentiating between types of damage-less dangerous sparse breaks (induced more often by low-LET radiation) and more dangerous complex and clustered damage (induced primarily by high-LET radiation). According to the results presented by Dobrzyńska et al., this method might be able to detect exposures as low as 0.3 mSv/year [49]. The comet assay might be more useful for damage measurement immediately after the exposure, rather than estimation of chronic occupational exposures. The collected results from studies using comet assay in dosimetry are shown in Table 1. Table 1. Studies using the alkaline comet assay in medical workers exposed to ionizing radiation.

Author Population Parameter Measured
Mean ± S.D.

Scoring of Chromosomal Aberrations
An increase in chromosomal aberrations (CA) is another effect of ionizing radiation often measured in biodosimetry studies. CA result from unrepaired DNA lesions, and can be assessed in the metaphase of cell cycle. Similarly to assessing DNA damage by the comet assay, CA can be counted in peripheral blood lymphocytes of exposed persons, and serve as a sensitive marker of genetic damage. In order to perform the assay the lymphocytes need to be cultured with phytohaemagglutinin, and later treated with colcemid (stops cell cycle in metaphase) in order to assess the chromosomes, which takes up to 48 h, with additional 5-25 h needed to analyze 500 cells [51]. Additionally, the increase in frequency of dicentrics seems to be specific to ionizing radiation effects, and shows a linearquadratic dose-effect relationship. Although the assessment of chromosomal aberrations, and especially dicentric chromosomes, has been successfully used for the estimation of radiation exposures following radiation accidents, few studies used it to assess long-term occupational exposures.
Recently, Shafiee et al. investigated the level of CA in blood lymphocytes of personnel working with C-arm fluoroscopy, multiscan CT slice, lithotripsy and digital radiology procedures [52]. The doses to which workers were exposed ranged from 0 to 2.99 mSv.
Authors observed a significantly higher frequency of CA in exposed workers compared to the matched controls; however, no difference was observed between the exposed groups. In a previously mentioned study, Fang et al. used measurement of CA as one of the methods to assess severity of genetic damage in lymphocytes of radiation workers [48]. As a measure of aberrations, the authors examined the number of dicentric chromosomes, ring chromosomes and acentric fragments. The CA rate was significantly higher in the smoking X-ray-exposed workers group compared to the smoking control group. Such difference was not observed for non-smokers. Since age, as well as smoking status, can influence genetic damage observed in lymphocytes, authors stratified groups by age (20-29, 30-39 and ≥40 years old). For each of the age groups, a significant increase in CA in the study group compared to the control group was observed. No significant differences were observed between the age subgroups for exposed or control group. Zakeri et al. [53] also used frequency of dicentric and acentric chromosomes in blood lymphocytes as a measure of DNA damage. The authors assessed the DNA damage in a group of radiation workers, which included 32 interventional cardiologists, 36 nuclear medicine physicians, 33 conventional radiologists, and as a control, a group of 35 age-and sex-matched subjects was used. For the three study groups, the exposures ranged from 0.25 to 48 mSv for the previous year, and from 1.5 to 147 mSv for lifetime exposure. The authors showed that the frequency of acentric and dicentric chromosomes was significantly higher among interventional cardiologists, nuclear medicine physicians, and conventional radiologists, compared to the control group (acentrics/100 cells: 3.23 ± 2.6, 2.87 ± 1.4, 2.18 ± 0.9 vs. 1.28 ± 0.5; dicentric %: 0.21, 0.14 and 0.13 vs. 0.04, respectively). Interventional cardiologists showed the highest frequency of CA compared to other study groups, although the differences were not statistically significant. No strong correlation of the frequency of CA with either duration of employment, age or physical dose were observed. Zakeri et al. also analyzed the level of CA in lymphocytes of personnel working in angiocardiography laboratories [54]. The doses registered in the exposed group averaged 3 mSv/year (0.25 and 15 mSv). Both cardiologists and nurses and technicians showed a significantly higher frequency of acentrics compared with matched controls. Although the frequency of dicentrics was higher in the two exposed groups than in control group, the difference was not statistically significant. The authors observed no correlation between chromosomal aberrations and physical dose or with age in the exposed groups. Interestingly, the authors also prepared calibration curves in order to estimate doses received by subjects who showed dicentrics in blood cells. The estimated doses ranged between 0.05 and 0.10 Gy, which did not agree with the doses read from personal dosimeters. The authors speculate that this discrepancy could come from inconsistent use of personal dosimeters. Maffei et al. investigated the level of CA in peripheral lymphocytes of physicians and technicians occupationally exposed to ionizing radiation [55]. The average frequency of CA and of cells with CA was significantly increased in the exposed group compared with matched control group. When exposed workers were classified based on the dose equivalent to whole body (Hwb) of ionizing radiation, the group with Hwb > 50 mSv showed a significantly higher frequency of aberrant cells than control group, which was not observed for the Hwb ≤ 50 mSv group. Additionally, the frequency of chromatid breaks was significantly higher in the Hwb > 50 mSv group compared to either control or Hwb ≤ 50 mSv group. The authors also noted that the very low frequency of dicentrics disallowed them from using this variable in statistical analyses. Cardoso et al. investigated the chromosomal damage to blood lymphocytes of occupationally exposed hospital workers (X-rays, radiotherapy, nuclear medicine sectors) [56]. The mean employment time for the study group was 19.8 years (8-26 years), and the workers were exposed to an average accumulated dose of 63.2 mSv (9.5-209.4 mSv). Although the study groups were relatively small (eight subjects per group), the authors observed a significantly higher frequency of CA in the exposed group compared with matched controls. The authors also noted that no dicentrics were found in either group, even in the two workers exposed to the highest radiation doses (145.1 mSv and 209.4 mSv). The assessment of CA frequency, and especially of dicentric chromosomes, is often used to measure exposure to radiation, however, this method is used mostly to measure one-time exposures in radiation accidents, and much more rarely for long-term occupational exposures. Unlike the comet assay, this method measures the presence of chromosomal changes, which stems from unrepaired DNA breaks. This kind of damage is produced more easily by complex lesions induced by high-LET radiation, however, low-LET radiation is also capable of inducing such changes. Similarly to the comet assay, for occupational exposures CA assay might be useful to detect increased exposure to ionizing radiation, but large-scale studies should be conducted to verify whether a dose-effect correlation exists for this kind of exposure. The collected results from studies using CA scoring in dosimetry are shown in Table 2. Table 2. Studies scoring the chromosomal aberrations in medical workers exposed to ionizing radiation.

Author Population Parameter Measured
Mean ± S.D.

Study Group Control Group
Shafiee et al. [

Scoring of Sister Chromatid Exchange
Sister chromatid exchange (SCE) can be caused by genotoxic agents, including ionizing radiation [57]. During the process, the two sister chromatids break and re-join in exchanged regions. SCEs are not specific to ionizing radiation, however, their increased frequency has been found in persons exposed to occupational levels of radiation [56]. In a study conducted by Eken et al. authors measured the frequency of SCEs in physicians and technicians working in the radiology units [58]. During the last six months of working prior to the analysis, the group was exposed to doses ranging from 0.10 to 3.86 mSv (median = 0.17 mSv). The frequency of SCE did not differ significantly between the exposed group and the control group and between smokers and non-smokers. Tug et al. conducted a study in which authors evaluated the frequency of SCE in peripheral blood lymphocytes of 39 radiology technologists [59]. The authors noted that doses to which workers were exposed within six months prior to the analysis did not exceed 20 mSv, however, the exact doses were not reported in this study. Nevertheless, the authors observed a significant increase in the mean frequency of SCE in the exposed group compared with the control group. There was no difference between male and female subjects within either exposed or control group. Sahin et al. investigated the genotoxic damage in lymphocytes from nuclear medicine workers during normal working conditions and after a one-month vacation period [60]. The dose accumulated by workers in period between vacations averaged 3.97 mSv (1.20 to 48.56 mSv). The authors observed a significantly higher frequency of SCE per cell after the exposure period, compared to the post-vacation period. Mrdjanowic et al. measured the frequency of SCE in workers in a radiotherapy and cardiology unit occupationally exposed to ionizing radiation [61]. The study group workers were exposed to radiation for an average of 11.9 years; however, the authors did not estimate doses to which workers were exposed using personal dosimetry. The frequency of SCE in peripheral blood cells of workers in the exposed group was not significantly different compared with the control group subjects. Interestingly, the SCE frequency was significantly higher in exposed smokers compared to unexposed workers, which suggests that smoking might influence the result of the assay. Engin and colleagues measured the level of SCEs in γradiation-and x-ray-exposed technicians. Both γ-rayand x-ray-exposed workers showed a significantly higher SCE frequency compared with the matched control group [62]. In the previously mentioned study by Cordoso et al., the authors also investigated the frequencies of SCE in medical workers [56]. Although no difference in proliferation index was observed, the exposed group showed a significantly higher frequency of SCE in this group compared to non-exposed subjects. The SCE scoring is a method similar to the measurement of CA frequency, and similarly finds use in biological dosimetry. Although some studies showed an increased SCE frequency in occupationally exposed medical workers, the results from these studies often lack complete dose descriptions for the group. Additionally, Chauduri et al. showed that X-rays poorly induce SCEs in human lymphocytes possibly due to the short-lived nature of this lesion [63]. The studies using SCE scoring to evaluate occupational exposures did not report annual doses for exposed workers, so it is not possible to estimate how low the detection threshold might be. Although the SCE scoring might be useful for assessment of accidental exposures, the currently available reports do not present this method as useful in dosimetry of occupationally exposed groups. The collected results from studies using SCE scoring in dosimetry are shown in Table 3. Table 3. Studies scoring of sister chromatid exchange in medical workers exposed to ionizing radiation.

Author
Population Parameter Measured Mean ± S.D.

Cytokinesis-Block Micronucleus Assay
Micronucleus frequency is one of the most commonly used assays to assess the DNA damage from ionizing radiation exposure. Micronuclei are small intracellular bodies containing chromatin, which are formed in cells with DNA damage when a chromosome fragment (or fragments) is not separated into the new nuclei [64]. Such fragments form micronuclei outside of the nucleus of the daughter cells. For a genotoxicity assay, a cytokinesis-block micronucleus assay (CBMN) is often used, where an inhibitor of cytokinesis is added to the cell culture, which allows for assessing the number of MN in binucleated cells, making the DNA damage assessment more reliable. Although widely used for radiation-induced genotoxicity, the MN assay is not specific to radiation-MNs can arise from exposure to other clastogenic or aneugenic factors, and its frequency additionally increases with age [51]. The CBMN assay is relatively simple and quick to perform, however, the lymphocytes requite three days of culture to enter cytokinesis. The CBMN assay also allows for an assessment of additional, genotoxicity-related effects, like nuclear division index (NDI), cytokinesis-block proliferation index (CBPI) and frequency of nuclear alterations like nuclei buds (NB) and nucleoplasmic bridges (NPB).
One of the largest studies which conducted DNA damage analysis using CBMN was performed by Sari-Minodier et al. [65]. Authors measured chromosomal damage in blood lymphocytes of 132 healthcare workers in six hospital units: 27 in radiotherapy, 43 in nuclear medicine, 25 in cardiology, 17 in radiology and 20 in the paediatric operating room. Mean dose to the exposed group in the last year was 0.97 ± 2.46 (0-16.42) mSv, and for the reference group the dose averaged 0.65 ± 1.90 (0-12.65) mSv. The mean binucleated MN cell rate was significantly higher in the exposed group compared with the control group. No correlation with the smoking status was observed, however, female workers showed higher frequency of binucleated MN cells than male workers. One of the most recent studies conducted by Shafiee et al. evaluated the frequency of MNs in exposed radiation workers and matched controls [52]. The frequency of MN was significantly higher in the study group compared to the control group. Although the MN frequency was not different between subjects in different wards, the authors observed a significant correlation between the MN frequency and age as well as MN frequency and length of work experience. Moreover, the MN frequency was higher in smokers than in nonsmokers. In the previously described study, Gerić et al. performed a CBMN assay on blood lymphocytes of radiology unit workers [47]. The authors did not observe a significant difference between the study group and control group in the total number of MNs or NPBs, however, the number of NBs was significantly higher in radiology unit group than in control group. CBPI analysis did not show significant differences in cell proliferation between the two groups. Fang et al. showed that the frequency of MN was significantly higher in radiation workers, compared with the reference group [48]. Additionally, the frequency of MNs was higher in smokers compared with non-smokers of both the study and the control group, and a correlation between time of exposure and MN frequency was observed. In another study, Bouraoui et al. used CBMN assay to assess the DNA damage in peripheral blood lymphocytes in a group of 67 healthcare workers exposed to ionizing radiation [66]. The frequency of MNs was significantly higher in the study group compared with the matched control group, and the highest frequency of MN was observed in nuclear medicine department and radiology department workers. Further analysis showed that the number of acentric MN and centromeric MN was higher in the study group. Using the CBMN assay, Eken et al. also showed that the frequency of MN in blood lymphocytes was higher in samples from medical workers exposed to radiation than in the reference group [58]. Similarly to the previous studies, the frequency of MNs was also higher in smoking-exposed workers than in smoking non-exposed workers, but the difference was not statistically significant. No correlation between the frequency of MNs and the exposure dose was observed. Sakly et al. in the previously described study also showed significant increase of MN frequency in the radiation-exposed group compared with the reference group [50]. Similar results were obtained in the study conducted by Zakeri et al. where the mean frequency of MNs was significantly higher in the interventional cardiologists, nuclear medicine physicists and conventional radiologists compared with the control group [53]. Ropolo et al. estimated DNA damage in 30 workers exposed to X-ray and gamma radiation and 30 matched healthy subjects not exposed to ionizing radiation [67]. For the exposed group, the cumulative effective dose averaged 19.49 ± 37.59 mSv for the average employment time of 12 ± 9.5 years. The results of MN assay performed on blood lymphocytes showed a significant increase in the frequency of MNs present in mononuclear cells in the study group compared with the control group. No such difference occurred when frequency of MN in bi-nucleated cells, NB or NPB was measured. Stratification of all subjects (exposed and non-exposed) based on the accumulated dose revealed that in the subjects who received cumulative doses above 10 mSv the frequency of MN in mononuclear cells was significantly increased. Although the majority of studies use peripheral blood lymphocytes as a source of cells for cytogenetic analysis of radiation effects, some studies used less invasive alternatives. Aguiar Torres et al. used buccal epithelial cells as a source of cells to assess the DNA damage in 42 medical workers exposed to radiation and 39 workers not exposed to radiation [37]. Authors showed that the number of cells with MN per 2000 cells counted was significantly higher in the exposed group compared with the reference group, and that the number of MN cells per 2000 correlated with the mean annual deep dose (dose received by tissues 10 mm beneath skin). Although buccal epithelial cells might be useful for some types of exposure, the doses received by different anatomic sites can vary significantly, as was shown in the "Radiation protection recommendations" section. The MN assay has been used as a biological dosimetry method in many studies, with some success. The results presented by Shafiee et al. showed that this method might be able to detect occupational exposures as low as 0.1 mSv/year [52]. Some researchers show a dose-response correlation for occupational exposures, however, more thorough studies on larger groups should be conducted to verify this relationship. Overall, the MN assay is a relatively easy and inexpensive method, which shows promising results for detection of occupational exposures. The collected results from studies using MN assay in dosimetry are shown in Table 4. Table 4. Studies using the micronucleus assay in medical workers exposed to ionizing radiation.

Author
Population Parameter Measured Mean ± S.D.

Study Group Control Group
Sari-Minodier et al. [

Other Candidate Methods for Biological Dosimetry
Along with from the more common biodosimetry assays listed above, other methods measuring the effects of ionizing radiation have been used for estimation of DNA damage resulting from ionizing radiation exposure. Chen et al. [68] investigated whether the occupational exposure to ionizing radiation correlates with DNA methylation rate and oxidative damage. The study included 117 physicians performing interventional work, whose cumulative dose equivalent between 2016 to 2017 was 2.333 ± 1.052 mSv, and a matched reference group. To assess the methylation level in subjects, the authors measured the expression level of DNA methyltransferase (Dnmts) and homocysteine (Hcy) in serum, and also used high-performance liquid chromatography to determine the total DNA methylation rate. For oxidative damage assessment, the levels of 8-hydroxy-2 -deoxyguanosine (8-OHDG) and 4-hydroxynonenal (4-HNE) were measured in serum. Authors showed that the level of Dnmts, 4HNE and 8-OHDG was significantly higher in the exposed group compared with the control group. Additionally, when the annual effective dose in the group of interventional physicians was taken into account, the authors observed that the level of Dnmts and 4HNE positively correlated with the increase in dose. No correlation between dose and Hcy level and the methylation rate was observed. Although the method gives some promising results, the analysis of methylation level requires experience to perform and is time-consuming, which is why it is not a good candidate for routine use.
Measurement of phosphorylated histone H2AX (γ-H2AX) level is amongst the most commonly used methods of radiation exposure estimation. Phosphorylation of the histone H2AX is one of the first occurrences following the induction of DSB by ionizing radiation. The γ-H2AX reaches maximum levels after around 30 min from exposure, and afterwards steadily declines [69]. Since after several hours the γ-H2AX level reaches levels similar to baseline, this method is most often used in basic research of radiation effects and potentially for an early estimation of radiation exposure, e.g., after accidental radiation exposures [70]. Similarly to the comet assay, γ-H2AX level can be measured in cells directly after exposure, without the need for lengthy culture. Raavi et al. showed that this method might also find some use in occupational exposure estimation [71]. The authors observed a significantly increased number of γ-H2AX foci in lymphocytes from occupationally exposed health workers (exposed to X-rays) compared to matched controls. El-Sayed et al. measured the level of γ-H2AX in blood lymphocytes of operators performing fluoroscopically guided endovascular aortic repair [72]. The γ-H2AX level increased significantly immediately after the procedure, however, 24 h later the level was back to baseline. Additionally, authors observed that the use of lower-leg shielding significantly reduced the γ-H2AX level. Although the γ-H2AX assay shows an established linear dose-response correlation [70], this method is likely more suitable for immediate exposure estimation rather than measurement of chronic occupational exposures.
Glycophorin A (GPA) mutation assay has been used to estimate accidental exposure to, among others, Chernobyl clean-up workers and atomic bomb survivors [73,74]. GPA is a surface glycoprotein expressed on red blood cells and exposure to ionizing radiation induces loss of one of the alleles coding this protein. The frequency of allele-loss variants of red blood cells correlates with exposure to ionizing radiation, and since the mutation is induced in bone marrow stem cells, it can be measured even months and years after the exposure. The GPA gene occurs in two allelic forms (M and N) and the assay can only be performed on individuals who are MN heterozygotes, which constitutes approximately half the human population. Ha et al. used the GPA assay to estimate exposure to 32 hospital workers exposed to ionizing radiation [75]. The frequency of both NO variant and NN variant correlated with exposure doses in this group. Straume et al. conducted a case study on a radiation worker who believed that his dosimetry records (0.56 Sv over 36 years of work) were severely underestimated [76]. The authors compared their results with published data from studies using GPA assay-Hiroshima atomic bomb study, and Goiania radiation accident study, to estimate the dose received by the worker. Results showed that the worker could have received a dose in the 0.4 and 2 Sv range, so the official dosimetry records could be valid. The GPA assay is a relatively quick and easy-to-perform method, that has already been used as a dosimetry tool to estimate exposures to large populations (atomic bomb survivors, Chernobyl clean-up workers). Furthermore, the mutagenic effect on bone marrow stem cells persists in cells for years to decades, which means that the method could detect cumulative radiation doses accumulated over long periods of time. However, this assay can only be used in half of the general population (MN heterozygotes). Additionally, the large interindividual variability of the allele variant frequencies for persons exposed to similar radiation doses means that this assay can be useful for estimating mean doses received by a population, but not for individual occupational dosimetry [77].

Conclusions
In this review we decided to assess studies which used several of the most promising biological dosimetry methods for estimation of occupational exposures to medical workers. Although previously such methods have been used for estimation of exposures during radiation incidents, the doses and dose rates recorded during such incidents are usually much higher than for occupational exposures. Since the biological effects of radiation are dose-dependent, a reliable measurement of such effects in medical workers exposed to very low doses over long periods of time is more difficult than during radiation incidents. Here we show that several methods show promising results for their use in assessment of genotoxic effects of radiation. For this study, we investigated the use of methods such as comet assay, frequency of chromosomal aberrations (dicentric chromosomes, acentric chromosomes, sister chromatid exchange), cytokinesis-block micronucleus assay (CBMN), the level of DNA methylation, measurement of histone H2AX and Glycophorin A mutation assay. In all papers using biological dosimetry methods except two, peripheral blood lymphocytes were used for assessment of ionizing radiation effects. This source of cells is beneficial for the dosimetry use, since lymphocytes constantly circulate throughout the body, hence the effects of irradiation of different parts of the body can be detected in lymphocytes isolated from venous blood. This presents a significant benefit of methods measuring damage to blood lymphocytes, as a considerable variability has been observed in doses received by different organs of occupationally exposed personnel [78].
Although all of the presented methods allowed for a detection of genotoxic effects of radiation in exposed medical workers, significant differences between the assays should be addressed. Similarly to the other methods, the comet assay is widely used for assessment of damage to DNA induced by ionizing radiation. However, it is important to note that this assay and its modifications measures the presence of DNA breaks and other lesions. Such damage could be too extensive for the cell to repair, leading either to cell death or to repair through DNA repair pathways. Initiation of these processes may result in correct repair and no further consequences to the cell, or erroneous repair, leading to chromosomal aberrations and mutations [79]. Both cell death and genetic damage can result in consequences to the exposed tissue, such as induction of lens opacity and carcinogenesis, which are observed consequences of long-term ionizing radiation exposure [15]. Since the comet assay detects only the DNA lesions induced by radiation (such as DNA breaks, base lesions) and not the effects that stem from them (such as mutations, chromosomal aberrations, cell death), the extent of lymphocyte damage measured this way might not accurately reflect potential biological consequences of exposure. Other biodosimetry methods reviewed in this paper measure the genotoxic effect of ionizing radiation observed in cells after advancement in cell-cycle, meaning that this damage was not repaired and might actually contribute to development of ionizing radiation-related diseases.
Although the reviewed assays measure different genotoxic effects of ionizing radiation, all of them showed some ability to detect exposure in medical workers when compared with non-exposed reference groups. However, the studies using the CBMN assay seem to show a somewhat reliable correlation between the dose and effect. Additionally, the CBMN assay is a relatively low-cost method, and does not require extensive training of the personnel. Furthermore, scoring of CA shows promising results as a method measuring occupational exposures; however, it requires trained personnel to perform. Although the MN assay and CA scoring seem to produce good results among the assessed methods, the limitations of all the studies should also be considered in assessment of the methods. Many of the reviewed studies did not accurately report doses to personnel or did not measure the correlation between the effect and annual/cumulative dose, which makes the assessment of the method limited. Only some papers reported results separately for specific groups of radiation workers, like nuclear medicine physicists and conventional radiologists. Since the type of work, as well as a type of procedure performed by the personnel can significantly influence the received dose [26,28,30], it is reasonable to assume that biological effects in those group would also not be the same. These differences were shown in the study conducted by Zakeri et al., where the mean frequency of MNs in lymphocytes of subjects was different depending on the profession [53]. Additionally, not all studies used the same measurements to assess the effects of each assay. For example, for MN assay, some studies used frequency of binucleated MN cells, and some used the frequency of MN. For cytogenetic assays, not all studies counted events in the same number of cells; for example, for the comet assay, Martinez et al. decided to analyze 50 cells per sample, whereas Fang et al. analyzed 200 cells per sample [38,47]. Several of the papers we referenced used more than one method to estimate occupational exposure to radiation [48,53,76]. The use of several assays measuring different exposure effects (extent of DNA damage, frequency of stable and unstable chromosomal aberrations, presence of mutations, etc.) could result in a more comprehensive understanding of dose to which an individual was exposed as well as of the biological effects of the chronic exposure. Although some assays measured radiation effects using more than one method, the analysis was performed on the whole groups (exposed and unexposed). Only one study used more than one method to estimate dose received by an individual worker [76], which allowed the authors to establish a range of doses to which the person was exposed. The study was published in 1992, and currently available methods could allow for a more precise dose estimation; however, using a combination of methods is both time-consuming and expensive, and probably could be used for individual cases, and not for routine application next to physical dosimeters.
In conclusion, our review shows that biological assays such as the comet assay, CBMN assay, assessment of CA and SCA can be used to differentiate between groups occupationally exposed to different doses of radiation. Moreover, measurement of the frequency of MNs using the CBMN assay and CA scoring show promise for more precise dose estimations compared with other methods. Nevertheless, the review of available literature also shows that more thorough studies on larger cohorts are needed to accurately evaluate the usefulness of biological dosimetry methods.

Conflicts of Interest:
The authors declare no conflict of interest.