Next Article in Journal
A Complete Classification and Clustering Model to Account for Continuous and Categorical Data in Presence of Missing Values and Outliers
Previous Article in Journal
Development of Strain Localization in a Beta-Titanium Alloy Gum Metal Analyzed by Infrared Camera and Digital Image Correlation for Various Strain Rates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Intracellular Background Estimation for Quantitative Fluorescence Microscopy †

by
Yannis Kalaidzidis
1,2,*,
Hernán Morales-Navarrete
1,
Inna Kalaidzidis
1 and
Marino Zerial
1
1
Max Planck Institute of Molecular Cell Biology and Genetics. Pfotenhauerstr. 108, 01307 Dresden, Germany
2
Faculty of Bioengineering and Bioinformatics, Moscow State University, 119991 Moscow, Russia
*
Author to whom correspondence should be addressed.
Presented at the 39th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Garching, Germany, 30 June–5 July 2019.
Proceedings 2019, 33(1), 22; https://doi.org/10.3390/proceedings2019033022
Published: 6 December 2019

Abstract

:
Fluorescently targeted proteins are widely used for studies of intracellular organelles dynamic. Peripheral proteins are transiently associated with organelles and a significant fraction of them are located at the cytosol. Image analysis of peripheral proteins poses a problem on properly discriminating membrane-associated signal from the cytosolic one. In most cases, signals from organelles are compact in comparison with diffuse signal from cytosol. Commonly used methods for background estimation depend on the assumption that background and foreground signals are separable by spatial frequency filters. However, large non-stained organelles (e.g., nuclei) result in abrupt changes in the cytosol intensity and lead to errors in the background estimation. Such mistakes result in artifacts in the reconstructed foreground signal. We developed a new algorithm that estimates background intensity in fluorescence microscopy images and does not produce artifacts on the borders of nuclei.

1. Introduction

The development of technologies for creating genetically encoded chimeric conjugates of proteins of interest with fluorescent proteins opened a new era in study of intracellular processes by means of quantitative fluorescence microscopy [1], and it is widely used in studies of spatio-temporal dynamics of intracellular organelles in live and fixed cells [2,3,4]. Several approaches for the quantification of cytosolic and membrane-bound proteins have been developed. However, most of them fail when applied to cases where the existence of large non-stained organelles (e.g., nuclei) produce sudden changes in the cytosol fluorescent intensity. Many peripheral membrane proteins dynamically switch between cytosolic and membrane-bound state. Whereas, they generate compact fluorescent images of intracellular organelles when they are in membrane-bound state, fuzzy fluorescent background is generated when they are in cytosolic state. As an example of such peripheral membrane proteins, we analyzed images of the small GTPase Rab5 conjugated with Green Fluorescent Protein (GFP), which dynamics orchestrates intracellular endocytic transport [5]. Quantitative analysis of endosome-associated proteins requires discrimination of fluorescent endosomes from fluorescent cytosolic background (Figure 1a). The problem of discriminating high spatial frequency (bright compact) structures, i.e., endosomes, from low-frequency (cytoplasmic) background has been extensively studied and many solutions (using heuristic as well as Bayesian approaches) have been developed. However, they mostly rely on the assumption that background corresponds to low-frequency signal, which is not the case of peripheral proteins. Usually, images of peripheral proteins (Figure 1a) show large dark areas with sharp boundaries, which are imprints of nuclei in the fluorescent cytoplasm. Multiple unlabeled organelles are observed as dark areas in the cytoplasm with spatial frequencies similar to those of endosomes. This spatial frequency similarity is exemplified by the intensity profile along yellow line on Figure 1a, which is presented on Figure 1b (black line). In a recent study [6], the problem of non-smooth background was addressed for time series (live cell imaging) by using conditional random fields to estimate the background as well as to segment the motile organelles. However, this algorithm is not applicable for single images. Unfortunately, state-of-the-arts algorithms for background estimation on single fluorescence microscopy images [7,8,9] explicitly rely on the smoothness of background signal and in this respect are not better than textbook "rolling ball" algorithm [10]. All of them produce artefacts (false-positive foreground rim) on border of cell nucleus (see Figure 1b,c). In present work we proposed new algorithm (TBL) based on probability distribution for two background levels: one in cytosol and another in possibly presented "dark" nucleus, each background is smooth, but transition between them could be abrupt.

2. Two Background Level Estimation (TBL) Algorithm

2.1. Probabilistic Model of Intensity

Images of fluorescence microscopy are dominated by Poisson noise of photo-electron flux in photomultiplier tube (PMT) or CMOS/CCD camera. Probability to detect n photo-electrons is P ( n ) = λ n e λ Γ ( n + 1 ) . Assuming that intensity I linearly depends on number photo-electrons, we got I = α · n + I 0 , where offset I 0 can be as positive as negative, dependent on microscope settings. Therefore, variance of intensity σ 2 = α · I + ζ , where ζ = ε 2 α · I 0 and ε 2 is variance of zero-mean Gaussian noise of electronic circuits.
First, we found parameters α and ζ for a single image as it was described in [11]. In most practical cases ε 2 is small, therefore we approximated I 0 = ζ α and subtracted it form the image: J i = max 0 , I i + ζ α . Second, we calculated estimation of variance σ i 2 for each pixel. Third, we approximated Poisson noise distribution by truncated Gaussian distribution (since it allows get integrals analytically):
Therefore, probability of intensity in absence of foreground, given background intensity B i , was approximated as:
P J i | B i , σ i = 2 π e 1 2 J i B i 2 σ i 2 σ i 1 2 B i σ i , B i 0
In presence of foreground signal F i it was approximated as:
P J i , σ i | B i = 0 2 π e 1 2 J i F i B i 2 σ i 2 σ i e r f c 1 2 B i σ i P F i d F i ,
where P F i is prior of foreground. By maximum entropy principle, we chose prior distribution for foreground
P F i | μ i = 1 μ e F i μ
The parameter of the prior (mean expected amplitude of foreground μ ) was found outside of the analyzed pixel (see Appendix A).
After substitution (3) to (2) we got :
P J i , σ i | B i = 1 μ e 1 2 J i B i 2 σ i 2 Ψ σ i μ J i B i σ i e 1 2 B i σ 2 Ψ σ i μ + B i σ i Ψ B i σ i Ψ σ i μ + B i σ i + Ψ B i σ i ,
where Ψ x = 2 π e 1 2 x 2 e r f c 1 2 x

2.2. Probabilistic Model of Two Background Levels

Assuming that background is slow varying signal, we approximate it by constant in the vicinity of the pixel i. The vicinity window is defined by characteristic scale discriminating background and foreground. We introduced latent variables z i , k , k = 1 , 2 , 3 , 4 to define 4 possible states of pixel intensity: background B 1 , background B 2 , background B 1 with foreground and background B 2 with foreground. Then probability of intensities in the vicinity window Ω In the pixel having two background levels in the vicinity window, the probability of intensities we got:
P J i , z i | B 1 , B 2 , σ i =
= i Ω e 1 2 J i 2 σ i 2 e J i B i , 1 σ i 2 Ψ B i , 1 σ i δ z i , 1 σ i 1 β 0 + δ z i , 2 μ β 0 Ψ σ i μ J i B i , 1 σ i Ψ σ i μ + B i , 1 σ i Ψ σ i μ + B i , 1 σ i + Ψ B i , 1 σ i + + e J i B i , 2 σ i 2 Ψ B i , 2 σ i δ z i , 3 σ i 1 β 0 + δ z i , 4 μ β 0 Ψ σ i μ J i B i , 2 σ i Ψ σ i μ + B i , 2 σ i Ψ σ i μ + B i , 2 σ i + Ψ B i , 2 σ i
where β 0 is prior probability of presence of foreground in the pixel (see Appendix A). If B σ > 1 , then the expression can be simplified as:
P J i , z i | B 1 , B 2 , σ i =
= i Ω e 1 2 J i 2 σ i 2 e J i B i , 1 σ i 2 Ψ B i , 1 σ i δ z i , 1 σ i 1 β 0 + δ z i , 2 μ β 0 1 Ψ σ i μ J i B i , 1 σ i + + e J i B i , 2 σ i 2 Ψ B i , 2 σ i δ z i , 3 σ i 1 β 0 + δ z i , 4 μ β 0 1 Ψ σ i μ J i B i , 2 σ i
We used EM algorithm to maximize likelihood L B 1 , B 2 | J i , z i , σ i over backgrounds B 1 , B 2 . Resulting backgrounds are presented on Figure 2a.
Then we calculated probability of presence of foreground in the pixel i (assuming B 2 > B 1 without loss of generality):
p i s i g n a l = p i , z i = 4 p i , z i = 4 + p i , z i = 3 , I i > B 2 p i , 2 p i , z i = 2 + p i , z i = 1
where
p i , z i = 1 = 1 β 0 σ i e J i B i , 1 σ i 2 Ψ B i , 1 σ i ; p i , z i = 3 = 1 β 0 σ i e J i B i , 2 σ i 2 Ψ B i , 2 σ i
p i , z i = 2 = β 0 μ e J i B i , 1 σ i 2 Ψ B i , 1 σ i Ψ σ i μ + B i , 1 σ i Ψ σ i μ J i B i , 1 σ i Ψ σ i μ + B i , 1 σ i + Ψ B i , 1 σ i
p i , z i = 4 = β 0 μ e J i B i , 2 σ i 2 Ψ B i , 2 σ i Ψ σ i μ + B i , 2 σ i Ψ σ i μ J i B i , 2 σ i Ψ σ i μ + B i , 2 σ i + Ψ B i , 2 σ i
Assuming that each pixel in the vicinity window exclusively belongs to only one background level, we got probability of background levels: Assuming that each pixel in the vicinity window exclusively belongs to only one background level, we got probability of background levels:
P j , 1 = i = 1 N p i , 1 + p i , 2 · p i s i g n a l · p j s i g n a l + p i , 1 · 1 p i s i g n a l · p j s i g n a l exp I i I j 2 2 σ i 2 + σ j 2
P j , 2 = i = 1 N p i , 3 + p i , 4 · p i s i g n a l · p j s i g n a l + p i , 3 · 1 p i s i g n a l · p j s i g n a l exp I i I j 2 2 σ i 2 + σ j 2
Finally consensus background intensity in the pixel of interest was calculated as weighted mean:
B = B 1 P 1 + B 2 P 2 P 1 + P 2
The estimation of variance of background intensity:
σ b 2 = V a r B 1 P 1 P 1 + P 2 2 + V a r B 2 P 2 P 1 + P 2 2 + B 1 B 2 2 P 1 P 2 P 1 + P 2 2 1 N
Final background was calculated by formula (14) (Figure 2b). Difference between original intensities and estimated backgrounds has residual “nuclei border artifact” within estimated uncertainty (Figure 2c).
Result of TBL algorithm is presented on Figure 3a,b.

3. Conclusions

We developed a new algorithm (TBL) to estimate cytoplasm fluorescence (background) in conditions where high spatial frequency is present in both background and foreground signal. TBL avoids artefacts that are characteristic for state-of-the-arts background subtraction algorithms in presence inhomogeneous background with sharp transition between levels.

Author Contributions

Methodology Y.K.; software, H.M-N.; investigation, I.K.; supervision, M.Z.

Funding

This work was financially supported by the German Federal Ministry of Education and Research (BMBF) (LiSyM: grant #031L0038), European Research Council (ERC) (grant #695646) and the Max Planck Society (MPG).

Acknowledgments

We thank the Center for Information Services and High Performance Computing (ZIH) of the TU Dresden and the Light Microscopy Facility of the MPI-CBG.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Appendix A. Estimation Prior Parameters μ and β

It is reasonable to assume that most of pixels have only one background level in their close vicinity. Fluorescence microscopy images of intracellular organelles have majority of pixels belonging to background. Therefore, median filter, which is insensitive to outliers (foreground), could be used as crude estimation of background. However, given that intensity is not normally distributed, median is shifted relative to mode. In our cropped Gaussian approximation, the median M i is solution of equation:
e r f 1 2 M i B i σ i = 1 2 e r f c 1 2 B i σ i , B i 0
Therefore, we first calculated median M i over vicinity window for image for image J i , σ i , where J i denote intensity after offset subtraction J i = I i + ζ α , then calculated B i by numerical solution of equation A1 and, finally, constructed image u i = J i B i σ i , where in absence of foreground image has intensity distribution
p b u | b , μ , σ = 2 π e 1 2 u 2 e r f c b 2 σ
In presence of foreground image has intensity distribution
p f u | μ , σ , b = σ μ e σ μ u e r f c 1 2 σ μ u e b μ e r f c 1 2 σ μ + b σ + e 1 2 σ μ 2 e r f c b 2 σ
Therefore ratio fraction of pixels with intensity in the interval t , to the number of pixels with intensity in the interval 0 , 1 for distribution (A2) is:
b = t p b u | b , μ , σ d u 0 1 p b u | b , μ , σ d u = 1 e r f t 2 e r f 1 2
For distribution (A3) the ration is:
f = t p f u | b , μ , σ d u 0 1 p f u | b , μ , σ d u = e σ μ t e r f c 1 2 σ μ t + e 1 2 σ μ 2 e r f c t 2 e r f c 1 2 σ μ e σ μ e r f c 1 2 σ μ 1 + e 1 2 η 2 e r f 1 2
The integration f with Jeffreys prior 1 μ gives:
< f > = lim A 1 A 0 A f 1 μ d μ 1.47 0.009t
Then we calculated ratio R of number of pixels with intensities above u > t to number of pixels with u 1 in the image:
R = b ( 1 + < f > ) + β ( < f > b ) 1 + < f > β ( < f > b )
The ratio (A7) was calculated for set of thresholds t ( t = 1 , 2 , 3 ). The resulting overdetermined system was solved in least square sense w.r.t β , which is probability of presence foreground signal in the pixel.
Expectations of mean value of pixels with intensities u > t are:
m b , u > t = t u p b u | μ , B , σ d u 1 p b u | μ , B , σ d u = 2 π e 1 2 t 2 e r f c t 2
m f , u > t = t u p f u | μ , B , σ d u 1 p f u | μ , B , σ d u = μ σ + t e 1 2 σ μ t 2 1 2 t 2 e r f c 1 2 σ μ t + 2 π e 1 2 t 2 + μ σ e r f c t 2 e 1 2 σ μ t 2 1 2 t 2 e r f c 1 2 σ μ t + e r f c t 2
Therefore expected mean value of pixels with intensities u > t is: m u > t = β m f , u > t + 1 β m b , u > t .
We calculated experimental mean value M of pixels with intensities u > t for a set of thresholds t ( t = 1 , 2 , 3 ):
M = β η + t e 1 2 σ μ t 2 1 2 t 2 e r f c t 2 η t + 2 π e 1 2 t 2 + η e r f c t 2 e 1 2 η t 2 1 2 t 2 e r f c t 2 η t + e r f c t 2 + 1 β 2 π e 1 2 t 2 e r f c t 2
and solved the overdetermined system in least square sense w.r.t. η . Then parameter μ was estimated as: μ = η · σ

References

  1. Zimmer, M. Green Fluorescent Protein (GFP): Applications, Structure, and Related Photophysical Behavior. Chem. Rev. 2002, 102, 759–782. [Google Scholar] [CrossRef] [PubMed]
  2. Rink, J.C.; Ghigo, E.; Kalaidzidis, Y.L.; Zerial, M. Rab Conversion as a Mechanism of Progression from Early to Late Endosomes. Cell 2005, 122, 735–749. [Google Scholar] [CrossRef] [PubMed]
  3. Meijering, E.; Smal, I.; Danuser, G. Tracking in Molecular Bioimaging. IEEE Signal Process. Mag. 2006, 23, 46–53. [Google Scholar] [CrossRef]
  4. Sbalzarini, I.F.; Koumoutsakos, P. Feature point tracking and trajectory analysis for video imaging in cell biology. J. Struct. Biol. 2006, 151, 182–195. [Google Scholar] [CrossRef] [PubMed]
  5. Pfeffer, S.R. Rab GTPases: master regulators that establish the secretory and endocytic pathways. Mol. Biol. Cell 2017, 28, 712–715. [Google Scholar] [CrossRef] [PubMed]
  6. Pécot, T.; Bouthemy, P.; Boulanger, J.; Chessel, A.; Bardin, S.; Salamero, J.; Kervrann, C. Background Fluorescence Estimation and Vesicle Segmentation in Live Cell Imaging with Conditional Random Fields. IEEE Trans. Image Process. 2015, 24, 667–680. [Google Scholar] [CrossRef] [PubMed]
  7. Kalaidzidis, Y. (Version 8.93.00, 27 June 2019). Available online: http://motiontracking.mpi-cbg.de.
  8. Lee, H.-C.; Yang, G. Computational Removal of Background Fluorescence for Biological Fluorescence Microscopy. In Proceedings of the IEEE 11th International Symposium on Biomedical Imaging (ISBI), Beijing, China, 29 April–2 May 2014. [Google Scholar]
  9. Yang, L.; Zhang, Y.; Guldner, I.H.; Zhang, S.; Chen, D.Z. Fast Background Removal in 3D Fluorescence Microscopy Images Using One-Class Learning. In International Conference on Medical Image Computing and Computer-Assisted Intervention—ICCAI 2015, Part III; Springer: Cham, Switzerland, 2015; pp. 292–299. [Google Scholar]
  10. Sternberg, S.R. Biomedical Image Processing. Computer (IEEE) 1983, 16, 22–34. [Google Scholar] [CrossRef]
  11. Kalaidzidis, Y. Fluorescence Microscopy Noise Model: Estimation of Poisson Noise Parameters from Snap-Shot Image. In Proceedings of the International Conference on Bioinformatics and Computational Biology (BIOCOMP’17), Las Vegas, NV, USA, 17–20 July 2017; pp. 63–66. [Google Scholar]
Figure 1. (a) A431 cells with GFP-tagged Rab5a. The cytosol is labelled by soluble fraction of GFP-Rab5a. Bright structures of different size and shape are endosomes, which are labelled by membrane-bound GFP-Rab5a. Images were obtained by spinning disk microscope (Andor-Olympus-IX71 inverted stand microscope; scan head CSU-X1 Yokogawa, objective Olympus UPlanSApo 63x 1.35oil, Optovar 1.6). Letters N denote nuclei. Yellow line marks the intensity profile presented on panel b. (b) Intensity profiles along the yellow line (Figure 1). Black curve is intensity of original image. Red, green, blue and cyan curves are background estimation by “rolling ball” [10], Gaussian, median and FMOR [7] filters respectively. (c) Difference between original intensities and estimated background. Green, blue and red curves correspond to background estimation by Gaussian, median and FMOR filters respectively. Filter window was 4 μ m for all filters.
Figure 1. (a) A431 cells with GFP-tagged Rab5a. The cytosol is labelled by soluble fraction of GFP-Rab5a. Bright structures of different size and shape are endosomes, which are labelled by membrane-bound GFP-Rab5a. Images were obtained by spinning disk microscope (Andor-Olympus-IX71 inverted stand microscope; scan head CSU-X1 Yokogawa, objective Olympus UPlanSApo 63x 1.35oil, Optovar 1.6). Letters N denote nuclei. Yellow line marks the intensity profile presented on panel b. (b) Intensity profiles along the yellow line (Figure 1). Black curve is intensity of original image. Red, green, blue and cyan curves are background estimation by “rolling ball” [10], Gaussian, median and FMOR [7] filters respectively. (c) Difference between original intensities and estimated background. Green, blue and red curves correspond to background estimation by Gaussian, median and FMOR filters respectively. Filter window was 4 μ m for all filters.
Proceedings 33 00022 g001
Figure 2. (a) Original intensity (as on Figure 1) curve (black) and two estimated background levels (red and green curves). (b) Original intensity curve (black) and background estimation by TBL (blue) curve. (c) Difference between original intensities and TBL background.
Figure 2. (a) Original intensity (as on Figure 1) curve (black) and two estimated background levels (red and green curves). (b) Original intensity curve (black) and background estimation by TBL (blue) curve. (c) Difference between original intensities and TBL background.
Proceedings 33 00022 g002
Figure 3. (a) Original image. (b) Image after subtraction background by TBL.
Figure 3. (a) Original image. (b) Image after subtraction background by TBL.
Proceedings 33 00022 g003

Share and Cite

MDPI and ACS Style

Kalaidzidis, Y.; Morales-Navarrete, H.; Kalaidzidis, I.; Zerial, M. Intracellular Background Estimation for Quantitative Fluorescence Microscopy. Proceedings 2019, 33, 22. https://doi.org/10.3390/proceedings2019033022

AMA Style

Kalaidzidis Y, Morales-Navarrete H, Kalaidzidis I, Zerial M. Intracellular Background Estimation for Quantitative Fluorescence Microscopy. Proceedings. 2019; 33(1):22. https://doi.org/10.3390/proceedings2019033022

Chicago/Turabian Style

Kalaidzidis, Yannis, Hernán Morales-Navarrete, Inna Kalaidzidis, and Marino Zerial. 2019. "Intracellular Background Estimation for Quantitative Fluorescence Microscopy" Proceedings 33, no. 1: 22. https://doi.org/10.3390/proceedings2019033022

Article Metrics

Back to TopTop