#
Segmentation and Shape Analysis of Macrophages Using Anglegram Analysis^{ †}

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

**bends**in boundaries of

**overlapping objects**, whose intersections would correspond to the junctions acquired; and (ii)

**corners**, which correspond to the pointy edges in

**single**objects. The

**corners**detected, and the methodology to do so would provide a tool for the analysis of shape. On the other hand, the

**bends**would be used as the basis for completing a segmentation of the overlapping cells, four methods are presented of which three use the information from the bends detected.

## 2. Materials and Methods

#### 2.1. Macrophages Embryos

**circles or ellipses**,

**drops**, which have one sharp edge like a water drop,

**bi-drops or**

**croissants**, which have two edges and

**tri-drops**which is similar to a drop but with three pointy edges. A close view of the shapes can be seen in Figure 1b. The data does not show instances of cells with more pointy edges. To illustrate the overlapping of cells (clumps) found in the data, Figure 1c shows one representative time frame where clumps are present. The green channel illustrates overlap that makes an accurate segmentation of the cells complicated. Figure 1d shows the detail of one of the clumps.

#### Ground Truth of Macrophages Data

^{®}software developed by the authors. The GT software, which is based on Matlab

^{®}’s

`imfreehand`function, allows the user to manually label images of cells, accounting for the overlap. The user labels all cells of interest in both red and green channels of the data. In this work, a

**subset of ten frames**from the original 541 images were manually segmented by the authors, the frames selected present examples of overlapping that can be recognised and studied, namely the four CLUMPS depicted in Figure 1a. An example of both manually segmented channels can be seen in Figure 2.

#### 2.2. Synthetic Data

#### 2.2.1. Synthetic Data of Overlapping Ellipses

^{®}to test the method at different values of $(\varphi ,\mathrm{\Delta})$ ranging $\varphi $ from 0 to 90 degrees and $\mathrm{\Delta}$ from 0 to 160 pixels with increments of 10. Images of size $({n}_{h},{n}_{w})=(256,512)$ were generated with ${\mathbf{x}}_{0}={(128,128)}^{T}$ and axes $(a,b)=(120,53)$ that contained an overlapping of ${\mathcal{E}}_{0}$ and ${\mathcal{E}}_{\varphi ,\mathrm{\Delta}}$. Disregarding the images where there was no overlap present in the generated ellipses, a total of 142 images was generated. Figure 3 contains a subset of the ellipses tested. Cases where there was no overlap were ignored from the analysis.

#### 2.2.2. Object Detection: Single Objects and Clumps

#### 2.2.3. Basic Shapes Synthetic Data

**circle**has 4, drop has 7, bi-drop has 8 and tridrop has 10. To model the variations in the cells’ shapes, within their basic categories, the control points are distributed Normal. It is easy to see that each shape has a specific number of corners that classify them, i.e., the drop has one pointy edge or corner, while bidrop and tridrop have two and three, respectively. The control points are joined with splines that then produce the

**boundary**of the shape, $\mathcal{B}$, which then models that of a segmented cell (Figure 4). In this work, the shapes explored in detail were drop, bidrop and tridrop, where only the corners were moved to modify the pointiness of each shape and the rest of the control points were kept stationary. The pointiness value of each shape was chosen empirically between a rounded version of each shape and a pointy one. These values were allocated in a scale that ranges from 0 to 1.

#### 2.3. Junction Detection through Angle Variations

**Definition 1**(Inner angle of a point)

**.**

**Definition 2**(Bend and corner junctions)

**.**

**bend**is a junction in which the inner point angle for most separation distances is greater than 180 degrees. Conversely, a

**corner**is a junction in which most of its inner point angles are acute, i.e, less than 180 degrees.

**Definition 3**(Anglegram matrix)

**.**

**anglegram matrix**$\mathrm{\Theta}(i,j)={\theta}_{i,j}$ is defined as the values of the inner angles of each point i and per separation j, that is ${\theta}_{i,j}=\measuredangle {\mathbf{p}}_{i+j}{\mathbf{p}}_{i}{\mathbf{p}}_{i-j}$.

#### Criteria to Detect Junctions from the Anglegram

`findpeaks`from Matlab

^{®}, which identifies local maxima of the input vector by choosing points of which its two neighbours have a lower value. Due to quantisation noise in ${\widehat{\theta}}_{max}$, the parameters

`MinPeakDistance`and

`MinPeakHeight`were set to empirically consistent values. First,

`MinPeakDistance`, which restricts the function to find local maxima with a minimum separation, was set to 25. Furthermore, the parameter

`MinPeakHeight`was set to $mean({\widehat{\theta}}_{max})+0.75\times std({\widehat{\theta}}_{max})$.

#### 2.4. Segmentation of Overlapping Regions

#### 2.4.1. Voronoi Partition

#### 2.4.2. Junction Slicing (JS)

#### 2.4.3. Edge Following (EF)

#### 2.4.4. Self-organising Maps (SOM) Fitting

## 3. Results

`detectHarrisFeatures`in Matlab

^{®}. Finally, the results of the proposed segmentation techniques of the overlapping cells (Section 2.4) are shown, starting with the detection of objects explains in Section 2.2.2.

#### 3.1. Bend Detection in Overlapping Objects

#### 3.2. Corner Detection in Single Objects

#### Detection of Objects and Segmentation of Clumps

**black**shows the background,

**yellow**represent the false negatives,

**blue**the false positives and

**red**the true positives. The Jaccard Index per detected object was computed for all objects in the available frames. The results are shown in Figure 13, top, where the frame numbers are outlined along the axis. Two arrows were added to the figure to point a high Jaccard value (red), shown in the middle column of the figure and a notably low one (black - -), which corresponds to the bottom column.

## 4. Discussion

**bends**) or acute (

**corners**).

## 5. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## Abbreviations

EF | Edge Following |

JS | Junction Slicing |

MIP | Maximum Intensity Projection |

mIP | Minimum Intensity Projection |

ROI | Region of interest |

SOM | Self-organising maps |

## References

- Martinez, F.O.; Sica, A.; Mantovani, A.; Locati, M. Macrophage activation and polarization. Front. Biosci. J. Virtual Libr.
**2008**, 13, 453–461. [Google Scholar] [CrossRef] - Wood, W.; Martin, P. Macrophage Functions in Tissue Patterning and Disease: New Insights from the Fly. Dev. Cell
**2017**, 40, 221–233. [Google Scholar] [CrossRef] [PubMed] - Pocha, S.M.; Montell, D.J. Cellular and molecular mechanisms of single and collective cell migrations in Drosophila: Themes and variations. Ann. Rev. Genet.
**2014**, 48, 295–318. [Google Scholar] [CrossRef] [PubMed] - Stramer, B.; Moreira, S.; Millard, T.; Evans, I.; Huang, C.Y.; Sabet, O.; Milner, M.; Dunn, G.; Martin, P.; Wood, W. Clasp-mediated microtubule bundling regulates persistent motility and contact repulsion in Drosophila macrophages in vivo. J. Cell Biol.
**2010**, 189, 681–689. [Google Scholar] [CrossRef] [PubMed][Green Version] - Maška, M.; Ulman, V.; Svoboda, D.; Matula, P.; Matula, P.; Ederra, C.; Urbiola, A.; España, T.; Venkatesan, S.; Balak, D.M.W.; et al. A benchmark for comparison of cell tracking algorithms. Bioinformatics (Oxford)
**2014**, 30, 1609–1617. [Google Scholar] [CrossRef] [PubMed] - Ulman, V.; Maška, M.; Magnusson, K.E.G.; Ronneberger, O.; Haubold, C.; Harder, N.; Matula, P.; Matula, P.; Svoboda, D.; Radojevic, M.; et al. An objective comparison of cell-tracking algorithms. Nat. Methods
**2017**, 14, 1141–1152. [Google Scholar] [CrossRef] [PubMed] - Henry, K.M.; Pase, L.; Ramos-Lopez, C.F.; Lieschke, G.J.; Renshaw, S.A.; Reyes-Aldasoro, C.C. PhagoSight: An Open-Source MATLAB
^{®}Package for the Analysis of Fluorescent Neutrophil and Macrophage Migration in a Zebrafish Model. PLoS ONE**2013**, 8, e72636. [Google Scholar] [CrossRef] [PubMed] - Dufour, A.; Shinin, V.; Tajbakhsh, S.; Guillén-Aghion, N.; Olivo-Marin, J.C.; Zimmer, C. Segmenting and tracking fluorescent cells in dynamic 3-D microscopy with coupled active surfaces. IEEE Trans. Image Process.
**2005**, 14, 1396–1410. [Google Scholar] [CrossRef] [PubMed] - Chan, T.F.; Vese, L.A. Active contours without edges. IEEE Transa. Image Process.
**2001**, 10, 266–277. [Google Scholar] [CrossRef] [PubMed] - Plissiti, M.E.; Nikou, C. Overlapping Cell Nuclei Segmentation Using a Spatially Adaptive Active Physical Model. IEEE Transa. Image Process.
**2012**, 21, 4568–4580. [Google Scholar] [CrossRef] [PubMed] - Lu, Z.; Carneiro, G.; Bradley, A.P. An Improved Joint Optimization of Multiple Level Set Functions for the Segmentation of Overlapping Cervical Cells. IEEE Transa. Image Process.
**2015**, 24, 1261–1272. [Google Scholar] - Reyes-Aldasoro, C.C.; Aldeco, A.L. Image segmentation and compression using neural networks. In Advances in Artificial Perception and Robotics; CIMAT: Guanajuato, Mexico, 2000; pp. 23–25. [Google Scholar]
- Hannah, I.; Patel, D.; Davies, R. The use of variance and entropic thresholding methods for image segmentation. Pattern Recognit.
**1995**, 28, 1135–1143. [Google Scholar] [CrossRef] - Caselles, V.; Kimmel, R.; Sapiro, G. Geodesic Active Contours. Int. J. Comput. Vis.
**1997**, 22, 61–79. [Google Scholar] [CrossRef] - Harris, C.; Stephens, M. A Combined Corner and Edge Detector. In Proceedings of the 4th Alvey Vision Conference Alvety Vision Club, University of Manchester, Manchester, UK, 31 August–2 September 1988; pp. 147–151. [Google Scholar]
- Lindeberg, T. Junction detection with automatic selection of detection scales and localization scales. In Proceedings of the 1st International Conference on Image Processing, Austin, TX, USA, 13–16 November 1994; Volume 1, pp. 924–928. [Google Scholar]
- Solís-Lemus, J.A.; Stramer, B.; Slabaugh, G.; Reyes-Aldasoro, C.C. Segmentation of Overlapping Macrophages Using Anglegram Analysis. In Communications in Computer and Information Science, Proceedings of the Medical Image Understanding and Analysis, Edinburgh, UK, 11–13 July 2017; Springer: Cham, Switzerland, 2017; pp. 792–803. [Google Scholar]
- Okabe, A.; Boots, B.; Sugihara, K.; Chiu, S.N.; Kendall, D.G. Algorithms for Computing Voronoi Diagrams. In Spatial Tessellations; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2000; pp. 229–290. [Google Scholar]
- Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell.
**1986**, 8, 679–698. [Google Scholar] [CrossRef] [PubMed] - Kohonen, T. The self-organizing map. Neurocomputing
**1998**, 21, 1–6. [Google Scholar] [CrossRef] - Jaccard, P. Étude comparative de la distribution florale dans une portion des Alpes et des Jura. Bull. Soc. Vaud. Sci. Nat.
**1901**, 37, 547–579. [Google Scholar] - Fawcett, T. An Introduction to ROC Analysis. Pattern Recognit. Lett.
**2006**, 27, 861–874. [Google Scholar] [CrossRef] - Hollander, M.; Wolfe, D.A.; Chicken, E. The One-Sample Location Problem. In Nonparametric Statistical Methods; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2015; pp. 39–114. [Google Scholar]
- Cootes, T.F.; Taylor, C.J.; Cooper, D.H.; Graham, J. Active Shape Models-Their Training and Application. Comput. Vis. Image Underst.
**1995**, 61, 38–59. [Google Scholar] [CrossRef] - Gooya, A.; Lekadir, K.; Castro-Mateos, I.; Pozo, J.M.; Frangi, A.F. Mixture of Probabilistic Principal Component Analyzers for Shapes from Point Sets. IEEE Trans. Pattern Anal. Mach. Intell.
**2017**, PP. [Google Scholar] [CrossRef] [PubMed]

Sample Availability: Samples of the data and the code used to compute the anglegram algorithm are available at https://github.com/alonsoJASL/matlab.anglegram, or upon request to the corresponding author. Also, the code to generate ground truth is available in https://github.com/alonsoJASL/matlab.manualSegmentation. |

**Figure 1.**Two representative time frames displaying examples of cell shapes and overlapping. (

**a**) Full frame with (red) squares highlighting the cells that display the aforementioned shapes; (

**b**) Detail of each cell; (

**c**) Presents the full frame with (red) squares highlighting all regions where instances of overlapping cells (clumps) are shown and labelled for easy reference; (

**d**) Detail of CLUMP 2, present in (

**c**). Bars: 10 μm. (

**c**,

**d**) are reproduced with permission from [17].

**Figure 2.**Example of the ground truth at a representative time frame. The ground truth for both red (nuclei) and green (microtubules) channels is shown in coloured lines. The frame is shown in grey scale to allow for a better visualisation of the lines in the ground truth.

**Figure 3.**Overview of the range of paired ellipses investigated. The pairs presented on this image represent a small sample of the ellipses that were tested by the method presented. The overlapped region can be seen in white and the areas that are not overlapping are shown in grey. The boundary of the central ellipse ${\mathcal{E}}_{0}$ is highlighted in cyan while the second ellipse’s boundary is presented in red.

**Figure 4.**Synthetic generation of random basic shapes. Per shape, 200 cases were generated. The control points are shown in blue (·). The mean shapes are presented in magenta (−); and the mean control points are represented in black (⋄).

**Figure 5.**Graphical representation of the calculation of the inner point angle of point ${\mathbf{p}}_{i}$ at separation j. (

**a**) Representation of the inner angle of point ${\mathbf{p}}_{i}$ at separation j. Notice that the points in the boundary are taken in clockwise order; (

**b**) Representation of the translation vectors ${\mathbf{v}}_{+},{\mathbf{v}}_{-}$. Full explanation in text.

**Figure 6.**Explanation of inner angle of a point in the construction of the anglegram. The diagram shows a representation of nine arbitrary entries ${\theta}_{i,j}$ of the anglegram matrix. Each entry corresponds to an inner point angle at a specific separation. In the diagram, as in the matrix, the rows (i) correspond to a single point alongside the boundary (red ⋄) that start at a specific point (marked ○); the columns (j) correspond to the separation from the point i⋄ and from there the angle is taken. Each corresponding entry, ${\theta}_{i,j}$, of the anglegram matrix $\mathrm{\Theta}$ is marked with a green arrow, furthermore, each angle is shaded to match the colour map used in the anglegram in Figure 7.

**Figure 7.**Junction detection on overlapping objects through the maximum intensity projection of the anglegram matrix. (

**a**–

**c**) Representation of inner point angle calculation and generation of the anglegram matrix. (

**a**) Represents a synthetic clump with its boundary outlined (blue - -), where a point (blue ○) in the boundary will have various inner point angles per separation j; All the inner point angles for the highlighted point are displayed in (

**b**); (

**c**) Shows the anglegram matrix, and the definition of ${\widehat{\theta}}_{max}$ is represented in (

**d**) along the boundary points. Detection of junctions are shown with ⋄ markers (magenta). Notice the two horizontal lines representing $mean{\widehat{\theta}}_{max}$ and $mean{\widehat{\theta}}_{max}+0.75std{\widehat{\theta}}_{max}$. [17] Reproduced with permission.

**Figure 8.**Illustration of all the methods developed and the workflow to obtain results. (

**a**) shows the detail of CLUMP 2 in the original frame. Clumps are detected and the boundary was extracted. With the boundary information, the anglegram was calculated and the junctions were detected (

**b**). On the second row, a diagram to the methods were presented. From left to right, the (

**c**) Voronoi partition, (

**d**) Junction Slicing (JS), (

**e**) Edge Following (EF) and (

**f**) SOM fitting. In (

**g**) the outputs from each method for both cells within the detected clump are shown. [17] Reproduced with permission.

**Figure 9.**Junction (bends) detected for varying angles and separation distances. Five

**rows**show angles ranging from 10 to 90 degrees and eight

**columns**showing different separation distances from 0 to 160 pixels. Images in grey are cases where there is no overlap. The boundary of the clumps is shown in cyan (- -), with the first point in the boundary marked (⋄). The junctions detected by the method are marked in magenta ($\ast $).

**Figure 10.**Pointiness assessment of the drop, bidrop and tridrop shapes, see full explanation in text. For each shape presented, three instances of the shape are shown with varying pointiness values; the relationship of the anglegram and the pointiness is shown. The columns are arranged in triads corresponding to each of the basic shapes. Notice that, in all cases, the difference between the maximum and minimum values of the mIP (solid lines in third row) grows proportionally to the pointiness level.

**Figure 11.**Qualitative comparison of junction detection via anglegram (magenta ⋄) versus the Harris corner detector (green +). The strongest corners from the Harris detector per clump are displayed. (

**a**) Only CLUMP 1 has a missing junction (cyan ○), it should be noticed how difficult detection of the junction would be. [17] Reproduced with permission. (

**b**) Each basic shape in the data is represented from a segmented frame. Refer to Section 2.3 for a full explanation of the corner detection algorithm.

**Figure 12.**Qualitative comparison of segmentation technique against the ground truth on three different time frames in the dataset. The columns from left to right present the original image, the manual segmentation (GT), the result of the segmentation described in Section 2.4 and finally the comparison between both binary images. Regarding the colours in the final column:

**black**is the background,

**yellow**represent the false negatives,

**blue**the false positives and

**red**the true positives.

**Figure 13.**(

**a**) Comparison of the Jaccard index for each object detected, whether it is a clump or not, at each of the ten frames with available ground truth. Two frames are highlighted (45 and 54) and arrows point at values in the ribbon. (

**b**) Depiction of cell in frame 45 that achieved a high Jaccard index in Top row (red arrow). (

**c**) Depiction of clump detected in frame 54, and its comparison with the ground truth. The black dotted arrow in the top row show the Jaccard index value of the clump shown. Regarding the colours in the segmentation comparison:

**black**is the background,

**yellow**represent the false negatives,

**blue**the false positives and

**red**the true positives.

**Figure 14.**Qualitative comparison of different segmentation methods in one frame. The segmentation results for (

**a**) the Voronoi method, (

**b**) Junction Slicing (JS), (

**c**) Edge Following (EF) and (

**d**) SOM fitting are shown. Top and Bottom rows represent the results for CLUMP 2 and CLUMP 3 respectively [17]. Reproduced with permission.

**Figure 15.**Comparison of Precision, Recall and Jaccard Index for all methods of segmentation of overlapping in clumps 2 and 3. Horizontal axis correspond to the box plots from the different methods and their summarised performance in the metrics computed. Three groups corresponding to Precision, Recall and Jaccard Index contain four box plots; which, from left to right, correspond to Voronoi, JS, EF and SOM methods. Table 1 summarises the information on this image. [17] Reproduced with permission. (

**a**) CLUMP 2. Y-axis ranges from $(0.82,1)$; (

**b**) CLUMP 3. Y-axis ranges from $(0.7,1)$.

**Table 1.**Comparison of mean values of Precision, Recall and Jaccard Index for clumps 2 and 3 over 10 frames. This table summarises the results in Figure 14. Highest results are highlighted.

CLUMP 2 | CLUMP 3 | |||||
---|---|---|---|---|---|---|

Precision | Recall | Jaccard Index | Precision | Recall | Jaccard Index | |

Voronoi | 0.906 | 0.925 | 0.843 | 0.872 | 0.868 | 0.771 |

JS | 0.970 | 0.953 | 0.926 | 0.974 | 0.948 | 0.925 |

EF | 0.964 | 0.983 | 0.948 | 0.938 | 0.950 | 0.896 |

SOM | 0.965 | 0.951 | 0.919 | 0.973 | 0.948 | 0.923 |

**Table 2.**Statistical analysis of results presented in Table 1. The Wilcoxon Signed Rank test [23] was implemented to compare the results per measurement (Voronoi, JS, EF and SOM). The table presents the p-values on the paired test for each of the pairs, (first and second columns). Tests where the null hypothesis could not be rejected are highlighted.

CLUMP 2 | CLUMP 3 | ||||||
---|---|---|---|---|---|---|---|

Precision | Recall | Jaccard Index | Precision | Recall | Jaccard Index | ||

Voronoi vs. | JS | $p=0.002$ | $p=0.002$ | $p=0.002$ | $p=0.004$ | $p=0.004$ | $p=0.004$ |

EF | $p=0.004$ | $p=0.004$ | $p=0.004$ | $p=0.008$ | $p=0.008$ | $p=0.008$ | |

SOM | $p=0.002$ | $p=0.037$ | $p=0.002$ | $p=0.004$ | $p=0.004$ | $p=0.004$ | |

JS vs. | EF | $p=0.004$ | $p=0.004$ | $p=0.004$ | $p=0.008$ | $p>0.05$ | $p>0.05$ |

SOM | $p>0.05$ | $p>0.05$ | $p>0.05$ | $p>0.05$ | $p>0.05$ | $p>0.05$ | |

EF vs. | SOM | $p>0.05$ | $p=0.004$ | $p=0.008$ | $p=0.016$ | $p>0.05$ | $p>0.05$ |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Solís-Lemus, J.A.; Stramer, B.; Slabaugh, G.; Reyes-Aldasoro, C.C.
Segmentation and Shape Analysis of Macrophages Using *Anglegram* Analysis. *J. Imaging* **2018**, *4*, 2.
https://doi.org/10.3390/jimaging4010002

**AMA Style**

Solís-Lemus JA, Stramer B, Slabaugh G, Reyes-Aldasoro CC.
Segmentation and Shape Analysis of Macrophages Using *Anglegram* Analysis. *Journal of Imaging*. 2018; 4(1):2.
https://doi.org/10.3390/jimaging4010002

**Chicago/Turabian Style**

Solís-Lemus, José Alonso, Brian Stramer, Greg Slabaugh, and Constantino Carlos Reyes-Aldasoro.
2018. "Segmentation and Shape Analysis of Macrophages Using *Anglegram* Analysis" *Journal of Imaging* 4, no. 1: 2.
https://doi.org/10.3390/jimaging4010002