A Mixed Method with Effective Color Reduction

: This article presents a color quantization technique that combines two previously proposed approaches: the Binary splitting method and the Iterative ant-tree for color quantization method. The resulting algorithm can obtain good quality images with low time consumption. In addition, the iterative nature of the proposed method allows the quality of the quantized image to improve as the iterations progress, although it also allows a good initial image to be quickly obtained. The proposed method was compared to 13 other color quantization techniques and the results showed that it could generate better quantized images than most of the techniques assessed. The statistical signiﬁcance of the improvement obtained using the new method is conﬁrmed by applying a statistical test to the results of all the methods compared.


Introduction
Color quantization is a process that attempts to reduce the colors of an image. Let us consider a color image including n pixels. When the image is represented using the RGB color space, each pixel p i of this image, with 1 ≤ i ≤ n is represented by three integer values between 0 and 255 that indicate the amount of red, green and blue that define the pixel, p i = (R i , G i , B i ). Therefore, the palette used to represent this image can include 256 3 different colors. The color quantization problem consists of defining a new palette which includes fewer colors than the palette used to represent the original image. Once this palette containing q colors has been defined, it is used to represent the new image, called quantized image. The quantized palette should be defined in such a way that it allows a new image, as similar to the original as possible, to be obtained.
Color quantization is not only a problem in itself, but it is also related to other operations applied to images, such as image segmentation [1][2][3][4][5], content-based image retrieval [6][7][8], texture analysis [9][10][11] and image watermarking [12,13]. When the number of different colors of an image is reduced, the size of the file that contains the image is also reduced. This feature not only reduces the storage space required for the image but also accelerates the speed of transmission of the image. Therefore, although current devices can display images with many colors, reducing the colors of an image is still useful for solving other problems. Several practical scientific and industrial applications that involve color quantization are medical image analysis [14][15][16], nondestructive inspection of food quality [17,18], biometric recognition [19,20] or surveillance systems [21,22].
The color quantization problem is complex, since the selection of the best colors to define the quantized palette is a NP-complete problem [23]. For this reason, several solution methods have been proposed to solve this problem. With respect to the strategy applied for finding a solution, the methods employed can be classified into two groups: splitting methods and clustering-based methods.
The splitting methods operate on the color cube to define the quantized palette. This cube is progressively divided into smaller boxes until q boxes are obtained. At the end of this process each box defines one color in the quantized palette, and the quantized image is generated by representing the 2.1. 1

. The Binary Splitting Method
This method builds a binary tree to reduce the colors of an image [27]. A binary tree is a data structure where each node can include a maximum of two children. The leaves of the tree created by the algorithm define the quantized palette and identify the subset of pixels of the original image that are represented by the same color in the quantized image.
Each node of this tree represents a subset of pixels of the original image and stores the information used to perform the sucessive operations. The node j represents the subset of pixels C j . The color of this node is computed as color j = m j /N j , where m j is the sum of the colors of the pixels in C j (Equation (1)) and N j is the number of elements of said set (Equation (2)). In addition, there is a 3 × 3 matrix associated with the node, denoted as R j , which stores the sum of the product of each pixel p i ∈ C j and its transpose (Equation (3)).
These three elements allow the covariance of C j to be computed using Equation (4).
Algorithm 1 shows the main steps of BS. At the beginning of the operations, the tree only includes the root node, denoted C 1 , which represents the entire set of pixels of the original image. Once the variables R 1 , m 1 and N 1 , which are associated with this node, have been computed, the algorithm applies q − 1 iterations to define the q colors of the quantized palette.
Each iteration selects a leaf j, splits the set of pixels C j into two subsets and then creates two children for the node j, which represent each of the two subsets.
To determine the leaf that will be processed during the current iteration, the eigenvalues of the covariance matrix of the current leaves are considered. Based on this information, the algorithm selects the leaf j with the largest eigenvalue, denoted λ j .
Once the leaf j has been selected, the two children of this node are created and denoted as jA and jB. Then, the set C j is divided into two subsets, C jA and C jB , with a plane perpendicular to the direction in which the variation of C j is the greatest and which passes through the mean point. In this case, such plane is perpendicular to the eigenvector E j corresponding to the principal eigenvalue λ j of R j . Therefore, the subsets that are associated with the nodes jA and jB are defined as follows: To complete the definition of the new nodes, the three values associated with each one are computed. After applying Equations (1)-(3) to calculate the values for node jA, the values for node jB can be easily calculated using Equations (5)- (7).
when the iterations conclude, the tree has q leaves that define the quantized palette. Then, the quantized image is obtained by representing the pixels associated with each leaf using the color of such leaf. Find the leaf j with the largest λ j

5:
Create the two children of node j 6: Split the pixels of C j into subsets C jA and C jB 7: Compute m jA , N jA and R jA by Equations (1)- (3) 8: Compute m jB , N jB and R jB by Equations (5)-(7) 9: end for 2.1.2. The Ant-Tree for Color Quantization Method ATCQ is a clustering-based method that is used to reduce the colors of an image [39]. This method represents the pixels of the original image by means of ants and subsequently builds a tree using them based on the similarity among the pixels represented by the ants. The ant h i represents the pixel p i of the image, with 1 ≤ i ≤ n. At the beginning of the algorithm, all the ants are on the root of the tree, denoted a 0 . The operations of the algorithm allow all the ants to connect to the structure, becoming new nodes of the tree. The number of children of a 0 , denoted as q, is equal to 0 before starting the operations and can increase to a limit defined by Q max . These children, denoted {S 1 , ..., S q }, define the root nodes of q subtrees of ants. The node S j defines a color of the quantized palette, computed as color j = sum j /nc j , where sum j is defined by Equation (8) and nc j is the number of said ants. When the algorithm concludes, the color of the subtree j is used to represent in the quantized image the pixels of the original image associated with the ants of this subtree.
The ATCQ operations define an iterative process to connect all the ants to the tree (Algorithm 2). Each iteration takes an ant h i that is not connected to the structure and operates according to the current position of this ant, denoted a pos . If the ant is on the node a 0 , the number of children of this node determines the next operation (Algorithm 3). If a 0 has no children (q = 0), the first one is created, and the ant is connected to it. In another case, the selected ant can move to an existing subtree or to a new subtree. To determine the destination of the ant, it is necessary to determine the node S c with the color most similar to h i . Let d ic denote this similarity (d ic = Sim(p i , color c )). In order to make a decision, this value is compared to a threshold computed by Equation (9), where α is a parameter in (0, 1] and e c is the sum of the similarities between each ant h i connected to the subtree c and the value of color c when the ant was included in the subtree (Equation (10)). If d ic < T c , a new child of a 0 must be created to connect h i to it. If d ic ≥ T c , the ant is included in the subtree c. This last operation is also performed when a new child of a 0 should be created, but it is impossible because this node already has Q max children.
Create a new child of a 0 and connect h i 3: else 4: Select the node S c with the color most similar to h i

5:
if (d ic < T c ) and (q < Q max ) then 6: Create a new child of a 0 and connect h i The similarity between two colors x and y used in [39] is computed as a function of the Euclidean distance, E(x, y), between both colors: Sim(x, y) = 1/(1 + E(x, y)).
Before creating a new child of a 0 , q is increased by one. Then, the three variables associated with the new node S q are initialized: nc q = 1, e q = 0 and sum q = p i .
When it is necessary to update an existing subtree c, first the ant h i is placed on S c and then the values associated with S c are updated: nc c = nc c + 1, e c = e c + d ic and sum c = sum c + p i .
When the selected ant is not on a 0 , the operations to be performed depend on the number of children of a pos (Algorithm 4). There is a limit to the number of children on the nodes of the tree: Q max for the root node and L max for the others. Both values can be different because Q max limits the size of the quantized palette. If a pos has no children, h i becomes the first one. If a pos has two children and the second one has never been disconnected from a pos , first this child and all its descendants are disconnected from the tree and moved to a 0 ; then, h i connects as the new second child. In other cases, a child of a pos is selected, denoted a + . If a + is not sufficiently similar to h i , (Sim(p i , a + ) < T c , where c is the subtree that includes the node a + ), and a pos admits at least one more child, h i connects to a pos ; in other cases, h i moves on a + .
When the iterations of the algorithm conclude, the nodes of the second level of the tree define a quantized palette that includes q colors, {color 1 , ..., color q }. Then, the quantized image is generated by replacing the color of all the ants included in the subtree c with the color of the node S c . Algorithm 4 Ant case operations (h i )-ATCQ. 1: if no children connected to a pos then 2: Connect h i to a pos 3: else if two ants connected to a pos and the second one never disconnected then 4: Move to a 0 the subtree with root in the second child of a pos

5:
Connect h i to a pos 6: else 7: Select a node connected to a pos , a +

8:
if (Sim(p i , a + )< T c ) and (a pos has less than L max children) then 9: Connect h i to a pos The solution described in [46] applies the ATCQ method to the solution obtained by BS (Algorithm 5). BS defines an initial quantized palette and a partition of the pixels of the original image. This information is used to define an initial tree of ants and ATCQ operations improve the quantized palette.
First, BS creates a binary tree with q leaves, denoted {l 1 , ..., l q }. The leaf l k represents the subset C l k of pixels of the original image and defines a color of the quantized palette, color l k = m l k /N l k , where m l k and N l k are computed by Equations (1) and (2), respectively.
Before applying ATCQ, the leaves of the binary tree are used to define q nodes in the second level of the tree of ants, {S 1 , ..., S q }. The leaf l k is used to initialize the three variables associated with the node S k : nc k and sum k take the value of N l k and m l k , respectively, while e k is computed by Equation (11), which computes the similarity between the color of the leaf l k and each pixel p i in the subset C l k .
At this point, the tree of ants includes the root node and the maximum number of children allowed for that node (q = Q max ). Therefore, this tree represents an initial quantized palette with q colors. Then, the ATCQ operations are applied in order to connect all the ants to the tree. When these operations conclude, the quantized image is generated in the same way described for ATCQ.

Algorithm 5
Binary splitting + ATCQ algorithm. 1: Apply BS 2: for each leaf l k of the binary tree do 3: Create node S k in the second level of the ATCQ tree 4: Set nc k = N l k , sum k = m l k

5:
Compute e k by Equation (11) 6: end for 7: Apply ATCQ 8: Generate the quantized image The computational results reported in [46] show that the image generated by BS+ATCQ is always better than the result of BS or ATCQ applied independently. In addition, this method consumes less time than other methods applied to the same problem.

The ITATCQ Method
The ITATCQ algorithm is based on the ATCQ algorithm [40]. This method performs the operations of ATCQ during a predefined number of iterations IT (Algorithm 6).
Initially, ATCQ is applied and the tree of ants is defined. After this, all the ants are disconnected from the tree and moved back to the support. Then, the operations of ATCQ are applied again to reconnect the ants to the structure. This process is applied to a certain number of iterations and allows a palette to be defined that is updated as the iterations progress. As a result, the quality of the quantized image obtained increases as the iterations progress.
When the iterative process ends, the quantized image and the quantized palette are defined in the same way described for the ATCQ algorithm.
When ATCQ was proposed, a variant was defined that eliminates the disconnection process in order to accelerate the method [39]. In this case, the method can be further accelerated by simplifying the structure of the tree by including all the ants in the third level. When a 3-level tree is used, the operations of Algorithm 4 reduce to line 2. The computational results reported in [39] showed that the images obtained by this variant are slightly worse than those generated when a multilevel tree is created.
ITATCQ applies the variant of ATCQ that builds a 3-level tree. On the other hand, ITATCQ does not use pixels sorted by average similarity, as was proposed when ATCQ was defined [39]. In this case, the pixels of the image are processed as they are read from the original image (from left to right and from top to bottom).
The results reported in [40] show that ITATCQ generates better images than ATCQ. end while 10: if t < IT then 11: Move all the ants back to the node a 0 12: end if 13: end for

The BS+ITATCQ Method
This article proposes a modification to the BS+ATCQ method that consists of replacing ATCQ by ITATCQ. As a result, the new method can obtain a quantized image that improves as the iterations progress. In this way, the user can decide which feature of the algorithm is more important: the quality of the solution or the speed in which a result is obtained.
Therefore, the only step that is modified with respect to BS+ATCQ is the one that applies ATCQ (line 7 of Algorithm 5).
Although ITATCQ generates a 3-level tree to accelerate the process, the creation of a multilevel tree to define BS+ITATCQ has also been considered. Certainly, the results published in [40] show that ITATCQ generates better images if a multilevel tree is considered. This possibility is easy to implement, since it only requires replacing the operation in line 7 of Algorithm 6 with the operations corresponding to Algorithm 4.
It should be noted that when the original ITATCQ that creates a 3-level tree is combined with BS, the resulting method only depends on the Q max parameter. On the contrary, if the version of ITATCQ that generates a multilevel tree is considered, the resulting method also depends on the parameters α and L max .
Obviously, BS+ITATCQ will consume more time when a multilevel tree is considered. The objective of considering this tree is to analyze whether the improvement in the quality of the image compensates for a longer execution time.
There are many color quantization techniques that combine other existing methods [29,31,32,34,35,37,[41][42][43]46]. Two main characteristics are considered when defining a color quantization method: the quality of the quantized image and the time required to obtain such image. However, it is difficult to achieve both objectives. When several methods are combined to define a new color quantization technique, the selected methods are intended to achieve these objectives or at least one of both.
The solution proposed in this article combines a fast method with another method that can generate a quantized image that improves with the iterations. In this way, the resulting method can generate an image quickly, but such an image can be improved if the method is allowed to consume a little more time. Computational results show that the new method can obtain a quantized image of better quality than BS even when few ITATCQ iterations are executed. Therefore, a small increase in the execution time is required to obtain a better quality image than that generated by BS. On the other hand, if the quality of the final image is more important than the execution time, this method can also achieve this goal, since the quality of the final image improves with the number of iterations of ITATCQ. Computational results also show that the image generated by the new method is better than that generated by ITATCQ applied independently.
Although BS+ATCQ can generate good quantized images quickly, BS+ITATCQ can generate even better images. In addition, the second method allows choosing between speed and quality of the result.

Results and Discussion
This section includes results corresponding to tests performed with the proposed method. It also includes the results of other methods described in the literature, in order to compare them to those obtained using the method proposed here. The execution time and the mean squared error (MSE) of the methods are compared. MSE is computed by Equation (12), where p i represents a pixel of the original image and p i represents the pixel in the same position as p i but in the quantized image.
The test set includes 24 images used in several research articles, which can be downloaded from [48][49][50][51]. Figure 1 shows these images.
Four palette sizes were considered to reduce the colors of the images: {32, 64, 128, 256}. In addition, the six α values proposed in [39] were used to apply ITATCQ: The tests were executed on a PC running under the Linux operating system with 8 GBytes of RAM and AMD Ryzen 5 1600 processor (3.2 GHz).
The BS+ITATCQ method was executed during 25 iterations. Table A1 shows the results for the variant that considers a 3-level tree (labeled BS+ITATCQ3) while Table A2 includes the results for the variant that considers a multilevel tree (labeled BS+ITATCQm). In order to evaluate the quality of this method compared to other faster methods, both tables show the results at iterations 5 and 25. The results of the tables correspond to 20 independent tests executed for each image, palette size and α value. Table A2 clearly shows that the error of the image obtained for a given palette size is very similar for all the executions. It is also observed that the average error of BS+ITATCQm is smaller than the error obtained by BS+ITATCQ3 for each image and palette size in almost all the cases. The reduction in the error obtained by the first variant at iteration 25 ranges in [0.01%, 5.87%] and only 8 cases exceed 2%. On the other hand, the execution time is longer for BS+ITATCQm, as can be clearly seen in Figure 2, which compares the average execution time for each palette size.   Therefore, BS+ITATCQm can generate better images than BS+ITATCQ3. Nevertheless, since the second variant consumes less time, the rest of the section will consider the results of BS+ITATCQ3 to compare the proposed method to other color quantization methods.

Results of BS and ITATCQ Applied Independently
The results of BS+ITATCQ3 were compared to those of BS (Table A3) and ITATCQ (Table A4) applied independently. ITATCQ was executed during 25 iterations and the same α values used to test BS+ITATCQ were considered.
It can be clearly observed that the quantized image is of a better quality for the combined method in all cases. Obviously, the combined method consumes more time than each method applied independently.
The amount of error is better for BS+ITATCQ3 than for BS from iteration 5 for all images and palette sizes. The percentage of error reduction obtained by the new method ranges in [9.9%, 30.1%] at iteration 5 and it ranges in [10.9%, 32.3%] at iteration 25. The largest differences obtained when comparing both methods correspond to the Caps image.
The average MSE obtained by ITATCQ at iteration 25 is also worse than the result obtained using BS+ITATCQ3 at iteration 5 for all images and palette sizes. In addition, when comparing the results of iteration 25, the percentage of error reduction obtained by the second method varies in between 17.8% and 93.9%. In this case, the most significant differences correspond to the Mandrill image. Table A4 shows that the average error and the standard deviation of ITATCQ are very large for several images (especially for the Mandrill image). This is because the smallest α values considered do not generate palettes with Q max colors. The most extreme case occurs for the Mandrill image when α = 0.25, in which the quantized image only includes one color and the associated error is very large.
The study proposing the ITATCQ method reported that better results are obtained if the range of α values considered for the tests is adjusted according to the size of the quantized palette [40]. Since this was not applied here, a second comparison was performed that considers the minimum MSE obtained by ITATCQ. In this case, the results of BS+ITATCQ3 at iteration 5 are better than those of ITATCQ at iteration 25 in all cases except for 1, corresponding to the Snowman image with 64 colors. In this case the difference is less than 0.3%. Therefore, ITATCQ errors are worse than those of the new method, even if the comparison considers the minimum error obtained by ITATCQ for each image and palette size.
The execution time of BS+ITATCQ3 is not the sum of the execution time of BS and ITATCQ applied independently, as might be expected. This is because the results reported for ITATCQ are the averages for the set of α values tested. The smallest α values generate palettes with less than Q max colors in some cases, so the execution time is reduced. Conversely, when ITATCQ is applied to the result of BS, the initial palette always includes the maximum number of colors allowed.

Results of BS+ATCQ
The next comparison performed considers the results of BS+ITATCQ3 and BS+ATCQ. The results included in Table A5 correspond to the execution of BS+ATCQ when a 3-level tree is considered, since this case is faster.
The image obtained using the new method is always better. When considering the results of BS+ITATCQ3 at iteration 5, the percentage of error reduction obtained by this method ranges in [0.7%, 27.6%]. On the other hand, this percentage ranges in [2.4%, 28.8%] when the results of iteration 25 are compared.

Comparison of Results
To complete the previous analysis, Figure 3 shows the average percentage of error reduction obtained by the new method at iteration 25 compared to BS, ITATCQ and BS+ATCQ. In this case, the average percentage is shown for the results grouped by palette size. This figure also shows two results for ITATCQ, labeled ITATCQa and ITATCQm, which use the average and the minimum MSE, respectively. It is clearly observed that all the methods are improved when using BS+ITATCQ. The smallest improvement is obtained when BS+ATCQ is compared. On the other hand, the largest improvement is obtained when BS+ITATCQ is compared to the average error of ITATCQ. It is also observed that the difference with respect to ITATCQ is reduced by comparing the error of the best image found by ITATCQ instead of the average error.

Results of Other Methods
Tables A6-A9 show the MSE values obtained using other quantization methods. Four splitting methods have been applied: Wu's method (WU) [26], Octree (OC) [28], Median-cut (MC) [24] and the Variance-based method (VB) [25]. In addition, five clustering-based methods were also used: Neuquant (NQ) [33], K-means (KM), the Shuffled-frog leaping algorithm (SFLA) [44], the Artificial bee colony algorithm combined with ATCQ (ABC+ATCQ) [42] and the Firefly algorithm combined with ATCQ (ATCQ+FA) [43]. To complete the comparison, the WATCQ method described in [52] was considered, which applies ATCQ to the partition defined by the Wu's method. The results reported for each method correspond to 20 independent tests for each image and palette size. The results of the iterative methods (KM, ABC+ATCQ, ATCQ+FA and SFLA) correspond to iteration 25 and include the average MSE and the standard deviation. The best solution for each image appears in bold and the worst solution is underlined. To reduce the length of the article, the execution time of each method for each image and palette size is included as Supplementary Materials. However, the article does include a figure that allows the execution time of the methods to be compared ( Figure 4). This figure represents the average execution time of each method for each palette size. Two subfigures have been included to show more clearly the time consumed by the fastest methods.
The parameters used for the tests are the following: six food sources have been considered for ABC+ATCQ and six individuals (frogs, fireflies) for SFLA and ATCQ+FA. SFLA was executed considering 2 memeplexes, 4 improvement iterations of each memeplex and a sampling step equal to 10. The ATCQ algorithm combined with other algorithms was executed considering the variant that generates a 3-level tree, since it is faster. In addition, the α parameter of ATCQ was set to the same values previously described to execute ITATCQ. Finally, the sampling factor of NQ was set to 1, since this is the value that generates the best results for this method.
It should be noted that the results obtained using the ATCQ method are not included because the tests published in [46] have already shown that the BS+ATCQ method improves the images obtained by ATCQ.
The results included in the tables indicate that BS+ITATCQ3 at iteration 5 obtains better MSE than WU, OC, MC, VB and NQ for all the images and palette sizes. In addition, Figure 5 shows the average percentage of error reduction obtained by BS+ITATCQ3 at iterations 5 and 25 compared to those methods. The summary that shows this figure confirms the superiority of BS+ITATCQ3.  When the results of BS+ITATCQ3 at iteration 25 are compared to those of the other iterative methods, the proposed solution obtains worse error in some cases. The best result of the comparison is obtained when ATCQ+FA is considered, since the proposed method obtains better error in all the cases. On the other hand, KM, SFLA and ABC+ATCQ are improved in 85, 77 and 31 cases out of 96, respectively. The percentage of error reduction obtained by BS+ITATCQ3 compared to these methods ranges in [1.15%, 51.29%], [0.01%, 42.56%] and [0.04%, 20.87%], respectively. On the contrary, when these three methods obtain better results than the proposed solution, the percentage of error reduction ranges in [0.11%, 5.49%] for KM, [0.08%, 6.23%] for SFLA and [0.13%, 9.07%] for ABC+ATCQ. It can be clearly observed that the upper limit of the intervals is always considerably greater for the cases improved by BS+ITATCQ3.
When the results of BS+ITATCQ3 at iteration 25 are compared to those of WATCQ, the first method obtains better results for 89 out of 96 cases. These results are similar in the number of cases to those obtained when KM is compared to BS+ITATCQ, although the range is different. In this case the improvement obtained by BS+ITATCQ compared to WATCQ is 10.42% at most. ABC+ATCQ is the method which obtains better global errors when compared to the proposed method. Nevertheless, it is also the most time consuming method among those considered. It should be noted that the improvement obtained by ABC+ATCQ is less than 5% for the majority of the cases (in 59 out of 65 cases improved by this method).
The results reported for KM correspond to tests that take random initial centroids. However, a second group of tests was performed using the quantized palette generated by BS to define the initial centroids of K-means. The results of these tests appear in Tables A6-A9 labeled as BSKM. Although BSKM and BS+ITATCQ3 have some similarities, there are several differences between both solutions: • Each iteration of K-means considers an initial set of centroids and uses these centroids to associate each pixel with a cluster. The values of the centroids are the same until the end of the iteration. On the other hand, the colors used by ITATCQ to associate an ant to a subtree are updated during each iteration of the algorithm. When a new ant is associated with a subtree, the color of that subtree is updated to include the color of the ant. Therefore, when the next ant is processed, the set of colors used to decide its destination is different from the set considered for the previous ant. It can be observed that BSKM obtains better error values than BS+ITATCQ3 in many cases, although it requires much more time. BS+ITATCQ3 obtains better error than BSKM only in 29 out of the 96 cases. For the remaining 67 cases, the reduction in the error obtained by BSKM is less than 4% for 56 cases, ranging between [4%, 6%] for six cases and exceeding 7% for only two cases. Figure 6 shows the average percentage of error reduction obtained by BS+ITATCQ3 when compared to WATCQ, ATCQ+FA, SFLA, KM, BSKM and ABC+ATCQ. The figure confirms that ABC+ATCQ and BSKM can obtain images with less error than BS+ITATCQ3, although the average difference is less than 4% for all the palettes.  Figure 7 compares the average MSE obtained by each method for the entire test set. The results are represented considering each palette independently and also considering the results for all the palettes as a whole. The methods were ordered by decreasing error for the case that considers all the palettes as a whole and this order was used in the five subfigures. The results considered for the proposed method are those of BS+ITATCQ3 at iteration 5. This figure confirms that the proposed method is among those that generate the best quality images of all the compared methods. To complete the comparison, the execution time of the methods must be taken into consideration. The information shown in Figure 4 allows for this comparison. Figure 4a shows the values for the noniterative methods and also includes the results of BS+ITATCQ3 at iteration 5, in order to better interpret the values of this figure. On the other hand, Figure 4b shows the results of the iterative methods. In this second case, the results of BS+ITATCQ3 are those corresponding to iteration 25. To complete the information, Figure 4 also shows the average execution time of BS+ATCQ, BS and ITATCQ, discussed in the previous subsections.
It can be clearly seen that the methods represented in Figure 4a are faster than the method described in this article, although they always generate worse images than the proposed method, except WATCQ. As far as WATCQ is concerned, it only generates better quality images than BS+ITATCQ3 at iteration 5 for 15 cases out of 96, and this value reduces to 7 cases when comparing the results of BS+ITATCQ3 at iteration 25.
Regarding the methods represented in Figure 4b, ABC+ATCQ, ATCQ+FA, KM and BSKM clearly consume more time than BS+ITATCQ3. On the other hand, SFLA is slower than BS+ITATCQ3, but the difference is smaller.
The two desirable features of a color quantization method are its speed and the good quality of the images it generates. To try to compare both features at the same time, a scoring method was defined. First, the average MSE and execution time of each method were computed, considering all the images of the test set and all the palettes. Then, these average values were sorted from worst to best. After this, score 1 was assigned to the method with the worst value and increasingly integer scores were assigned to the sorted methods. As a result, the methods were assigned a score between 1 and 15 when considering the average MSE, while this range was reduced to 1-14 when considering the average execution time, since the execution time is the same for MC and WU methods. Once these two scores were defined, the average of both values was computed. Figure 8 shows the scores assigned for both criteria and the average value. The methods are represented and sorted by increasing average value. It can be clearly observed that BSKM and ABC+ATCQ obtain the best scores when MSE is considered, but they obtain the worst scores when the execution time is considered. On the contrary, MC obtains the best score for the execution time but the worst for MSE. When the average of both scores is considered, WATCQ obtains the best result and the proposed method obtains the second best result. To sum up the results previously presented, the WU, OC, NQ, VB and MC methods are faster than BS+ITATCQ3, but they also generate worse images than BS+ITATCQ3. The WATCQ method is also faster than BS+ITATCQ3, but it generates worse images in most cases. On the contrary, the iterative methods are slower that BS+ITATCQ3, but some of them can obtain better images than it. Among these methods, ABC+ATCQ and BSKM are the ones that obtain better quality images compared to the proposed method, but they are also the slowest. For example, 25 iterations of BS+ITATCQ3 consume 15 s on average to generate an image with 256 colors. For this same case, BSKM and ABC+ATCQ consume 74 and 162 s, respectively. Therefore, although both methods can obtain better images in many cases, a lot of time is required to obtain a small reduction in the error of the quantized image. However, more iterations of BS+ITATCQ3 could be executed, generating a better quality image and in less time than both methods.

Statistical Significance of the Results
To conclude the comparison of the methods, the Wilcoxon test [53] was applied in order to determine the statistical significance of the improvement obtained by the proposed method. This test compares the results of two methods to determine that there is no significant difference between both methods.
To compare BS+ITATCQ3 to the iterative methods, the results of iteration 25 of all the methods were used. Nevertheless, when the noniterative methods were compared, the results of BS+ITATCQ3 used were those corresponding to iteration 5. The Wilcoxon test, with a significance level of 0.05, was applied to compare both the MSE values and the execution time. The results of the tests appear in Table 1. The p-value (probability value corresponding to the Z-value) is not included in the table because it is the same in all cases (<0.00001). This value indicates that the differences between each pair of methods compared are significant in all cases. In addition, the sums of the ranks included in the table allow us to determine which of the two methods compared is better. If the results corresponding to the MSE values are analyzed, it is observed that the proposed method is significantly better than all the other methods, except ABC+ATCQ and BSKM. On the other hand, when analyzing the results for the execution time, the noniterative methods are significantly faster than BS+ITATCQ3. As far as the iterative methods are concerned, the proposed method is significantly faster than all of them, except ITATCQ.
In summary, the results obtained from the Wilcoxon test confirm the conclusions previously presented in this section.

Conclusions
BS+ATCQ is a recently proposed color quantization method which combines two previous methods that solve the same problem: BS and ATCQ. BS is a well-known color quantization technique. Moreover, ATCQ is a recent method that represents the pixels of the image by ants that build a tree structure.
This article proposes to replace ATCQ by ITATCQ, generating another color quantization method called BS+ITATCQ. In essence, ITATCQ applies ATCQ operations iteratively, improving the resulting image as the iterations progress. When ITATCQ was proposed, it was shown that the said algorithm could obtain better images than ATCQ. However, now this work has shown that the method that combines BS with ITATCQ obtains better results than BS+ATCQ.
Computational results show that the new method generates better images than BS and ITATCQ applied independently and also improves the images generated by BS+ATCQ. In addition, the method was compared to 11 other color quantization techniques and obtained good results. BS+ITATCQ obtains better images than the splitting methods compared, although it consumes more time, as is usually the case when comparing the execution time of splitting methods and clustering-based methods. On the other hand, it can obtain better images than most of the clustering-based methods analyzed and with less time consumption.