Blind Source Separation Based on Double-Mutant Butterfly Optimization Algorithm

The conventional blind source separation independent component analysis method has the problem of low-separation performance. In addition, the basic butterfly optimization algorithm has the problem of insufficient search capability. In order to solve the above problems, an independent component analysis method based on the double-mutant butterfly optimization algorithm (DMBOA) is proposed in this paper. The proposed method employs the kurtosis of the signal as the objective function. By optimizing the objective function, blind source separation of the signals is realized. Based on the original butterfly optimization algorithm, DMBOA introduces dynamic transformation probability and population reconstruction mechanisms to coordinate global and local search, and when the optimization stagnates, the population is reconstructed to increase diversity and avoid falling into local optimization. The differential evolution operator is introduced to mutate at the global position update, and the sine cosine operator is introduced to mutate at the local position update, hence, enhancing the local search capability of the algorithm. To begin, 12 classical benchmark test problems were selected to evaluate the effectiveness of DMBOA. The results reveal that DMBOA outperformed the other benchmark algorithms. Following that, DMBOA was utilized for the blind source separation of mixed image and speech signals. The simulation results show that the DMBOA can realize the blind source separation of an observed signal successfully and achieve higher separation performance than the compared algorithms.


Introduction
Blind source separation (BSS), sometimes referred to as blind signal processing, is capable of recovering a source signal from an observed signal in the absence of critical information, such as source and channel [1][2][3]. Due to its high adaptability and other advantages, BSS has been employed in a variety of research fields in recent years, such as image processing, medical evaluation, radar analysis, speech recognition, and machinery [4][5][6][7][8].
Independent component analysis (ICA) is an important BSS method [9]. However, the conventional natural gradient algorithm (NGA) is too reliant on gradient information [10], whereas the fast fixed-point algorithm for ICA (FastICA) is sensitive to the initial solution [11]. Thus, improving the speed and precision with which the separation matrix is solved and obtaining higher-quality separated signals have significant practical implications.
To address the aforementioned issues, a swarm intelligence algorithm with a solid coevolution mechanism is gradually applied to ICA. Preliminary research indicates that BSS based on a swarm intelligence algorithm outperforms traditional BSS methods in terms of separation performance [12]. Li et al. [13] utilized the improved particle swarm optimization (PSO) for ICA. The disadvantage is the poor search capability of PSO in the later stages of iteration. Wang et al. [14] employed the improved artificial bee colony (ABC) optimization as the optimization algorithm for ICA, despite the fact that this optimization algorithm is very parameter dependent. Luo et al. [15,16] applied the improved fireworks algorithm (FA) to the radar signal processing, while the fireworks algorithm is prone to local extremum. Wen et al. [17] used a genetic algorithm (GA) to ICA, although the local search capability of GA is limited.
The butterfly optimization algorithm (BOA) was developed in 2018. It was inspired by the behavior of butterflies looking for food and demonstrated high robustness and global convergence while addressing complex optimization problems [18]. According to preliminary studies, BOA is very competitive in function optimization when compared to other metaheuristic algorithms, such as ABC, cuckoo search algorithm (CSA), firefly algorithm (FA), GA, and PSO [19]. It does, however, face several difficulties. For instance, it is possible to fall into local optimization when dealing with high-dimensional complexity prior to optimization operation. Additionally, inappropriate parameters result in a slow convergence speed of BOA. Therefore, scholars have proposed a series of improved algorithms to improve the performance of BOA. Arora et al. [20] combined BOA and ABC, enhancing the algorithm's exploitation capacity. Long et al. [21] provided a pinhole image learning strategy based on the optical principle that can help avoid premature convergence in the algorithm. Fan et al. [22] introduced a new fragrance coefficient and a different iteration and update strategy. Mortazavi et al. [23] proposed a novel fuzzy decision strategy and introduced a notion of "virtual butterfly" to enhance the search capability of BOA. Zhang et al. [24] proposed a heuristic initialization strategy combined with greedy strategy, which improved the diversity of the initial population. Li et al. [25] introduced weight factor and Cauchy mutation to BOA, enhancing the ability of the algorithm to jump out of local optimization. The above references are some improvement methods of BOA. Although they can improve the search performance of the algorithm to some extent and reduce the premature convergence phenomenon in the algorithm, most improved algorithms only focus on the improvement of single search performance and ignore the balance between global search ability and local search ability.
Based on the foregoing research, and in response to the limitations of the low separation performance of conventional ICA methods and the lack of search ability in basic BOA, this paper presents an ICA method based on the double-mutant butterfly algorithm (DMBOA). Firstly, the dynamic transformation probability and population reconstruction mechanisms are introduced to assist the algorithm in maintaining its search balance and increasing its capacity to avoid the local optimum. The differential evolution operator is then introduced in the global position update to allow for mutation, while the sine cosine operator is introduced in the local position update to allow for mutation, hence, enhancing the algorithm's exploitation capacity. Finally, the superiority of DMBOA is verified in benchmark function and BSS problem.
To summarize, the major contributions of this paper are given as follows: (1) An ICA method based on DMBOA is designed to address the low-separation performance of conventional ICA. DMBOA is used to optimize the separation matrix W, maximize the kurtosis, and finally, complete the separation of observation signals. (2) Three improved strategies are designed for the insufficient search capability of the basic BOA, which coordinate the global search and local search of the algorithm while improving BOA searching ability. (3) Simulation results show that DMBOA outperforms the other nine algorithms when optimizing 12 benchmark functions. In the BSS problem, DMBOA is capable of successfully separating mixed signals and achieving higher separation performance than the compared algorithms.
The remainder of this paper is organized as follows: Section 2 introduces the basic theory of BSS. Section 3 discusses the details of the BOA. Section 4 addresses the DMBOA im- plementation. Section 5 provides simulation analysis, which verifies the effectiveness of the proposed algorithm. Section 6 concludes the paper and summarizes the major contributions.
The main literature contributions in the introduction are introduced in Table 1.

Linear Mixed Blind Source Separation Model
The linear mixed BSS model is described below: where t is the sampling moment, A is a mixed matrix of order m × n (m ≥ n), X(t) is a vector of the m-dimensional observed signals, X(t) = [X 1 (t), X 2 (t), . . . , X m (t)], S(t) is a vector of the n-dimensional source signals, S(t) = [S 1 (t), S 2 (t), . . . , S n (t)], N(t) is a vector of the m-dimensional noise signals. BSS represents the cases in which an optimization algorithm determines the separation matrix, W, when only the observed signals, X(t), are known. In such instances, the separated signals, Y(t), are obtained using Equation (2). where To ensure the feasibility of BSS, the following assumptions are required: (1) The mixing matrix, A, should be reversible or full rank, and the number of observed signals should be larger than or equal to the number of source signals (i.e.,). (2) From a statistical standpoint, each source signal is independent of the others, and at most, one signal follows a Gaussian distribution, because multiple Gaussian processes remain a Gaussian process after mixing and, hence, cannot be separated.
Due to the lack of source signal and channel information, it is difficult to discern the signal's amplitude and order following BSS, a phenomenon known as fuzziness. Although BSS is fuzzy, its fuzziness has a negligible effect on the results in the majority of scientific research and production practices. Figure 1 shows the linear mixed blind source separation model. (1) The mixing matrix, A, should be reversible or full rank, and the number of observed signals should be larger than or equal to the number of source signals (i.e.,). (2) From a statistical standpoint, each source signal is independent of the others, and at most, one signal follows a Gaussian distribution, because multiple Gaussian processes remain a Gaussian process after mixing and, hence, cannot be separated.
Due to the lack of source signal and channel information, it is difficult to discern the signal's amplitude and order following BSS, a phenomenon known as fuzziness. Although BSS is fuzzy, its fuzziness has a negligible effect on the results in the majority of scientific research and production practices. Figure 1 shows the linear mixed blind source separation model.

Hybrid System Am×n
Separation System Wn×m S1(t) Figure 1. Linear mixed blind source separation model.

Signal Preprocesing
Prior to performing BSS on observed signals, it is usually essential to preprocess the signals in order to simplify the separation process. De-averaging and whitening are two widely used preprocessing techniques.
The de-averaging processing method is shown in Equation (3).
The purpose of whitening is to eliminate the signals' correlation. The whitening operation in BBS is used to remove the second-order correlations between signals in space, ensuring that the observed signals received by the sensor remain uncorrelated in space and simplifying the algorithm complexity. The signal, V, after whitening is expressed as follows: where Q is a whitening matrix, U is a characteristic matrix composed of eigenvectors corresponding to the n maximum eigenvalues of the autocorrelation matrix, The separation matrix, W, is an orthogonal matrix, which can be expressed as the product of a series of rotation matrices [26]. Taking three source signals as an example, the separation matrix, W, is defined as follows:

Separation Principle
When performing BBS on mixed signals using ICA, it is necessary to first select an appropriate criterion for determining the statistical independence of the separated signals.

Signal Preprocesing
Prior to performing BSS on observed signals, it is usually essential to preprocess the signals in order to simplify the separation process. De-averaging and whitening are two widely used preprocessing techniques.
The de-averaging processing method is shown in Equation (3).
The purpose of whitening is to eliminate the signals' correlation. The whitening operation in BSS is used to remove the second-order correlations between signals in space, ensuring that the observed signals received by the sensor remain uncorrelated in space and simplifying the algorithm complexity. The signal, V, after whitening is expressed as follows: where Q is a whitening matrix, U is a characteristic matrix composed of eigenvectors corresponding to the n maximum eigenvalues of the autocorrelation matrix, R XX = E[XX H ], of the observed matrix, X, and E = diag(d 1 , d 2 , . . . d n ) is a diagonal matrix composed of these eigenvalues. The separation matrix, W, is an orthogonal matrix, which can be expressed as the product of a series of rotation matrices [26]. Taking three source signals as an example, the separation matrix, W, is defined as follows:

Separation Principle
When performing BSS on mixed signals using ICA, it is necessary to first select an appropriate criterion for determining the statistical independence of the separated signals. Afterwards, the objective function is established and optimized using the appropriate algorithm. This leads to the separation matrix with the strongest independence of the separated signals.
The commonly used independence criterion of signals includes mutual information, kurtosis, and negative entropy. Kurtosis is calculated using Equation (6) as follows: where y i is a gaussian random variable. The sum of absolute values of kurtosis is used as a criterion of signal independence in this paper, and the objective function is specified as follows: where ε is an extremely small amount that prevents division by zero. According to the information theory, for a gaussian random vector y i , when E[yy T ] = I, the larger the kurtosis of the signals, the greater their independence. The above-mentioned DMBOA will be used to optimize the separation matrix W, to maximize the kurtosis, and finally complete the separation of the observed signals.

Butterfly Optimization Algorithm (BOA)
BOA is an optimization technique inspired by the foraging behavior of butterflies. Each butterfly in BOA serves as a search operator and performs the optimization process in the search space. Butterflies are capable of perceiving and distinguishing between different fragrance intensities, and each butterfly emits a fragrance of a certain intensity. Assume that the intensity of the fragrance produced by butterflies is proportional to their fitness; that is, as butterflies move from one location to another, their fitness will change accordingly. When a butterfly detects the fragrance of another, it will move toward the butterfly with the strongest fragrance. This stage is referred to as "global search." On the contrary, if the butterfly is unable to perceive the fragrance of other butterflies, it will move randomly. This stage is referred to as "local search." The global and local searches are switched during the search process by switching the probability p.
The fragrance can be formulated as follows: where f is the perceived intensity of the fragrance, i.e., the fragrance's intensity as perceived by other butterflies, c is the sensory modality, I is the stimulus intensity, depending on fitness, and a is the mode-dependent power exponent, which accounts for the various degrees of absorption, a ∈ [0, 1]. The value of c is updated by Equation (9) as follows: where t and T represent the current and maximum number of iterations, respectively. When butterflies sense the stronger fragrance in the area, they move towards the strongest one. This stage is calculated as follows: When a butterfly is unable to perceive the surrounding fragrance, it moves randomly. This stage is calculated as follows: where x t i represents the position of butterfly individual i in generation t, x t j denotes the position of butterfly individual j in generation t, x t k indicates the position of butterfly individual k in generation t, r shows a random number between 0 and 1, and g stands for the gl obal optimal position.
The pseudo code of BOA is provided in Algorithm 1. Calculate fragrance using Equation (8)  5.
Update the value of c using Equation (9)

Dynamic Transition Probability
Local and global searches are controlled in the basic BOA by the constant switching probability p, which implies that during the iterative process of the algorithm, BOA will allocate 80% of its search capability to global search and 20% to local search. In this search mode, about 80% of the butterflies in the population will be attracted to the best butterfly, g. Therefore, if the best butterfly, g, falls into the local optimum, it will strongly guide other butterflies to this unpopular position in the search space, making it more difficult for the algorithm to avoid the local extreme value, so it converges prematurely.
A reasonable search process should begin with a strong global search in the early stages of the algorithm, quickly locate the scope of the global optimal solution in the search space, and appropriately enhance the local development capability in the latter stages of the exploration, all of which contribute to the optimization accuracy of the algorithm. The dynamic switching probability, p 2 , is proposed in this paper to balance the proportions of local and global search to achieve a more effective optimization strategy. The dynamic conversion probability, p 2 , is shown in Equation (12).
where µ takes constant 2. As seen in Figure 2, the dynamic conversion probability, p 2 , proposed in this paper, gradually converges to 0.5 as iteration progresses. It can strike a balance between global search in the early stages and local development in the latter stages.

Improvement in Update Function
When some butterflies move completely at will or when a large number of butterflies congregate at non-global extreme points, the convergence speed of BOA is significantly slowed and falls into local extreme values. Two mutation operators, the differential evolution [27,28] and sine cosine operator [29], are used in this paper to improve BOA.
The differential evolution operator utilizes three-parameter variables for global search, which results in a faster convergence rate and simplifies the process of obtaining the global optimal value, which is why it is used for global search. The sine cosine operator possesses the periodicity and oscillation of the sine cosine function, which enables it to avoid falling into the local extremum, accelerate the convergence speed of the algorithm, and be applied to local search. ensors 2022, 22, x FOR PEER REVIEW 7 As seen in Figure 2, the dynamic conversion probability, p2, proposed in this p gradually converges to 0.5 as iteration progresses. It can strike a balance between g search in the early stages and local development in the latter stages.

Improvement in Update Function
When some butterflies move completely at will or when a large number of butte congregate at non-global extreme points, the convergence speed of BOA is signific slowed and falls into local extreme values. Two mutation operators, the differential lution [27,28] and sine cosine operator [29], are used in this paper to improve BOA.
The differential evolution operator utilizes three-parameter variables for g search, which results in a faster convergence rate and simplifies the process of obta the global optimal value, which is why it is used for global search. The sine cosine ope possesses the periodicity and oscillation of the sine cosine function, which enables avoid falling into the local extremum, accelerate the convergence speed of the algor and be applied to local search.
The global search variation is expressed as follows: The local search variation is determined as follows: where the mutation operator, [0, 2] F , is a real constant factor, 2 r is a random nu with a value range between 0 and 2π, and  and r3 are random numbers with a v range between 0 and 1. The parameter r1 is calculated as follows: The global search variation is expressed as follows: The local search variation is determined as follows: where the mutation operator, F ∈ [0, 2], is a real constant factor, r 2 is a random number with a value range between 0 and 2π, and λ and r 3 are random numbers with a value range between 0 and 1. The parameter r 1 is calculated as follows: where δ takes constant 2.

Population Reconstruction Mechanism
The counter count is introduced, with an initial value of 0. If the global optimal solution, g, remains constant, the count increases by 1. If the global optimal solution, g, changes, the counter is reset. When the count is greater than or equal to 0.1 * T, the default optimization stops. To preserve previous optimization results and increase the population diversity to avoid local optimums, 20% of the individuals, including the optimal solution, are randomly selected from the original population, while the remaining 80% of individuals are discarded and replaced with new randomly generated individuals.
Algorithm 2 gives the pseudo code of DMBOA, and Figure 3 shows the flow chart of DMBOA-ICA. Calculate fragrance using Equation (8)  5.
Execute population reconstruction strategy 19.
Update the value of c using Equation (9)  Update position using Equation (14) f(x) ≥ f(g) ?
Execute population reconstruction strategy

Evalution of DMBOA on Benchmark Function
To more accurately and comprehensively verify the efficacy of DMBOA, 12 test func- The DMBOA proposed in this paper enhances the basic BOA in three aspects. Firstly, the dynamic transformation probability coordination algorithm is implemented using both local and global search. The double-mutant operator is then incorporated into the algorithm update function to enhance the local search capability of the algorithm. Finally, a population reconstruction mechanism is introduced to avoid falling into local optimums in the event of optimization stagnation. Through the above three improvement methods, DMBOA can effectively overcome the poor search capability of the basic BOA, which makes it easy to fall into local optimums. However, when compared to the basic BOA, DMBOA has a higher computational complexity, as each iteration of DMBOA requires calculating the value of the calculator count and reconstructing the population when it falls into optimization stagnation, which, in turn, increases the calculations required by this algorithm.

Evalution of DMBOA on Benchmark Function
To more accurately and comprehensively verify the efficacy of DMBOA, 12 test functions were used with varying characteristics for experiments. The detailed characteristics of each test function are listed in Table 2. It features four single-mode test functions (F 1 -F 4 ), as well as eight multi-mode test functions (F 5 -F 12 ). In Table 2, Dim denotes the function dimension, Scope represents the value range of x, and f min indicates the ideal value of each function. There is only one global optimal solution for single-mode test functions and no local optimal solution. They are suitable for evaluating the local development capability of the algorithm. On the contrary, there are many local optimal solutions for multimodal test functions. Numerous algorithms that perform well with low modal functions perform poorly with high modal functions and are prone to local optimization or oscillation between local extrema. The high-modal test function is usually used to evaluate the global search capability of the algorithm [30].
DMBOA is compared against nine algorithms in the experiment, namely GWO [31], WOA [32], CF-AW-PSO [33], HPSOBOA [34], FPSBOA [35], BOA [18], BOA_1 (dynamic conversion probability), BOA_2 (introduce double-mutant operator), and BOA_3 (introduce population reconstruction mechanism). For all ten algorithms, the population size N = 30 and the total number of iterations T = 500. The parameters of DMBOA are shown in Algorithm 2, while the parameters of other algorithms are shown in references [31][32][33][34][35]. Table 2 shows the optimal fitness value (BEST), the average fitness value (MEAN), the standard deviation (STD), and the running time (TIME), tested by 10 algorithms, such as DMBOA under 12 test functions in Table 2, in which the time unit is seconds. The test results of DMBOA have been bold in Table 3. Each algorithm was performed separately 30 times to minimize the error, and all experiments were conducted on a laptop equipped with an Intel (R) Core (TM) i7-6500 CPU at 2.50 GHz and 8 GB of RAM.
As shown in Table 3, DMBOA is capable of obtaining the optimal values for these 12 test functions, and the optimal values for each function are closer to f min in Table 2. The search accuracy of BOA_1, BOA_2, and BOA_3 proposed in this paper is also better than the original BOA, demonstrating the efficacy of the three improvement strategies utilized in this paper. DMBOA has a higher search accuracy than the improved algorithm with a single strategy, indicating that under the joint influence of different strategies, the optimization ability and stability of the algorithm are improved to the greatest extent. Overall, the test results of BOA_2 are closer to those of DMBOA. The STD of data can reflect the degree of dispersion. According to the test results in Table 3, DMBOA has the smallest STD for each test function, indicating that it is more robust and stable than the compared algorithms when dealing with both low-and high-modal problems. As for the calculation time in Table 3, DMBOA has a medium execution time. According to the data in the table, the test time for DMBOA under the five test functions of F 2 , F 4 , F 5 , F 11 , and F 12 is less than that of the original BOA. This indicates that, although the time complexity of DMBOA is higher in theory than that of the original BOA, the high convergence accuracy of DMBOA enables it to find the global optimal solution more quickly, particularly for the two test functions, F 11 and F 12 . Figure 4 depicts the iteration history of the ten algorithms tested on the 12 test functions in Table 2. As seen in Figure 4, the DMBOA developed in this study has the fastest iteration speed and maximum convergence accuracy among all the convergence history graphs. This demonstrates that, when compared to other algorithms, DMBOA is capable of obtaining the optimal solution in the shortest amount of time. BOA-1, BOA-2, and BOA-3, which are improved by a single strategy, improved convergence speed and optimization accuracy to a certain extent when compared to basic BOA, indicating that each strategy performed satisfactorily and effectively, but not as well as the DMBOA, which is improved by a hybrid strategy. The feasibility of the three improved strategies is further verified. GWO can be iterated until it reaches the theoretical optimal value under F 5 and F 7 . The overall convergence performance of WOA is general. The convergence speed of CF-AW-PSO is slow in the early stages. The iteration results of HPSOBOA under F 1 , F 2 , F 6 , and F 7 are poor. FPSBOA outperforms F 5 , F 6 , and F 7 in terms of convergence curve and search performance. Table 2. Basic information of benchmark functions.

Speech Signal Separation
Three speech signals are used as the source signals, which are then mixed to obtain the observed signals. To acquire the separated signals, DMBOA, BOA, HPSOBOA, and FPSBOA are used to blindly separate the observed signals. The simulation diagram is depicted in Figure 5. The sampling frequency and sampling point of voice signals are 40,964 and 1000, respectively.
In order to quantitatively analyze and compare the separation performance of the four algorithms, the time, similarity coefficient, performance index (PI), and PESQ [36] are employed in this study. The data are shown in Table 4 with a time unit of seconds.    Table 2.

Speech Signal Separation
Three speech signals are used as the source signals, which are then mixed to obtain the observed signals. To acquire the separated signals, DMBOA, BOA, HPSOBOA, and FPSBOA are used to blindly separate the observed signals. The simulation diagram is depicted in Figure 5. The sampling frequency and sampling point of voice signals are 40,964 and 1000, respectively.   Table 2.  The PESQ metric is based on the wide-band version recommended in ITU-T [37], and its range is extended from −0.5 to 4.5. The higher its value, the better the quality of the speech signal. The similarity coefficient and PI are expressed in Equations (16) and (17) as follows: The PESQ metric is based on the wide-band version recommended in ITU-T [37], and its range is extended from −0.5 to 4.5. The higher its value, the better the quality of the speech signal. The similarity coefficient and PI are expressed in Equations (16) and (17) as follows: In Equation (16), ρ ij is a similarity index used to compare the source signal with the separated signal. The greater the ρ ij , the more effective the separation. In this section, ρ ij is a 3 × 3 matrix. The maximum value of each channel is taken as the experimental data, and N is set to 3. Additionally, in Equation (17), ; the closer the PI is to 0, the more similar the separated signal is to the source signal. In comparison to Figure 5, the separated signals have a different amplitude and order than the source signals, indicating the fuzziness of BSS. The signals separated by BOA are partially distorted. The signals separated by HPSOBOA and FPSBOA are partially deformed. The signals separated by DMBOA are highly consistent with the waveform of the source signal and have a strong separation effect.
As shown in Table 4, DMBOA produces not only the highest similarity coefficient and PESQ but also the smallest PI of the separated signal, allowing for a more accurate restoration of the source signal. Moreover, the operation time of DMBOA is shorter than that of the examined algorithms.

Image Signal Separation
Three gray-scale images and one random noise image are used as source signals, and they are combined to produce the observed signals. To acquire the separated signals, DMBOA, BOA, HPSOBOA, and FPSBOA are used to blindly separate the observed signals. In this section, N is assumed to be 4, and the pixels of the image are 256 × 256; ρ ij is a 4 × 4 matrix. Figure 6 illustrates the simulation result and Table 5 compares the similarity coefficient, PI, and duration of separated signals, as well as the SSIM [38] of the output image. The SSIM proves to be a better error metric for comparing the image quality with better structure preservation. They are in the range of [0, 1], which is a value closer to one indicating better structure preservation: where C 1 and C 2 are constant, σx x represents the covariance of image, µx and µ x represent the mean value of the two images, respectively, σx and σ x represent the variance in the two images, respectively.
ilarity coefficient, PI, and duration of separated signals, as well as the SSIM [38] of the output image. The SSIM proves to be a better error metric for comparing the image quality with better structure preservation. They are in the range of [0,1], which is a value closer to one indicating better structure preservation: where 1 C and 2 C are constant, xx  represents the covariance of image, x  and x  represent the mean value of the two images, respectively, x  and x  represent the variance in the two images, respectively.  As seen in Figure 6, the images separated by DMBOA are similar to the source images, but the images separated by other algorithms have varying degrees of ambiguity. Additionally, as demonstrated by the data in Table 5, the separation performance of DMBOA is superior to that of the examined algorithms.

Conclusions
This paper proposed a novel double-mutant butterfly optimization algorithm (DMBOA), which is a major improvement on the butterfly optimization algorithm (BOA) and applied to blind source separation (BBS). The algorithm incorporates a double-mutant operator and a population reconstruction mechanism, which enhances the capability of local development and avoids local optimization. The proposed technique was initially explored and further developed through the use of a dynamic conversion probability balancing method. The following conclusions are drawn from the simulation results: (1) When optimizing 12 benchmark functions (four low-modal and eight high-modal), DMBOA outperforms the other nine algorithms. The three improvement methods proposed in this study increased the performance of BOA to varying degrees in the algorithm ablation experiment. All of this demonstrates that DMBOA has a high level  As seen in Figure 6, the images separated by DMBOA are similar to the source images, but the images separated by other algorithms have varying degrees of ambiguity.
Additionally, as demonstrated by the data in Table 5, the separation performance of DMBOA is superior to that of the examined algorithms.

Conclusions
This paper proposed a novel double-mutant butterfly optimization algorithm (DM-BOA), which is a major improvement on the butterfly optimization algorithm (BOA) and applied to blind source separation (BSS). The algorithm incorporates a double-mutant operator and a population reconstruction mechanism, which enhances the capability of local development and avoids local optimization. The proposed technique was initially explored and further developed through the use of a dynamic conversion probability balancing method. The following conclusions are drawn from the simulation results: (1) When optimizing 12 benchmark functions (four low-modal and eight high-modal), DMBOA outperforms the other nine algorithms. The three improvement methods proposed in this study increased the performance of BOA to varying degrees in the algorithm ablation experiment. All of this demonstrates that DMBOA has a high level of search performance and strong robustness. (2) DMBOA outperforms the other algorithms in the BSS and is capable of successfully separating the mixed speech and image signals.