An Improved Background Normalization Algorithm for Noise Resilience in Low Frequency

: Background normalization algorithms attempt to suppress the ambient and self-noise during the measurements of sonar, which enhance the detection performance and the display effect of weak signals. Conventional background normalization methods are usually sensitive to the accuracy of prior set ﬁltering interval and threshold, while signiﬁcant noise is still detected in low frequency. In this paper, an improved background normalization algorithm is proposed by thresholding the processing interval between several local peak values and local valley values. Compared to the existing scenarios, the proposed approach automatically calculates the ﬁltering interval and threshold, with substantial resilience to the noise level in low frequency. Experimental results illustrate the effectiveness of our algorithm.


Introduction
In sonar detection, background normalization is a kind of constant false-alarm rate (CFAR) processing that estimates the magnitude of background ambient and self-noise as a threshold, and thresholding the weak signals of interest by providing dynamic magnitude compression for the data visualization displays [1]. This technique has been widely used in searching low level signals and detecting line spectrums, especially those in very low frequency (usually lower than 100 Hz) but submerged in background environmental noise and receiver-self-noise in most situations, since they possibly contain the power and character frequencies of targets like autonomous underwater vehicles (AUVs) and submersibles, etc.
Over three decades ago, several classical background normalizers, including two pass mean (TPM) algorithms, split three-pass mean (S3PM) algorithms (also called two-pass split-window (TPSW) algorithms in subsequent literature [1][2][3]), order truncate average (OTA) algorithms, and split average exclude average (SAXA) algorithms, were introduced by Struzinski and Lowe [4]. Stergiopoulos then proposed a normalization algorithm that estimates the mean of noise in the beam and frequency domain [1]. An improved OTA algorithm combined with a median filter was next studied by Li et al. in 2000, which mainly focuses on the inhomogeneous and non-stationary background [5]. In 2006, Joo and Jum further evaluated the previous works and specially compared the performances of TPSW and OTA methods in terms of the window length as well as threshold [6]. Wang and Zheng summarized the above approaches and employed another background normalization method in the later research, called the beam characteristics scan algorithm [3,7]. In addition, the background normalizers had been practically applied in shallow water with a multipath [8] and high-clutter environment [9], for better signal detection performances in broadband interferences.
Nevertheless, most of the aforementioned algorithms need to manually set several parameters (except to the beam characteristics scan algorithm) such as normalization interval and thresholding factor, which may seriously influence the normalization effect. On the other hand, all these normalizers are still performing unsatisfied for low frequency noise resilience. Therefore, we consider an improved background normalization algorithm that focuses on the noise suppression in low frequency, without the prior selection of experiential parameters. We are able to illustrate that the proposed scenario can decrease the background noise level in low frequency more significantly than those conventional normalization methods.
The remainder of this paper is organized as follows. Three commonly used background normalization algorithms mentioned above-TPSW algorithm, OTA algorithm and beam characteristics scan algorithm, are briefly reviewed in Section 2. Section 3 demonstrates the problem and the proposed normalizer in detail. In Section 4, experimental data are employed for the validation of the algorithm put forward compared to the previous methods, particularly for the normalization in low frequency, with the calculation burden and the influence of processing range also analyzed. Finally, conclusions are made in Section 5.

TPSW Algorithm
The TPSW algorithm, or S3PM algorithm, is proved to be a very robust and systemimplementation-friendly normalizer [1]. For a string of data X = [X(1), X(2), . . . , X(N)], where N is the length of data, it gives the noise level estimation of the element of interest X(k) (k = 1, 2, . . . , N) from two neighborhood windows, as shown in Figure 1. Thus, the interval index of two windows is expressed as For the situation in which k is close to the head or the end of the data string, the corresponding window is truncated (or even abandoned) at the boundaries of the string. Then, the local mean value of the elements in the two windows is given bỹ Next, a new sequence that helps eliminate the magnitude of possible signal and estimate the noise is formed as where r = {1 + C[(4/π − 1)/A] 1/2 } is a threshold regulator, C is a constant not less than 1, and A = G − E + 1 is the length of each window [4]. After all the Φ(k) (k = 1, 2, . . . , N) are obtained, the normalized values of TPSW are deduced from [Z TPSW (k)] N k=1 , in which where u(k) = mean k ∈R [Φ(k )] is the local mean value of the estimated noise.

OTA Algorithm
Compared to the TPSW algorithm, the OTA algorithm is more efficient and effective [1], which makes it one of the most popular background normalization schemes [10].
The algorithm achieves the normalization through thresholding and median filtering. In detail, the data string X = [X(1), X(2), . . . , X(N)] is first extended as and also re-written in the following format: Note that K is the length of the extension window to avoid the edge effect. The operation of OTA method is illustrated in Figure 2. The window including . . , N, is next sorted from small to large as y(1), y(2), . . . , y(2K + 1), and we calculate the truncated mean valueȳ as the estimation of noise level.
Then, outputs of the OTA algorithm are where α is an empirical threshold regulator.

Beam Characteristics Scan Algorithm
Actually, the beam characteristics scan algorithm is an improved OTA approach that automatically defines the length of processing window as well as the threshold [11]. The peaks of data string X = [X(1), X(2), . . . , X(N)] are recorded at the beginning of this algorithm. For an arbitrary peak p, denote the left and right neighbor valleys as v(1) = v L and v(M) = v R , and the data vector (with the length M) between the two valleys is expressed as Similar to the OTA algorithm, the vector is sorted in ascending order to be and the following deduction ofw is employed as the normalization threshold with a self-calculated thresholding regulator q: Thus, the normalization results are and the whole string is denoted as [Z BCS (n)] N n=1 . The beam characteristics scan algorithm is schematically demonstrated in Figure 3.

Problem Formulation
According to the previous discussion, the performances of TPSW and OTA algorithm are significantly dependent on the proper choice of window length and threshold [12]. The beam characteristics scan scheme is free of empirical parameters, while it probably fails to process the edge of the data string (respected to the low and high frequency band in the delay-frequency map when normalizing along the frequency domain) since the peak may not exist in these areas (as illustrated in the shadow parts of Figure 3). In addition, the low frequency ambient and receiver-self-noise are unable to be well suppressed under all of the current approaches, which will be further validated in Section 4.
Therefore, the problems of traditional background normalization methods that worsen the detection of target line spectrums are concluded below: • Unstable performance due to the experientially set parameters (TPSW and OTA); • High level environmental and self-noise in low frequency cannot be fully suppressed (beam characteristics scan algorithm).

Proposed Approach
To combine the advantages of those three algorithms, we expect to design an improved approach based on the beam characteristics scan algorithm, since it is the only one in the three with adaptively set parameters. Note that we abandon to apply the extension window like the OTA algorithm for the beam characteristics scan to illuminate the blind areas during the processing (although it may be an obvious and simple idea) because it introduces empirical values again.
The approach we propose is an inverse beam characteristics scan (IBCS) algorithm that normalizes the data in an interval between two peaks instead of two valleys, followed by a pointwise nonlinear combination with a beam characteristics scan method.
As a complementary process, the inverse beam characteristics scan algorithm achieves normalization to the data string edge (shadow parts in Figure 3) that is neglected by the traditional beam characteristics scan algorithm. Typically, this scheme contains the following steps: • Step Here, we extended the processing boundaries to the third closest left and right peaks, to make the whole procedure of the proposed approach remain comparable in complexity to the existing beam characteristics scan algorithm. Furthermore, this scenario is considered in detail as a compromise in Section 4 with respect to the normalized noise level and computing burden.

•
Step 3. Sort Equation (15) in ascending order as and derive the thresholdw and regulator q from • Step 4. Export the normalized data vector and the whole normalized data string is represented as [Z IBCS (n)] N n=1 . On the other hand, the normal beam characteristics scan algorithm is simultaneously employed, in which the v L and v R are correspondingly replaced to be the third closest left and right valleys, with the whole processed data string written as [Z BCS (n)] N n=1 . Next, a pointwise minimization operator [13] is applied to combine the results of beam characteristics scan and inverse beam characteristics scan algorithms, whose final output is expressed as [Z(n)] N n=1 , where This operation selects the lower level of background noise, while retaining the energy of target line spectrums with acceptable magnitudes to form the final data string.

Analysis of Experimental Results
In this subsection, a group of underwater acoustic experimental data are derived from a lake through a passive 12-element uniform circular array (whose radius is 1 m). After some conventional signal processing analysis, the received time-domain signal is demonstrated as a delay-frequency map in Figure 4, and employed for the validation of the performance of proposed approach.  ], respectively, surrounded by quantities of ambient noise. Furthermore, receiver-self-noise with the magnitude much higher than −10 dB is mainly distributed in the frequency band 0∼50 Hz, and interferences on full frequency range are also observed at the 54th, 134th and 189th second. Figure 5 illustrates the background normalization effects (along the frequency axis) of TPSW, OTA, beam characteristics scan, and the proposed approach, where the experiential parameters of TPSW and OTA methods are: for TPSW, G = 16, E = 3, r = 1.14 [1]; for OTA, K = 25, α = 1.05 [3]; as referred to previous research. The four methods indeed make global suppression of the background noise to a significant extent, and all the whole frequency band interferences are removed.
As to the performances, our proposed scheme shows a comparable visual display effect to the OTA algorithm, and much clearer noise background to the other two methods. Specifically, the approach we propose outperforms the OTA on the preservation of target line spectrums. For instance, our work highlights the originally discrete target line spectrums at [120 Hz, 399∼522 s] and [190 Hz, 420∼522 s] to be more continuous, while the normalization of OTA makes them further interrupted and even harder to be regarded as "line spectrums". The normalized noise level of these four schemes at each frequency bin is next displayed in Figure 6. Comparison results exhibit the lowest noise level of the proposed method, which is specifically over 12 dB and 24 dB lower than the OTA and BCS at 30 Hz, while TPSW achieves the normalization approximate to 2 dB between 0∼500 Hz at the cost of increased noise level in higher frequency.

Further Discussion
Firstly, the computational complexity of algorithms is evaluated and listed in Table 1. Notice that these results just reflect a relative complexity between the approaches, which varies with the size of data. The operations are based on a Thinkpad laptop that carries a Core i7-8565 CPU, with 16 GB of the internal storage. The complexity indicates that OTA is the most efficient algorithm, which only costs about 1/3 of time compared to the other three schemes. The proposed approach consumes almost the same time as the beam characteristics scan algorithm, and both are better than that of TPSW.
On the other hand, the Remark in Section 3 points out that the proposed approach included d = 3 peaks (valleys) in the left and right at a given valley (peak) to form a data vector for processing, in order to make a balance on the normalized noise level as well as the computing time. Hence, we further consider the performances of these two indicators under different values of d, as expressed in Figures 7 and 8.  As explained before, d refers to the counting number of peaks/valleys from a given valley/peak in the proposed approach, where the first and last peaks/valleys determine a data vector for calculation. It is observed that the normalized noise achieves an overall lowest level at d = 3 and then picks up. The reason is that a narrow processing band may be unable to separate the target line spectrums and background noise, while a too wide processing band may include strong receiver-self-noise and relatively weak target line spectrums at the same time, which means that the targets would be falsely suppressed and leads to an error normalization output (thus the situations that d > 6 are not considered in this paper). However, the reduction of processing time is slowing down with the increase of d. Though the time consumption turns to be less than the beam characteristics scan algorithm at d = 3, it is still more than double compared to the OTA algorithms when d increases from 4 to 6. Therefore, in summary, d = 3 is adopted for the proposed approach.

Conclusions
The paper proposes an improved background normalization scheme including an inverse beam characteristics scan algorithm based on the conventional beam characteristics scan method and a pointwise minimization operator that combines the above two approaches. The proposed scheme achieves over 40 dB, 10 dB and 20 dB lower normal-ized noise level (especially in the low frequency band 0∼100 Hz) than the existing TPSW, OTA, and beam characteristics scan algorithms, with almost the same (a little bit better) performance as the beam characteristics scan in terms of the computational complexity.
Future avenues of the work may direct to the evaluation of other kinds of practical data, such as sea trial data, and the possible modifications to the presented approach.