Next Article in Journal
Time Series Forecasting Case Study on Risk-Based Asset Integrity Management for Low-Voltage Failures of Power Distribution Systems
Previous Article in Journal
Applying the Technology Acceptance Model to Understand Financial Practitioners’ Intentions to Use the Digital Innovation Learning Platform
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Forecasting of Signals by Forecasting Linear Recurrence Relations †

Faculty of Mathematics and Mechanics, St. Petersburg State University, 199034 St. Petersburg, Russia
*
Author to whom correspondence should be addressed.
Presented at the 9th International Conference on Time Series and Forecasting, Gran Canaria, Spain, 12–14 July 2023.
Eng. Proc. 2023, 39(1), 12; https://doi.org/10.3390/engproc2023039012
Published: 28 June 2023
(This article belongs to the Proceedings of The 9th International Conference on Time Series and Forecasting)

Abstract

:
The forecasting of a signal that locally satisfies linear recurrence relations (LRRs) with slowly changing coefficients is considered. A method that estimates the local LRRs using the subspace-based method, predicts their coefficients and constructs a forecast using the LRR with the predicted coefficients is proposed. This method is implemented for time series that have the form of a noisy sum of sine waves with modulated frequencies. Linear and sinusoidal frequency modulations are considered. The application of the algorithm is demonstrated with numerical examples.

1. Introduction

Let us consider the problem of forecasting time series using singular spectrum analysis (SSA) [1,2,3,4,5,6]. The theory of SSA is quite well-developed; there are many papers with applications of SSA to real-life time series (see, for example, [6] and [7] (Section 1.7) with short reviews). SSA does not require a time-series model to construct the decomposition into interpretable components such as the trend, periodicities, and noise. However, for prediction in SSA, it is assumed that the signal (the deterministic component of the time series) S N = ( s 1 , , s N ) satisfies some model, in particular, a linear recurrence relation (LRR) with constant coefficients (maybe, approximately):
s i + d = k = 1 d a k s i + d k for i = 1 , , N d .
This assumption is valid for signals in the form of a sum of products of polynomial, exponential, and sinusoidal time series, in particular, a sum of exponentially modulated periodic components. SSA also works for the case of trend extraction and the general case of amplitude-modulated harmonics, where the model is satisfied approximately. However, SSA is not applicable if the signal locally satisfies a changing LRR. An example of such a signal is sinusoidal frequency-modulated time series. This paper aims to construct a method for the prediction of time series locally governed by changing LRRs, staying within the framework of SSA.
Let us consider the model of time series in the form of a noisy signal, where the signal is locally governed by LRRs with slowly time-varying coefficients. A local version of SSA has already been considered earlier for signal estimation [8]. However, it results in different approximations of the segments of the time series and the prediction can be performed based on the last segment only. In this paper, a local modification of the recurrent SSA prediction, based on the construction of a prediction of the coefficients of the local LRRs, is proposed. This modification was applied to time series in which the signal is a sum of sinusoids with time-varying frequencies having non-intersecting frequency ranges, where the instantaneous frequency of each summand is slowly varying.

2. Basic Notions

2.1. Linear Recurrence Relations

Let us introduce several definitions. By time series of length N we mean a sequence of real numbers X N = ( x 1 , , x N ) R N . Consider a time series in the form of the sum of a signal and noise X N = S N + R N and state the problem of signal forecasting.
A time series S N satisfies a linear recurrence relation (LRR) of order d if there exists a sequence { a i } i = 1 d such that a d 0 and (1) takes place.
A time series governed by an LRR satisfies a set of LRRs, one of which has minimal order; we will call it minimal LRR. Among the set of the governing LRRs, there is the so-called min-norm LRR with the minimum norm of coefficients, which suppresses noise in the best way [5] and is used for the recurrent SSA forecasting.
The characteristic polynomial of LRR (1) is defined as
P d ( μ ) = μ d k = 1 d a k μ d k .
The roots of the characteristic polynomial of the minimal LRR are called signal roots. The characteristic polynomial corresponding to an LRR governing the signal includes the signal roots, among others.
Remark 1.
Since P m ( μ ) = k = 1 m ( μ μ k ) = μ m k = 1 m a k μ m k , the roots of the characteristic polynomial provides the LRR coefficients, and vice versa.
The following result [9] together with Remark 1 shows how to find the roots of the governing minimal LRR s i + r = k = 1 r a k s i + r k using the common term of the time series S N . Let μ 1 , , μ p be the roots of the LRR characteristic polynomial with multiplicities k 1 , , k p . The time series S N satisfies the LRR if and only if s n = m = 1 p j = 0 k m 1 c m j n j μ m n , where c m j C depends on s 1 , , s r .
Example 1.
Consider the time series S N = ( s 1 , , s N ) with s n = A cos ( 2 π ω n + ϕ ) , ω 0 , 1 2 . Since A cos ( 2 π ω n + ϕ ) = A e i ϕ e i · 2 π ω n / 2 + A e i ϕ e i · 2 π ω n / 2 , we have μ 1 = e i · 2 π ω , μ 2 = e i · 2 π ω . Therefore, the characteristic polynomial is P r ( μ ) = ( μ μ 1 ) ( μ μ 2 ) = ( μ e i · 2 π ω n ) ( μ e i · 2 π ω n ) = μ 2 2 μ cos 2 π ω + 1 . Thus, a 1 = 2 cos 2 π ω , a 2 = 1 and s i + 2 = 2 s i + 1 cos 2 π ω s i .
Remark 2.
The method for constructing the min-norm LRR of a given order with the given signal roots is described in [10]. This method will be used in the algorithm proposed in Section 3.2.

2.2. Harmonic Signal with Time-Varying Frequency: Instantaneous Frequency

In this paper, the basic form of signals will be the discrete-time version of
s ( t ) = i = 1 p cos 2 π ω i ( t ) + ϕ i ,
where ω i ( t ) , i = 1 , , p , are slowly changing functions. Note that if ω i ( t ) are linear functions, the signal satisfies an LRR; see [4] (Section 2.2) and Remark 1.
Let s ( t ) = cos 2 π ψ ( t ) + ϕ . The instantaneous frequency of the signal s ( t ) is defined as ω ins ( t ) = ψ ( t ) . The instantaneous period is the function T ins ( t ) = 1 ω ins ( t ) . If ω ins ( a ) = 0 for some a, we put T ins ( a ) = + . The frequency range of the signal S N is the range ω ins ( [ 1 , N ] ) , that is, the image of [ 1 , N ] .
Example 2.
For the signal s ( t ) = cos 2 π t 100 2 , the instantaneous frequency ω ins ( t ) = t 5000 is a linear function; the frequency range equals ω ins ( [ 1 , N ] ) = 1 5000 , N 5000 . For the signal s ( t ) = cos 2 π 20 t + 5 sin 2 π t 100 , the instantaneous frequency equals ω ins ( t ) = 1 20 1 + 5 · 2 π 100 cos 2 π t 100 and is a periodic function with period equal to T = 100 ; the frequency range is
ω ins ( [ 1 , N ] ) 1 20 1 π 10 , 1 20 1 + π 10 .
We assume that on short segments of time, the signals considered in Example 2 are well-enough approximated by a sinusoid with a frequency equal to the instantaneous frequency in the middle of the segment. This assumption will be used for the construction of the signal-forecasting algorithm.

2.3. SSA, Signal Subspace and Recurrent SSA Forecasting

In the version for signal extraction, SSA has two parameters, the window length L, 1 < L < N , where N is the time-series length, and the number of elementary components r. At the first step of SSA, the trajectory matrix of size L × K , where K = N L + 1 , is constructed and then decomposed into elementary components using the SVD. The leading r SVD components are used for the estimation of the signal and the signal subspace basis, which is used for constructing the forecasting LRR. Here is the scheme of the method given in [6]:
X L T X = x 1 x 2 x K x 2 x 3 x K + 1 x L x L + 1 x N r SVD : ( λ m , U m , V m ) , Π r
L r = span ( U 1 , , U r ) is   the   signal   space ; Π r is   the   projector   on L r ; S ^ = m = 1 r U m ( X T U m ) T = Π r X . Π H S ˜ = s ˜ 1 s ˜ 2 s ˜ K s ˜ 2 s ˜ 3 s ˜ K + 1 s ˜ L s ˜ L + 1 s ˜ N T 1 S ˜ .
Thus, a concise form of the SSA algorithm for signal extraction is
S ˜ = T 1 Π H Π r T ( X ) ,
where Π H is the projector to the set of Hankel matrices, that is, the set of trajectory matrices.
The L-rank of a time series is the rank of its trajectory matrix T ( X ) , or, equivalently, the dimension of the column space of T ( X ) . For infinite time series and L > r , the rank is equal to the order of the minimal LRR governing the time series. For example, the rank of the signal S = ( s 1 , s 2 , ) with the common term s n = A cos ( 2 π ω n + ϕ ) and ω 0 , 1 2 equals two.
The construction of the forecasting min-norm LRR of order L 1 based on the SSA decomposition follows the formula for the LRR coefficients R [4] (Equation (2.1)):
R = 1 1 ν 2 i = 1 r π i U i ̲ = : ( a L 1 , , a 1 ) ,
where U i ̲ R L 1 is the vector U i without the last coordinate, which is denoted by π i . The forecast is constructed as s ˜ N + 1 = i = 1 L 1 a i s ˜ N + 1 i .
In addition to forecasting, the SSA decomposition allows one to estimate the signal roots. Let us describe the ESPRIT (see [7] (Algorithm 3.3) and [11]) for signal-root estimation. We define U : = [ U 1 : : U r ] , U ¯ the matrix U without the first row and U ̲ the matrix U without the last row. The ESPRIT estimates of the signal roots { μ k } k = 1 r are the eigenvalues of a matrix that is an approximate solution of the equation U ̲ D = U ¯ ; e.g., D = ( U ̲ T U ̲ ) 1 U ̲ T U ¯ for the LS-ESPRIT version.

3. Signal Forecasting by Forecasting of Local LRRs

3.1. General Model of Signals

Let us describe a general model of time series, for which the developed approach to forecasting will be applied. Consider the signal S N and take some natural Z, 1 < Z < N . For a time series Y , we denote Y A , B the series ( y A , , y B ) .
The signal model is
s n = j = 1 p ρ j n ( n ) cos ( 2 π ψ j ( n ) + φ j ) ,
where we assume that
1.
For the time series S , on its sequential segments S i , i + Z 1 of length Z,
i = 1 , , N Z + 1 , every summand in (4) is well-approximated by a series in the form s n approx = ρ j n ( n 0 ) cos ( 2 π ω j , 0 · n + ϕ ) , where ω j , 0 = ψ j ( n 0 ) and n 0 = n 0 ( i ) is the middle point of the segment.
2.
The series ρ j ( n ) and ψ j ( n ) behave regularly in n, 0 < ψ j ( n ) < 0.5 and there exist methods that can forecast such kinds of series.
To construct a forecast s N + 1 one needs to find the instantaneous frequencies at the point N + 1 .

3.2. Algorithm LocLRR SSA Forecast

Hereinafter, we will consider the model X N = S N + R N , where S N = ( s 1 , , s N ) satisfies the conditions described above and R N is white Gaussian noise with zero mean and standard deviation σ . Let Z ( i ) = X i , i + Z 1 , i = 1 , , W , where W = N Z + 1 . The estimates of instantaneous roots, frequencies, moduli and LRR coefficients will be enumerated according to the middle of the local segment. In particular, denote { a k ( i ) } k = 1 r the coefficients of the minimal LRR that approximates the local segment of the series S i Z / 2 , i + Z / 2 1 with the center in s i .
The local segment of length Z has the structure depicted in Figure 1, where the middle of Z ( i 0 ) is N + 1 .

3.2.1. Scheme

In the scheme below, we consider j = 1 , , p , where p is the number of signal roots with nonnegative imaginary parts (note that signal roots with negative imaginary parts are conjugate to signal roots with positive imaginary parts):
X N { ρ j ( i + Z / 2 ) } i = 1 W FOR MODs { ρ ˜ j ( N + l ) } l = 1 M Z { Z ( i ) } i = 1 W L , r E S P R I T { μ j ( i + Z / 2 ) } i = 1 W { μ ˜ j ( N + l ) } l = 1 M { R ˜ N + l } l = 1 M . { ω j ( i + Z / 2 ) } i = 1 W FOR ARGs { ω ˜ j ( N + l ) } l = 1 M
Here { R ˜ N + l } l = 1 M , where R ˜ k = ( a ˜ r , , a ˜ 1 ) is the sequence of coefficients of the forecasting minimal LRRs. If the required length m of the forecasting LRRs is larger than r, then we lengthen each R ˜ N + l , l = 1 , , M , to R ˜ N + l ( m ) of the min-norm LRR of order m; see Remark 2.
The result is a sequence of coefficients of the forecasting min-norm LRRs { R ˜ N + l ( m ) } l = 1 M of order m. To obtain the results, the series of moduli and frequencies should be forecasted using some algorithms FOR MODs and FOR ARGs for each j.
After the sequence of coefficients is constructed, the forecasting values S ˜ N + 1 , N + M = ( y N + 1 , , y N + M ) are taken from the time series
y n = s ˜ n , n = 1 , , N , k = 1 m a ˜ k ( n ) y n k , n = N + 1 , , N + M ,
where s ˜ n , n = 1 , , N , is the signal estimate obtained using SSA.

3.2.2. Algorithm in Detail

Let us formally describe the algorithm for forecasting the minimal LRRs (Algorithm 1).
Algorithm 1: LocLRR SSA Forecast
Input:
  • time series X N = ( x 1 , , x N ) ,
  • forecast length M,
  • local segment length Z,
  • window length L,
  • number r of leading eigentriples of the trajectory matrices of the local segments that are used to find the estimates of the signal roots,
  • length m of the forecasting LRRs,
  • algorithm for forecasting the root moduli FOR MODs ,
  • algorithm for forecasting the instantaneous frequencies FOR ARGs .
Steps:
  • For each segment Z ( i ) , i = 1 , , W , where W = N Z + 1 , estimate the signal roots { μ j ( i + Z / 2 ) } j = 1 r of the approximating series using ESPRIT with window length L and signal rank r. Suppose that the estimates of the roots form complex-conjugate pairs. Choose the roots with positive argument { μ j ( i + Z / 2 ) } j = 1 p , i = 1 , , W , p = r 2 .
  • Arrange the first set of roots { μ j ( 1 + Z / 2 ) } j = 1 p in descending argument order.
  • Order the sets of roots { μ j ( i + Z / 2 ) } j = 1 p , i = 2 , , W , so that the sum
    j = 1 p μ j ( i + Z / 2 ) μ j ( i + Z / 2 1 )
    is minimal among all possible permutations of { μ j ( i + Z / 2 ) } j = 1 p .
  • For each i = 1 , , W and j = 1 , , p , calculate the moduli ρ j ( i + Z / 2 ) = | μ j ( i + Z / 2 ) | and the frequencies ω j ( i + Z / 2 ) = Arg μ j ( i + Z / 2 ) 2 π . For j = 1 , , p , set the series P ( j ) = ρ j ( 1 + Z / 2 ) , , ρ j ( N Z / 2 + 1 ) and Ω ( j ) = ω j ( 1 + Z / 2 ) , , ω j ( N Z / 2 + 1 ) .
  • Using the algorithms FOR MODs and FOR ARGs , for each j = 1 , , p , construct the forecast ρ ˜ j ( N + 1 ) , , ρ ˜ j ( N + M ) of the series P ( j ) and the forecast ω ˜ j ( N + 1 ) , , ω ˜ j ( N + M ) of  Ω ( j ) , respectively.
  • Using the obtained forecasts of frequencies and moduli, calculate the roots μ ˜ j ( n ) = ρ ˜ j ( n ) exp ( i 2 π ω ˜ j ( n ) ) , n = N + 1 , , N + M , j = 1 , , p , supplement them by their complex conjugates and then, using the relation between characteristic polynomials and LRRs (see Remark 1), find the sequence { R ˜ N + j } j = 1 M of the LRR coefficients, R ˜ N + j = a ˜ r ( N + j ) , , a ˜ 1 ( N + j ) .
Output: The sequence { R ˜ N + l } l = 1 M of coefficients of minimal LRRs of order r approximately governing the future signal segments S l + N Z / 2 1 , l + N + Z / 2 .
Remark 3.
If real-valued roots are obtained on some segments of Z ( i ) , i = 1 , , W , we replace the values of the roots with missing values. That is, for i = 1 , , W and j = 1 , , r such that μ j ( i ) R we put μ j ( i ) : = NA ( not available ) . Possible gaps in the series of frequency estimates can be filled in; e.g., one can fill them with the iterative gap-filling method [12]; see also the description of the igapfill algorithm in [7] (Algorithm 3.7).
An appropriate choice of the algorithms FOR MODs and FOR ARGs depends on the form of the frequency modulation.
Finally, each LRR of { R ˜ N + j } j = 1 M is enlarged to a min-norm LRR of length m with coefficients { R ˜ N + l ( m ) } l = 1 M (see Remark 2) and this enlarged LRR is used for the prediction of s N + l .

4. Examples

4.1. Description

In this section, we demonstrate the forecasting using the LocLRR SSA Forecast algorithm. The following types of time series were considered.

4.1.1. Sinusoid with Linearly Modulated Frequency

The signal has the form
s ( t ) = cos ( 2 π ( a t ) 2 ) , a 0 .
The instantaneous frequency is ω ins ( t ) = 2 a 2 t ; the frequency range is ω ins ( [ 1 , N ] ) = 2 a 2 , 2 a 2 N . We will consider such values of the parameter a and series length N at which the frequency range ω ins ( [ 1 , N ] ) ( 0 , 0.5 ) . Since the instantaneous frequency is a linear function, we will take the linear-regression-prediction algorithm as FOR ARGs .

4.1.2. Sinusoid with Sinusoidal Frequency

The signal is
s ( t ) = cos 2 π ω ext t + b sin 2 π ω int t , b > 0 ,
ω ext , ω int ( 0 , 0.5 ) , where ω int is much smaller than ω ext . The instantaneous frequency equals ω ins ( t ) = ω ext ( 1 + 2 π ω int b cos 2 π ω int t ) ; the frequency range is ω ins ( [ 1 , N ] ) = ω ext ( 1 2 π ω int b ) , ω ext ( 1 + 2 π ω int b ) . We will consider values of signal parameters and series lengths such that ω ins ( [ 1 , N ] ) ( 0 , 0.5 ) . The rank of the time series of the series Ω N of instantaneous frequencies with terms ω n = ω ins ( n ) , n = 1 , , N , is equal to 3. Therefore, we take the recurrent SSA forecasting algorithm with r = 3 as FOR ARGs ; the window length L is chosen to be half of the length of the frequency-estimation series following the general recommendations.
Since we do not consider time-varying amplitude modulation in the examples, we take the forecast of moduli series using the average value over local intervals as algorithm FOR MODs .

4.1.3. Sum of Sinusoids

The signal is
s ( t ) = j = 1 p 1 cos ( 2 π ( a j t ) 2 ) + k = 1 p 2 cos 2 π ω k ext t + b k sin 2 π ω k int t ,
a j 0 , j = 1 , , p 1 , ω k ext , ω k int ( 0 , 0.5 ) , b k > 0 , k = 1 , , p 2 . We will consider the signal parameters and the time-series length N such that the frequency ranges of the summands are not mutually intersected and the frequency range of each summand belongs to the interval ( 0 , 0.5 ) .
The frequencies corresponding to the signal summands of s ( t ) will be forecasted either with linear regression or with SSA, based on the type of obtained estimates of the instantaneous frequencies on the local segments. The moduli will be predicted using the average of the estimates over the local segments.
In numerical examples, we will consider the signal parameters such that the instantaneous frequencies are slow-varying functions. Since the examples under consideration satisfy the general time-series model (Section 3.1), the LocLRR SSA-forecast algorithm can be used for forecasting.
In real-life problems, the value of the optimal LRR order m can be chosen based on the training data. In the model examples, the value of the optimal LRR order m will be chosen by trying all possible values of m in the range from r to Z and comparing the mean squared errors (MSE) of the predictions.
The following approach was used for choosing the length Z of local segments. We take such a value of the parameter Z that most of the local segments Z ( i ) , i = 1 , , W , contain at least 2–3 instantaneous periods of each summand of the signal satisfying model (5). For small Z, we obtain estimates of the instantaneous frequency with large variability, whereas for large Z, we have a considerable bias. The necessary condition for appropriate values of Z is that there are no (or a few) segments providing real-valued roots.

4.2. Numerical Experiments

Consider the following numerical examples:
  • Sinusoid with linearly modulated frequency,
    s n = cos 2 π n 100 2
    (denoted by cos ( n 2 ) );
  • Sinusoid with sinusoidal frequency modulation,
    s n = cos 2 π 20 n + 5 sin 2 π n 100
    (denoted by cos sin ( n ) );
  • Sum of sinusoids with linear and sinusoidal frequency modulations,
    s n = cos 2 π n 100 2 + cos 2 π 10 n + sin 2 π n 100
    (denoted by cos ( n 2 ) + cos sin ( n ) );
  • Sum of two sinusoids with sinusoidal frequency modulation,
    s n = cos 2 π 20 n + sin 2 π n 100 + cos 2 π 10 n + 2 sin 2 π n 140
    (denoted by cos sin ( n ) + cos sin ( n ) ).
We compare the proposed LocLRR SSA-forecast algorithm (denoted by “alg”) with two simple methods:
  • Forecasting by constant, which forecasts by zero, since we consider time series with zero average (denoted by ‘by 0’).
  • Forecasting using the last local segment, which is performed with the min-norm LRR computed using the roots of the last local segment Z ( N Z + 1 ) = ( x N Z + 1 , , x N ) (denoted by ‘last’).
Let us consider accuracies of M = 30 step ahead forecasts for time series of length N = 300 . For each example, the prediction is performed for the pure signal and the noisy signal, where the noise is white Gaussian with standard deviation σ = 0.25 . In the noisy case, a sample of size P = 100 is used for the estimation of accuracy.
Let { X N ( i ) } i = 1 P be the time-series sample. The prediction error is estimated as RMSE = 1 P i = 1 P MSE ( S ˜ N + 1 , N + M ( i ) , S N + 1 , N + M ) .
The results are shown in Table 1, where the best results are highlighted in bold. They confirm the advantage of the proposed method over the simple methods under consideration.

4.3. Detailed Example

To demonstrate the approach more clearly, let us consider the example (6) without noise; see Figure 2. Take Z = 61 , L = 30 , r = 4 .
The results of forecasting the signal root corresponding to the first summand are shown in Figure 3, and the results of forecasting the signal root corresponding to the second summand can be seen in Figure 4. The frequency ranges of modulations in the summands are approximately [ 0 , 0.06 ] and [ 0.094 , 0.106 ] , respectively. The forecasts are depicted in Figure 5 (forecasting with the last segment) and Figure 6 (forecasting with the proposed algorithm). Since there is no noise, the optimal LRR length m is small; here, m = 4 .

5. Conclusions

In this paper, we proposed a method for forecasting time series that extends the capabilities of SSA and allows one to predict time series in which the signal just locally satisfies LRRs. A regular behaviour of the coefficients of the governed LRRs was assumed. In [6] (page 9), it was stated for the considered type of time series that “[t]he problem is how to forecast the extracted signal, since its local estimates may have different structures on different time intervals. Indeed, by using local versions of SSA, we do not obtain a common nonlinear model but instead we have a set of local linear models”. In this paper, we proposed an answer to the problem of prediction of local structures of time series for some class of signals.
We constructed an algorithm for predicting local structures of time series that are the sum of frequency-modulated sinusoids and showed that the proposed forecasting method gives reasonable results for the cases of linear and sinusoidal frequency modulations.
Certainly, the considered comparison with a couple of simple methods is not enough; a more extensive comparison should be performed in the future. However, the results of this work show that the proposed approach based on the prediction of the coefficients of LRRs is promising.

Author Contributions

Conceptualization, N.G.; methodology, N.G. and E.S.; software, E.S.; validation, N.G. and E.S.; formal analysis, N.G. and E.S.; investigation, N.G. and E.S.; resources, N.G.; data curation, N.G. and E.S.; writing—original draft preparation, N.G. and E.S.; writing—review and editing, N.G. and E.S.; visualization, N.G. and E.S.; supervision, N.G.; project administration, N.G.; funding acquisition, N.G. All authors have read and agreed to the published version of the manuscript.

Funding

The work is supported by the Russian Science Foundation (project No. 23-21-00222).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are openly available. The R-code for data generation and result replication can be found at https://zenodo.org/record/8087608, doi 10.5281/zenodo.8087608.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Broomhead, D.; King, G. Extracting qualitative dynamics from experimental data. Physica D 1986, 20, 217–236. [Google Scholar] [CrossRef]
  2. Vautard, R.; Ghil, M. Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series. Physica D 1989, 35, 395–424. [Google Scholar] [CrossRef]
  3. Elsner, J.B.; Tsonis, A.A. Singular Spectrum Analysis: A New Tool in Time Series Analysis; Plenum Press: New York, NY, USA, 1996. [Google Scholar]
  4. Golyandina, N.; Nekrutkin, V.; Zhigljavsky, A. Analysis of Time Series Structure: SSA and Related Techniques; Chapman & Hall/CRC: Boca Raton, FL, USA, 2001. [Google Scholar]
  5. Golyandina, N.; Zhigljavsky, A. Singular Spectrum Analysis for Time Series, 2nd ed.; Springer Briefs in Statistics; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
  6. Golyandina, N. Particularities and commonalities of singular spectrum analysis as a method of time series analysis and signal processing. Wiley Interdiscip. Rev. Comput. Stat. 2020, 12, e1487. [Google Scholar] [CrossRef]
  7. Golyandina, N.; Korobeynikov, A.; Zhigljavsky, A. Singular Spectrum Analysis with R; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
  8. Leles, M.; Sansão, J.; Mozelli, L.; Guimarães, H. Improving reconstruction of time-series based in Singular Spectrum Analysis: A segmentation approach. Digit. Signal Process. 2018, 77, 63–76. [Google Scholar] [CrossRef]
  9. Hall, M.J. Combinatorial Theory; Wiley: New York, NY, USA, 1998. [Google Scholar]
  10. Usevich, K. On signal and extraneous roots in Singular Spectrum Analysis. Stat. Interface 2010, 3, 281–295. [Google Scholar] [CrossRef] [Green Version]
  11. Roy, R.; Kailath, T. ESPRIT-estimation of signal parameters via rotational invariance techniques. IEEE Trans. Acoust. Speech Signal Process. 1989, 37, 984–995. [Google Scholar] [CrossRef] [Green Version]
  12. Kondrashov, D.; Ghil, M. Spatio-temporal filling of missing points in geophysical data sets. Nonlinear Process. Geophys. 2006, 13, 151–159. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Scheme of moving segments.
Figure 1. Scheme of moving segments.
Engproc 39 00012 g001
Figure 2. Initial signal s n = cos 2 π n 100 2 + cos 2 π 10 n + sin 2 π n 100 .
Figure 2. Initial signal s n = cos 2 π n 100 2 + cos 2 π 10 n + sin 2 π n 100 .
Engproc 39 00012 g002
Figure 3. Forecasting the series of linear instantaneous frequencies for the summand cos 2 π n 100 2 .
Figure 3. Forecasting the series of linear instantaneous frequencies for the summand cos 2 π n 100 2 .
Engproc 39 00012 g003
Figure 4. Forecasting the series of sinusoidal instantaneous frequencies for the summand cos 2 π 10 n + sin 2 π n 100 .
Figure 4. Forecasting the series of sinusoidal instantaneous frequencies for the summand cos 2 π 10 n + sin 2 π n 100 .
Engproc 39 00012 g004
Figure 5. Forecasting using the last local segment, RMSE = 0.88 . A shift is clearly seen.
Figure 5. Forecasting using the last local segment, RMSE = 0.88 . A shift is clearly seen.
Engproc 39 00012 g005
Figure 6. Forecasting using LocLRR SSA forecast, RMSE = 0.2 . There is no shift.
Figure 6. Forecasting using LocLRR SSA forecast, RMSE = 0.2 . There is no shift.
Engproc 39 00012 g006
Table 1. RMSE of forecasts; ‘alg’ is the proposed algorithm; m is the optimal length of the forecasting LRRs.
Table 1. RMSE of forecasts; ‘alg’ is the proposed algorithm; m is the optimal length of the forecasting LRRs.
Signal S N σ by 0LastAlg
RMSEmRMSEm
cos ( n 2 ) 00.6890.71720.0143
0.250.7330.75450.13511
cos sin ( n ) 00.6980.30920.0972
0.250.7410.43850.2326
cos ( n 2 ) + cos sin ( n ) 01.0600.88060.1844
0.251.0890.958100.29512
cos sin ( n ) + cos sin ( n ) 00.8730.58740.1915
0.250.9080.656280.29115
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Golyandina, N.; Shapoval, E. Forecasting of Signals by Forecasting Linear Recurrence Relations. Eng. Proc. 2023, 39, 12. https://doi.org/10.3390/engproc2023039012

AMA Style

Golyandina N, Shapoval E. Forecasting of Signals by Forecasting Linear Recurrence Relations. Engineering Proceedings. 2023; 39(1):12. https://doi.org/10.3390/engproc2023039012

Chicago/Turabian Style

Golyandina, Nina, and Egor Shapoval. 2023. "Forecasting of Signals by Forecasting Linear Recurrence Relations" Engineering Proceedings 39, no. 1: 12. https://doi.org/10.3390/engproc2023039012

Article Metrics

Back to TopTop