Next Article in Journal
Solving Regression Problems with Intelligent Machine Learner for Engineering Informatics
Next Article in Special Issue
A New Model to Identify the Reliability and Trust of Internet Banking Users Using Fuzzy Theory and Data-Mining
Previous Article in Journal
Application of Mathematical Methods to the Study of Special-Needs Education in Spanish Journals
Previous Article in Special Issue
Using Artificial Neural Networks in Predicting the Level of Stress among Military Conscripts

Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

# Boscovich Fuzzy Regression Line

by
Pavel Škrabánek
1,*,
Jaroslav Marek
2 and
Alena Pozdílková
2
1
Institute of Automation and Computer Science, Brno University of Technology, Technická 2896/2, 616 69 Brno, Czech Republic
2
Department of Mathematics and Physics, University of Pardubice, Studentská 95, 532 10 Pardubice, Czech Republic
*
Author to whom correspondence should be addressed.
Mathematics 2021, 9(6), 685; https://doi.org/10.3390/math9060685
Submission received: 27 February 2021 / Revised: 14 March 2021 / Accepted: 19 March 2021 / Published: 23 March 2021
(This article belongs to the Special Issue Recent Advances in Applications of Fuzzy Logic and Soft Computing)

## Abstract

:
We introduce a new fuzzy linear regression method. The method is capable of approximating fuzzy relationships between an independent and a dependent variable. The independent and dependent variables are expected to be a real value and triangular fuzzy numbers, respectively. We demonstrate on twenty datasets that the method is reliable, and it is less sensitive to outliers, compare with possibilistic-based fuzzy regression methods. Unlike other commonly used fuzzy regression methods, the presented method is simple for implementation and it has linear time-complexity. The method guarantees non-negativity of model parameter spreads.

## 1. Introduction

A regression model approximates the functional relationship between a dependent y and independent variables x. Parameters of a regression model are estimated using a set of observations of x and y. The model with the estimated parameters can be used to predict a dependent variable value for a specific combination of the independent variables. In practice, statistical regression models are most often used for this purpose, but their usage is limited by an assumption that any deviation of a prediction from a corresponding observation is due to a random error.
In many practical applications, the deviations are a result of imprecise observations, an indefiniteness of the system structure and parameters [1,2], or the vagueness of human perception of the model (in contrast with the statistical regression where the errors are associated with observations) [3]. There are also cases where the observations are inherently fuzzy, e.g., if the observations are described by linguistic terms [4,5,6]. In such cases, the deviations are not due to randomness, but they are due to fuzziness and fuzzy regression should be used. The fuzzy regression can be also used when statistical distributional assumptions cannot be justified, or if the representation of the regression model is poor [3]. The fuzzy regression is an efficient alternative to statistical regression whenever a dataset is insufficient to support statistical regression analysis [7].
The fuzzy regression approximates the relationship between the dependent y and the independent variables x using a fuzzy regression model. Once the underlying regression relationship is known, an appropriate parametric fuzzy regression model (e.g., linear [8], polynomial [9], and logistic [10] ) can be used for the approximation of the relationship. If the relationship is unknown, it is possible to utilize a nonparametric fuzzy regression model. Such as a model based on k-nearest neighbour smoothing [11], kernel smoothing [11], local linear smoothing [12], and adaptive neuro-fuzzy inference systems [13].
In fuzzy regression analysis, the relationship is typically approximated as a linear dependence by a model with all fuzzy parameters [2,14,15,16,17,18,19,20,21,22,23,24], but other fuzzy linear models are used as well [8]. For example, the model with all fuzzy parameters can be extended with a fuzzy error term [22], or some of the parameters can be real value numbers [8]. Models with all fuzzy parameters are known for a dependence of model prediction spreads on absolute values of x [8,23]. In the case that all parameters, except a y-intercept or the fuzzy error term, are real value numbers, the model prediction spreads are constant in the whole range of x values [8].
For estimation of unknown parameters of a fuzzy model, possibilistic and statistics-based approaches are frequently used. Possibilistic fuzzy estimators minimize a total spread of the fuzzy model predictions subject to constraints that arise from observations [1,3,15,16,19,21,25,26]. The statistics-based solutions adopt concepts that are used in statistics such as the ordinary least squares method [17,18,20,27,28,29,30,31,32,33], the least absolute deviations method [22,23,24,34,35,36,37], and adaptive smoothing methods [11,12]. Some fuzzy estimators combine the possibilistic and the statistics-based approaches [38].
Several limitations were observed for some fuzzy estimators. As pointed out by many authors [22,24,39,40,41], numerous fuzzy estimators are sensitive to outliers. This criticism resulted in a number of fuzzy estimators that are more or less resilient to outliers [3,11,12,24,26,41,42,43,44]. A serious issue of some estimators is the fact that they do not guarantee non-negativity of spreads [35,41,45,46]. A certain disadvantage is also the computational complexity of the estimators. Depending on a used fuzzy estimator, matrix operations [18], linear programming [1,3,15,16,19,21,24,44], quadratic programming [26,27,32,38], or a general constrained optimization problem [25,29,30,31,33] must be solved to obtain estimates of the unknown parameters. Some methods employ customized iterative optimization algorithms [20,41].
Herein, we propose a new fuzzy estimator intended for a simple fuzzy regression model with real value independent and fuzzy dependent variables. We based the estimator on the Boscovich regression line [47,48], hence we named it Boscovich fuzzy regression line. The original Boscovich regression line was a pioneering regression method based on minimization criterion, i.e., it is a predecessor of today’s statistical regression methods. The method was characterized by extremely low computing demands. The presented Boscovich fuzzy regression line inherited this property. Moreover, the used fuzzy linear model guaranties non-negativity of the model parameter spreads by its nature, and spreads of model predictions are not influenced by independent variable values. Even with its simplicity, the presented fuzzy estimator provides regression models in a quality comparable to other fuzzy regression methods.

## 2. Materials and Methods

The Boscovich regression method [48] was designed for the simple theoretical linear regression model
$y = β 0 + β 1 x ,$
where $β 0$ and $β 1$ are unknown parameters of the model. The estimation of $β 0 , β 1$ is based on n imprecise observations $( x i , y i )$, where $i ∈ I$, and $I = { 1 , 2 , ⋯ , n }$. The estimator introduced by Boscovich was based on two constraints [49]:
(1)
the sum of the positive and negative residuals (in the sense of y axis) shall be equal,
(2)
the sum of the best absolute values of all the residuals shall be as small as possible.
The constraint (1) implies that the regression line should pass through the centroid $( x ¯ , y ¯ )$ formed by the observations, i.e.,
$y ¯ = β ^ 0 * + β ^ 1 * x ¯ ,$
where $β ^ 0 * , β ^ 1 *$ are the best estimates of the unknown regression coefficients $β 0 , β 1$ with respect to the constraints (1) and (2). As follows from the constraint (1), the coordinates of the centroid can be expressed as
$x ¯ = 1 n ∑ i = 1 n x i , y ¯ = 1 n ∑ i = 1 n y i .$
Thus, the best estimate of the unknown regression coefficient $β 0$ can be expressed on the base of (2) as
$β ^ 0 * = y ¯ − β ^ 1 * x ¯ .$
Naturally, only one point, $( x ¯ , y ¯ )$, is insufficient to form any line. Therefore, at least one other point is needed. Boscovich suggested using one of the observations as the second point of the regression line. Whereas n observations are available, n regression lines can be constructed in such a way. The k-th regression line is clearly determined by
$L k : y k = y ¯ + β ^ 1 ( k ) Δ x k ,$
where $β ^ 1 ( k )$ is the k-th estimate of $β 1$ based on the k-th observation $( x k , y k )$, and $Δ x k = x k − x ¯$. On the basis of (5), the k-th estimate of $β 1$ can be expressed as
$β ^ 1 ( k ) = Δ x k − 1 y k − y ¯ .$
Altogether, n estimates of $β 1$ are obtained in such a way. The selection of the best one can be carried out using an evaluation function J based on the constraint (2), i.e., for the k-th regression line, it holds that
$J ( k ) = ∑ i = 1 n y i − y ^ i ( k ) = ∑ i = 1 n y i − y ¯ − β ^ 1 ( k ) Δ x i ,$
where $y ^ i ( k )$ is the i-th prediction of the dependent variable y using the k-th regression model (5).
Boscovich expressed a premise that one of the n regression lines is the best approximation of the model (1). Considering constraint (2), the best estimate of the unknown regression coefficient $β 1$ is given by
$β ^ 1 * = arg min β ^ 1 ( k ) , ∀ k ∈ I ∑ i = 1 n y i − y ¯ − β ^ 1 ( k ) Δ x i .$

## 3. Fuzzy Set Preliminaries

Definition 1.
A fuzzy subset $A ˜$ defined on $R$ with membership function $μ A ˜ : R → [ 0 , 1 ]$ is called a fuzzy number if
(a)
$μ A ˜ ( x )$ is normal, i.e., $∃ x 0 ∈ R$ with $μ A ˜ ( x 0 ) = 1$,
(b)
$μ A ˜ ( x )$ is fuzzy convex,
(c)
$μ A ˜ ( x )$ is upper semi-continuous on $R$,
(d)
$μ A ˜ ( x )$ is compactly supported, i.e., $c l x ∈ R ; μ A ˜ ( x ) > 0$ is compact, where $c l A$ denotes the closure of the set A.
Definition 2.
Let L and R be continuous decreasing functions $L , R : 0 , + ∞ → [ 0 , 1 ]$ fulfilling $L ( 0 ) = R ( 0 ) = 1$ and $L ( 1 ) = R ( 1 ) = 0$, invertible on $[ 0 , 1 ]$. Moreover, let $m A ˜ L , m A ˜ R ∈ R$, where $m A ˜ L ≤ m A ˜ R$, and $α A ˜ , β A ˜ ∈ R +$; then a fuzzy set $A ˜$ is said to be an L-R type fuzzy number if its membership function is
$μ A ˜ ( x ) = L m A ˜ L − x α A ˜ , f o r m A ˜ L − α A ˜ < x < m A ˜ L , 1 , f o r m A ˜ L ≤ x ≤ m A ˜ R , R x − m A ˜ R β A ˜ , f o r m A ˜ R < x < m A ˜ R + β A ˜ , 0 , o t h e r w i s e ,$
where $m A ˜ L$ and $m A ˜ R$ are the left and the right points of the core and $α A ˜$ and $β A ˜$ are the left and the right spread of the fuzzy number $A ˜$. Symbolically,
$A ˜ = m A ˜ L , m A ˜ R , α A ˜ , β A ˜ L R .$
In the case that the functions L and R are linear, and $m A ˜ L = m A ˜ R = m A ˜$, then the L-R fuzzy number is called a triangular fuzzy number and its membership function is
$μ A ˜ ( x ) = m A ˜ − x α A ˜ , m A ˜ − α A ˜ < x ≤ m A ˜ , x − m A ˜ β A ˜ , m A ˜ < x < m A ˜ + β A ˜ , 0 , o t h e r w i s e ,$
where $m A ˜$ is the mean of the triangular fuzzy number $A ˜$. The triangular fuzzy number $A ˜$ is sufficiently represented by the triplet
$A ˜ = m A ˜ , α A ˜ , β A ˜ T .$
Let the set of all triangular fuzzy numbers be denoted as $R T$, where T is a symbol for a triangular fuzzy number.
Definition 3.
Let $A ˜ = m A ˜ , α A ˜ , β A ˜ T$ be the triangular fuzzy number where $A ˜ ∈ R T$ and λ is a real number. Then their scalar multiplication can be, on the basis of the extension principle, defined as
$λ · A ˜ = λ m A ˜ , λ α A ˜ , λ β A ˜ T λ ≥ 0 , λ m A ˜ , − λ β A ˜ , − λ α A ˜ T λ < 0 .$
For any $λ , ν ∈ R$ and any $A ˜ ∈ R T$, it holds that
$( λ ν ) · A ˜ = λ · ( ν · A ˜ ) .$
Definition 4.
The opposite of a triangular fuzzy number $A ˜ = m A ˜ , α A ˜ , β A ˜ T$, where $A ˜ ∈ R T$, is the triangular fuzzy number
$− A ˜ = − 1 · A ˜ = − m A ˜ , β A ˜ , α A ˜ T .$
The fuzzy numbers $A ˜$ and $− A ˜$ are symmetrical with respect to the axis $x = 0$.
Definition 5.
For the triangular fuzzy numbers $A ˜ = m A ˜ , α A ˜ , β A ˜ T$, $B ˜ = m B ˜ , α B ˜ , β B ˜ T$, and $C ˜ = m C ˜ , α C ˜ , β C ˜ T$, where $A ˜ , B ˜ , C ˜ ∈ R T$, the sum of $A ˜$ and $B ˜$ can be, on the basis of the extension principle, defined as
$A ˜ ⊕ B ˜ = m A ˜ + m B ˜ , α A ˜ + α B ˜ , β A ˜ + β B ˜ T .$
The operation for $A ˜ , B ˜ , C ˜ ∈ R T$ is:
(a)
commutative, i.e., $A ˜ ⊕ B ˜ = B ˜ ⊕ A ˜$,
(b)
associative, i.e., $A ˜ ⊕ ( B ˜ ⊕ C ˜ ) = ( A ˜ ⊕ B ˜ ) ⊕ C ˜$.
Furthermore, it holds:
(a)
for any $λ , ν ∈ R$ with $λ ν ≥ 0$ and any $A ˜ ∈ R T$ that
$( λ + ν ) · A ˜ = λ · A ˜ ⊕ ν · A ˜ ,$
(b)
for any $λ ∈ R$ and any $A ˜ , B ˜ ∈ R T$ that
$λ · ( A ˜ ⊕ B ˜ ) = λ · A ˜ ⊕ λ · B ˜ ,$
(c)
for any $A ˜ ∈ R T$ and $λ ∈ R$ that
$A ˜ − λ = m A ˜ − λ , α A ˜ , β A ˜ T ,$
and
$A ˜ + λ = m A ˜ + λ , α A ˜ , β A ˜ T .$
Definition 6.
The average value of n triangular fuzzy numbers $A ˜ 1 , A ˜ 2 , ⋯ , A ˜ n ∈ R T$ is
$A ¯ ˜ = 1 n ∑ i = 1 n A ˜ i = 1 n · A ˜ 1 ⊕ A ˜ 2 ⊕ ⋯ ⊕ A ˜ n ,$
where $A ˜ i = m A ˜ i , α A ˜ i , β A ˜ i T$ for $∀ i ∈ [ 1 , n ]$, $A ¯ ˜ = m A ¯ ˜ , α A ¯ ˜ , β A ¯ ˜ T ∈ R T$.

## 4. Boscovich Fuzzy Regression Line

The presented fuzzy regression method is intended for datasets where observations of the independent variable x are real value numbers and observations of the dependent variable y are triangular fuzzy numbers $Y ˜$, i.e., a set of n observations is given as
$O = ( x i , Y ˜ i ) | ∀ i ∈ I ,$
where $x i$ is the i-th real value observation of the independent variable x, $Y ˜ i$ is the i-th fuzzy observation of the dependent variable y, $Y ˜ i = ( y i , υ ̲ i , υ ¯ i ) T$, $y i$ is the mean, $υ ̲ i$ is the left, and $υ ¯ i$ is the right spread of $Y ˜ i$.
For the approximation of a linear dependence between x and $Y ˜$, we use a simple fuzzy linear model
$Y ˜ = A ˜ 0 + a 1 x ,$
where $A ˜ 0$ and $a 1$ are unknown model parameters, $A ˜ 0 ∈ R T$, $a 1 ∈ R$, $A ˜ 0 = ( a 0 , α ̲ 0 , α ¯ 0 ) T$, $a 0$ is the mean, $α ̲ 0$ is the left, and $α ¯ 0$ is the right spread of $A ˜ 0$. As the y-intercept $A ˜ 0$ is the only fuzzy parameter of the model (23), the fuzziness of model predictions is independent of model input x [8].
Following the Boscovich idea (see Section 2, constraint (1)), the best estimate of the fuzzy regression line (23) shall pass through a centroid formed by the observations O, i.e.,
$Y ¯ ˜ = A ˜ ^ 0 * + a ^ 1 * x ¯ ,$
where $A ˜ ^ 0 * , a ^ 1 *$ are the best estimates of the unknown regression coefficients $A ˜ 0$ and $a 1$ according to the constraints (1) and (2); $Y ¯ ˜$ is the y coordinate of the centroid, and $x ¯$ is its x coordinate. The coordinates of the centroid formed by the observations (22) are given as
$x ¯ = 1 n ∑ i = 1 n x i , Y ¯ ˜ = 1 n ∑ i = 1 n Y ˜ i ,$
therefore $x ¯ ∈ R , Y ¯ ˜ ∈ R T$.
Let us express the estimate of the unknown fuzzy regression coefficient $A ˜ 0$, using the Formula (24), as
$A ˜ ^ 0 * = Y ¯ ˜ − a ^ 1 * x ¯ .$
As in the case of the Boscovich regression line, n fuzzy regression lines can be constructed using the observations (22). The k-th fuzzy regression line, based on the k-th observation $x k , Y ˜ k$ is given as
$L ˜ k : Y ˜ k = Y ¯ ˜ + a ^ 1 ( k ) Δ x k ,$
where $a ^ 1 ( k )$ is the slope of the k-th fuzzy regression line (i.e., the k-th estimate of $a 1$), and $Δ x k = x k − x ¯$.
The model (27) uses the trick used in the ordinary least squares method. Specifically, we relate the explanatory variable to the centre of gravity using the relation $Δ x k = x k − x ¯$. The y coordinate of the centroid $Y ¯ ˜$ incorporates the fuzziness of the underling relationship (25), which is reflected in the intercept of the regression line $A ˜ ^ 0 *$ (26). The intercept is the only fuzzy coefficient of the linear function in our model. Such a constructed line, fulfilling the first Boscovich assumption (Section 2, constraint (1)), is going through the centroid, which is a necessary constraint for an unbiased estimate of the regression line. Considering this, we can construct an estimate of the slope of the k-th regression line on the basis of the mean values of the fuzzy numbers $Y ˜ k$ and $Y ¯ ˜$
$L k ′ : m Y ˜ k = m Y ¯ ˜ + a ^ 1 ( k ) Δ x k ,$
where $m Y ˜ k$ is the mean of the k-th observation of $Y ˜$, and $m Y ¯ ˜$ is the mean of the y coordinate of the centroid $Y ¯ ˜$.
The k-th estimate of the slope $a 1$ is then given as
$a ^ 1 ( k ) = Δ x k − 1 Y ˜ k − m Y ¯ ˜ .$
For the selection of the best estimate of the slope $a 1$, an appropriate evaluation function has to be formulated (see Section 2, constraint (2)). As follows from the relaxed Equation (28), the criterion is given as
$J F ( k ) = ∑ i = 1 n m Y ˜ i − m Y ¯ ˜ − a 1 ( k ) Δ x i ,$
and the best estimate of $a 1$ is given as
$a ^ 1 * = arg min a ^ 1 ( k ) , ∀ k ∈ I ∑ i = 1 n m Y ˜ i − m Y ¯ ˜ − a 1 ( k ) Δ x i .$
The proposed fuzzy estimator can be written using a pseudocode as an Algorithm 1.
 Algorithm 1 Boscovich fuzzy regression line 1:functionBFRL(O)Require: The set of n observations $O = ( x i , Y ˜ i ) | ∀ i ∈ I$Ensure: The best estimates of $A ˜ 0$ and $a 1$2:        $x ¯ ← 1 n ∑ i = 1 n x i$3:        $Y ¯ ˜ ← 1 n ∑ i = 1 n Y ˜ i$4:        $Δ x i ← x i − x ¯ , ∀ i ∈ I$5:        $a ^ 1 ( i ) ← Δ x i − 1 m Y ˜ i − m Y ¯ ˜ , ∀ i ∈ I$6:        $J F ( k ) ← ∑ i = 1 n m Y ˜ i − m Y ¯ ˜ − a ^ 1 ( k ) Δ x i , ∀ k ∈ I$7:        $a ^ 1 * ← arg min a ^ 1 ( k ) , ∀ k ∈ I J F ( k )$        ▹ Best estimate of $a 1$8:        $A ˜ ^ 0 * ← Y ¯ ˜ − a ^ 1 * x ¯$           ▹Best estimate of $A ˜ 0$9:        return $A ˜ ^ 0 * , a ^ 1 *$10:end function

## 5. Numerical Examples

We compared the presented Boscovich fuzzy regression line (BFRL) with several possibilistic and statistics-based fuzzy regression methods. As representatives of the possibilistic family, we selected a possibilistic linear regression (PLR) analysis [16], a PLR combined with an omission approach (OPRL) [44], and a multi-objective fuzzy linear regression (MOFLR) [26]. The OPRL and MOFLR were designed to be less sensitive to outliers. We implemented the OPRL for detection of one outlier. Each of these three methods requires setting of one parameter by a decision maker. The decision maker must set up a threshold value h in the cases of PRL and OPRL, where $h ∈ [ 0 , 1 )$. The threshold value indicates a degree of fitness of an estimated fuzzy regression model [16]. MOFLR requires presenting of a weighting coefficient $ω$, where $ω ∈ ( 0 , 1 )$. The weighting coefficient determines a trade off between outlier penalization ($ω → 1$) and data fitting ($ω → 0$) [26]. From the statistics-based methods, we considered a fuzzy least squares (FLS) [18], a fuzzy least absolute linear regression (FLAR) [24], and a robust fuzzy regression (RFR) analysis [41]. Note that the RFR requires at least six observations to estimate parameters of a simple fuzzy linear model and it employs a customized iterative optimization method.
BFRL was designed as a parameter estimator of the fuzzy linear model (23). PLR, OPLR, MOFLR, FLS, FLAR and FLS expect a fuzzy linear model with all fuzzy parameters. For one real value independent variable x, the model is given as
$Y ˜ = A ˜ 0 + A ˜ 1 · x ,$
where $A ˜ 0$ and $A ˜ 1$ are unknown model parameters, $A ˜ 0 , A ˜ 1 ∈ R T$, $A ˜ 1 = ( a 1 , α ̲ 1 , α ¯ 1 ) T$, $a 1$ is the mean, $α ̲ 1$ is the left, and $α ¯ 1$ is the right spread of $A ˜ 1$.
RFR uses a different approach to model prediction calculation. The i-th prediction of y, $Y ˜ i = ( y i , υ ̲ i , υ ¯ i ) T$, is given as
$Y ˜ i = y i = f a , υ ̲ i = b y i + d , υ ¯ i = g y i + h ,$
where $a , b , d , g$ and h are unknown model parameters, and $f = 1 x i$.
We evaluated performance of the aforementioned methods on various datasets. The possibilistic approaches (PRL, OPRL, MOFLR) were designed for symmetric fuzzy numbers. To allow comparison of all the methods, we involved datasets with symmetric fuzzy observations into the evaluation process. We used data from example 1 published in [16] and from example 2 published in [18]. We labelled the datasets SFN-1 and SFN-2, respectively. The presented BFRL, as well as the statistics-based methods (FLS, FLAR, RFR), are also capable of processing non-symmetric fuzzy numbers. To fully examine performance of these methods, we included three datasets with non-symmetric fuzzy observations of the dependent variable into the evaluation process. Specifically, we used data from example 1 published in [45], from example 2 published in [50], and from example 3 published in [51]. We labelled the datasets NFN-1, NFN-2, and NFN-3, respectively. The datasets SFN-1, SFN-2, NFN-1, NFN-2, and NFN-3 consist of 5, 8, 16, 8 and 8 observations, respectively.
We evaluated performance of the methods using a total error E which we defined as a sum of absolute errors, i.e.,
$E = ∑ i = 1 n D i ,$
where D is a difference between membership functions of observed and estimated fuzzy responses. For the i-th observation, the difference is given as
$D i = ∫ S Y ˜ i ∪ S Y ˜ ^ i μ Y ˜ i ( y ) − μ Y ˜ ^ i ( y ) d y ,$
where $μ Y ˜ i ( y )$ and $μ Y ˜ ^ i ( y )$ are the membership functions of the i-th observed $Y ˜ i$ and the i-th estimated response $Y ˜ ^ i$, respectively; and $S Y ˜ i$ and $S Y ˜ ^ i$ represent supports of $μ Y ˜ i ( y )$ and $μ Y ˜ ^ i ( y )$, respectively [52]. Total errors (34) of the examined fuzzy regression methods are summarized in Table 1. The errors are organized with respect to fuzzy regression methods (columns) and datasets (rows). For PRL and OPRL, we used threshold values $h = 0$ and $h = 0.5$. For MOFLR, we considered $ω ∈ { 0.1 , 0.5 , 0.99 }$. Note that RFR cannot be applied on SFN-1 since the dataset consists of 5 observations only. Since PRL, OPRL and MOFLR return biased results on datasets with non-symmetric fuzzy observations, the results were not included into Table 1.
We also investigated the sensitivity of the methods on outliers. We considered three types of outliers: (a) outliers with respect to centres of the fuzzy dependent variable $Y ˜$ (o1), (b) outliers with respect to spreads of $Y ˜$ (o2), and (c) outliers with respect to both the centres and the spreads (o3). To examine the sensitivity, we created from each above stated dataset, three new sets where each new set contained one outlier. The outliers in the datasets are specified in Table 2 by their serial numbers i, means $y i$, left spreads $υ ̲ i$ and right spreads $υ ¯ i$. Their original values are written in normal text while the changed ones are in bold.
Total errors of acquired fuzzy regression models on the modified datasets with symmetric and non-symmetric fuzzy observations are summarized in Table 3 and Table 4, respectively. The results are organized with respect to fuzzy regression methods (columns) and datasets (rows). Settings of adjustable parameters are given under method abbreviations. Asterisk marked results point to the situation when outliers were correctly recognized by OPRL (Table 3).
The obtained parameter estimates by PRL and OPRL on datasets with symmetric fuzzy numbers are summarized in Table 5. Estimates provided by MOFLR on the same datasets are given in Table 6. Settings of adjustable parameters h and $ω$ are stated under method abbreviations. The estimates of model parameters generated by FLS, FLAR, RFR and BFRL on datasets with symmetric and non-symmetric fuzzy observations are summarized in Table 7 and Table 8, respectively.
We implemented all the fuzzy regression methods in MATLAB R2016a. We used default setting of optimization functions which were used within the calculations. It means that interior point methods were utilized to solve both linear and quadratic programming problems.

## 6. Discussion

We demonstrated in the numerical examples that the proposed Boscovich fuzzy regression line (BFRL) is capable of approximating a fuzzy linear relationship between the dependent y and one independent variable x, where the independent variable is a real value number and the dependent variable is a triangular fuzzy number $Y ˜$. We compared BFRL with several other fuzzy linear regression methods. Most of the reference methods (PRL, OPRL, MOFLR, FLS and FLAR) approximate the relationship using the fuzzy linear model (32), while RFR uses the model (33). Prediction spreads of both models are dependent on x. BFRL estimates parameters of the fuzzy linear model (23). Prediction spreads of this model are independent of x. This fact predetermines BFRL for applications where the fuzziness of model predictions is independent of model inputs. An example of such an application is approximation of the dependence of water level, in an uncovered channel, on the opening of a floodgate. The level is measured using a perpendicularly positioned scale.
We studied performance of the fuzzy regression methods on twenty datasets where fifteen of them were affected by outliers. We measured prediction errors of acquired fuzzy regression models using the total error (34). We found that, in most cases, BFRL based models show lower total errors compared to PLR, OPRL, MOFLR and RFR models. Total errors of BFRL and FLS models are comparable but, in comparison, FLAR models always show lower errors (Table 1, Table 3 and Table 4). Sensitivity of BFRL to all types of outliers is comparable with other statistics-based methods (FLS, FLAR, RFR) but considerably lower compared to PLR and MOFLR (Table 3 and Table 4). If an outlier is correctly recognized by OPRL, the total error of a model produced by OPRL on the outlier affected dataset, is comparable with the total error of a BFRL based model (Table 3).
Unlike some reference methods, BFRL proved to be generally applicable and reliable. BFRL provides sensible parameter estimates for datasets with symmetric as well as with non-symmetric triangular fuzzy numbers (as compared to PLR, OPRL and MOFLR). BFRL is capable of operating on datasets with only two observations. For example, RFR requires at least six observations. In contrast to OPRL (SFN-2-o2 for $h = 0$, Table 5), BFRL always provided estimates of parameters (Table 5 and Table 6). In contrast to RFR, BFRL always returned the same estimates on the same dataset. The iterative manner of optimization in RFR leads generally to various parameter estimates and various total errors. BFRL also guaranteed non-negativity of parameter spreads (Table 7 and Table 8). This basic requirement is not guaranteed by MOFLR and FLS.
Fuzzy regression methods have mostly large computational complexity, which makes them difficult to use especially with regard to their not easy implementation. For example, PLR, OPRL, and FLAR result in a linear programming problem, MOFLR results in a quadratic one. An analytical solution must be expressed in the case of FLS, and a customized iterative optimization method must be implemented in the case of RFR. Moreover, time-complexity of these methods is exponential. A pleasant feature of the method BFRL (Algorithm 1) is straightforward implementation and it has linear time-complexity.

## 7. Conclusions

Our approach is based on a fuzzy modification of the first statistical method for data fitting based on criterion minimization. The generalization of the Boschovich line to fuzzy regression does not allow an approximation of data by a model other than the regression line. Other predecessors of the ordinary least squares method (Mayer’s method of averages, Lambert’s line, and Laplace’s method, see [53]) could enhance the family of fuzzy linear regression methods in future research. For example, a similarity between Lambert’s ideas and robust non-parametric methods such as repeated median regression and the Theil–Stein estimate [54] might provide future directions for further research.
The proposed Boscovich fuzzy regression line is a simple fuzzy regression method. We demonstrated on the numerical examples that BFRL is a reliable method which provides good parameter estimates including those estimated from datasets affected by outliers. Prediction errors of BFRL models are smaller when compared to models proposed by the possibilistic-based methods, and favourably comparable with prediction errors of models produced by statistics-based methods. In comparison with the other methods, BFRL has two major advantages: small time-complexity and straightforward implementation. Moreover, the method guarantees non-negativity of model parameter spreads. The robustness and user-friendliness of the method makes it relevant in non-mathematical research fields.

## Author Contributions

Conceptualization, J.M.; methodology, J.M. and P.Š.; software, P.Š.; validation, J.M., P.Š. and A.P.; formal analysis, J.M., P.Š. and A.P.; investigation, P.Š. and A.P.; data curation, P.Š.; writing—original draft preparation, P.Š.; writing—review and editing, J.M., P.Š. and A.P.; supervision, J.M. and P.Š. All authors have read and agreed to the published version of the manuscript.

## Funding

This research was funded by Department of Mathematics and Physics, Faculty of Electrical Engineering and Informatics, University of Pardubice, Czech Republic.

Not applicable.

Not applicable.

## Data Availability Statement

This study is based on previously published data to which the authors continuously refer in the text.

## Conflicts of Interest

The authors declare no conflict of interest.

## References

1. Bisserier, A.; Boukezzoula, R.; Galichet, S. A revisited approach to linear fuzzy regression using trapezoidal fuzzy intervals. Inf. Sci. 2010, 180, 3653–3673. [Google Scholar] [CrossRef]
2. Škrabánek, P.; Martínková, N. Algorithm xxxx: Fuzzyreg: An R package for fitting fuzzy regression models. ACM Trans. Math. Soft. 2021, in press. [Google Scholar]
3. Özelkan, E.C.; Duckstein, L. Multi-objective fuzzy regression: A general framework. Comput. Oper. Res. 2000, 27, 635–652. [Google Scholar] [CrossRef]
4. Yager, R.R. Fuzzy prediction based on regression models. Inf. Sci. 1982, 26, 45–63. [Google Scholar] [CrossRef]
5. Toyoura, Y.; Watada, J.; Khalid, M.; Yusof, R. Formulation of linguistic regression model based on natural words. Soft Comput. 2004, 8, 681–688. [Google Scholar] [CrossRef] [Green Version]
6. Pan, N.F.; Lin, T.C.; Pan, N.H. Estimating bridge performance based on a matrix-driven fuzzy linear regression model. Autom. Constr. 2009, 18, 578–586. [Google Scholar] [CrossRef]
7. Kim, K.J.; Moskowitz, H.; Koksalan, M. Fuzzy versus statistical linear regression. Eur. J. Oper. Res. 1996, 92, 417–434. [Google Scholar] [CrossRef]
8. Škrabánek, P.; Marek, J. Models used in fuzzy linear regression. In Proceedings of the 17th Conference on Applied Mathematics—APLIMAT 2018, Bratislava, Slovak Republic, 6–8 February 2018; pp. 955–964. [Google Scholar]
9. Mosleh, M.; Otadi, M.; Abbasbandy, S. Fuzzy polynomial regression with fuzzy neural networks. Appl. Math. Model. 2011, 35, 5400–5412. [Google Scholar] [CrossRef]
10. Pourahmad, S.; Ayatollahi, S.M.T.; Taheri, S.M.; Agahi, Z.H. Fuzzy logistic regression based on the least squares approach with application in clinical studies. Comput. Math. Appl. 2011, 62, 3353–3365. [Google Scholar] [CrossRef] [Green Version]
11. Cheng, C.B.; Lee, E. Nonparametric fuzzy regression—k-NN and kernel smoothing techniques. Comput. Math. Appl. 1999, 38, 239–251. [Google Scholar] [CrossRef] [Green Version]
12. Wang, N.; Zhang, W.X.; Mei, C.L. Fuzzy nonparametric regression based on local linear smoothing technique. Inf. Sci. 2007, 177, 3882–3900. [Google Scholar] [CrossRef]
13. Danesh, S.; Farnoosh, R.; Razzaghnia, T. Fuzzy nonparametric regression based on an adaptive neuro-fuzzy inference system. Neurocomputing 2016, 173, 1450–1460. [Google Scholar] [CrossRef]
14. Maturo, F.; Hošková-Mayerová, Š. Fuzzy Regression Models and Alternative Operations for Economic and Social Sciences. In Recent Trends in Social Systems: Quantitative Theories and Quantitative Models, 1st ed.; Springer International Publishing: Cham, Switzerland, 2017; pp. 235–247. [Google Scholar] [CrossRef]
15. Tanaka, H.; Uejima, S.; Asai, K. Linear Regression Analysis with Fuzzy Model. IEEE Trans. Syst. Man Cybern. 1982, 12, 903–907. [Google Scholar] [CrossRef]
16. Tanaka, H.; Hayashi, I.; Watada, J. Possibilistic linear regression analysis for fuzzy data. Eur. J. Oper. Res. 1989, 40, 389–396. [Google Scholar] [CrossRef]
17. Chang, P.T.; Lee, E.S. A generalized fuzzy weighted least-squares regression. Fuzzy Sets Syst. 1996, 82, 289–298. [Google Scholar] [CrossRef]
18. Diamond, P. Fuzzy Least Squares. Inf. Sci. 1988, 46, 141–157. [Google Scholar] [CrossRef]
19. Yen, K.; Ghoshray, S.; Roig, G. A linear regression model using triangular fuzzy number coefficients. Fuzzy Sets Syst. 1999, 106, 167–177. [Google Scholar] [CrossRef]
20. D’Urso, P.; Gastaldi, T. A least-squares approach to fuzzy linear regression analysis. Comput. Stat. Data Anal. 2000, 34, 427–440. [Google Scholar] [CrossRef]
21. Hojati, M.; Bector, C.; Smimou, K. A simple method for computation of fuzzy linear regression. Eur. J. Oper. Res. 2005, 166, 172–184. [Google Scholar] [CrossRef]
22. Choi, S.H.; Buckley, J.J. Fuzzy regression using least absolute deviation estimators. Soft Comput. 2008, 12, 257–263. [Google Scholar] [CrossRef]
23. Kelkinnama, M.; Taheri, S. Fuzzy least-absolutes regression using shape preserving operations. Inf. Sci. 2012, 214, 105–120. [Google Scholar] [CrossRef]
24. Zeng, W.; Feng, Q.; Li, J. Fuzzy least absolute linear regression. Appl. Soft Comput. 2017, 52, 1009–1019. [Google Scholar] [CrossRef]
25. Nasrabadi, M.M.; Nasrabadi, E. A mathematical-programming approach to fuzzy linear regression analysis. Appl. Math. Comput. 2004, 155, 873–881. [Google Scholar] [CrossRef]
26. Nasrabadi, M.M.; Nasrabadi, E.; Nasrabady, A.R. Fuzzy linear regression analysis: A multi-objective programming approach. Appl. Math. Comput. 2005, 163, 245–251. [Google Scholar] [CrossRef]
27. Celmiņš, A. Least squares model fitting to fuzzy vector data. Fuzzy Sets Syst. 1987, 22, 245–269. [Google Scholar] [CrossRef]
28. Diamond, P.; Körner, R. Extended fuzzy linear models and least squares estimates. Comput. Math. Appl. 1997, 33, 15–32. [Google Scholar] [CrossRef] [Green Version]
29. Kao, C.; Chyu, C.L. A fuzzy linear regression model with better explanatory power. Fuzzy Sets Syst. 2002, 126, 401–409. [Google Scholar] [CrossRef]
30. Kao, C.; Chyu, C.L. Least-squares estimates in fuzzy regression analysis. Eur. J. Oper. Res. 2003, 148, 426–435. [Google Scholar] [CrossRef]
31. Wu, H.C. Fuzzy estimates of regression parameters in linear regression models for imprecise input and output data. Comput. Stat. Data Anal. 2003, 42, 203–217. [Google Scholar] [CrossRef]
32. Hassanpour, H.; Maleki, H.; Yaghoobi, M. A note on evaluation of fuzzy linear regression models by comparing membership functions. Iran. J. Fuzzy Syst. 2009, 6, 1–6. [Google Scholar] [CrossRef]
33. Wu, H.C. The construction of fuzzy least squares estimators in fuzzy linear regression models. Expert Syst. Appl. 2011, 38, 13632–13640. [Google Scholar] [CrossRef]
34. Chen, L.H.; Hsueh, C.C. A mathematical programming method for formulating a fuzzy regression model based on distance criterion. IEEE Trans. Syst. Man, Cybern. Part B (Cybern.) 2007, 37, 705–712. [Google Scholar] [CrossRef]
35. Hassanpour, H.; Maleki, H.; Yaghoobi, M. Fuzzy linear regression model with crisp coefficients: A goal programming approach. Iran. J. Fuzzy Syst. 2010, 7, 1–153. [Google Scholar] [CrossRef]
36. Li, J.; Zeng, W.; Xie, J.; Yin, Q. A new fuzzy regression model based on least absolute deviation. Eng. Appl. Artif. Intell. 2016, 52, 54–64. [Google Scholar] [CrossRef]
37. Al-Qudaimi, A.; Kumar, A. A note on “A new fuzzy regression model based on absolute deviation”. Eng. Appl. Artif. Intell. 2017, 66, 30–32. [Google Scholar] [CrossRef]
38. Lee, H.; Tanaka, H. Fuzzy approximations with non-symmetric fuzzy parameters in fuzzy regression analysis. J. Oper. Res. Soc. Jpn. 1999, 42, 98–112. [Google Scholar] [CrossRef] [Green Version]
39. Redden, D.T.; Woodall, W.H. Properties of certain fuzzy linear regression methods. Fuzzy Sets Syst. 1994, 64, 361–375. [Google Scholar] [CrossRef]
40. Redden, D.T.; Woodall, W.H. Further examination of fuzzy linear regression. Fuzzy Sets Syst. 1996, 79, 203–211. [Google Scholar] [CrossRef]
41. D’Urso, P.; Massari, R.; Santoro, A. Robust fuzzy regression analysis. Inf. Sci. 2011, 181, 4154–4174. [Google Scholar] [CrossRef]
42. Peters, G. Fuzzy linear regression with fuzzy intervals. Fuzzy Sets Syst. 1994, 63, 45–55. [Google Scholar] [CrossRef]
43. Kim, K.J.; Chen, H.R. A comparison of fuzzy and nonparametric linear regression. Comput. Oper. Res. 1997, 24, 505–519. [Google Scholar] [CrossRef]
44. Hung, W.L.; Yang, M.S. An omission approach for detecting outliers in fuzzy regression models. Fuzzy Sets Syst. 2006, 157, 3109–3122. [Google Scholar] [CrossRef]
45. Chang, P.T.; Lee, E. Fuzzy least absolute deviations regression and the conflicting trends in fuzzy parameters. Comput. Math. Appl. 1994, 28, 89–101. [Google Scholar] [CrossRef] [Green Version]
46. Lu, J.; Wang, R. An enhanced fuzzy linear regression model with more flexible spreads. Fuzzy Sets Syst. 2009, 160, 2505–2523. [Google Scholar] [CrossRef]
47. Boscovich, R.J. De litteraria expeditione per pontificiam ditioned, et synopsis amplioris operis, ac habentur plura ejus ex exemplaria etiam sensorum impressa. Bononiensi Sci. Artium Inst. Atque Acad. Comment. 1757, 4, 353–396. [Google Scholar]
48. Howarth, R.J. A History of Regression and Related Model-Fitting in the Earth Sciences (1636–2000). Nat. Resour. Res. 2001, 10, 241–286. [Google Scholar] [CrossRef]
49. Ku, H. Precision Measurement and Calibration: Selected NBS Papers on Statistical Concepts and Procedures; NBS Special Publication 300; U.S. Government Printing Office: Washington, DC, USA, 1969; Volume 1.
50. Chang, Y.H.O. Hybrid fuzzy least-squares regression analysis and its reliability measures. Fuzzy Sets Syst. 2001, 119, 225–246. [Google Scholar] [CrossRef]
51. Körner, R.; Näther, W. Linear regression with random fuzzy variables: Extended classical estimates, best linear estimates, least squares estimates. Inf. Sci. 1998, 109, 95–118. [Google Scholar] [CrossRef]
52. Kim, B.; Bishu, R.R. Evaluation of fuzzy linear regression models by comparing membership functions. Fuzzy Sets Syst. 1998, 100, 343–352. [Google Scholar] [CrossRef]
53. Hald, A. A History of Probability and Statistics and Their Applications before 1750, 1st ed.; Wiley series in probability and statistics; John Wiley & Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
54. Wilcox, R. Introduction to Robust Estimation and Hypothesis Testing, 4th ed.; Elsevier/Academic Press: London, UK, 2016. [Google Scholar]
Table 1. Total errors of the possibilistic linear regression (PLR), the PLR combined with the omission approach (OPRL), the multi-objective fuzzy linear regression (MOFLR), the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL). Testing done on datasets with symmetric (SFN-1, $n = 5$ and SFN-2, $n = 8$) and non-symmetric fuzzy numbers (NFN-1, $n = 16$; NFN-2, $n = 8$ and NFN-3, $n = 8$). Settings of adjustable parameters h and $ω$ are stated under method abbreviations.
Table 1. Total errors of the possibilistic linear regression (PLR), the PLR combined with the omission approach (OPRL), the multi-objective fuzzy linear regression (MOFLR), the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL). Testing done on datasets with symmetric (SFN-1, $n = 5$ and SFN-2, $n = 8$) and non-symmetric fuzzy numbers (NFN-1, $n = 16$; NFN-2, $n = 8$ and NFN-3, $n = 8$). Settings of adjustable parameters h and $ω$ are stated under method abbreviations.
PLROPRLMOFLRFLSFLARRFRBFRL
$h = 0$$h = 0.5$$h = 0$$h = 0.5$$ω = 0.1$$ω = 0.5$$ω = 0.99$
SFN-112.4017.8912.4017.89220.4235.3514.5310.149.50-9.17
SFN-23.866.453.514.7873.7311.895.093.233.243.233.61
NFN-1-------144.13133.81151.98161.09
NFN-2-------15.2114.0914.5715.30
NFN-3-------2.982.292.903.03
Table 2. Outliers of type o1 (columns 3–5), o2 (columns 6 - 8) and o3 (columns 9–11) in datasets (column 1) are identified by their serial numbers i (column 2). Original values of means $m Y ˜$, right spreads $α Y ˜$ and left spreads $β Y ˜$ of the observations $Y ˜$ are in normal text and the changed values are in bold.
Table 2. Outliers of type o1 (columns 3–5), o2 (columns 6 - 8) and o3 (columns 9–11) in datasets (column 1) are identified by their serial numbers i (column 2). Original values of means $m Y ˜$, right spreads $α Y ˜$ and left spreads $β Y ˜$ of the observations $Y ˜$ are in normal text and the changed values are in bold.
o1o2o3
Dataseti$y i$$υ ̲ i$$υ ¯ i$$y i$$υ ̲ i$$υ ¯ i$$y i$$υ ̲ i$$υ ¯ i$
SFN-13$1$1.81.88$35$$35$$1$$35$$35$
SFN-24$4$0.40.42$1.6$$1.6$$4$$1.6$$1.6$
NFN-17$170$121270.9$65$$77$$170$$65$$77$
NFN-26$3$1.51.722$10$$0.01$$3$$10$$0.01$
NFN-31$9.5$0.170.42.5$3$$4$$9.5$$3$$4$
Table 3. Total errors of the possibilistic linear regression (PLR), the PLR combined with the omission approach (OPRL), the multi-objective fuzzy linear regression (MOFLR), the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL). Method tested on datasets with symmetric fuzzy numbers SFN-1 and SFN-2 affected by outliers of type o1, o2 and o3 (first column). Settings of adjustable parameters h and $ω$ are stated under method abbreviations. Asterisk indicates correctly recognized outliers by OPRL.
Table 3. Total errors of the possibilistic linear regression (PLR), the PLR combined with the omission approach (OPRL), the multi-objective fuzzy linear regression (MOFLR), the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL). Method tested on datasets with symmetric fuzzy numbers SFN-1 and SFN-2 affected by outliers of type o1, o2 and o3 (first column). Settings of adjustable parameters h and $ω$ are stated under method abbreviations. Asterisk indicates correctly recognized outliers by OPRL.
PLROPRLMOFLRFLSFLARRFRBFRL
$h = 0$$h = 0.5$$h = 0$$h = 0.5$$ω = 0.1$$ω = 0.5$$ω = 0.99$
SFN-1-o135.9956.8835.9923.03 *220.5839.5722.5015.7414.17-14.59
SFN-2-o17.5812.704.33 *6.29 *73.7412.295.823.763.013.763.49
SFN-1-o2131.05131.0542.03 *44.30 *836.00132.9979.4852.9741.08-53.02
SFN-2-o28.2110.17-5.62 *96.5215.446.513.903.824.603.97
SFN-1-o3132.57132.3842.56 *45.06 *836.13145.8781.7954.6442.06-54.61
SFN-2-o310.7015.725.23 *6.96 *96.5416.107.694.803.935.644.60
Table 4. Total errors of the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) (first row) tested on datasets with non-symmetric fuzzy numbers, NFN-1, NFN-2, and NFN-3, affected by outliers of type o1, o2 and o3 (first column).
Table 4. Total errors of the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) (first row) tested on datasets with non-symmetric fuzzy numbers, NFN-1, NFN-2, and NFN-3, affected by outliers of type o1, o2 and o3 (first column).
FLSFLARRFRBFRL
NFN-1-o1180.45146.88166.43189.74
NFN-2-o119.2315.3420.1419.14
NFN-3-o15.302.295.325.03
NFN-1-o2207.69183.84192.53214.43
NFN-2-o215.7414.8917.0416.54
NFN-3-o25.864.946.576.15
NFN-1-o3259.93205.94393.57259.24
NFN-2-o325.9819.4728.6225.64
NFN-3-o311.635.5811.6910.60
Table 5. Parameters of fuzzy regression models $A ˜ 0$ and $A ˜ 1$ estimated by the possibilistic linear regression (PLR) and the PLR combined with the omission approach (OPRL) on datasets (first column) with symmetric fuzzy observations. Settings of the adjustable parameter h are stated in the second line.
Table 5. Parameters of fuzzy regression models $A ˜ 0$ and $A ˜ 1$ estimated by the possibilistic linear regression (PLR) and the PLR combined with the omission approach (OPRL) on datasets (first column) with symmetric fuzzy observations. Settings of the adjustable parameter h are stated in the second line.
PLROPRL
$h = 0$$h = 0.5$$h = 0$$h = 0.5$
$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$
SFN-1$( 3.85 , 3.85 , 3.85 ) T$$( 2.10 , 0.00 , 0.00 ) T$$( 4.15 , 5.57 , 5.57 ) T$$( 1.97 , 0.00 , 0.00 ) T$$( 3.85 , 3.85 , 3.85 ) T$$( 2.10 , 0.00 , 0.00 ) T$$( 4.15 , 5.57 , 5.57 ) T$$( 1.97 , 0.00 , 0.00 ) T$
SFN-2$( 1.28 , 0.83 , 0.83 ) T$$( 0.13 , 0.00 , 0.00 ) T$$( 1.39 , 1.23 , 1.23 ) T$$( 0.11 , 0.00 , 0.00 ) T$$( 1.03 , 0.63 , 0.63 ) T$$( 0.14 , 0.01 , 0.01 ) T$$( 0.94 , 0.98 , 0.98 ) T$$( 0.15 , 0.00 , 0.00 ) T$
SFN-1-o1$( 1.68 , 6.02 , 6.02 ) T$$( 1.51 , 0.59 , 0.59 ) T$$( 2.27 , 9.32 , 9.32 ) T$$( 1.33 , 1.27 , 1.27 ) T$$( 1.68 , 6.02 , 6.02 ) T$$( 1.51 , 0.59 , 0.59 ) T$$( 4.15 , 5.57 , 5.57 ) T$$( 1.20 , 0.00 , 0.00 ) T$
SFN-2-o1$( 2.85 , 1.25 , 1.25 ) T$$( 0.03 , 0.00 , 0.00 ) T$$( 3.05 , 2.00 , 2.00 ) T$$( 0.02 , 0.00 , 0.00 ) T$$( 1.29 , 0.82 , 0.82 ) T$$( 0.13 , 0.00 , 0.00 ) T$$( 1.44 , 1.12 , 1.12 ) T$$( 0.11 , 0.00 , 0.00 ) T$
SFN-1-o2$( 5.53 , 26.47 , 26.47 ) T$$( 1.32 , 2.84 , 2.84 ) T$$( 5.52 , 25.25 , 25.25 ) T$$( 1.33 , 3.25 , 3.25 ) T$$( 3.85 , 3.85 , 3.85 ) T$$( 2.10 , 0.00 , 0.00 ) T$$( 4.15 , 5.57 , 5.57 ) T$$( 1.98 , 0.00 , 0.00 ) T$
SFN-2-o2$( 0.53 , 1.60 , 1.60 ) T$$( 0.16 , 0.00 , 0.00 ) T$$( 1.09 , 1.83 , 1.83 ) T$$( 0.11 , 0.00 , 0.00 ) T$--$( 1.44 , 1.12 , 1.12 ) T$$( 0.11 , 0.00 , 0.00 ) T$
SFN-1-o3$( 6.68 , 26.19 , 26.19 ) T$$( − 1.89 , 2.94 , 2.94 ) T$$( 4.13 , 24.07 , 24.07 ) T$$( − 1.04 , 3.64 , 3.64 ) T$$( 3.85 , 3.85 , 3.85 ) T$$( 2.10 , 0.00 , 0.00 ) T$$( 4.15 , 5.57 , 5.57 ) T$$( 1.97 , 0.00 , 0.00 ) T$
SFN-2-o3$( 4.01 , 1.77 , 1.77 ) T$$( − 0.02 , 0.00 , 0.00 ) T$$( 3.63 , 2.52 , 2.52 ) T$$( − 0.01 , 0.00 , 0.00 ) T$$( 1.29 , 0.82 , 0.82 ) T$$( 0.13 , 0.00 , 0.00 ) T$$( 1.44 , 1.12 , 1.12 ) T$$( 0.11 , 0.00 , 0.00 ) T$
Table 6. Parameters of fuzzy regression models $A ˜ 0$ and $A ˜ 1$ estimated by the multi-objective fuzzy linear regression (MOFLR) on datasets (first column) with symmetric fuzzy observations. Settings of the adjustable parameter $ω$ are stated in the second line. Negative spreads are in bold.
Table 6. Parameters of fuzzy regression models $A ˜ 0$ and $A ˜ 1$ estimated by the multi-objective fuzzy linear regression (MOFLR) on datasets (first column) with symmetric fuzzy observations. Settings of the adjustable parameter $ω$ are stated in the second line. Negative spreads are in bold.
MOFLR
$ω = 0.1$$ω = 0.5$$ω = 0.99$
$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$
SFN-1$( 4.95 , 36.80 , 36.80 ) T$$( 1.71 , 3.20 , 3.20 ) T$$( 4.95 , 7.36 , 7.36 ) T$$( 1.71 , 0.64 , 0.64 ) T$$( 4.95 , 1.84 , 1.84 ) T$$( 1.71 , 0.16 , 0.16 ) T$
SFN-2$( 1.38 , 2.95 , 2.95 ) T$$( 0.12 , 0.50 , 0.50 ) T$$( 1.38 , 0.59 , 0.59 ) T$$( 0.12 , 0.10 , 0.10 ) T$$( 1.37 , 0.30 , 0.30 ) T$$( 0.12 , 0.05 , 0.05 ) T$
SFN-1-o1$( 3.25 , 36.80 , 36.80 ) T$$( 1.71 , 3.20 , 3.20 ) T$$( 3.25 , 7.36 , 7.36 ) T$$( 1.71 , 0.64 , 0.64 ) T$$( 3.25 , 3.72 , 3.72 ) T$$( 1.71 , 0.32 , 0.32 ) T$
SFN-2-o1$( 2.38 , 2.95 , 2.95 ) T$$( 0.06 , 0.50 , 0.50 ) T$$( 2.38 , 0.59 , 0.59 ) T$$( 0.06 , 0.10 , 0.10 ) T$$( 2.38 , 0.30 , 0.30 ) T$$( 0.07 , 0.05 , 0.05 ) T$
SFN-1-o2$( 4.95 , 166.40 , 166.40 ) T$$( 1.71 , 3.20 , 3.20 ) T$$( 4.95 , 33.28 , 33.28 ) T$$( 1.71 , 0.64 , 0.64 ) T$$( 4.94 , 16.81 , 16.81 ) T$$( 1.71 , 0.32 , 0.32 ) T$
SFN-2-o2$( 1.38 , 14.95 , 14.95 ) T$$( 0.12 , − 0.17 , − 0.17 ) T$$( 1.38 , 2.99 , 2.99 ) T$$( 0.12 , − 0.03 , − 0.03 ) T$$( 1.35 , 1.51 , 1.51 ) T$$( 0.12 , − 0.02 , − 0.02 ) T$
SFN-1-o3$( 3.25 , 166.40 , 166.40 ) T$$( 1.71 , 3.20 , 3.20 ) T$$( 3.25 , 33.28 , 33.28 ) T$$( 1.71 , 0.64 , 0.64 ) T$$( 3.26 , 16.81 , 16.81 ) T$$( 1.71 , 0.32 , 0.32 ) T$
SFN-2-o3$( 2.38 , 14.95 , 14.95 ) T$$( 0.06 , − 0.17 , − 0.17 ) T$$( 2.38 , 2.99 , 2.99 ) T$$( 0.06 , − 0.03 , − 0.03 ) T$$( 2.38 , 1.51 , 1.51 ) T$$( 0.07 , − 0.02 , − 0.02 ) T$
Table 7. Parameters of fuzzy regression models $A ˜ 0 , A ˜ 1 , a , b , g , d , h$ and $a 1$ estimated by the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) on datasets (first column) with symmetric fuzzy observations. Negative spreads are in bold.
Table 7. Parameters of fuzzy regression models $A ˜ 0 , A ˜ 1 , a , b , g , d , h$ and $a 1$ estimated by the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) on datasets (first column) with symmetric fuzzy observations. Negative spreads are in bold.
FLSFLARRFRBFRL
$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$$a T$bgdh$A ˜ 0$$a 1$
SFN-1$( 4.95 , 1.84 , 1.84 ) T$$( 1.71 , 0.16 , 0.16 ) T$$( 5.61 , 1.80 , 1.80 ) T$$( 1.48 , 0.20 , 0.20 ) T$-----$( 5.70 , 2.32 , 2.32 ) T$1.46
SFN-2$( 1.38 , 1.48 , 1.48 ) T$$( 0.12 , 0.03 , 0.03 ) T$$( 1.41 , 0.10 , 0.10 ) T$$( 0.12 , 0.03 , 0.03 ) T$$0.81 0.16$0.180.18−0.03−0.02$( 1.20 , 0.49 , 0.49 ) T$0.13
SFN-1-o1$( 3.25 , 1.84 , 1.84 ) T$$( 1.71 , 0.16 , 0.16 ) T$$( 4.14 , 1.80 , 1.80 ) T$$( 1.77 , 0.20 , 0.20 ) T$-----$( 2.44 , 2.32 , 2.32 ) T$1.98
SFN-2-o1$( 2.38 , 0.15 , 0.15 ) T$$( 0.06 , 0.03 , 0.03 ) T$$( 2.00 , 0.10 , 0.10 ) T$$( 0.08 , 0.03 , 0.03 ) T$$2.37 0.06$0.390.39−0.77−0.77$( 1.90 , 0.49 , 0.49 ) T$0.10
SFN-1-o2$( 4.95 , 8.32 , 8.32 ) T$$( 1.71 , 0.16 , 0.16 ) T$$( 5.60 , 1.80 , 1.80 ) T$$( 1.48 , 0.20 , 0.20 ) T$-----$( 5.70 , 8.80 , 8.80 ) T$1.46
SFN-2-o2$( 1.38 , 0.75 , 0.75 ) T$$( 0.12 , − 0.01 , − 0.01 ) T$$( 1.42 , 0.21 , 0.21 ) T$$( 0.12 , 0.02 , 0.02 ) T$$1.57 0.10$0.180.190.140.12$( 1.20 , 0.64 , 0.64 ) T$0.13
SFN-1-o3$( 3.25 , 8.32 , 8.32 ) T$$( 1.71 , 0.16 , 0.16 ) T$$( 4.05 , 1.80 , 1.80 ) T$$( 1.79 , 0.20 , 0.20 ) T$-----$( 2.44 , 8.80 , 8.80 ) T$1.98
SFN-2-o3$( 2.38 , 0.75 , 0.75 ) T$$( 0.06 , − 0.01 , − 0.01 ) T$$( 2.00 , 0.21 , 0.21 ) T$$( 0.08 , 0.02 , 0.02 ) T$$2.78 0.03$0.180.180.030.03$( 1.90 , 0.64 , 0.64 ) T$0.10
Table 8. Parameters of fuzzy regression models $A ˜ 0 , A ˜ 1 , a , b , g , d , h$ and $a 1$ estimated by the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) on datasets (first column) with non-symmetric fuzzy observations. Negative spreads are in bold.
Table 8. Parameters of fuzzy regression models $A ˜ 0 , A ˜ 1 , a , b , g , d , h$ and $a 1$ estimated by the fuzzy least squares (FLS), the fuzzy least absolute linear regression (FLAR), the robust fuzzy regression (RFR) analysis, and the Boscovich fuzzy regression line (BFRL) on datasets (first column) with non-symmetric fuzzy observations. Negative spreads are in bold.
FLSFLARRFRBFRL
$A ˜ 0$$A ˜ 1$$A ˜ 0$$A ˜ 1$$a T$bgdh$A ˜ 0$$a 1$
NFN-1$( 24.47 , 4.85 , 4.46 ) T$$( 34.05 , 4.95 , 5.80 ) T$$( 25.46 , 4.68 , 4.82 ) T$$( 32.90 , 4.90 , 5.45 ) T$$21.66 35.16$0.160.170.670.39$( 27.07 , 12.28 , 13.16 ) T$32.32
NFN-2$( 12.93 , 1.29 , 1.70 ) T$$( 0.54 , 0.04 , 0.01 ) T$$( 12.86 , 1.30 , 1.58 ) T$$( 0.57 , 0.02 , 0.02 ) T$$12.65 0.55$0.090.090.130.13$( 12.93 , 1.63 , 1.79 ) T$0.54
NFN-3$( 1.31 , 0.16 , 0.29 ) T$$( 0.13 , 0.01 , 0.02 ) T$$( 0.51 , 0.15 , 0.29 ) T$$( 0.17 , 0.01 , 0.02 ) T$$1.31 0.13$0.060.150.080.10$( 1.09 , 0.26 , 0.54 ) T$0.14
NFN-1-o1$( 33.94 , 4.85 , 4.46 ) T$$( 31.87 , 4.95 , 5.80 ) T$$( 25.46 , 4.70 , 4.82 ) T$$( 32.90 , 4.94 , 5.45 ) T$$35.02 27.99$0.160.180.170.02$( 37.05 , 12.28 , 13.16 ) T$29.79
NFN-2-o1$( 13.61 , 1.29 , 1.70 ) T$$( 0.20 , 0.04 , 0.01 ) T$$( 13.00 , 1.30 , 1.58 ) T$$( 0.50 , 0.02 , 0.02 ) T$$13.25 0.19$0.100.120.060.02$( 13.61 , 1.63 , 1.79 ) T$0.20
NFN-3-o1$( 6.80 , 0.16 , 0.29 ) T$$( − 0.22 , 0.01 , 0.02 ) T$$( 0.51 , 0.15 , 0.29 ) T$$( 0.17 , 0.01 , 0.02 ) T$$8.62 − 0.36$0.010.050.180.26$( 5.10 , 0.26 , 0.54 ) T$−0.09
NFN-1-o2$( 24.47 , 9.92 , 10.67 ) T$$( 34.05 , 3.78 , 4.36 ) T$$( 25.46 , 4.55 , 4.82 ) T$$( 32.90 , 5.03 , 5.45 ) T$$34.14 29.79$0.150.170.730.49$( 27.07 , 15.59 , 17.22 ) T$32.32
NFN-2-o2$( 12.93 , 0.99 , 1.76 ) T$$( 0.54 , 0.19 , − 0.02 ) T$$( 12.86 , 0.95 , 1.58 ) T$$( 0.57 , 0.13 , 0.02 ) T$$12.86 0.47$0.100.100.070.11$( 12.93 , 2.69 , 1.58 ) T$0.54
NFN-3-o2$( 1.31 , 2.38 , 3.11 ) T$$( 0.13 , − 0.13 , − 0.16 ) T$$( 0.51 , 0.20 , 0.41 ) T$$( 0.17 , 0.01 , 0.01 ) T$$1.50 0.11$−0.010.110.680.71$( 1.09 , 0.61 , 0.99 ) T$0.14
NFN-1-o3$( 33.94 , 9.92 , 10.67 ) T$$( 31.87 , 3.78 , 4.36 ) T$$( 25.46 , 4.71 , 4.82 ) T$$( 32.90 , 4.97 , 5.45 ) T$$53.26 25.81$0.190.233.443.25$( 37.05 , 15.59 , 17.22 ) T$29.79
NFN-2-o3$( 13.61 , 0.99 , 1.76 ) T$$( 0.20 , 0.19 , − 0.02 ) T$$( 13.00 , 0.95 , 1.58 ) T$$( 0.50 , 0.13 , 0.02 ) T$$13.69 0.10$0.210.11−0.12−0.03$( 13.61 , 2.69 , 1.58 ) T$0.20
NFN-3-o3$( 6.80 , 2.38 , 3.11 ) T$$( − 0.22 , − 0.13 , − 0.16 ) T$$( 0.51 , 0.20 , 0.41 ) T$$( 0.17 , 0.01 , 0.01 ) T$$8.44 − 0.37$0.340.42−0.55−0.44$( 5.10 , 0.61 , 0.99 ) T$−0.09
 Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

## Share and Cite

MDPI and ACS Style

Škrabánek, P.; Marek, J.; Pozdílková, A. Boscovich Fuzzy Regression Line. Mathematics 2021, 9, 685. https://doi.org/10.3390/math9060685

AMA Style

Škrabánek P, Marek J, Pozdílková A. Boscovich Fuzzy Regression Line. Mathematics. 2021; 9(6):685. https://doi.org/10.3390/math9060685

Chicago/Turabian Style

Škrabánek, Pavel, Jaroslav Marek, and Alena Pozdílková. 2021. "Boscovich Fuzzy Regression Line" Mathematics 9, no. 6: 685. https://doi.org/10.3390/math9060685

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.