# Lower Approximation Reduction Based on Discernibility Information Tree in Inconsistent Ordered Decision Information Systems

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminary

#### 2.1. Inconsistent Ordered Decision Information System (IODIS)

**Definition**

**1.**

**Definition**

**2.**

**Proposition**

**1.**

- (1)
- ${R}_{A}^{\u2ab0}\supseteq {R}_{AT}^{\u2ab0}$, ${\left[{x}_{i}\right]}_{A}^{\u2ab0}\supseteq {\left[{x}_{i}\right]}_{AT}^{\u2ab0}$;
- (2)
- ${x}_{j}\in {\left[{x}_{i}\right]}_{A}^{\u2ab0}$, ${x}_{i}\in {\left[{x}_{j}\right]}_{A}^{\u2ab0}\iff {\left[{x}_{j}\right]}_{A}^{\u2ab0}={\left[{x}_{i}\right]}_{A}^{\u2ab0}$;
- (3)
- ${\left[{x}_{i}\right]}_{A}^{\u2ab0}=\bigcup \{{\left[{x}_{j}\right]}_{A}^{\u2ab0}|{x}_{j}\in {\left[{x}_{i}\right]}_{A}^{\u2ab0}\}$;
- (4)
- ${\left[{x}_{i}\right]}_{A}^{\u2ab0}={\left[{x}_{j}\right]}_{A}^{\u2ab0}\iff f({x}_{i},a)=f({x}_{j},a)\phantom{\rule{4pt}{0ex}}\phantom{\rule{4pt}{0ex}}(\forall a\in A)$.

#### 2.2. Lower Approximation Reduction in an IODIS

**Definition**

**3.**

**Theorem**

**1.**

**Definition**

**4.**

**Theorem**

**2.**

**Definition**

**5.**

**Theorem**

**3.**

#### 2.3. Discernibility Information Tree

- (1)
- The subtree of the discernibility information tree is also an ordered tree, which is arranged from left to right in the order of the condition attribute set.
- (2)
- Each node of the discernibility information tree is composed of four parts: prefix pointer, successor pointer, node name, pointer with the same name. The prefix pointer points to previous level node (i.e., the parent node) of this node, and the subsequent pointer points to the successor node (i.e., the child node) of this node. The node name marks the condition attribute name corresponding to the node, and the same name pointer points to the node in the discernibility information tree that has the same node name in other paths.
- (3)
- Each node in the discernibility information tree has at most $\left|AT\right|$ child nodes, where $\left|AT\right|$ is the number of condition attributes in the ordered decision information system.

## 3. The Method of the Lower Approximation Reduction Based on Discernibility Information Tree in an IODIS

Algorithm 1: The algorithm of discernibility information tree in an inconsistent ordered decision information system. |

**Theorem**

**4.**

**Proof.**

**Theorem**

**5.**

**Proof.**

**Theorem**

**6.**

**Proof.**

Algorithm 2: The algorithm of lower approximation reduction based on the discernibility information tree in an IODIS. |

- ${\underline{\eta}}_{A}^{\u2ab0}={\underline{\eta}}_{AT}^{\u2ab0}$;
- $\forall a\in A$, then ${\underline{\eta}}_{A-\left\{a\right\}}^{\u2ab0}\ne {\underline{\eta}}_{AT}^{\u2ab0}$.

- $\forall ({x}_{i},{x}_{j})\in {D}_{\u2ab0AT}^{\underline{\eta}}$, then $A\cap {\mathcal{D}}_{\u2ab0AT}^{\underline{\eta}}({x}_{i},{x}_{j})\ne \varnothing $;
- $\forall a\in A$, $\exists ({x}_{i},{x}_{j})\in {D}_{\u2ab0AT}^{\underline{\eta}}$ such that $(A-\{a\})\cap {\mathcal{D}}_{\u2ab0AT}^{\underline{\eta}}({x}_{i},{x}_{j})=\varnothing $.

- $\forall M\in DS$, $M\cap A\ne \varnothing $;
- $\forall a\in A$, $\exists M\in DS$ such that $T\cap (A-\{a\})=\varnothing $.

## 4. An Illustrative Example

${\left[{x}_{1}\right]}_{AT}^{\u2ab0}=\{{x}_{1},{x}_{5},{x}_{6},{x}_{7},{x}_{8}\}$, | ${\left[{x}_{2}\right]}_{AT}^{\u2ab0}=\{{x}_{2},{x}_{8},{x}_{9}\}$, |

${\left[{x}_{3}\right]}_{AT}^{\u2ab0}=\{{x}_{1},{x}_{2},{x}_{3},{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9}\}$, | ${\left[{x}_{4}\right]}_{AT}^{\u2ab0}=\{{x}_{2},{x}_{4},{x}_{8},{x}_{9}\}$, |

${\left[{x}_{5}\right]}_{AT}^{\u2ab0}=\{{x}_{5},{x}_{6}\}$, | ${\left[{x}_{6}\right]}_{AT}^{\u2ab0}=\{{x}_{5},{x}_{6}\}$, |

${\left[{x}_{7}\right]}_{AT}^{\u2ab0}=\{{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9}\}$, | ${\left[{x}_{8}\right]}_{AT}^{\u2ab0}=\{{x}_{8},{x}_{9}\}$, |

${\left[{x}_{9}\right]}_{AT}^{\u2ab0}=\left\{{x}_{9}\right\}$, | ${\left[{x}_{10}\right]}_{AT}^{\u2ab0}=\left\{{x}_{10}\right\}$. |

- ${D}_{1}={\left[{x}_{3}\right]}_{D}^{\u2ab0}={\left[{x}_{5}\right]}_{D}^{\u2ab0}={\left[{x}_{6}\right]}_{D}^{\u2ab0}={\left[{x}_{9}\right]}_{D}^{\u2ab0}=\{{x}_{3},{x}_{5},{x}_{6},{x}_{9}\}$,
- ${D}_{2}={\left[{x}_{2}\right]}_{D}^{\u2ab0}={\left[{x}_{8}\right]}_{D}^{\u2ab0}=\{{x}_{2},{x}_{3},{x}_{5},{x}_{6},{x}_{8},{x}_{9}\}$,
- ${D}_{3}={\left[{x}_{1}\right]}_{D}^{\u2ab0}={\left[{x}_{7}\right]}_{D}^{\u2ab0}=\{{x}_{1},{x}_{2},{x}_{3},{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9}\}$,
- ${D}_{4}={\left[{x}_{4}\right]}_{D}^{\u2ab0}={\left[{x}_{7}\right]}_{D}^{\u2ab0}={\left[{x}_{10}\right]}_{D}^{\u2ab0}=\{{x}_{1},{x}_{2},{x}_{3},{x}_{4},{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9},{x}_{10}\}$.

- $\underline{{R}_{AT}^{\u2ab0}}\left({D}_{1}\right)=\{{x}_{5},{x}_{6},{x}_{9}\}$
- $\underline{{R}_{AT}^{\u2ab0}}\left({D}_{2}\right)=\{{x}_{2},{x}_{5},{x}_{6},{x}_{8},{x}_{9}\}$
- $\underline{{R}_{AT}^{\u2ab0}}\left({D}_{3}\right)=\{{x}_{1},{x}_{2},{x}_{3},{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9}\}$
- $\underline{{R}_{AT}^{\u2ab0}}\left({D}_{4}\right)=\{{x}_{1},{x}_{2},{x}_{3},{x}_{4},{x}_{5},{x}_{6},{x}_{7},{x}_{8},{x}_{9},{x}_{10}\}$

- (1)
- According to the first step of Algorithm 2, we first establish an empty set A.
- (2)
- Based on the lower approximation identifiable matrix, a path with only one node $\left\{a\right\}$ is selected in the discernibility information tree. Then, delete all the paths that contains only one node $\left\{a\right\}$. That means removing the path $<a>$.
- (3)
- $A=A\cup \left\{a\right\}$.
- (4)
- Choose the right child node c of the root node in the discernibility information tree and $A=A\cup \left\{c\right\}$.
- (5)
- Delete paths $<b,c,d,e>$ and $<c,d>$ that contain the node c.
- (6)
- At this point, there is only one path $<b,e>$ on the discernibility information tree. Afterwards, select the node b and delete the path $<b,e>$. Finally, $A=A\cup \left\{b\right\}$.
- (7)
- At this time, the root node of the discernibility information tree has no child nodes. Thus, the algorithm is finished. A lower approximation reduction based on discernibility information tree is $A=\{a,b,c\}$.

## 5. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Pawlak, Z. Rough set. Int. J. Comput. Inf. Sci.
**1982**, 11, 341–356. [Google Scholar] [CrossRef] - Pawlak, Z. Rough Sets: Theoretical Aspects of Reasoning about Data; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1992. [Google Scholar]
- Li, W.T.; Pedrycz, W.; Xue, X.P.; Xu, W.H.; Fan, B.J. Distance-based double-quantitative rough fuzzy sets with logic operations. Int. J. Approx. Reason.
**2018**, 101, 206–233. [Google Scholar] [CrossRef] - Li, W.T.; Pedrycz, W.; Xue, X.P.; Xu, W.H.; Fan, B.J. Fuzziness and incremental information of disjoint regions in double-quantitative decision-theoretic rough set model. Int. J. Mach. Learn. Cybern.
**2018**. [Google Scholar] [CrossRef] - Xu, W.H.; Yu, J.H. A novel approach to information fusion in multi-source datasets: A granular computing viewpoint. Inf. Sci.
**2017**, 378, 410–423. [Google Scholar] [CrossRef] - Xu, W.H.; Pang, J.Z.; Luo, S.Q. A novel cognitive system model and approach to transformation of information granules. Int. J. Approx. Reason.
**2014**, 55, 853–866. [Google Scholar] [CrossRef] - Xu, W.H.; Li, W.T. Granular Computing Approach to Two-Way Learning Based on Formal Concept Analysis in Fuzzy Datasets. IEEE Trans. Cybern.
**2016**, 46, 366–379. [Google Scholar] [CrossRef] [PubMed] - Medina, J.; Ojedaaciego, M. Multi-adjoint t-concept lattices. Inf. Sci.
**2010**, 180, 712–725. [Google Scholar] [CrossRef] - Kumar, A.; Kumar, D.; Jarial, S.K. A hybrid clustering method based on improved artificial bee colony and fuzzy C-Means algorithm. Int. J. Artif. Intell.
**2017**, 15, 40–60. [Google Scholar] - Pozna, C.; Minculete, N.; Precup, R.E.; Kóczy, L.T.; Ballagi, A. Signatures: Definitions, operators and applications to fuzzy modelling. Fuzzy Sets Syst.
**2012**, 201, 86–104. [Google Scholar] [CrossRef] - Jankowski, J.; Kazienko, P.; Wątróbski, J.; Lewandowska, A.; Ziemba, P.; Zioło, M. Fuzzy multi-objective modeling of effectiveness and user experience in online advertising. Exp. Syst. Appl.
**2016**, 65, 315–331. [Google Scholar] [CrossRef] - Jeon, G.; Kim, D.; Jeong, J. Rough sets attributes reduction based expert system in interlaced video sequences. IEEE Trans. Consum. Electron.
**2006**, 52, 1348–1355. [Google Scholar] [CrossRef] - Duntsh, I.; Gediga, G. Uncertainty measures of rough set prediction. Artif. Intell.
**1998**, 106, 109–113. [Google Scholar] [CrossRef] - Hu, X.; Cercone, N. Learning in relational databases: A rough set approach. Int. J. Comput. Intell.
**2010**, 11, 323–338. [Google Scholar] [CrossRef] - Pedrycz, W.; Bargiela, A. Granular clustering: A granular signature of data. IEEE Trans. Syst. Man Cybern. Part B
**2002**, 32, 212–224. [Google Scholar] [CrossRef] - Qian, Y.H.; Li, S.Y.; Liang, J.Y.; Shi, Z.; Wang, F. Pessimistic rough set based decisions: A multigranulation fusion strategy. Inf. Sci. Int. J.
**2014**, 264, 196–210. [Google Scholar] [CrossRef] - Pedrycz, W. Granular Computing Analysis and Design of Intelligent Systems; CRC Press Taylor and Francis Group: Boca Raton, FL, USA, 2013. [Google Scholar]
- Yao, Y.; Zhao, Y. Attribute reduction in decision-theoretic rough set models. Inf. Sci.
**2008**, 178, 3356–3373. [Google Scholar] [CrossRef] [Green Version] - Xu, W.; Li, W.; Luo, S. Knowledge reductions in generalized approximation space over two universes based on evidence theory. J. Intell. Fuzzy Syst.
**2015**, 28, 2471–2480. [Google Scholar] [CrossRef] - Guo, Y.T.; Xu, W.H. Attribute Reduction in Multi-source Decision Systems. In Proceedings of the International Joint Conference on Rough Sets, Santiago de Chile, Chile, 7–11 October 2016; Springer: Cham, Switzerland, 2016; pp. 558–568. [Google Scholar]
- Yang, X.; Yang, J.; Wu, C.; Yu, D. Dominance-based rough set approach and knowledge reductions in incomplete ordered information system. Inf. Sci.
**2008**, 178, 1219–1234. [Google Scholar] [CrossRef] - Xu, W.; Shao, M.; Zhang, W. Knowledge Reduction Based on Evidence Reasoning Theory in Ordered Information Systems. Knowl. Eng. Manag.
**2006**, 4092, 535–547. [Google Scholar] - Skowron, A.; Rauszer, C. The Discernibility Matrices and Functions in Information Systems. Intell. Decis. Support
**1992**, 11, 331–362. [Google Scholar] - Liang, J.Y.; Chin, K.S.; Dang, C.Y.; Yam, R.C.M. A new method for measuring uncertainty and fuzziness in rough set theory. Int. J. Gen. Syst.
**2002**, 31, 331–342. [Google Scholar] [CrossRef] [Green Version] - Cao, F.Y.; Liang, J.Y.; Qian, Y.H. Decision table reduction based on information entropy. Comput. Appl.
**2005**, 25, 2630–2631. [Google Scholar] - Hu, X.G.; Xue, F.; Zhang, Y.H.; Zhang, J. Attribute reduction method of decision table based on concept lattice. Pattern Recognit. Artif. Intell.
**2009**, 22, 624–629. [Google Scholar] - Jiang, Y.; Wang, X.; Zhen, Y. Attribute reduction algorithm of rough sets based on discernibility matrix. J. Syst. Simul.
**2008**, 20, 3717–3720. [Google Scholar] - Yang, M.; Yang, P. A novel condensing tree structure for rough set feature selection. Neurocomputing
**2008**, 71, 1092–1100. [Google Scholar] [CrossRef] - Dai, J.H.; Hu, H.; Wu, W.Z.; Qian, Y.; Huang, D. Maximal-discernibility-pair-based approach to attribute reduction in fuzzy rough sets. IEEE Trans. Fuzzy Syst.
**2018**, 26, 2174–2187. [Google Scholar] [CrossRef] - Qian, Y.; Liang, J.; Pedrycz, W.; Dang, C. Positive approximation: An accelerator for attribute reduction in rough set theory. Artif. Intell.
**2010**, 174, 597–618. [Google Scholar] [CrossRef] - Hu, B.Q.; Zhang, L.J.; Zhou, Y.C.; Pedrycz, W. Large-scale multimodality attribute reduction with multi-kernel fuzzy rough sets. IEEE Trans. Fuzzy Syst.
**2018**, 26, 226–238. [Google Scholar] [CrossRef] - Zhang, W.D.; Wang, X.; Yang, X.B.; Chen, X.; Wang, P. Neighborhood attribute reduction for imbalanced data. In Granular Computing; Springer: New York, NY, USA, 2018; pp. 1–11. [Google Scholar]
- Li, F.; Jin, C.; Yang, J. Roughness measure based on description ability for attribute reduction in information system. Int. J. Mach. Learn. Cybern.
**2018**. [Google Scholar] [CrossRef] - Xie, X.J.; Qin, X.L. A novel incremental attribute reduction approach for dynamic incomplete decision systems. Int. J. Approx. Reason.
**2018**, 93, 443–462. [Google Scholar] [CrossRef] - Shi, Y.P.; Huang, Y.; Wang, C.Z.; He, Q. Attribute reduction based on the Boolean matrix. In Granular Computing; Springer: New York, NY, USA, 2018; pp. 1–10. [Google Scholar]
- Ge, H.; Li, L.S.; Xu, Y.; Yang, C. Quick general reduction algorithms for inconsistent decision tables. Int. J. Approx. Reason.
**2017**, 82, 56–80. [Google Scholar] [CrossRef] - Wang, C.Z.; He, Q.; Shao, M.W.; Hu, Q. Feature selection based on maximal neighborhood discernibility. Int. J. Mach. Learn. Cybern.
**2018**, 9, 1929–1940. [Google Scholar] [CrossRef] - Jiang, Y. Attribute reduction with rough set based on discernibility information tree. Control Decis.
**2015**, 30, 1531–1536. [Google Scholar]

**Figure 2.**The flow chart of lower approximation reduction based on the discernibility information tree in an IODIS.

U | a | b | c | d | e | $\mathit{Decision}$ |
---|---|---|---|---|---|---|

${x}_{1}$ | 2 | 1 | 3 | 2 | 3 | 2 |

${x}_{2}$ | 4 | 0 | 2 | 1 | 2 | 3 |

${x}_{3}$ | 2 | 0 | 2 | 1 | 2 | 4 |

${x}_{4}$ | 4 | 0 | 1 | 0 | 2 | 1 |

${x}_{5}$ | 3 | 3 | 5 | 4 | 5 | 4 |

${x}_{6}$ | 3 | 3 | 5 | 4 | 5 | 4 |

${x}_{7}$ | 2 | 2 | 3 | 2 | 4 | 2 |

${x}_{8}$ | 4 | 3 | 4 | 3 | 5 | 3 |

${x}_{9}$ | 4 | 4 | 4 | 3 | 6 | 4 |

${x}_{10}$ | 1 | 4 | 5 | 4 | 6 | 1 |

U | ${\mathit{x}}_{1}$ | ${\mathit{x}}_{2}$ | ${\mathit{x}}_{3}$ | ${\mathit{x}}_{4}$ | ${\mathit{x}}_{5}$ | ${\mathit{x}}_{6}$ | ${\mathit{x}}_{7}$ | ${\mathit{x}}_{8}$ | ${\mathit{x}}_{9}$ | ${\mathit{x}}_{10}$ |
---|---|---|---|---|---|---|---|---|---|---|

${x}_{1}$ | ⌀ | ⌀ | ⌀ | $\{b,c,d,e\}$ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | $\left\{a\right\}$ |

${x}_{2}$ | $\left\{a\right\}$ | ⌀ | $\left\{a\right\}$ | $\{c,d\}$ | ⌀ | ⌀ | $\left\{a\right\}$ | ⌀ | ⌀ | $\left\{a\right\}$ |

${x}_{3}$ | ⌀ | ⌀ | ⌀ | $\{c,d\}$ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | $\left\{a\right\}$ |

${x}_{4}$ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ |

${x}_{5}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | ⌀ | ⌀ | $\{a,b,c,d,e\}$ | $\{c,d\}$ | ⌀ | $\left\{a\right\}$ |

${x}_{6}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | ⌀ | ⌀ | $\{a,b,c,d,e\}$ | $\{c,d\}$ | ⌀ | $\left\{a\right\}$ |

${x}_{7}$ | ⌀ | ⌀ | ⌀ | $\{b,c,d,e\}$ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | $\left\{a\right\}$ |

${x}_{8}$ | $\{a,b,c,d,e\}$ | ⌀ | $\{b,c,d,e\}$ | $\{b,c,d,e\}$ | ⌀ | ⌀ | $\{a,b,c,d,e\}$ | ⌀ | ⌀ | $\left\{a\right\}$ |

${x}_{9}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | $\{a,b,c,d,e\}$ | $\{b,c,d,e\}$ | ⌀ | ⌀ | $\{a,b,c,d,e\}$ | $\{b,e\}$ | ⌀ | $\left\{a\right\}$ |

${x}_{10}$ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ | ⌀ |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, J.; Zhang, X.; Xu, W.
Lower Approximation Reduction Based on Discernibility Information Tree in Inconsistent Ordered Decision Information Systems. *Symmetry* **2018**, *10*, 696.
https://doi.org/10.3390/sym10120696

**AMA Style**

Zhang J, Zhang X, Xu W.
Lower Approximation Reduction Based on Discernibility Information Tree in Inconsistent Ordered Decision Information Systems. *Symmetry*. 2018; 10(12):696.
https://doi.org/10.3390/sym10120696

**Chicago/Turabian Style**

Zhang, Jia, Xiaoyan Zhang, and Weihua Xu.
2018. "Lower Approximation Reduction Based on Discernibility Information Tree in Inconsistent Ordered Decision Information Systems" *Symmetry* 10, no. 12: 696.
https://doi.org/10.3390/sym10120696