Open Access
This article is

- freely available
- re-usable

*Computers*
**2020**,
*9*(1),
3;
https://doi.org/10.3390/computers9010003

Article

Dynamic Boundary of P-Set and Intelligent Acquisition for Two Types of Information Fusion

^{1}

School of Business, Shandong Normal University, Jinan 250014, China

^{2}

School of Mathematics, Shandong University, Jinan 250100, China

^{*}

Author to whom correspondence should be addressed.

Received: 3 December 2019 / Accepted: 13 January 2020 / Published: 16 January 2020

## Abstract

**:**

The development of information technology brings the challenge of data redundancy and data shortage to information fusion. Based on the dynamic boundary characteristics of p-set, this paper analyzes the structure and generation of p-augmented matrix, and then analyzes the dynamic generation of information equivalence class, and then proposes an intelligent acquisition algorithm of information equivalence class based on matrix reasoning. In addition, this paper analyzes two types of information fusion, namely information redundancy fusion and information supplement fusion. Then, the relationship among redundant information fusion, supplementary information fusion, and information equivalence classes is analyzed. Finally, this paper presents the application of intelligent acquisition of information equivalence class in information retrieval.

Keywords:

p-sets; information equivalence class; intelligent acquisition algorithm## 1. Introduction

Information fusion widely exists in the biological world, and it is an intrinsic feature of organisms from the ancient times to the present [1]. As a hot field of information science, information fusion technology originated from the military application in the 1970s [2]. After the continuous research climax from the early 1980s to now, the theory and technology of information fusion have been further developed rapidly [3,4]. As an independent discipline, information fusion has been successfully applied to military fields such as military command automation, strategic early warning and defense, multi-target tracking, etc., and gradually radiated to many civil fields such as intelligent transportation, remote sensing monitoring, e-commerce, artificial intelligence, wireless communication, industrial process monitoring and fault diagnosis, etc.

Information fusion is a formal framework, which uses mathematical methods and technical tools to synthesize different information, in order to get high-quality and useful information [5,6,7,8]. Compared with the single-source independent processing, the advantages of information fusion include: improving detectability and credibility, expanding the space-time sensing range, reducing the degree of reasoning ambiguity, improving the detection accuracy and other performance, increasing the target feature dimension, improving spatial resolution, enhancing the system fault-tolerant ability and white adaptability, so as to improve the whole system performance.

In the past 20 years, scholars have put forward a variety of methods for information fusion, and achieved rich research results [9,10,11,12]. Among them, p-set theory and method is a unique application. P-sets (P = packet) is a mathematical model with dynamic boundary features [13,14,15]. It is obtained by introducing dynamic features into the finite common element set X, and improving it. The dynamic boundary features of the p-set are as following: for the given finite set of common elements X, and the attribute collection $\alpha $ of X, (a) If the attribute ${\alpha}_{i}$ is added into $\alpha $, $\alpha $ generates ${\alpha}^{F}$, $\alpha \subseteq {\alpha}^{F}$, then some elements are removed from X, and the X boundary shrinks inward. We called that the internal p-set ${X}^{\overline{F}}$ is generated by X, ${X}^{\overline{F}}\subseteq X$. (b) If the attribute ${\alpha}_{i}$ is deleted from $\alpha $, $\alpha $ generates ${\alpha}^{\overline{F}}$, ${\alpha}^{\overline{F}}\subseteq \alpha $, then X is supplemented with some elements, and the X boundary is expanded outward. We call that X generates the outer p-set ${X}^{F}$, $X\subseteq {X}^{F}$. (c) If you add some attributes into $\alpha $ and delete some other attributes from $\alpha $ at the same time, some elements are deleted from X and some other elements are added into X. We call that X generates a set pair $({X}^{\overline{F}},{X}^{F})$ which is named by p-set. (d) If the above process continues, X will generate multiple set pair $({X}_{1}^{\overline{F}},{X}_{1}^{F})$, $({X}_{2}^{\overline{F}},{X}_{2}^{F})$, ⋯, $({X}_{n}^{\overline{F}},{X}_{n}^{F})$. We get the dynamic boundary of p-set: ${X}_{n}^{\overline{F}}\subseteq {X}_{n-1}^{\overline{F}}\subseteq \cdots \subseteq {X}_{2}^{\overline{F}}\subseteq {X}_{1}^{\overline{F}}$, ${X}_{1}^{F}\subseteq {X}_{2}^{F}\subseteq \cdots \subseteq {X}_{n}^{F}$. In the p-set, the attribute ${\alpha}_{i}$ of the element ${x}_{i}$ satisfies the expansion or contraction of “conjunctive normal form” in mathematical logic. For given the information $\left(x\right)$ which is defined by X, inner p-information ${\left(x\right)}^{\overline{F}}$, outer p-information ${\left(x\right)}^{F}$ and p-information $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ are defined by ${X}^{\overline{F}}$, ${X}^{F}$ and (${X}^{\overline{F}}$, ${X}^{F}$) respectively, i.e., $\left(x\right)=X$, ${\left(x\right)}^{\overline{F}}$=${X}^{\overline{F}}$, ${\left(x\right)}^{F}$=${X}^{F}$, $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$=$({X}^{\overline{F}},{X}^{F})$. We can speculate that p-sets can be used to analyze dynamic information recognition and information fusion. In fact, p-sets are the new mathematical methods and models for researching dynamic information recognition and fusion, because each information $\left(x\right)$ has an attribute set $\alpha $, that is, the information $\left(x\right)$ is associated with its attribute set $\alpha $. Given the existing researches that the p-set and p-augmented matrix have many applications in China [16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38] and some applications of function p-sets, the inverse p-sets and the inverse p-sets have made by many researchers [39,40,41].

In the actual data set, redundant information will inevitably appear. For example, the data collected by the sensor at a higher frequency is redundant for data analysis with a longer time span. Similarly, in information fusion, sometimes we need to add some information to improve the accuracy of the analysis. Therefore, we need to pay attention to redundant information fusion and supplementary information fusion. These two kinds of information fusion are more important in the era of big data. In this paper, two kinds of information fusion algorithms are proposed by analyzing p-augmented matrix reasoning from the dynamic boundary of p-set. The purpose of this paper is to improve the dynamic boundary of the p-set and its generated p-augmented matrix for information fusion based on the function p-sets, the inverse p-sets, and the function inverse p-sets. Compared with other traditional methods, p-set theory and method start from the attributes of data, through set operation, matrix reasoning, etc., obtain information equivalent classes, and mine unknown information.

The researches given in this paper are as follows: (a) we give the existing fact of the structure and logical features of p-sets, then we give the structure and generation method of p-augmented matrix. These concepts are preparations for reading this paper. (b) We analyze the dynamic boundary features and the generation of information equivalence classes of p-sets. (c) We give matrix reasoning intelligent acquisition and intelligent acquisition algorithm of information equivalence class generated by p-augmented matrix. (d) We analyze the relationships between the concepts of information equivalence class and information fusion. We find that information equivalence class and information fusion are equivalent. (e) We give the application of intelligent acquisition of information equivalence class on information fusion, which can be used in unknown information discovery.

## 2. Preparatory Concepts

Some preparatory concepts are given in literature [13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41].

#### 2.1. The Structure of P-Sets and Their Logical Characteristics

Given a finite set of ordinary elements $X=\{{x}_{1}$, ${x}_{2}$, ⋯, ${x}_{q}\}$ ⊂ U, $\alpha $ =$\{{\alpha}_{1},{\alpha}_{2},\cdots ,{\alpha}_{k}\}\subset V$ is a attribute set of X. ${X}^{\overline{F}}$ is called the internal p-set generated by X,
where ${X}^{-}$ is called the $\overline{F}$-deleted set of X,

$${X}^{\overline{F}}=X-{X}^{-},$$

$${X}^{-}=\left\{{x}_{i}|{x}_{i}\in X,\overline{f}\left({x}_{i}\right)={u}_{i}\overline{\in}X,\overline{f}\in F\right\}.$$

If the attribute set ${\alpha}^{F}$ of ${X}^{\overline{F}}$ satisfies
where in (3), ${\beta}_{i}\in V$, ${\beta}_{i}\overline{\in}\alpha $, $f\in F$ turns ${\beta}_{i}$ into $f\left({\beta}_{i}\right)={\alpha}_{i}^{\prime}\in \alpha $; in (1), ${X}^{\overline{F}}\ne \varphi $, ${X}^{\overline{F}}=\left\{{x}_{1},{x}_{2},\cdots ,{x}_{p}\right\}$, $p<q$,$p,q\in {N}^{+}$.

$${\alpha}^{F}=\alpha \cup \left\{{\alpha}_{i}^{\prime}|f\left({\beta}_{i}\right)={\alpha}_{i}^{\prime}\in \alpha ,f\in F\right\},$$

Given a finite set of ordinary elements $X=\{{x}_{1}$, ${x}_{2}$, ⋯, ${x}_{q}$} ⊂ U, $\alpha $ = $\{{\alpha}_{1}$, ${\alpha}_{2}$, ⋯${\alpha}_{k}$$\}\subset V$ is the attribute set of X. ${X}^{F}$ is called outer p-set generated by X,
where ${X}^{+}$ is called F-supplemented set of X,

$${X}^{F}=X\cup {X}^{+},$$

$${X}^{+}=\left\{{u}_{i}|{u}_{i}\in U,{u}_{i}\overline{\in}X,f\left({u}_{i}\right)={x}_{i}^{\prime}\in X,f\in F\right\}.$$

If the attribute set ${\alpha}^{\overline{F}}$ of ${X}^{F}$ satisfies
where in (6), ${\alpha}_{i}\in \alpha $, $\overline{f}\in \overline{F}$ turns ${\alpha}_{i}$ into $\overline{f}\left({\alpha}_{i}\right)$, $\overline{f}\left({\alpha}_{i}\right)={\beta}_{i}\overline{\in}\alpha $; in (6), ${\alpha}^{\overline{F}}\ne \varphi $; in (4), ${X}^{F}=\left\{{x}_{1},{x}_{2},\cdots ,{x}_{r}\right\}$, $q<r$, $q,r\in {N}^{+}$.

$${\alpha}^{\overline{F}}=\alpha -\left\{{\beta}_{i}|\overline{f}\left({\alpha}_{i}\right)={\beta}_{i}\overline{\in}\alpha ,\overline{f}\in \overline{F}\right\},$$

The finite ordinary element set pair composed by internal p-set ${X}^{\overline{F}}$ and outer p-set ${X}^{F}$ is called p-set generated by X, namely

$$({X}^{\overline{F}},{X}^{F}).$$

The finite ordinary element set X is called the base set of p-set $({X}^{\overline{F}},{X}^{F})$.

It is obtained from (3) that

$${\alpha}_{1}^{F}\subseteq {\alpha}_{2}^{F}\subseteq \cdots \subseteq {\alpha}_{n-1}^{F}\subseteq {\alpha}_{n}^{F}.$$

Internal p-sets can be obtained accordingly from (1), (8) as following:

$${X}_{n}^{\overline{F}}\subseteq {X}_{n-1}^{\overline{F}}\subseteq \cdots \subseteq {X}_{2}^{\overline{F}}\subseteq {X}_{1}^{\overline{F}}.$$

It is obtained from (6) that

$${\alpha}_{n}^{\overline{F}}\subseteq {\alpha}_{n-1}^{\overline{F}}\subseteq \cdots \subseteq {\alpha}_{2}^{\overline{F}}\subseteq {\alpha}_{1}^{\overline{F}}.$$

Outer p-sets can be obtained accordingly from (4) and (10) as follows:

$${X}_{1}^{F}\subseteq {X}_{2}^{F}\subseteq \cdots \subseteq {X}_{n-1}^{F}\subseteq {X}_{n}^{F}.$$

By using (9) and (11), the set is obtained as follow:
which is called the p-set family generated by X, and (12) is the general form of the p-set.

$$\left\{({X}_{i}^{\overline{F}},{X}_{j}^{F})|i\in \mathrm{I},j\in \mathrm{J}\right\},$$

**Theorem**

**1.**

If $F=\overline{F}=\varphi $, then the p-set $({X}^{\overline{F}},{X}^{F})$ is restored to the finite ordinary element set X, namely

$${({X}^{\overline{F}},{X}^{F})}_{F=\overline{F}=\varphi}=X.$$

**Theorem**

**2.**

If $F=\overline{F}=\varphi $, then the p-set family $\left\{\right({X}_{i}^{\overline{F}}$, ${X}_{j}^{F}\left)\right|i$$\in \mathrm{I},j\in \mathrm{J}\}$ is restored to a finite set of ordinary element set X, namely

$${\left\{({X}_{i}^{\overline{F}},{X}_{j}^{F})|i\in \mathrm{I},j\in \mathrm{J}\right\}}_{F=\overline{F}=\varphi}=X.$$

Special notes:

- 1.
- U is the finite element universe, and V is the finite attribute universe.
- 2.
- $F=\left\{{f}_{1},{f}_{2},\cdots ,{f}_{n}\right\}$, $\overline{F}=\left\{{\overline{f}}_{1},{\overline{f}}_{2},\cdots ,{\overline{f}}_{n}\right\}$ are element or attribute transfer families; $f\in F$, $\overline{f}\in \overline{F}$ are element or attribute transfer; element (or attribute) transfer is a function concept of transformation.
- 3.
- The characteristic of $f\in F$ is that, for the element ${u}_{i}\in U$, ${u}_{i}\overline{\in}X$, $f\in F$ turns ${u}_{i}$ into $f\left({u}_{i}\right)={x}_{i}^{\prime}\in X$; for the attribute ${\beta}_{i}\in V$, ${\beta}_{i}\overline{\in}\alpha $, $f\in F$ turns ${\beta}_{i}$ into $f\left({\beta}_{i}\right)={\alpha}_{i}^{\prime}\in \alpha $.
- 4.
- The characteristic of $\overline{f}\in \overline{F}$ is that: for element ${x}_{i}\in X$, $\overline{f}\in \overline{F}$ turns ${x}_{i}$ into $\overline{f}\left({x}_{i}\right)={u}_{i}\overline{\in}X$; for the attribute ${\alpha}_{i}\in \alpha $, $\overline{f}\in \overline{F}$ turns ${\alpha}_{i}$ into $\overline{f}\left({\alpha}_{i}\right)={\beta}_{i}\overline{\in}\alpha $.
- 5.
- The dynamic feature of the Equation (1) is the same as the dynamic feature of the inverse accumulator $T=T-1$.
- 6.

#### 2.2. The Existence Fact of P-Sets and Its Logical Characteristics

Suppose that $X=\{{x}_{1}$, ${x}_{2}$, ${x}_{3}$, ${x}_{4}$, ${x}_{5}\}$ is a set of finite ordinary elements in which there are 5 apples, $\alpha =\{$${\alpha}_{1}$, ${\alpha}_{2}$, ${\alpha}_{3}$} is a attribute set of X, ${\alpha}_{1}$=Red, ${\alpha}_{2}$=Sweet, ${\alpha}_{3}$=Red Fuji; $\forall {x}_{i}\in X$, ${x}_{i}$ have the attributes ${\alpha}_{1}$, ${\alpha}_{2}$ and ${\alpha}_{3}$. By using the “conjunctive normal form” in mathematical logic, we can obtain the following facts:

Given the attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in X$, ${\alpha}_{i}={\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3},i=1,2,3,4,5$,

- 1.
- If ${\alpha}_{4}=produced\phantom{\rule{4pt}{0ex}}from\phantom{\rule{4pt}{0ex}}Yantai,\phantom{\rule{4pt}{0ex}}Chinese$ is added to $\alpha $, $\alpha $ generates ${\alpha}^{F}$, $\alpha \subseteq {\alpha}^{F}$, ${\alpha}^{F}=\alpha \cup \left\{{\alpha}_{4}\right\}=\left\{{\alpha}_{1},{\alpha}_{2},{\alpha}_{3},{\alpha}_{4}\right\}$, then ${x}_{4}$, ${x}_{5}$ are deleted from X, X generates internal p-set ${X}^{\overline{F}}$, ${X}^{\overline{F}}\subseteq X$, ${X}^{\overline{F}}=X-\left\{{x}_{4},{x}_{5}\right\}=\left\{{x}_{1},{x}_{2},{x}_{3}\right\}$, the attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in {X}^{\overline{F}}$ satisfies ${\alpha}_{i}=({\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3})\wedge {\alpha}_{4}={\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3}\wedge {\alpha}_{4}$; $i=1,2,3$.
- 2.
- If the attribute ${\alpha}_{3}$ is deleted in $\alpha $, $\alpha $ generates ${\alpha}^{\overline{F}}$, ${\alpha}^{\overline{F}}\subseteq \alpha $, ${\alpha}^{\overline{F}}=\alpha -\left\{{\alpha}_{3}\right\}=\left\{{\alpha}_{1},{\alpha}_{2}\right\}$, then ${x}_{6}$, ${x}_{7}$ is supplemented to X, X generates an outer p-set ${X}^{F}$, $X\subseteq {X}^{F}$, ${X}^{F}=X\cup \left\{{x}_{6},{x}_{7}\right\}=\left\{{x}_{1},{x}_{2},{x}_{3},{x}_{4},{x}_{5},{x}_{6},{x}_{7}\right\}$, the attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in {X}^{F}$ satisfies ${\alpha}_{i}=({\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3})-\wedge {\alpha}_{3}={\alpha}_{1}\wedge {\alpha}_{2}$, $i=1,2,3,4,5,6,7$.
- 3.
- If you add some attributes into $\alpha $ and delete some other attributes from $\alpha $ at the same time, $\alpha $ generates ${\alpha}^{F}$ and ${\alpha}^{\overline{F}}$, i.e., $\alpha $ generates $({\alpha}^{F},{\alpha}^{\overline{F}})$, then X generates ${X}^{\overline{F}}$ and ${X}^{F}$, i.e., X generates a p-set $({X}^{\overline{F}},{X}^{F})$.
- 4.
- If the process of adding some attributes into $\alpha $ while deleting other attributes continues from $\alpha $, X generates multiple p-sets: $({X}_{1}^{\overline{F}},{X}_{1}^{F}),({X}_{2}^{\overline{F}},{X}_{2}^{F}),\cdots ,({X}_{n}^{\overline{F}},{X}_{n}^{F})$, which are the p-set family which is showed as Equation (12).

For $X=\{{x}_{1}$, ${x}_{2}$, ⋯${x}_{q}\}$, $\alpha =\{{\alpha}_{1}$, ${\alpha}_{2}$, ⋯, ${\alpha}_{\eta}$, ${\alpha}_{\eta +1}$, ⋯${\alpha}_{k}\}$ is the attribute set of X; for ${X}^{\overline{F}}=\{{x}_{1},{x}_{2}$, ⋯${x}_{p}\}$, ${\alpha}^{F}=\{{\alpha}_{1}$, ${\alpha}_{2}$, ⋯, ${\alpha}_{k}$, ${\alpha}_{k+1}$, ⋯, ${\alpha}_{\lambda}\}$ is the attribute set of ${X}^{\overline{F}}$; for ${X}^{F}=\{{x}_{1},{x}_{2}$, ⋯, ${x}_{r}\}$, ${\alpha}^{\overline{F}}=\{{\alpha}_{1},{\alpha}_{2},\cdots ,{\alpha}_{\eta}\}$ is the attribute set of ${X}^{F}$; $p<q<r$, $p,q,r\in {N}^{+}$; $\eta <k<\lambda $, $\eta ,k,\mathsf{\Lambda}\in {N}^{+}$. Some general conclusions can be obtained from the above facts 1–4 as following:

- 1.
- The attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in X$ satisfies the attribute’s conjunctive normal form:$${\alpha}_{i}={\wedge}_{t=1}^{k}{\alpha}_{t}.$$
- 2.
- The attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in {X}^{\overline{F}}$ satisfies the expansion of attribute’s conjunctive normal form:$${\alpha}_{i}=\left({\wedge}_{t=1}^{k}{\alpha}_{t}\right){\wedge}_{t=k+1}^{\lambda}{\alpha}_{t}.$$
- 3.
- The attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in {X}^{F}$ satisfies the contraction of attribute’s conjunctive normal form:$${\alpha}_{i}=\left({\wedge}_{t=1}^{k}{\alpha}_{t}\right)-{\wedge}_{t=\eta +1}^{k}{\alpha}_{t}.$$
- 4.
- The attribute ${\alpha}_{i}$ for $\forall {x}_{i}\in {X}^{\overline{F}}$ and the attribute ${\alpha}_{j}$ for $\forall {x}_{j}\in {X}^{F}$ satisfies the expansion and contraction of attribute’s conjunctive normal form:$$({\alpha}_{i},{\alpha}_{j})=(({\wedge}_{t=1}^{k}{\alpha}_{t}){\wedge}_{t=k+1}^{\lambda}{\alpha}_{t},({\wedge}_{t=1}^{k}{\alpha}_{t})-{\wedge}_{t=\eta +1}^{k}{\alpha}_{t}),$$

#### 2.3. Structure and Generation of P-Augmented Matrix

By using the structure of the p-set, the definition and structure of improved general augmentation matrix ${A}^{*}$ are given in literature [38]:

Given a finite set of ordinary elements $X=\{{x}_{1}$, ${x}_{2}$, ⋯, ${x}_{q}\}$, ${x}_{i}$ ($\forall {x}_{i}\in X$) has n values ${y}_{i,1},{y}_{i,2}$, ⋯, ${y}_{i,n}$; ${y}_{j}=({y}_{i,1}$, ${y}_{i,2}$, ⋯, ${y}_{i,n}{)}^{T}$ is a vector generated by ${y}_{i,1}$, ${y}_{i,2}$, ⋯, ${y}_{i,n}$, the matrix A can be obtained by using ${y}_{i}$ as the column. The A is called element value matrix generated by X

$$A=\left[\begin{array}{cccc}{Y}_{1,1}& {y}_{1,2}& \cdots & {y}_{1,q}\\ {Y}_{2,1}& {y}_{2,2}& \cdots & {y}_{2,q}\\ \vdots & \vdots & \ddots & \vdots \\ {Y}_{n,1}& {y}_{n,2}& \cdots & {y}_{n,q}\end{array}\right].$$

The ${A}^{\overline{F}}$ is called the internal p-augmented matrix of A generated by internal p-set ${X}^{\overline{F}}=\left\{{x}_{1},{x}_{2},\cdots ,{x}_{p}\right\}$,

$${A}^{\overline{F}}=\left[\begin{array}{cccc}{Y}_{1,1}& {y}_{1,2}& \cdots & {y}_{1,p}\\ {Y}_{2,1}& {y}_{2,2}& \cdots & {y}_{2,p}\\ \vdots & \vdots & \ddots & \vdots \\ {Y}_{n,1}& {y}_{n,2}& \cdots & {y}_{n,p}\end{array}\right].$$

The ${A}^{F}$ is called the outer p-augmented matrix of A generated by the outer p-set ${X}^{F}=\left\{{x}_{1},{x}_{2},\cdots ,{x}_{r}\right\}$,

$${A}^{F}=\left[\begin{array}{cccc}{Y}_{1,1}& {y}_{1,2}& \cdots & {y}_{1,r}\\ {Y}_{2,1}& {y}_{2,2}& \cdots & {y}_{2,r}\\ \vdots & \vdots & \ddots & \vdots \\ {Y}_{n,1}& {y}_{n,2}& \cdots & {y}_{n,r}\end{array}\right].$$

The matrix pair consisting of the inner p-augmented matrix ${A}^{\overline{F}}$ and outer p-augmented matrix ${A}^{F}$ is as following

$$({A}^{\overline{F}},{A}^{F}).$$

The $({A}^{\overline{F}},{A}^{F})$ is called p-augmented matrix of A generated by p-set $({X}^{\overline{F}},{X}^{F})$, where, in Equations (19)–(21), $p<q<r$, $p,q,r\in {N}^{+}$. The outer p-augmented matrix ${A}^{F}$ of A is the same concept as the ordinary augmentation matrix ${A}^{*}$ of A.

Figure 1 shows a two-dimensional visual representation of the p-set $({X}^{\overline{F}},{X}^{F})$.

- 1.
- The X boundary is contracting inward when some attributes are added into the attribute set $\alpha $ of X. That is, the X dynamically generates the internal p-set ${X}^{\overline{F}}$.
- 2.
- The X boundary is expanding outward when some attributes are deleted from the attribute set $\alpha $ of X. That is, the X dynamically generates the outer p-set ${X}^{F}$.
- 3.
- The boundary of X is contracting inward and expanding outward when some attributes are added and some attributes are deleted in attribute collection $\alpha $ of X. That is, the X dynamically generates p-set $({X}^{\overline{F}},{X}^{F})$; the process of adding attributes and deleting attributes in $\alpha $ keeps going, X dynamically generates p-set families.

The concepts in this section are important for accepting the research and results given in Section 3, Section 4 and Section 5. More features and applications of p-sets and p-augmented matrices can be found from the works of literature [13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38].

Convention: X, ${X}^{\overline{F}}$, ${X}^{F}$ and $({X}^{\overline{F}}$, ${X}^{F}$ are defined as the information $\left(x\right)$, the inner p-information ${\left(x\right)}^{\overline{F}}$, the outer p-information ${\left(x\right)}^{F}$ and the p-information $({\left(x\right)}^{\overline{F}}$,${X}^{F})$ respectively; i.e., $\left(x\right)=X$, ${\left(x\right)}^{\overline{F}}={X}^{\overline{F}}$, ${\left(x\right)}^{F}={X}^{F}$ and $({\left(x\right)}^{\overline{F}}$, ${X}^{F})$ = $({X}^{\overline{F}}$, ${X}^{F})$. These concepts and symbols are used in Section 3, Section 4, Section 5 and Section 6.

## 3. Dynamic Boundary of P-Sets and Dynamic Generation of Information Equivalence Classes

**Theorem**

**3.**

(The dynamic generation theorem of ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$) if some attributes are added into the attribute set α of information $\left(x\right)$, α generates ${\alpha}^{F}$, $\alpha \subseteq {\alpha}^{F}$, then the internal p-information ${\left(x\right)}^{\overline{F}}$ with the attribute set ${\alpha}^{F}$ is the ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$ generated by $\left(x\right)$. That is

$${\left(x\right)}^{\overline{F}}={\left[x\right]}^{\overline{F}}.$$

**Proof.**

Suppose that ${\left(x\right)}^{\overline{F}}$ is the internal p-information generated by the information $\left(x\right)$, the attribute set ${\alpha}^{F}$ of ${\left(x\right)}^{\overline{F}}$ is the relationship R of ${\left(x\right)}^{\overline{F}}\times {\left(x\right)}^{\overline{F}}$, i.e., $R={\alpha}^{F}$; Some equivalence class concepts can be obtained: 1. For $\forall {x}_{i}\in {\left(x\right)}^{\overline{F}}$, ${x}_{i}$ and ${x}_{i}$ have the relationship R, i.e., ${x}_{i}{\alpha}^{F}{x}_{j}$, so the reflexivity is satisfied. 2. For $\forall {x}_{i},{x}_{j}\in {\left(x\right)}^{\overline{F}}$, ${x}_{i}$ has a relationship R with ${x}_{j}$, then ${X}_{j}$ has a relationship R with ${x}_{i}$; i.e., if ${x}_{i}{\alpha}^{F}{x}_{j}$, then ${x}_{j}{\alpha}^{F}{x}_{i}$ could be obtained, so the symmetry is satisfied. 3. For $\forall {x}_{i},{x}_{j},{x}_{k}\in {\left(x\right)}^{\overline{F}}$, if ${x}_{i}$ has a relationship R with ${x}_{j}$, and ${x}_{j}$ has a relationship R with ${x}_{k}$, then ${x}_{i}$ has a relationship R with ${x}_{k}$; i.e., if ${x}_{i}{\alpha}^{F}{x}_{j}$, and ${x}_{j}{\alpha}^{F}{x}_{k}$, then the ${x}_{i}{\alpha}^{F}{x}_{k}$ could be obtained. So the transitivity is satisfied. From 1–3 we can obtained that: for $\forall {x}_{i},{x}_{j},{x}_{k}\in {\left(x\right)}^{\overline{F}}$, ${\alpha}^{F}$ satisfies with the reflexivity ${x}_{i}{\alpha}^{F}{x}_{i}$; the symmetry ${x}_{i}{\alpha}^{F}{x}_{j}\Rightarrow {x}_{j}{\alpha}^{F}{x}_{i}$; and the transitivity ${x}_{i}{\alpha}^{F}{x}_{j},{x}_{j}{\alpha}^{F}{x}_{k}\Rightarrow {x}_{i}{\alpha}^{F}{x}_{k}$. It is easy to get that: internal p-information ${\left(x\right)}^{\overline{F}}$ is the ${\alpha}^{F}$- information equivalence class ${\left[x\right]}^{\overline{F}}$ generated by the information $\left(x\right)$, ${\left(x\right)}^{\overline{F}}={\left[x\right]}^{\overline{F}}$. □

**Theorem**

**4.**

(The dynamic generation theorem of ${\alpha}^{\overline{F}}$-Information equivalence class ${\left[x\right]}^{F}$ ) If some attributes are deleted from attribute set α of information $\left(x\right)$, α generates ${\alpha}^{\overline{F}}$, ${\alpha}^{F}\subseteq \alpha $, then the outer p-information ${\left(x\right)}^{F}$ with attribute set ${\alpha}^{\overline{F}}$ is the ${\alpha}^{F}$- Information equivalence class ${\left[x\right]}^{F}$ generated by $\left(x\right)$; that is,

$${\left(x\right)}^{F}={\left[x\right]}^{F}.$$

The proof is similar to Theorem 1, so the proof of Theorem 2 is omitted.

From Theorems 3 and 4, the Theorem 5 can be obtained directly,

**Theorem**

**5.**

(The dynamic generation theorem of $({\alpha}^{\overline{F}}$,${\alpha}^{F})$- information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ ) If some attributes are added to and deleted from attribute set α of information $\left(x\right)$ at the same time, α generates ${\alpha}^{F}$ and ${\alpha}^{\overline{F}}$, ${\alpha}^{\overline{F}}\subseteq \alpha \subseteq {\alpha}^{F}$, then the p-information $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ with attribute set $({\alpha}^{F},{\alpha}^{\overline{F}})$ is the $({\alpha}^{F},{\alpha}^{\overline{F}})$-Information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ generated by $\left(x\right)$; that is,

$$({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})=[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}].$$

Obviously, the information $\left(x\right)$ with the attribute set $\alpha $ is the $\alpha $-information equivalence class $\left[x\right]$, $\left[x\right]=\left(x\right)$.

Some propositions can be obtained from Theorems 3–5 and Equations (1)–(7) in Section 2 as following:

**Proposition**

**1.**

The dynamic generation of ${\alpha}^{F}$- information equivalence class ${\left[x\right]}^{\overline{F}}$ is synchronous with the boundary inward dynamic contraction of the internal p-set ${X}^{\overline{F}}$.

**Proposition**

**2.**

The dynamically generation of ${\alpha}^{\overline{F}}$- information equivalence class ${\left[x\right]}^{F}$ is synchronous with the boundary outward expansion of the outer p-set ${X}^{F}$.

**Proposition**

**3.**

The dynamically generation of $({\alpha}^{F},{\alpha}^{\overline{F}})$-Information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ are synchronous with the boundary inward dynamic contraction and outward dynamic expansion of the p-set $({X}^{\overline{F}},{X}^{F})$.

## 4. Matrix Reasoning and the Intelligent Acquisition Theorem of Information Equivalence Classes

Conventions: in Section 2, the internal p-augmented matrix ${A}^{\overline{F}}$, outer p-augmented matrix ${A}^{F}$ and p-augmented matrix $({A}^{\overline{F}},{A}^{F})$ are recorded as the internal p-matrix ${A}^{\overline{F}}$, the outer p-matrix ${A}^{F}$ and p-matrix $({A}^{\overline{F}},{A}^{F})$ respectively. It will not cause any misunderstanding.

Given internal p-matrix ${A}_{k}^{\overline{F}}$ and ${A}_{k+1}^{\overline{F}}$, ${\alpha}_{k}^{F}$, ${\alpha}_{k+1}^{F}$ are the attribute set of ${A}_{k}^{\overline{F}}$, ${A}_{k+1}^{\overline{F}}$ respectively; ${A}_{k}^{\overline{F}}$, ${A}_{k+1}^{\overline{F}}$ and ${\alpha}_{k}^{F}$, ${\alpha}_{k+1}^{F}$ satisfy the following equation

$$if\phantom{\rule{1.em}{0ex}}{A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}},\phantom{\rule{1.em}{0ex}}then\phantom{\rule{1.em}{0ex}}{\alpha}_{k}^{F}\Rightarrow {\alpha}_{k+1}^{F}.$$

Equation (26) is called internal p-matrix reasoning generated by internal p-matrix; ${A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}}$ is called the internal p-matrix reasoning condition, ${\alpha}_{k}^{F}\Rightarrow {\alpha}_{k+1}^{F}$ is called the internal p-matrix reasoning conclusion. Where, in Equation (26), ${A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}}$ is equivalent to ${A}_{k+1}^{\overline{F}}\subseteq {A}_{k}^{\overline{F}}$; ${\alpha}_{k}^{F}\Rightarrow {\alpha}_{k+1}^{F}$ is equivalent to ${\alpha}_{k}^{F}\subseteq {\alpha}_{k+1}^{F}$.

Given the outer p-matrix ${A}_{k}^{F}$ and ${A}_{k+1}^{F}$, ${\alpha}_{k}^{\overline{F}}$ and ${\alpha}_{k+1}^{\overline{F}}$ are the attribute set of ${A}_{k}^{F}$, ${A}_{k+1}^{F}$, respectively. ${A}_{k}^{F}$, ${A}_{k+1}^{F}$ and ${\alpha}_{k}^{\overline{F}}$, ${\alpha}_{k+1}^{\overline{F}}$ satisfy the following equation

$$if\phantom{\rule{1.em}{0ex}}{A}_{k}^{F}\Rightarrow {A}_{k+1}^{F},\phantom{\rule{1.em}{0ex}}then\phantom{\rule{1.em}{0ex}}{\alpha}_{k+1}^{\overline{F}}\Rightarrow {\alpha}_{k}^{\overline{F}}.$$

Equation (27) is called outer p-matrix reasoning generated by outer p-matrix; ${A}_{k}^{F}\Rightarrow {A}_{k+1}^{F}$ is called outer p-matrix inference condition, ${\alpha}_{k+1}^{\overline{F}}\Rightarrow {\alpha}_{k}^{\overline{F}}$ is called the outer p-matrix reasoning conclusion.

Given p-matrix $({A}_{k+1}^{\overline{F}}$, ${A}_{k}^{F})$ and $({A}_{k}^{\overline{F}}$, ${A}_{k+1}^{F})$, $({\alpha}_{k+1}^{F}$, ${\alpha}_{k}^{\overline{F}})$ and $({\alpha}_{k}^{F}$, ${\alpha}_{k+1}^{\overline{F}})$ are the attribute sets of $({A}_{k+1}^{\overline{F}}$, ${A}_{k}^{F})$, $({A}_{k}^{\overline{F}}$, ${A}_{k+1}^{F})$ respectively. $({A}_{k+1}^{\overline{F}}$, ${A}_{k}^{F})$, $({A}_{k}^{\overline{F}}$, ${A}_{k+1}^{F})$, $({\alpha}_{k+1}^{F}$, ${\alpha}_{k}^{\overline{F}})$ and $({\alpha}_{k}^{F}$, ${\alpha}_{k+1}^{\overline{F}})$ satisfy the following equation

$$\begin{array}{cc}\hfill \phantom{\rule{1.em}{0ex}}& if\phantom{\rule{1.em}{0ex}}({A}_{k+1}^{\overline{F}},{A}_{k}^{F})\Rightarrow ({A}_{k}^{\overline{F}},{A}_{k+1}^{F}),\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& then\phantom{\rule{1.em}{0ex}}({\alpha}_{k}^{F},{\alpha}_{k+1}^{\overline{F}})\Rightarrow ({\alpha}_{k+1}^{F},{\alpha}_{k}^{\overline{F}}).\hfill \end{array}$$

Equation (28) is called p-matrix reasoning generated by p-matrix; $({A}_{k+1}^{\overline{F}},{A}_{k}^{F})\Rightarrow ({A}_{k}^{\overline{F}},{A}_{k+1}^{F})$ is called the p-matrix reasoning condition, $({\alpha}_{k}^{F},{\alpha}_{k+1}^{\overline{F}})\Rightarrow ({\alpha}_{k+1}^{F},{\alpha}_{k}^{\overline{F}})$ is called p-matrix reasoning conclusion. Where, in Equation (28), $({A}_{k+1}^{\overline{F}},{A}_{k}^{F})\Rightarrow ({A}_{k}^{\overline{F}},{A}_{k+1}^{F})$ means that ${A}_{K+1}^{\overline{F}}\Rightarrow {A}_{K}^{\overline{F}}$, ${A}_{k}^{F}\Rightarrow {A}_{k+1}^{F}$.

There are some special explanation: from Section 2, ${A}_{k+1}^{\overline{F}}$ is generated by the value of ${X}_{k+1}^{\overline{F}}$; ${A}_{k+1}^{\overline{F}}$ does not change the attribute set of ${X}_{k+1}^{\overline{F}}$; ${A}_{k+1}^{\overline{F}}$ and ${X}_{k+1}^{\overline{F}}$ have the same attribute set ${\alpha}_{k+1}^{F}$; ${A}_{k}^{F}$ is generated by the value of ${X}_{k}^{F}$; ${A}_{k}^{F}$ does not change the attribute set of ${X}_{k}^{F}$; ${A}_{k}^{F}$ and ${X}_{k}^{F}$ have the same attribute set ${\alpha}_{k}^{\overline{F}}$.

**Theorem**

**6.**

(The intelligent acquisition theorem of ${\alpha}^{F}$-Information equivalence class ${\left[x\right]}^{\overline{F}}$) if the internal p-matrix ${A}_{k}^{\overline{F}}$, ${A}_{k+1}^{\overline{F}}$ and ${\alpha}^{F}$-Information equivalence class ${\left[x\right]}_{k}^{\overline{F}}$, ${\left[x\right]}_{k+1}^{\overline{F}}$ satisfy

$$if\phantom{\rule{1.em}{0ex}}{A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}},\phantom{\rule{1.em}{0ex}}then\phantom{\rule{1.em}{0ex}}{\left[x\right]}_{k+1}^{\overline{F}}\Rightarrow {\left[x\right]}_{k}^{\overline{F}}.$$

Then, under the condition of ${A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}}$, ${\alpha}^{F}$-information equivalence class ${\left[x\right]}_{k+1}^{\overline{F}}$ is acquired intelligently from ${\left[x\right]}_{k}^{\overline{F}}$; ${\left[x\right]}_{k+1}^{\overline{F}}\subseteq {\left[x\right]}_{k}^{\overline{F}}$.

**Proof.**

From Section 2, we obtained that: ${A}_{k+1}^{\overline{F}}$, ${A}_{k}^{\overline{F}}$ are generated by ${\left(x\right)}_{k+1}^{\overline{F}}$, ${\left(x\right)}_{k}^{\overline{F}}$ respectively; ${A}_{k+1}^{\overline{F}}$ and ${A}_{k}^{\overline{F}}$ satisfy ${A}_{k+1}^{\overline{F}}\subseteq {A}_{k}^{\overline{F}}$, that is, ${A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}}$. By using Theorem 3, we get that: ${\left[x\right]}_{k+1}^{\overline{F}}$, ${\left[x\right]}_{k}^{\overline{F}}$ are the ${\alpha}^{F}$-information equivalent equivalence class generated by information $\left(x\right)$. ${\left[x\right]}_{k+1}^{\overline{F}}$ and ${\left[x\right]}_{k}^{\overline{F}}$ satisfy ${\left[x\right]}_{k+1}^{\overline{F}}\subseteq {\left[x\right]}_{k}^{\overline{F}}$, that is, ${\left[x\right]}_{k+1}^{\overline{F}}\Rightarrow {\left[x\right]}_{k}^{\overline{F}}$. Under the internal p-matrix reasoning condition ${A}_{k+1}^{\overline{F}}\Rightarrow {A}_{k}^{\overline{F}}$, ${\left[x\right]}_{k+1}^{\overline{F}}\Rightarrow {\left[x\right]}_{k}^{\overline{F}}$ is obtained, that is, ${\left[x\right]}_{k+1}^{\overline{F}}\subseteq {\left[x\right]}_{k}^{\overline{F}}$. ${\left[x\right]}_{k+1}^{\overline{F}}$ is acquired intelligently in ${\left[x\right]}_{k}^{\overline{F}}$. □

**Theorem**

**7.**

(The intelligent acquisition theorem of ${\alpha}^{\overline{F}}$-Information equivalence class ${\left[x\right]}^{F}$) If the outer p-matrix ${A}_{k}^{F}$, ${A}_{k+1}^{F}$ and ${\alpha}^{\overline{F}}$-Information equivalence class ${\left[x\right]}_{k}^{F}$, ${\left[x\right]}_{k+1}^{F}$ satisfy

$$if\phantom{\rule{4pt}{0ex}}{A}_{k}^{F}\Rightarrow {A}_{k+1}^{F},\phantom{\rule{4pt}{0ex}}then\phantom{\rule{4pt}{0ex}}{\left[x\right]}_{k}^{F}\Rightarrow {\left[x\right]}_{k+1}^{F}.$$

Then, under the condition of ${A}_{k}^{F}\Rightarrow {A}_{k+1}^{F}$, ${\alpha}^{\overline{F}}$- information equivalence class ${\left[x\right]}_{k+1}^{F}$ is acquired intelligently by ${\left[x\right]}_{k}^{F}$; ${\left[x\right]}_{k}^{F}\subseteq {\left[x\right]}_{k+1}^{F}$.

The proof of Theorem 7 is similar to Theorem 6, so the proof is omitted.

From Theorems 6 and 7, we can obtained directly the following theorem:

**Theorem**

**8.**

(The intelligent acquisition theorem of $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class $({\left[x\right]}^{\overline{F}},{\left[x\right]}^{F})$) If the p-matrix $({A}_{k+1}^{\overline{F}},{A}_{k}^{F})$, $({A}_{k}^{\overline{F}},{A}_{k+1}^{F})$ and $({\alpha}^{\overline{F}},{\alpha}^{F})$-information equivalence class $[{\left[x\right]}_{k+1}^{\overline{F}},{\left[x\right]}_{k}^{F}]$, $[{\left[x\right]}_{k}^{\overline{F}},{\left[x\right]}_{k+1}^{F}]$ satisfy

$$\begin{array}{cc}\hfill \phantom{\rule{1.em}{0ex}}& if\phantom{\rule{1.em}{0ex}}({A}_{k+1}^{\overline{F}},{A}_{k}^{F})\Rightarrow ({A}_{k}^{\overline{F}},{A}_{k+1}^{F}),\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& then\phantom{\rule{1.em}{0ex}}[{\left[x\right]}_{k+1}^{\overline{F}},{\left[x\right]}_{k}^{F}]\Rightarrow [{\left[x\right]}_{k}^{\overline{F}},{\left[x\right]}_{k+1}^{F}].\hfill \end{array}$$

Then under the condition of $({A}_{k+1}^{\overline{F}},{A}_{k}^{F})\Rightarrow ({A}_{k}^{\overline{F}},{A}_{k+1}^{F})$, the ${\left[x\right]}_{k+1}^{\overline{F}}$, ${\left[x\right]}_{k+1}^{F}$ in the $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class $[{\left[x\right]}_{k+1}^{\overline{F}},{\left[x\right]}_{k+1}^{F}]$ will be acquired intelligently in ${\left[x\right]}_{k}^{\overline{F}}$ and ${\left[x\right]}_{k}^{F}$ respectively. ${\left[x\right]}_{k+1}^{\overline{F}}\subseteq {\left[x\right]}_{k}^{\overline{F}}$, ${\left[x\right]}_{k}^{F}\subseteq {\left[x\right]}_{k+1}^{F}$.

**Corollary**

**1.**

If ${\left[x\right]}^{\overline{F}}$ is the ${\alpha}^{F}$-information equivalence class generated intelligently by the internal p-matrix reasoning, ${\left[x\right]}^{\overline{F}}\subseteq \left(x\right)$, then the attribute set α of the $\left(x\right)$ must be supplemented with some attributes ${\alpha}_{i}$.

The proof is obtained directly by Theorem 3, and the proof of Corollary 1 is omitted.

**Corollary**

**2.**

If ${\left[x\right]}^{F}$ is the ${\alpha}^{\overline{F}}$- information equivalence class generated intelligently by outer p-matrix reasoning, $(x)\subseteq {\left[x\right]}^{F}$, then the attribute set α of information $\left(x\right)$ must be deleted some attributes ${\alpha}_{j}$.

The proof of Corollary 2 is similar to Corollary 1, and the proof is omitted.

From Corollaries 1 and 2, we can obtain directly the following corollary:

**Corollary**

**3.**

If $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ is the $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class generated intelligently by p-matrix reasoning, ${\left[x\right]}^{\overline{F}}\subseteq \left(x\right)$, $\left(x\right)\subseteq {\left[x\right]}^{F}$, then the attribute set α of information $\left(x\right)$ must be added into the attributes ${\alpha}_{i}$ and must be delete the attribute ${\alpha}_{j}$.

The intelligent acquisition algorithm of information acquisition class can be obtained by using the concepts and results given in Section 4 (showed in Figure 2). It should be noted that the intelligent algorithm diagram of ${\alpha}^{\overline{F}}$- information equivalence class ${\left[x\right]}_{k}^{F}$ is similar to Figure 2. It is omitted.

## 5. The Relationship between Information Equivalence and Information Fusion

#### 5.1. Two Types of Information Fusion

For example, there are two boxes A and B on the table; there are m grains of soybeans in box A, and n grains of wheat in box B.

- I.
- Children w puts n grains of wheat in box B into box A, then the m grains of soybeans mixed with n grains of wheat.
- II.
- Children w pour the mixture of m grains and n grains into a sieve. The wheat grains are filtered by a sieve and separated from it, and m grains of soybean are left in the sieve.

If m grains of soybeans in box A are considered as m information elements ${x}_{i}$, n grains of wheat in box B are considered as n information elements ${x}_{j}$, ${x}_{i}\ne {x}_{j}$. The two conclusions I${}^{*}$ and II${}^{*}$ are obtained by using the concept of information fusion to understand the above facts I and II as following:

- I${}^{*}$.
- The n information elements ${x}_{j}$ are merged into A from outside A, which generates information fusion ${\left(x\right)}^{F}$. There are $m+n$ information elements ${x}_{k}$ in ${\left(x\right)}^{F}$; ${\left(x\right)}^{F}$ is the first type of information fusion. The first type of information fusion is called information supplementation fusion.
- II${}^{*}$.
- The n information elements ${x}_{j}$ among the $m+n$ information elements in box A are transfer from inside to outside of box A, which generates information fusion ${\left(x\right)}^{\overline{F}}$. There are m information elements ${x}_{i}$ in ${\left(x\right)}^{\overline{F}}$. ${\left(x\right)}^{\overline{F}}$ is the second type of information fusion. The second type of information fusion is called information redundancy fusion. The characteristics of the two types of information fusion are exactly the same as those of the p-set $({X}^{\overline{F}},{X}^{F})$. P-set is a new model and new method for researching information fusion.

#### 5.2. The Relationship between Two Types of Information Fusion and Information Equivalence Classes

From the above simple example, we analyze the relationship among two types of information fusion and information equivalence class.

The information supplementation fusion ${\left(x\right)}^{F}$ is called ${\alpha}^{\overline{F}}$-information equivalence class ${\left[x\right]}^{F}$ on the attribute set ${\alpha}^{\overline{F}}$, if ${\left(x\right)}^{F}$ is the generation of the delete attribute in the attribute set $\alpha $ of information $\left(x\right)$.

The information redundancy fusion ${\left(x\right)}^{\overline{F}}$ is called ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$ on the attribute set ${\alpha}^{F}$, if ${\left(x\right)}^{\overline{F}}$ is the generation of the supplementary attribute in the attribute set $\alpha $ of information $\left(x\right)$.

The information fusion pair $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ is composed of information redundancy fusion ${\left(x\right)}^{\overline{F}}$ and information supplementation fusion ${\left(x\right)}^{F})$. $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ is called the $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ on the attribute set $({\alpha}^{F},{\alpha}^{\overline{F}})$.

By using these concepts, we can get the following theorems.

**Theorem**

**9.**

(The relationship theorem of information supplementation fusion and ${\alpha}^{\overline{F}}$- information equivalence class) Information supplementation fusion ${\left(x\right)}^{F}$ is the ${\alpha}^{\overline{F}}$- information equivalence class generated by information $\left(x\right)$ if and only if ∀${x}_{i}$, ${x}_{j}$,${X}_{k}$∈${\left(x\right)}^{F}$ satisfy

$$\begin{array}{cc}\hfill \phantom{\rule{1.em}{0ex}}& 1.reflexivity.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{\overline{F}}{x}_{i}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& 2.symmetry.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{\overline{F}}{x}_{j}\Rightarrow {x}_{j}{\alpha}^{\overline{F}}{x}_{i}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& 3.transitivity.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{\overline{F}}{x}_{j},{x}_{j}{\alpha}^{\overline{F}}{x}_{k}\Rightarrow {x}_{i}{\alpha}^{\overline{F}}{x}_{k}.\hfill \end{array}$$

**Theorem**

**10.**

(The relationship theorem of information redundancy fusion and ${\alpha}^{F}$-information equivalence class) Information redundancy fusion ${\left(x\right)}^{\overline{F}}$ is the ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$ generated by information $\left(x\right)$ if and only if $\forall {x}_{i}$, ${x}_{j}$, ${x}_{k}\in {\left(x\right)}^{\overline{F}}$ satisfy

$$\begin{array}{cc}\hfill \phantom{\rule{1.em}{0ex}}& 1.reflexivity.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{F}{x}_{i}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& 2.symmetry.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{F}{x}_{j}\Rightarrow {x}_{j}{\alpha}^{F}{x}_{i}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& 3.transitivity.\phantom{\rule{1.em}{0ex}}{x}_{i}{\alpha}^{F}{x}_{j},{x}_{j}{\alpha}^{F}{x}_{k}\Rightarrow {x}_{i}{\alpha}^{F}{x}_{k}.\hfill \end{array}$$

Theorems 9 and 10 can be obtained directly by using Theorem 3. The proof of Theorems 9 and 10 is omitted.

**Corollary**

**4.**

Information redundancy and supplement fusion $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ is the $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$.

The following Propositions 4–6 are obtained directly by Theorems 9 and 10 and Corollary 4:

**Proposition**

**4.**

Information redundancy fusion ${\left(x\right)}^{\overline{F}}$ and ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$ are two equivalent concepts.

**Proposition**

**5.**

Information supplement fusion ${\left(x\right)}^{F}$ and ${\alpha}^{\overline{F}}$-information equivalence class ${\left[x\right]}^{F}$ are two equivalent concepts.

**Proposition**

**6.**

Information redundancy and supplement fusion $({\left(x\right)}^{\overline{F}},{\left(x\right)}^{F})$ and the $({\alpha}^{F},{\alpha}^{\overline{F}})$-information equivalence class $[{\left[x\right]}^{\overline{F}},{\left[x\right]}^{F}]$ are two equivalent concepts.

## 6. Application on Intelligent Acquisition of Information Equivalence Class in Information Fusion and Unknown Information Discovery

In order to be simple and easy to accept the conceptual and theoretical results given in Section 3, Section 4 and Section 5 of this paper, this section only gives the simple application of ${\alpha}^{F}$-information equivalence intelligence acquisition in information redundancy fusion and unknown information discovery.

Suppose that ${x}_{1}$, ${x}_{2}$, ${x}_{3}$, ${x}_{4}$, ${x}_{5}$, ${x}_{6}$, ${x}_{7}$ are PhD students enrolled in 2018, they will complete their PhD within four years; ${x}_{1}\sim {x}_{7}$ come from different provinces in China; ${x}_{1}\sim {x}_{7}$ constitutes information $\left(x\right)$:

$$\left(x\right)=\left\{{x}_{1},{x}_{2},{x}_{3},{x}_{4},{x}_{5},{x}_{6},{x}_{7}\right\},$$

$\forall {x}_{i}\in \left(x\right)$ has the test scores of math, physics, computer, information technology: mathematics = ${y}_{1,i}$, physics = ${y}_{2,i}$, computer = ${y}_{3,i}$, information technology = ${y}_{4,i}$; $i=1$, 2, 3, 4, 5, 6, 7. A is the information value matrix generated by $\left(x\right)$:
where, for ${x}_{i}\in \left(x\right)$, the j column ${y}_{j}$ in A are 4 scores: ${y}_{1,i}$, ${y}_{2,i}$, ${y}_{3,i}$, ${y}_{4,i}$, which constitutes the vector ${y}_{j}=({y}_{1,i}$, ${y}_{2,i}$, ${y}_{3,i}$, ${y}_{4,i}{)}^{T}$; $j\in (1,2,3,4,5,6,7)$, $i=1,2,3,4,5,6,7$ .

$$A=\left[\begin{array}{cccc}87& 93& 79& 97\\ 80& 88& 91& 87\\ 74& 83& 92& 77\\ 91& 90& 93& 88\\ 96& 73& 82& 91\\ 85& 89& 90& 78\\ 91& 91& 70& 85\end{array}\right],$$

The math, physics, computer, and information technology are defined as attributes ${\alpha}_{1}$ = math, ${\alpha}_{2}$ = physics, ${\alpha}_{3}$ = computer, ${\alpha}_{4}$ = Information Technology respectively. ${\alpha}_{1}$, ${\alpha}_{2}$, ${\alpha}_{3}$, ${\alpha}_{4}$ constitutes the attribute set $\alpha $ of $\left(x\right)$:

$$\alpha =\left\{{\alpha}_{1},{\alpha}_{2},{\alpha}_{3},{\alpha}_{4}\right\}$$

Because each ${x}_{i}\in \left(x\right)$ has the attributes ${\alpha}_{1}$, ${\alpha}_{2}$, ${\alpha}_{3}$ and ${\alpha}_{4}$, the attribute ${\alpha}_{i}$ of ${x}_{i}\in \left(x\right)$ satisfies the attribute “conjunctive normal form” of Equation (15); that is,

$${\alpha}_{i}={\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3}\wedge {\alpha}_{4}={\wedge}_{t=1}^{4}{\alpha}_{t.}$$

If we want to know which one in ${x}_{1}\sim {x}_{7}$ came from Shandong Province, China, we can add the attribute ${\alpha}_{5}$= Shandong Province to the attribute set $\alpha $, so $\alpha $ generates ${\alpha}^{F}$:

$${\alpha}^{F}=\alpha \cup \left\{{\alpha}_{5}\right\}=\left\{{\alpha}_{1},{\alpha}_{2},{\alpha}_{3},{\alpha}_{4},{\alpha}_{5}\right\}.$$

We get the ${\left(x\right)}^{\overline{F}}$ with attribute set ${\alpha}^{F}$:

$${\left(x\right)}^{\overline{F}}=\left(x\right)-\left\{{x}_{3},{x}_{5},{x}_{6},{x}_{7}\right\}=\left\{{x}_{1},{x}_{2},{x}_{4}\right\}.$$

The attribute ${\alpha}_{i}$ of $\forall {x}_{i}\in {\left(x\right)}^{\overline{F}}$ satisfies the attribute “expansion of conjunctive normal form” of Equation (16), that is,

$$\begin{array}{cc}\hfill {\alpha}_{i}& =({\alpha}_{1}\wedge {\alpha}_{2}\wedge {\alpha}_{3}\wedge {\alpha}_{4})\wedge {\alpha}_{5}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& =\left({\wedge}_{t=1}^{4}{\alpha}_{t}\right)\wedge {\alpha}_{5}\hfill \\ \hfill \phantom{\rule{1.em}{0ex}}& ={\wedge}_{t=1}^{5}{\alpha}_{t}.\hfill \end{array}$$

From Theorem 3, we can obtain that: ${\left(x\right)}^{\overline{F}}$ is the ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$ generated by $\left(x\right)$; from Theorem 10 and Proposition 4, we obtain that ${\left(x\right)}^{\overline{F}}$ is the information redundant fusion generated by $\left(x\right)$ by deleting ${x}_{3}$, ${x}_{5}$, ${x}_{6}$, ${x}_{7}$. The information value matrix A generate the internal p-matrix ${A}^{\overline{F}}$:

$${A}^{\overline{F}}=\left[\begin{array}{cccc}87& 93& 79& 97\\ 80& 88& 91& 87\\ 91& 90& 93& 88\end{array}\right].$$

From Equations (35) and (40), we get that A and ${A}^{\overline{F}}$ satisfy ${A}^{\overline{F}}\subseteq A$, or ${A}^{\overline{F}}\Rightarrow A$; Equations (35) and (40), Equations (34) and (39) satisfy the inner p-p-matrix reasoning respectively: $if\phantom{\rule{4pt}{0ex}}{A}^{\overline{F}}\Rightarrow A$, $then\phantom{\rule{4pt}{0ex}}{\left(x\right)}^{\overline{F}}\Rightarrow \left(x\right)$. Because the inner p-matrix reasoning condition is satisfied, ${A}^{\overline{F}}\Rightarrow A$, information redundancy fusion ${\left(x\right)}^{\overline{F}}$ is intelligently acquired in the information $\left(x\right)$; the students ${x}_{1}$, ${x}_{2}$, ${x}_{4}$ who come from the Shandong province are found in ${x}_{1}$-${x}_{7}$.

From this simple example, we conclude that if the attribute ${\alpha}_{5}$ is added to the attribute set $\alpha $ of the information $\left(x\right)$, the redundant fusion ${\left(x\right)}^{\overline{F}}$ of unknown information is discovered intelligently from $\left(x\right)$; the unknown information ${\left(x\right)}^{\overline{F}}$ is hidden in $\left(x\right)$ before the attribute ${\alpha}_{5}$ is added to $\alpha $.

For ${x}_{1}$, ${x}_{2}$, ${x}_{3}$, ${x}_{4}$, ${x}_{5}$, ${x}_{6}$, ${x}_{7}$, we have conducted the survey of students from the provinces respectively, and the results of the survey are the same as those given in Equation (39).

## 7. Conclusions

For the p-set and its augmented matrix, some new conclusions are obtained from the analyses given in this paper as following:

- 1.
- Under the condition that some attributes are added to the attribute set $\alpha $ of $\left(x\right)$, some information elements are deleted in $\left(x\right)$, so $\left(x\right)$ generate ${\left(x\right)}^{\overline{F}}$; that is, the boundary of $\left(x\right)$ shrinks inward to generate ${\left(x\right)}^{\overline{F}}$; by using the equivalence class concept in mathematics, we get that: ${\left(x\right)}^{\overline{F}}$ is the ${\alpha}^{F}$- information equivalence class ${\left[x\right]}^{\overline{F}}$ generated by $\left(x\right)$; the reasons are as following: the attribute set ${\alpha}^{F}$ for $\forall {x}_{i}$, ${x}_{j}$, ${x}_{k}\in {\left[x\right]}^{\overline{F}}$ satisfies the characteristics of the equivalence class: reflexivity, symmetry, and transitivity. Obviously, $\left(x\right)$ generates multiple ${\alpha}^{F}$-information equivalence classes ${\left[x\right]}_{1}^{\overline{F}}$, ${\left[x\right]}_{2}^{\overline{F}}$, ⋯, ${\left[x\right]}_{n}^{\overline{F}}$ under the condition of constantly supplementing attributes in $\alpha $; so $\left(x\right)$ continuously deletes the information element ${x}_{i}$ to get multiple information fusions: ${\left(x\right)}_{1}^{\overline{F}}$, ${\left(x\right)}_{2}^{\overline{F}}$, ⋯, ${\left(x\right)}_{n}^{\overline{F}}$; each ${\left(x\right)}_{i}^{\overline{F}}$ is called information redundancy fusion, $i=1,2,\cdots ,n$. we get a new concept of information fusion: information redundancy.
- 2.
- Under the condition that some attributes are deleted from attribute set $\alpha $ of $\left(x\right)$, some information elements are added in $\left(x\right)$, so $\left(x\right)$ generates ${\left(x\right)}^{F}$; that is, the boundary of $\left(x\right)$ expands outward to generate ${\left(x\right)}^{F}$; by using the equivalence class concept in mathematics, we get that: ${\left(x\right)}^{F}$ is the ${\alpha}^{\overline{F}}$-information equivalence class ${\left[x\right]}^{F}$ generated by $\left(x\right)$; the reasons are as following: the attribute set ${\alpha}^{\overline{F}}$ for $\forall {x}_{i}$, ${x}_{j}$, ${x}_{k}\in {\left[x\right]}^{F}$ satisfies the characteristics of the equivalence class: reflexivity, symmetry and transitivity. Obviously, $\left(x\right)$ generates multiple ${\alpha}^{\overline{F}}$-information equivalence classes ${\left[x\right]}_{1}^{F}$, ${\left[x\right]}_{2}^{F}$, ⋯, ${\left[x\right]}_{n}^{F}$ under the condition of continually deleting attributes in $\alpha $; so $\left(x\right)$ constantly supplement the information element ${x}_{j}$ to get multiple information fusions: ${\left(x\right)}_{1}^{F}$, ${\left(x\right)}_{2}^{F}$, ⋯, ${\left(x\right)}_{n}^{F}$; Each ${\left(x\right)}_{j}^{F}$ is called information supplementation fusion, $j=1$, 2,⋯, n. We get a new concept of information fusion: information supplementation fusion. Information redundancy fusion and information supplementation fusion exist in many application researches of information fusion.

The literature cited is [1,2,3,4,5,6,7,8,42,43] gives many excellent researches on multiple information fusions. By comparing with the literature [1,2,3,4,5,6,7,8,42,43]. The research given by the contributions of this article are as following: two new concepts of information fusion are presented by using mathematical methods to understand the concept and characteristics of information fusion: information redundancy fusion and Information supplementation fusion. The concept of information equivalence class is presented by using the new mathematical model: the p-set. Information equivalence class and information fusion are two equivalent concepts, which is an important theoretical conclusion. Information fusion intelligent acquisition method and intelligent acquisition algorithm are presented under the matrix reasoning conditions. The results given in the paper are all new.

## Author Contributions

Conceptualization and formal analysis, S.L. and K.S.; writing—original draft preparation, Y.X. All authors have read and agreed to the published version of the manuscript.

## Funding

This document is the results of the research project funded by National Natural Science Foundation of China (71663010), National Philosophy and Social Science Foundation of China (17BGL001), Shandong Provincial Natural Science Foundation of China (ZR2019MG015).

## Conflicts of Interest

The authors declare there is no conflicts of interest regarding the publication of this paper.

## References

- Dubois, D.; Liu, W.; Ma, J.; Prade, H. The basic principles of uncertain information fusion. An organized review of merging rules in different representation frameworks. Inf. Fusion
**2016**, 32, 12–39. [Google Scholar] [CrossRef] - Wei, W.; Liang, J. Information fusion in rough set theory: An overview. Inf. Fusion
**2019**, 48, 107–118. [Google Scholar] [CrossRef] - Sycara, K.; Glinton, R.; Yu, B.; Giampapa, J.; Owens, S.; Lewis, M.; Grindle, L.C. An integrated approach to high-level information fusion. Inf. Fusion
**2009**, 10, 25–50. [Google Scholar] [CrossRef] - Blasch, E.P.; Pribilski, M.; Daughtery, B.; Roscoe, B.; Gunsett, J. Fusion metrics for dynamic situation analysis. In Proceedings of the Signal Processing, Sensor Fusion, and Target Recognition XIII, Orlando, FL, USA, 12–16 April 2004; International Society for Optics and Photonics: Bellingham, WA, USA, 2004; Volume 5429, pp. 428–439. [Google Scholar]
- Campomanes-Alvarez, C.; Ibáñez, O.; Cordón, O.; Wilkinson, C. Hierarchical information fusion for decision making in craniofacial superimposition. Inf. Fusion
**2018**, 39, 25–40. [Google Scholar] [CrossRef] - Beliakov, G. How to build aggregation operators from data. Int. J. Intell. Syst.
**2003**, 18, 903–923. [Google Scholar] [CrossRef] - Huete, M.I.; Ibáñez, O.; Wilkinson, C.; Kahana, T. Past, present, and future of craniofacial superimposition: Literature and international surveys. Leg. Med.
**2015**, 17, 267–278. [Google Scholar] [CrossRef] - Bronselaer, A.; Van Britsom, D.; De Tré, G. A framework for multiset merging. Fuzzy Sets Syst.
**2012**, 191, 1–20. [Google Scholar] [CrossRef] - Imani, M. Estimation, Inference and Learning in Nonlinear State-Space Models. Ph.D. Thesis, Texas A&M University, College Station, TX, USA, 2019. [Google Scholar]
- Xie, S.; Imani, M.; Dougherty, E.R.; Braga-Neto, U.M. Nonstationary linear discriminant analysis. In Proceedings of the 2017 51st Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, 29 October–1 November 2017; pp. 161–165. [Google Scholar]
- Imani, M.; Braga-Neto, U.M. Finite-horizon LQR controller for partially-observed Boolean dynamical systems. Automatica
**2018**, 95, 172–179. [Google Scholar] [CrossRef] - Imani, M.; Braga-Neto, U. Control of gene regulatory networks using Bayesian inverse reinforcement learning. IEEE/ACM Trans. Comput. Biol. Bioinform.
**2018**, 16, 1250–1261. [Google Scholar] [CrossRef] - Kaiquan, S. P-sets. J. Shandong Univ. (Nat. Sci.)
**2008**, 43, 77–84. [Google Scholar] - Chengxian, F.; Hongkang, L. P-sets and the reasoning-identification of disaster information. Int. J. Converg. Inf. Technol.
**2012**, 7, 337–345. [Google Scholar] - Kaiquan, S. P-sets and its applications. Int. J. Adv. Syst. Sci. Appl.
**2009**, 9, 209–219. [Google Scholar] - Kaiquan, S. P-sets and its Applied Characteristics. Comput. Sci.
**2010**, 37, 1–8. [Google Scholar] - Kaiquan, S. P-reasoning and P-reasoning Discovery-identification of Information. Comput. Sci.
**2011**, 38, 1–9. [Google Scholar] - KaiShi, K.Q. P-sets, Inverse P-sets and the Intelligent Fusion-filter Identification of Information. Comput. Sci.
**2012**, 39, 1–13. [Google Scholar] - Hongkang, L.; Chengxian, F. The dual form of P-reasoning and identification of unknown attribute. Int. J. Digit. Content Technol. Its Appl.
**2012**, 6, 121–131. [Google Scholar] - Ling, Z.; Yuquan, C.; Kaiquan, S. Outer P-sets and data internal-recovery. Syst. Eng. Electron.
**2010**, 32, 1233–1238. [Google Scholar] - Yang, W.; Hongqin, G.; Kaiquan, S. P-sets and dependence-discovery of dynamic information. Syst. Eng. Electron.
**2011**, 33, 2033–2038. [Google Scholar] - Yuying, L.; Hongkang, L.; Kaiquan, S. Characteristics of data discrete interval and data discovery-application. Syst. Eng. Electron.
**2011**, 33, 2258–2262. [Google Scholar] - Ling, Z.; Jihua, T.; Kaiquan, S. The fusion of internal P-information and its feature of attribute conjunction. J. Shandong Univ. (Nat. Sci.)
**2014**, 49, 93–97. [Google Scholar] - Kaiquan, S.; Ling, Z. Internal P-sets and data outer-recovery. J. Shandong Univ. (Nat. Sci.)
**2009**, 44, 8–14. [Google Scholar] - Ling, Z.; Xuefang, R. P-Set and Its (f,$\overline{f}$)-Heredity. Quant. Log. Soft Comput.
**2010**, 2, 735–743. [Google Scholar] - Li, Z.; Ming, X.; Kaiquan, S. P-sets and applications of internal-outer data circle. Quant. Log. Soft Comput.
**2010**, 2, 581–591. [Google Scholar] - Yufeng, Q.; Baohui, C. f-Model generated by P-set. Quant. Log. Soft Comput.
**2010**, 2, 613–620. [Google Scholar] - Yuying, L.; Li, Z.; Kaiquan, S. Generation and recovery of compressed data and redundant data. Quant. Log. Soft Comput.
**2010**, 2, 661–671. [Google Scholar] - Ming, X.; Kaiquan, S.; Li, Z. P-Sets and $\overline{F}$-Data Selection-Discovery. Quant. Log. Soft Comput.
**2010**, 2, 791–799. [Google Scholar] - Shuli, Z.; Chengxian, F.; Kaiquan, S. Outer P-information generation and its reasoning-searching discovery. J. Shandong Univ. (Nat. Sci.)
**2012**, 47, 99–104. [Google Scholar] - Shuli, Z.; Songli, W.; Kaiquan, S. Internal P-reasoning Information Recovery and Attribute Hiding Reasoning Discovery. Comput. Sci.
**2013**, 10, 209–213. [Google Scholar] - Kaiquan, S. Function P-sets. J. Shandong Univ. (Nat. Sci.)
**2011**, 46, 62–69. [Google Scholar] - Kaiquan, S. Function P-sets. Int. J. Mach. Learn. Cybern.
**2011**, 2, 281–288. [Google Scholar] - Kaiquan, S. P-information law intelligent fusion and soft information image intelligent generation. J. Shandong Univ. (Nat. Sci.)
**2014**, 49, 1–17. [Google Scholar] - Jihua, T.; Ling, Z.; Kaiquan, S. Intelligent Fusion of Information Law and its Inner Separating. Comput. Sci.
**2015**, 42, 204–209. [Google Scholar] - Xuefang, R.; Ling, Z.; Kaiquan, S. Two Types of Dynamic Information Law Models and Their Applications in Information Camouflage and Risk Identification. Comput. Sci.
**2018**, 45, 230–236. [Google Scholar] - Jihua, T.; Ling, Z.; Baohui, C.; Kaiquan, S.; Hsien-Wei, T.; Yilun, C. Outer P-information law reasoning and its application in intelligent fusion and separating of information law. Microsyst. Technol.
**2018**, 24, 4389–4398. [Google Scholar] - Kaiquan, S. P-augmented matrix and dynamic intelligent discovery-identification of information. J. Shandong Univ. (Nat. Sci.)
**2015**, 50, 1–12. [Google Scholar] - Kaiquan, S. Inverse P-sets. J. Shandong Univ. (Nat. Sci.)
**2012**, 47, 98–109. [Google Scholar] - Kaiquan, S. Function inverse P-sets and information law fusion. J. Shandong Univ. (Nat. Sci.)
**2012**, 47, 73–80. [Google Scholar] - Kaiquan, S. Function inverse P-sets and the hiding information generated by function inverse P-information law fusion. In Proceedings of the Conference on e-Business, e-Services and e-Society, Sanya, China, 28–30 November 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 224–237. [Google Scholar]
- Benferhat, S.; Sossai, C. Reasoning with multiple-source information in a possibilistic logic framework. Inf. Fusion
**2006**, 7, 80–96. [Google Scholar] [CrossRef] - Forbus, K.D.; Mahoney, J.V.; Dill, K. How qualitative spatial reasoning can improve strategy game AIs. IEEE Intell. Syst.
**2002**, 17, 25–30. [Google Scholar] [CrossRef]

**Figure 1.**The positional relationship between the finite ordinary element set X and the p-set $({X}^{\overline{F}},{X}^{F})$, ${X}^{\overline{F}}\subseteq X\subseteq {X}^{F}$; where, X is represented by a solid line, ${X}^{\overline{F}}$ and ${X}^{F}$ are indicated by dashed lines respectively; the p-set is composed of ${X}^{\overline{F}}$ and ${X}^{F}$.

**Figure 2.**The intelligent algorithm diagram of ${\alpha}^{F}$-information equivalence class ${\left[x\right]}^{\overline{F}}$. In the figure, $\left(x\right)$ is the given information; $\alpha $ is the attribute set of $\left(x\right)$; ${\left[x\right]}_{k}^{\overline{F}}$ is ${\alpha}_{k}^{F}$-information equivalence class; ${\left[x\right]}^{\overline{F},*}$ is the given ${\alpha}^{F}$- information equivalence class; ${A}_{k}^{\overline{F}}$, ${A}_{k+t}^{\overline{F}}$ are the internal p-matrix generated by ${\left[x\right]}_{k}^{\overline{F}}$, ${\left[x\right]}_{k+t}^{\overline{F}}$ respectively; A is the information value matrix generated by $\left(x\right)$.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).