Next Article in Journal
Estimating the Spatial Distribution of Crime Events around a Football Stadium from Georeferenced Tweets
Previous Article in Journal
A New Approach to Line Simplification Based on Image Processing: A Case Study of Water Area Boundaries

ISPRS Int. J. Geo-Inf. 2018, 7(2), 42; https://doi.org/10.3390/ijgi7020042

Article
Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process
by 1,2,3, 1,2,3 and 1,2,3,*
1
College of Resources Environment and Tourism, Capital Normal University, Beijing 100048, China
2
3D Information Collection and Application Key Laboratory of Education Ministry, Capital Normal University, Beijing 100048, China
3
Beijing State Key Laboratory Incubation Base of Urban Environmental Processes and Digital Simulation, Capital Normal University, Beijing 100048, China
*
Author to whom correspondence should be addressed.
Received: 2 November 2017 / Accepted: 28 January 2018 / Published: 30 January 2018

Abstract

:
Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.
Keywords:
incremental update; incremental information extraction; type-of-change detection; hierarchical matching operator; hierarchical matching

1. Introduction

Spatial databases are mainly updated incrementally, and one difficulty with the incremental updating of spatial databases is the extraction of the said incremental (i.e., changed) information [1,2,3]. The detection and extraction of information about object changes is the basis for the incremental updating of spatial databases [4,5]. Recording complete process about object changes should not only include the changed object itself, but should also contain information about the types of changes that trigger the physical change of the object [6,7,8]. Recording the changed object is the key to collecting and storing incremental update information, while further detection of the types of changes influences the linked update processing, quality control, and release of the incremental change information [9]. Information about object change can be applied in fields such as spatial object update, lifecycle tracking, historical data backtracking, statistics, and analysis of change information, as well as prediction of the spatiotemporal transmission patterns and change trend of spatial objects [10,11,12]. In updating practices, if incremental updating only record those old object in history database, with paying little attention to change process of geographical objects, then multiple updates will hide inevitably the corresponding connection between the old and new data.
On the one hand, the nature of incremental extraction is the matching of spatial objects, and feature matching is often used to detect the changed objects. This is because since single-geometric-feature (e.g., centroids) matching usually cannot accurately measure the similarity among different objects or determining spatial relationships of polygon objects. Although there indeed exist many other alternative approaches (e.g., MBR-Minimum Bounding Rectangle; Hausdorff, Voronoi, etc.) that solve these problems, most researchers prefer integrating a number of operators reflecting the different geometric characteristics of the objects from the perspective of reducing the complexity of the algorithm [10,13,14,15,16,17]. One of the typical methods conducts a weighted processing of several operators reflecting different geometric features of the objects, obtaining a comprehensive similarity index to judge whether there is a geometric matching relationship between those objects [11,18,19,20]. This method avoids the selection of single operator thresholds by increasing the complexity and cost of the algorithm, and can better describe the overall geometric features of the objects, greatly improving the matching integrity and accuracy. However, both the weighted method and the single operator matching method ignore an important fact: geometric matching requires that similarity in position, shape, and size should be satisfied simultaneously, and similarity in only one or two geometric features is not enough [12]. They are liable to produce matching redundancy, and to affect the matching accuracy due to different weight assignment methods.
On the other hand, as change types are concerned, changes are usually divided into nine types in existing studies: vanish; appearance; reappearance; attribute change; expansion; shrink; translation; rotation, and; deformation [21,22,23,24]. For the detection and extraction of incremental information, scholars mainly conduct studies from two perspectives: geographic entity change (change event-driven) and feature change (change information-driven). Of these, most researchers update a spatial database with information about the types of changes in spatial objects based on event-driven incremental updates [25,26,27], which alleviates the difficulty of detecting the types of changes during the incremental information extraction process. However, this method requires information about the types of changes ahead of time and conduct incremental updates with that information (geographic change event), which means that the information needs to be stored in the database before the update. In essence, this method requires manually adding the type-of-change information to the database; however, in most cases, when updating a large amount of spatial data, it is impossible to have information ahead of time about the types of changes experienced by these objects. In addition, the topological relations among objects, or non-topology attributes such as size, shape, and position, are also used to determine the types of changes [19,27,28,29]. Some scholars apply the snapshot difference triple-descriptive model [30] or the quartet model [31] to detect the types of changes. Compared with methods based on the change event, these methods (which determine the types of changes using object information) have made remarkable progress towards automation, but they face problems like high complexity in their detection method, high redundancy rates of detection factors, and rough, inaccurate inference of type-of-change information. Moreover, most of the existing literature either studies the extraction of changed objects or the detection of the types of changes, and there is little research that combines both the extraction of incremental information about area objects and the determination of the types of changes in order to completely detect the change information and to record the change process. The link between updating data and history data remains a difficult issue to date [8,11].
To solve the above problems, this study considers the organic unity of the extraction of change information about the objects and the detection of the types of changes, and proposes a method to completely detect the incremental information about area objects. Taking into account the process of object change, we propose an algorithm which implements the identification, extraction, and database entry of the types of changes of area objects during the incremental information extraction process. While ensuring the accuracy of the algorithm, both its complexity and the redundancy of the detection factors involved are reduced to the greatest possible extent.

2. Incremental Detection of Change Information of Area Objects

We propose a new, universal method that is based on object matching features and detects both the changing objects and their types of changes. This method first establishes a hierarchical matching model to extract incremental information datasets of area objects in both the old and new versions. Subsequently, based on the attributes, geometric position, size, shape and directional information of the area objects, this method defines appropriate matching operators to define the rules to automatically detect the types of changes. Then, the method detect and extract the nine types of changes of individual area objects. It uses concise detection conditions and operational operators to ensure the accuracy of detection, reduce algorithmic complexity, and achieve effective tracking and management of various changing tracks within the lifecycle of area objects. The incremental information extraction process and its connection with the extraction of incremental objects are shown in Figure 1.

2.1. Basic Idea

The identification and extraction of change information depends largely on matching objects in the new-version database to the corresponding objects in the old-version database, and the matching accuracy will directly affect the accuracy of the data update. The matching of spatial objects includes both geometric matching and attribute matching. The geometrical characteristics of area objects include their position, size, shape, and direction [21], whose influence on feature matching should be fully considered. We use M to represent the matching model, with M = (Attribute, Position, Size, Shape, Direction).
To detect the change information, we propose to use matching operators reflecting information such as attribute, position, size, shape, and direction in the matching model M to identify the changes of the area object and discriminate the types of changes. When the attribute data are not included in the dataset, other geometric information apart from attribute can also be used to conduct geometric matching and discriminate the types of changes [32]. Therefore, this study only focuses on geometric matching. The matching operators are screened and combined according to their complexity from the lowest to the highest, reducing the calculation workload while ensuring accuracy. For any two matching objects in the old and new databases under the same measuring scale, if any of the operators matches successfully, the condition judgment value will be denoted as T; otherwise, it will be denoted as F. Therefore, when M = (T, T, T, T, T), the match succeeds. After traversing all the objects, those that cannot be successfully matched are the changing ones.
In addition, the types of changes in the objects are detected. Vanish, appearance, and reappearance are a combination of delete and add, while the other six types of changes (attribute change, expansion, shrink, translation, rotation, and deformation) may be inferred using the combination of operators in Model M shown in Figure 2. For vanish and appearance, we only need to find out whether any one area object in a database intersects with another area object in the other database within the same geographical range. If there is object A in the new database while there is no corresponding object of A in the old database, then such change can be classified as appearance, otherwise it belongs to vanish. For reappearance, it is necessary to identify objects in the databases over three periods, in which it is the succession of vanish and appearance. The specific solution and reasoning are shown in the following sections.

2.2. Selection of Hierarchical Matching Operators

Once the changing objects are discriminated, providing attribute data for each, only the attribute fields of two area objects need to be directly matched. However, when selecting geometric discriminant operators, their integrity must be guaranteed. Moreover, the objects’ shape, size, and position should be considered as well to ensure the accuracy of the matching. In addition, the complexity of the algorithm should be taken into account. In the end, the detection model formed by all the operators should not be affected by the data types and the measurement scale, and it should be able to accurately describe both the overall and individual geometric features of the objects.
Among the discriminant operators, the simple and accurate centroid distance operator measures the position similarity of area objects and is used by many researchers [8,33]. As an alternative, some scholars use the Hausdorff distance model, the generalized Hausdorff distance model, or the median Hausdorff distance model to measure the position similarity [25,34]. This method not only measures the position difference of spatial objects, but also reflects differences between objects in terms of shape and overall distribution. However, only as far as the measurement of position similarity is concerned, the Hausdorff methods are not as accurate as the centroid matching method, lying in that they are highly complicated with heavy computational burden to a great extent.
As far as shape similarity is concerned, there are also various operators to measure shape similarity [20,28,31]. Among them, the steering angle function matching algorithm can accurately describe the local, detailed features of complex graphics on their shape and direction, and it is easier to operate and more accurate than other matching algorithms. In addition, those other operators, for example, the fractal dimension and compactness index, the function describing the distance from edge feature point to the centroid of the area object, the Fourier transform method, the symmetric difference index, and the operator to measure the shape of tangent space, etc. can also measure the shape similarity of area objects [21,25,27]. Although these operators can reflect the shape similarity between two area objects to a certain extent, they either fail to describe the details of the shape or involve complicated calculation.
Additionally, most researchers use the degree of area overlapping as a criterion to measure the size similarity between area objects [20]. In addition, Hausdorff distance and symmetric difference could also reflect the size similarity of the objects to a certain extent [11]. However, measuring the size similarity using the degree of area overlapping is not only simple to operate, but is also relatively high accuracy in nature.
In summary, after a systematic analysis and modification of the operators that come from the existing studies, this study uses the centroid discriminant operator to measure the position features of area objects, the area overlapping operator to measure their size features, and the steering angle cumulative function matching operator to measure their shape and direction characteristics. The specific calculation method is as follows:
(1) Centroid discriminant operator. This is an index that reflects the distance (position) between objects. The smaller the centroid distance, the higher the position similarity between two area objects. Here, F1 and F2 denote the two area objects to be matched, D(F1, F2) denotes the centroid distance between the two area objects, and r1 and r2 represent the half of the diagonal of the minimum enclosing rectangle of F1 and F2, respectively, used so as not to be affected by the actual size of the objects. The formula is as follows:
O centroid = r 1 + r 2 r 1 + r 2 + D ( F 1 ,   F 2 )
(2) Area discriminant operator. Assume that S(F1) and S(F2) represent the area of F1 and that of F2, respectively. Formula (2), which discriminates the size similarity between area objects with area difference, is shown as follows:
O area = 1 | S ( F 1 ) S ( F 2 ) | Max ( S ( F 1 ) ,   S ( F 2 ) )
(3) Steering angle cumulative function discriminant operator. This operator measures the shape similarity between area objects and can measure the direction similarity between them to a certain extent. The principle is as follows: First, a shape feature point in a specific direction in the vertices of the two objects is chosen as a reference point. Giving consideration to information about the direction of the area objects, the reference point is selected per the following standard: the sine value of the angle between the line formed by connecting the reference point to the object centroid and the abscissa axis of the coordinate system should be the maximum, and the X axis coordinate of the reference point should be larger than that of the object centroid. The angle between the counterclockwise segmental arc of the reference point and the X axis is then recorded, along with the normalized length (the ratio of arc length to perimeter) of each counterclockwise arc. Then, the normalized length is taken as the X axis, and the accumulated value of each point along the peripheral steering angle is taken as the Y axis. Assuming that F1(x) and F2(x) are the steering angle cumulative functions of F1 and F2, then the steering angle cumulative function discriminant operator formula can be expressed as follows:
O angle = 1 0 1 | F 1 ( x ) F 2 ( x ) | Max ( 0 1 F 1 ( x ) , 0 1 F 2 ( x ) )
Although the operator can simultaneously measure the shape and direction information of area objects, certain shape or direction differences may cause the discriminant operator to be inadequate, in which case the combination of the above three operators cannot discriminate the type of rotation change. Therefore, in the case where the centroid discriminant operator and the area difference discriminant operator are adequate, while the steering angle function discriminant operator is inadequate, the auxiliary discriminant operator is introduced in this paper to discriminate the type of rotation change. If the computed result of the auxiliary operator is larger than the set threshold value, it means that the two objects have the same shape but different directions, and the type of changes can be identified as rotation. The discriminant operator measures the shape difference with the similarity of form factor [8]. Assume that S(F1), S(F2), L(F1) and L(F2) represent the areas and perimeters of the two area objects, F1 and F2, respectively, then the formula is as follows:
O Form = 1 | S ( F 1 ) L ( F 1 ) 2 S ( F 2 ) L ( F 2 ) 2 | / MAX ( S ( F 1 ) L ( F 1 ) 2 , S ( F 2 ) L ( F 2 ) 2 )

2.3. Extraction of Incremental Information Based on Hierarchical Matching

This study proposes a geometric matching method for area object based on hierarchical matching, with the hierarchy determined by the accuracy of object matching, the increase of operator complexity, and the decrease of the number of filtered objects. Using the matching operators described above, the algorithm filters the three operators with increasing computational complexity layer by layer to reduce the calculational burden and ensure the accuracy of the matching.
In this study, the similarity threshold of the four geometric matching operators (namely O centroid , O area O angle and O Form ) are set as φ1, φ2, φ3 and φ4 respectively. Let IsAttributeMatch denote whether the attributes of the two area objects match or not. If they match, it will be denoted by 1 (True), otherwise by 0 (False). For any A and B, which are area objects from the new-version and old-version databases, respectively, the detailed matching method to extract the change information based on hierarchical feature matching is as follows, also as shown in Figure 2.
The first layer: Matching of centroid distance. This layer determines which pairs of candidate objects meet the 1:1 criterion of centroid distance matching, with the remaining mismatched ones regarded as changed objects. Centroid distance matching is conducted among all objects of the same semantic theme in the new and old databases in pairs. If the centroid distance between two objects is smaller than the given matching threshold φ1, the ID number of the pair of objects will be stored in the ID set IDCollection1 for the matching calculation at the next layer.
The second layer: Matching of area overlapping degree. The algorithm’s first layer returns a one-to-one relationship of the objects, indicating identical positions. Next, a one-to-one relationship at the second layer indicates that the two objects are not only matched in terms of centroid distance but also area similarity, with mismatched objects regarded as the changed ones. In IDCollection1, each pair of objects satisfying centroid distance matching conditions are traversed, calculating the area overlapping degree of the pair. If the overlapping degree is greater than the given area matching threshold φ2, then the two objects meet the criterion of the overlapping degree and will be re-saved to the new ID set (IDCollection2). Pairs of objects ultimately retained are those matched to each other in both centroid distance and area overlapping degree.
The third layer: Matching of steering angle function. The algorithm proceeds only on those pairs matching in the first two layers. If the two objects are matched at the third layer as well, they are successfully matched geometrically; otherwise, they are changed objects. The matching process is similar to that of the second layer. In IDCollection2, each pair of objects is traversed, calculating their steering function matching. If it is greater than the matching threshold (φ3), then the two objects are successfully matched in terms of shape.
After the above hierarchical matching, the finally retained pairs of objects are considered successfully matched, which can be formalized as follows.
MatchingRule: if ((IsAttributeMatch = True) and (PositionResult ≤ φ1) and (AreaResult ≥ φ2) and (ShapeDirectionResult ≥ φ3)) then Matching (A, B). From the above, the matching result set IDCollection (M) is obtained. Unsuccessfully matched objects in the old and new databases are considered to be the changed objects and are recorded as ChangeCollection (N) and ChangeCollection (O), respectively, to be used in the subsequent detection of the types of changes.
Let the numbers of objects in the new and old databases be m and n, respectively. We compare the hierarchical matching algorithm proposed in this study and the existing weighted matching algorithm from the perspective of algorithm complexity. Between the two methods, the algorithm complexity of the three operators, namely distance, area, and shape, is the same. If the number of pairs of objects meeting the distance matching is k1, and that of pairs of objects meeting the distance and area matching at the same time is k2, then Min(m, n) > k1 > k2. For each operator, O centroid , O area , and O angle , the times of calculations conducted by the hierarchical matching algorithm is mn, k1, k2, respectively, much less than the number of calculations conducted by the weighted algorithm, which is m*n, as shown in Table 1. Therefore, when the time complexity and spatial complexity of each operator are the same, the hierarchical matching algorithm is much less complicated than the weighted matching algorithm.

2.4. Detection of Change Types Based on Matching Operators

Based on the above extracted change datasets, that is, ChangeCollection (N) and ChangeCollection (O), assume that ChangeType denotes the detected results about the types of changes. Meanwhile, the corresponding dataset (i.e., GeoDatabase (P)), earlier than GeoDatabase (O), is introduced to help detect the change type of Reappearance. For Fn, Fo, and Fp, which are any objects derived from ChangeCollection (N), ChangeCollection (O), and GeoDatabase (P), respectively, the rules to infer the nine basic types of changes of single area objects are defined as follows, also shown in Figure 3.
  • Rule 1: If there is no intersection between the area object Fo and any object Fn within the same geographical theme, the change of Fo is classified as Vanish.
    Fn ∈ ChangeCollection (N), ∃Fo ∈ ChangeCollection (O), if (FoFn = Ø) then ChangeType (Fo) ← Vanish.
  • Rule 2: If there is no intersection between the area object Fn and any object Fo within the same geographical theme, the change of Fo is classified as Vanish.
    Fo ∈ ChangeCollection (O), ∃Fn ∈ ChangeCollection (N), if (FnFo = Ø) then ChangeType (F0) ← Vanish.
  • Rule 3: If Fp and Fn are matched, and the types of changes of Fp and Fn are Vanish and Appearance, then the change of Fn is classified as Reappearance.
    if ((Matching (Fo, Fn) = Ture)and (ChangeType (Fo) = Vanish) and (ChangeType (Fn) = Appearance)) then ChangeType (Fn) ← Reappearance.
  • Rule 4: If geometrical features of Fo and Fn are determined to be unchanged and the attribute information has changed, then the change of the object is classified as AttributeChange.
    if ((IsAttributeMatch = False) and (PositionResult ≥ φ1) and (AreaResult ≥ φ2) and (ShapeDirectionResult ≥ φ3)) then ChangeType (FoFn) ← AttributeChange.
  • Rule 5: If area geometrical features of Fn and Fo are determined to be unchanged and the area of Fn decreases from that of Fo, while the other discriminant indexes are unchanged, then the change of the object is classified as Shrink.
    if ((IsAttributeMatch = True) and (PositionResult ≥ φ1) and (AreaResult < φ2) and (ShapeDirectionResult ≥ φ3) and (S(Fn) < S(Fo))) then ChangeType (FoFn) ← Shrink.
  • Rule 6: If geometrical features of the areas of Fn and Fo are determined to be changed and the area of Fn increases from that of Fo, while the other discriminant indexes are unchanged, then the change of the object is classified as Expansion.
    if ((IsAttributeMatch = True) and (PositionResult ≥ φ1) and (AreaResult < φ2) and (ShapeDirectionResult ≥ φ3) and (S(Fn) > S(Fo))) then ChangeType (FoFn) ← Expansion.
  • Rule 7: If geometrical features of the positions of Fn and Fo are determined to be changed while the other discriminant indexes are unchanged, then the change of the object is classified as Translation.
    if ((IsAttributeMatch = True) and (PositionResult < φ1) and (AreaResult ≥ φ2) and (ShapeDirectionResult ≥ φ3)) then ChangeType (FoFn) ← Translation.
  • Rule 8: If geometrical features of the directions of Fn and Fo are determined to be changed while the other discriminant indexes are deemed as unchanged, then the types of changes of the object is classified as Rotation.
    if ((IsAttributeMatch = True) and (PositionResult ≥ φ1) and (AreaResult ≥ φ2) and (ShapeDirectionResult < φ3) and (AssistResult ≥ φ4)) then ChangeType (FoFn) ← Rotation.
  • Rule 9: The types of changes other than the types of changes stated in Rule 1–Rule 8 are defined as Deformation.
    ∀E1 ∈ ChangeCollection (O), Fn ∈ ChangeCollection (N), if ((ChangeType ∉ (Rule 1∪Rule 2∪Rule 3∪Rule 4∪Rule 5∪Rule 6∪Rule 7∪Rule 8)) then ChangeType (FoFn) ← Deformation.

3. Experimental Test

To verify the feasibility of the method proposed in this study, C# programming was combined with ArcGIS Engine to conduct an experiment. Large-scale thematic maps of buildings in the same area of Fangshan District, Beijing City during different periods of time were selected to carry out the discriminating experiment. In consideration that few cases in real world contain the two change types of rotation and translation, while the type of reappearance is based on the change data in the three periods. Therefore, to fully prove the feasibility of this method, simulated buildings with the translation and rotation types of changes were added to the extracted set of changed objects in the experiment.
In the algorithm implementation, the threshold settings of the discriminant operators will directly affect the discriminating effect of the method. Since the OTSU (maximization of interclass variance) algorithm exists as one of popular methods on automatic threshold selection, it has a very wide range of applications in the image segmentation for its simplicity and intuitive. In case that the object and the background in image obey normal distributions, the selection of the global threshold could be realized by statistics of the histogram of the whole image [3,35]. In this study, we found sampling values of each operators fit in with the principle of the OTSU method. Therefore, to get the optimal threshold, the adaptive threshold calculation method of geometric matching based on OSTU, was used in this study. Using this method, the thresholds of the four operators ( O centroid , O area , O angle , and O Form ) in the algorithm to determine the types of changes were 0.85, 0.85, 0.71, and 0.95, respectively. A hierarchical matching experiment was carried out using the above thresholds, and a weighted matching was performed simultaneously among objects meeting the centroid matching conditions on the basis of centroid matching.
Additionally, in order to better measure and compare the experimental results, integrity (Ri) and accuracy (Ra) were chosen as the indices to evaluate the performance of the algorithm. The number of objects with correct results detected by this algorithm and the number of objects in the old and new databases with actual corresponding relationships with each other (i.e., matching relationships or the actual types of changes) are represented by t and T respectively, with R denoting the number of objects with corresponding relationships determined by the algorithm. The calculation method is as follows:
Ri = t/T
Ra = t/R

3.1. Incremental Information Extraction

From comparative analysis of the experimental results that are shown in Figure 4 and Table 2, the area object matching method based on hierarchical matching can not only achieve high accuracy and stability, but is also superior to the weighted algorithm when it comes to integrity and accuracy of matching. From the theoretical point of view, there are cases where some indices of the weighted matching method are too large or too small, but the final weighted results satisfy the set threshold, which was also verified by the experiment.
An further analysis was carried out with MD1_32 (meaning that the object was in the old-version candidate dataset MD1 and the ID was 32, and so on) and MD2_30 (meaning that the object was in the new-version candidate dataset MD2 and the ID was 30, and so on), MD1_33 and MD2_32, and MD1_65 and MD2_47 as the cases, as shown in Table 3, the analysis indicates that when two indices among the three indices are large while the other is smaller or two results are small and the other is larger, the weighted matching method can match two objects with each other, while the hierarchical matching method does not. In fact, these objects are mismatched. The first two pairs of matching objects contain an object with shape mutation, and there is a significant difference in positional distance when it comes to the latter as shown in Figure 5. This reason is that, the results obtained by weighted matching cannot reliably satisfy the overall matching probability of the geometric features of the area objects, when there are objects that match only some geometric features. The above analysis not only explains the reason why the hierarchical matching method is superior to the weighted matching method, but also points out the shortcomings of the weighted method.
To verify the reliability of the method, other four datasets of different study areas were selected to carry out the matching experiment. As shown in Table 4, as the number of matching objects increases in the study area, the integrity and accuracy of the hierarchical matching method and the weighted matching method remain within a certain range, and the result of the hierarchical matching method is better than that of the weighted algorithm in both integrity and accuracy. This indicates that the two methods both have good matching ability; however, the hierarchical matching method is better than the weighted matching method.

3.2. Detection of Change Types

The detection results obtained by the algorithm were compared with the actual types of changes, accompanied by a precision analysis. As implied by the data samples in Figure 6 and the results in Table 5, except that the accuracy and integrity to discriminate rotation and deformation are maintained between 89% and 93%, the accuracies of detecting the other changes mostly reach 100%.
Similarly, in order to verify the reliability and robustness of the method, other four datasets from different scales were also selected to carry out the type-of-change detection experiment. As shown in Table 6, the results of which show that both the integrity (Ri) and accuracy (Ra) of this method are relatively high, reaching above 90%, indicating that the detection method proposed in this study can accurately discriminate the types of changes.

4. Discussion and Conclusions

The detection and extraction of information about object change provides the basis for incrementally updating spatial databases. Considering spatial objects’ change process, complete information about object change includes extraction of the changed objects and detection of their types of changes. To synthetically detect the change types of spatial area objects while extracting incremental information, so as to track object’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets, as well as to solve problems like difficulty in detecting the types of changes, high complexity of detection methods, high redundancy rate of detection factors, and low degree of automation during the identification process during the incremental update of the spatial database, we proposed a method based on hierarchical matching model to detect change types of area objects in an integrated way. Using hierarchical matching operators reflecting different geometric features of area objects, this method filtered all the operators according to their complexity from low to high, creating hierarchical matching rules to extract incremental information. On this basis, we established the rules to judge the types of changes of objects based on the relevant matching operators, and designed the process to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. Considering to track spatial object’s change process, we discussed the identification, extraction, and database entry of the types of changes of area objects during the incremental information extraction process, achieved a close connection and organic coupling of incremental information extraction and object type-of-change detection, and ensured the accuracy and integrity of incremental information detection. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new and old objects, and facilitates the management of backtracking historical data and maintenance of spatial data.
However, in this study, only the detection on the types of changes of simple area objects at the same scale was explored. It is necessary to further study adaptive thresholds in the algorithm implementation, finding a method to examine the types of changes of complex objects under different scales, and more and varied test datasets are needed to demonstrate the robustness of the method in this study. Meanwhile, due to the limits of objective conditions and academic ability, we have not compared it with other methods except for the existing weighted matching method, nor performed with non-convex polygons or multi-part ones, it is inevitable for this paper that sensitivity to these issues needs to be addressed as some omissions or shortcoming during the study. Moreover, due to uncertainty and maybe fuzzy definitions, there may be an intersection with another object of the same kind even if an object has vanished. Therefore, the existing definitions of rules on extracting incremental information are not robust enough for real-world data, which should be addressed to be further studied in the future.

Acknowledgments

This work is supported by National Natural Science Foundations of China (No. 41771157, 41371375), as well as by Research project of Beijing Municipal Education Committee (No. KM201810028014). We also thank the anonymous referees for their helpful suggestion.

Author Contributions

Yanhui Wang designed the research flow and wrote the manuscript. Qisheng Zhang performed the data analysis of the study. Hongliang Guan contributed significantly to the conception of the study and constructive discussion. All authors read and approved the final manuscript.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Ye, Y.; Chen, B.; Wan, B. MMS-IU Model for Incremental Update of Spatial Database. Commun. Comput. Inf. Sci. 2013, 398, 359–370. [Google Scholar]
  2. Zhou, X.; Zeng, L.; Jiang, Y.; Zhou, K.; Zhao, Y. Dynamically Integrating OSM Data into a Borderland Database. ISPRS Int. J. Geo-Inf. 2015, 3, 1707–1728. [Google Scholar] [CrossRef]
  3. Wang, Y.; Yin, C.; Ding, Y. Multi-scale Extraction for the Road Network Incremental Information in Navigation Electronic Map. J. Southwest Jiaotong Univ. 2015, 4, 590–596. [Google Scholar] [CrossRef]
  4. Harrie, L. Generalisation Methods for Propagating Updates between Cartographic Data Sets; Surveying Lund Institute of Technology, Lund University: Lund, Sweden, 1998. [Google Scholar]
  5. Zhang, X.; Guo, T.; Tang, T. An Adaptive Method for Incremental Updating of Vector Data. Acta Geod. Cartogr. Sin. 2012, 4, 613–619. [Google Scholar]
  6. Badard, T.; Richard, D. Using XML for the Exchange of Updating Information between Geographical Information Systems. Comput. Environ. Urban. Syst. 2001, 25, 17–31. [Google Scholar] [CrossRef]
  7. Briatm, O.; Monnot, J.L.; Kressmann, T. Incremental Update of Cartographic Data in a Versioned Enviroment. In Proceedings of the 22nd International Cartographic Conference, Coruna, Spain, 11–16 July 2005; pp. 1–9. [Google Scholar]
  8. Zhang, Q.; Wang, Y. Automatic Detection Method for Change Type in Incremental Information Extraction Process of the Area Entities. Geogr. Geogr. Inf. Sci. 2016, 2, 11–16. [Google Scholar]
  9. Spery, L.; Claramunt, C.; Libourel, T. A Spatio-Temporal Model for the Manipulation of Lineage Metadata. Geoinformatica 2001, 1, 51–70. [Google Scholar] [CrossRef]
  10. Gombosˇi, M.; Zˇalik, B.; Krivograd, S. Comparing Two Sets of Polygons. Int. J. Geogr. Inf. Sci. 2003, 5, 431–443. [Google Scholar] [CrossRef]
  11. Luo, G.; Zhang, X.; Qi, L. An Incremental Updating Method of Spatial Data Considering the Geographic Features Change Process. Acta Sci. Nat. Univ. Sunyatseni 2014, 4, 131–141. [Google Scholar]
  12. Zhang, Q.; Wang, Y. The Research of Polygon Geometric Matching Method Based on Hierarchical Matching. Geogr. Geogr. Inf. Sci. 2016, 3, 298–306. [Google Scholar]
  13. Chen, J.; Wang, D.; Shang, Y. Master Design And Technical Development For National 1: 50,000 Topographic Database Updating Engineering in China. Acta Geod. Cartogr. Sin. 2010, 1, 7–10. [Google Scholar]
  14. Safra, E.; Kanza, Y.; Sagiv, Y.; Beeri, C.; Doytsher, Y. Location-Based Algorithms For finding Sets of Corresponding Objects over Several Geo-Spatial Data sets. Int. J. Geogr. Inf. Sci. 2010, 1, 69–106. [Google Scholar] [CrossRef]
  15. Zhai, R.J. Research on Automated Matching Methods for Multi-Scale Vector Spatial Data Based on Global Consistency Evaluation; PLA Information Engineering University: Zhengzhou, China, 2011. (In Chinese) [Google Scholar]
  16. Shin, K. Comparative Study on the Measures of Similarity for the Location Template Matching (LTM) Method. Trans. Korean Soc. Noise Vib. Eng. 2014, 4, 310–316. [Google Scholar] [CrossRef]
  17. Kim, I.; Feng, C.; Wang, Y. A Simplified Linear Feature Matching Method Using Decision Tree Analysis, Weighted Linear Directional Mean, and Topological Relationships. Int. J. Geogr. Inf. Sci. 2017, 5, 1042–1060. [Google Scholar] [CrossRef]
  18. Armiti, A.; Gertz, M. Efficient Geometric Graph Matching Using Vertex Embedding. In Proceedings of the ACM Sigspatial International Conference on Advances in Geographic Information Systems, Orlando, FL, USA, 5–8 November 2013; pp. 224–233. [Google Scholar] [CrossRef]
  19. Xing, H.; Chen, J. Parametric Approach to Classification of Spatial Object Change. J. Cent. South Univ. (Sci. Technol.) 2014, 2, 495–500. [Google Scholar]
  20. Zhang, J.; Wang, Y.; Zhao, W. An Improved Probabilistic Relaxation Method for Matching Multi-Scale Road Networks. Int. J. Digit. Earth 2017. [Google Scholar] [CrossRef]
  21. Claramunt, C.; Thériault, M. Managing Time in GIS: An Event-Oriented Approach. In Proceedings of the International Workshop on Temporal Databases, Zürich, Switzerland, 17–18 September 1995; pp. 23–42. [Google Scholar] [CrossRef]
  22. Hornsby, K.; Egenhofer, M.J. Identity-based Change: A Foundation for Spatio-Temporal Knowledge Representation. Int. J. Geogr. Inf. Sci. 2000, 3, 207–224. [Google Scholar] [CrossRef]
  23. Bhatt, M.; Wallgruen, J.O. Geospatial Narratives and Their Spatio-Temporal Dynamics: Commonsense Reasoning For High-Level Analyses In Geographic Information Systems. ISPRS Int. J. Geo-Inf. 2014, 1, 80–95. [Google Scholar] [CrossRef]
  24. Wang, S.; Liu, D.; Gu, F. Identity Change Based Spatio-Temporal Reasoning. Chin. J. Comput. 2012, 2, 210–217. [Google Scholar] [CrossRef]
  25. Klippel, A.; Worboys, M.; Duckham, M. Identifying Factors of Geographic Event Conceptualisation. Int. J. Geogr. Inf. Sci. 2008, 2, 183–204. [Google Scholar] [CrossRef]
  26. Kane, J.; Naumov, P. The Semantics of Similarity in Geographic Information Retrieval. J. Spat. Inf. Sci. 2011, 2, 29–57. [Google Scholar] [CrossRef]
  27. Fan, Y.T.; Yang, J.Y.; Zhu, D.H. An Event-Based Change Detection Method of Cadastral Database Incremental Updating. Math. Comput. Model. 2011, 11–12, 1343–1350. [Google Scholar] [CrossRef]
  28. Pan, L.; Wang, H. Using Topological Relation Model to Automatically Detect the Change Types of Residential Land. Geomat. Inf. Sci. Wuhan Univ. 2009, 3, 301–304. [Google Scholar]
  29. Zhu, H.; Wu, H.; Ma, S. Description and Representation Model of Spatial Object Incremental Update. J. Spat. Sci. 2014, 1, 49–61. [Google Scholar] [CrossRef]
  30. Chen, J.; Lin, Y.; Liu, W. Formal Classification of Spatial Incremental Changes for Updating. Acta Geod. Cartogr. Sin. 2012, 1, 108–114. [Google Scholar]
  31. Li, J. Spatial Conflict Detection and Processing Method for Incremental Change of Residential Land. Master’s Thesis, PLA Information Engineering University, Zhengzhou, China, 2015. (In Chinese). [Google Scholar]
  32. Zhao, Z.; Stough, R.; Song, D. Measuring Congruence of Spatial Objects. Int. J. Geogr. Inf. Sci. 2011, 1, 113–130. [Google Scholar] [CrossRef]
  33. Arjun, P.; Mirnalinee, T.T.; Tamilarasan, M. Compact Centroid Distance Shape Descriptor Based on Object Area Normalization. In Proceedings of the International Conference on Advanced Communication Control and Computing Technologies, Ramanathapuram, India, 8–10 May 2015; pp. 1650–1655. [Google Scholar] [CrossRef]
  34. Moreira, D.; Wang, L. Hausdorff Measure Estimates and Lipschitz Regularity in Inhomogeneous Nonlinear Free Boundary Problems. Arch. Ration. Mech. Anal. 2014, 2, 527–559. [Google Scholar] [CrossRef]
  35. Yin, C.; Wang, Y. Target Geometry Matching Threshold in Incremental Updating of Road Network Based on OSTU. Geomat. Inf. Sci. Wuhan Univ. 2014, 39, 1061–1067. [Google Scholar]
Figure 1. The incremental information extraction process.
Figure 1. The incremental information extraction process.
Ijgi 07 00042 g001
Figure 2. Illustration of extracting the change information based on hierarchical feature matching.
Figure 2. Illustration of extracting the change information based on hierarchical feature matching.
Ijgi 07 00042 g002
Figure 3. Illustration of type-of-change detection.
Figure 3. Illustration of type-of-change detection.
Ijgi 07 00042 g003
Figure 4. Illustration of hierarchical matching result (a) and weighted matching result (b), respectively (the highlighted represents matched objects).
Figure 4. Illustration of hierarchical matching result (a) and weighted matching result (b), respectively (the highlighted represents matched objects).
Ijgi 07 00042 g004
Figure 5. Case comparison of matching objects from hierarchical matching and weighted algorithm.
Figure 5. Case comparison of matching objects from hierarchical matching and weighted algorithm.
Ijgi 07 00042 g005
Figure 6. Illustration of change samples from both old and new dataset.
Figure 6. Illustration of change samples from both old and new dataset.
Ijgi 07 00042 g006
Table 1. Comparison of calculation times of sub-operators from two algorithms.
Table 1. Comparison of calculation times of sub-operators from two algorithms.
AlgorithmCalculation Times of O centroid Calculation Times of O area Calculation Times of O angle
Weighted matching algorithmm*nm*nm*n
Hierarchical matching algorithmm*nk1k2
Table 2. Comparison of experimental results from study area.
Table 2. Comparison of experimental results from study area.
Matching MethodNumber of SamplesRtTRiRa
Hierarchical matching48740938442490.6%93.9%
Weighted matching48743236342485.6%84.0%
Table 3. Matching analysis of object instances.
Table 3. Matching analysis of object instances.
Corresponding ID O centroid O area O angle Weight of Weighted Matching MethodHierarchical Matching DetectionWeighted Matching Detection
MD1_32/MD2_300.9990.9790.5130.830NoYes
MD1_33/MD2_320.9980.9650.4950.819NoYes
MD1_65/MD2_470.7810.7530.9860.840NoYes
Table 4. Matching results in different study areas.
Table 4. Matching results in different study areas.
No. of Study AreaNumber of SamplesHierarchical MatchingWeighted Matching
Ri (%)Ra (%)Ri (%)Ra (%)
131391.797.687.288.9
269694.294.384.786.3
3109292.995.182.987.4
4172193.495.686.186.7
Overall382293.395.385.087.2
Table 5. Detection result of change type in study area.
Table 5. Detection result of change type in study area.
Change TypeRtTRaRi
Vanish242424100%100%
Appearance373737100%100%
Reappearance---100%100%
AttributeChange212121100%100%
Shrink111111100%100%
Expansion464646100%100%
Translation444100%100%
Rotation27252892.6%89.3%
Deformation43394490.7%88.6%
Overall21320721597.2%96.2%
Table 6. Detection results of change type in different study areas.
Table 6. Detection results of change type in different study areas.
No. of Study AreaRtTRa (%)Ri (%)
123020922490.993.3
261358462195.394.0
387480986392.693.7
412031136117894.496.4
Overall29202738288693.894.9

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop