Next Article in Journal
Sentinel-2 Satellite Image Time-Series Land Cover Classification with Bernstein Copula Approach
Previous Article in Journal
Latent Low-Rank Projection Learning with Graph Regularization for Feature Extraction of Hyperspectral Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Assessment of More Suitable Image Spatial Resolutions for Offshore Aquaculture Areas Automatic Monitoring Based on Coupled NDWI and Mask R-CNN

1
Key Laboratory of Regional Ecology and Environmental Change, School of Geography and Information Engineering, China University of Geosciences, Wuhan 430074, China
2
Hubei Key Laboratory of Yangtze Catchment Environmental Aquatic Science, China University of Geosciences, Wuhan 430074, China
3
United Center for Eco-Environment in Yangtze River Economic Belt, Chinese Academy of Environmental Planning, Beijing 100012, China
4
MNR Key Laboratory for Geo-Environmental Monitoring of Great Bay Area 883 & Guangdong Key Laboratory of Urban Informatics, Shenzhen University, Shenzhen 518060, China
5
GEOXAIR (Fujian) Technology Co., Ltd., Fuzhou 350003, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(13), 3079; https://doi.org/10.3390/rs14133079
Submission received: 6 April 2022 / Revised: 27 May 2022 / Accepted: 1 June 2022 / Published: 27 June 2022
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
Wide-scale automatic monitoring based on the Normalized Difference Water Index (NDWI) and Mask Region-based Convolutional Neural Network (Mask R-CNN) with remote sensing images is of great significance for the management of aquaculture areas. However, different spatial resolutions brought different cost and model performance. To find more suitable image spatial resolutions for automatic monitoring offshore aquaculture areas, seven different resolution remote sensing images in the Sandu’ao area of China, from 2 m, 4 m, to 50 m, were compared. Results showed that the remote sensing images with a resolution of 15 m and above can achieve the corresponding recognition effect when no financial issues were considered, with the F1 score of over 0.75. By establishing a cost-effectiveness evaluation formula that comprehensively considers image price and recognition effect, the best image resolution in different scenes can be found, thus providing the most appropriate data scheme for the automatic monitoring of offshore aquaculture areas.

Graphical Abstract

1. Introduction

Global aquaculture production has increased by 7.5% per year since 1970, creating economic benefits but also posing significant environmental challenges [1] which has caused severe pollution in the marine environment [2]. In recent years, due to excessive aquiculture, water pollution has become more and more serious in mariculture areas. Many countries, such as New Zealand [3], China [4], and Turkey [5] have problems with excessive aquaculture pollution. Protecting the marine ecological environment, carrying out proper planning of aquaculture areas and avoiding over-farming are essential for marine culture management. This requires real-time and high-precision monitoring data of aquaculture areas to help managers quickly discover the changes in offshore aquaculture areas and carry out aquaculture plans [6].
The method of coupling the Normalized Difference Water Index (NDWI) and Mask Region-based Convolutional Neural Network (Mask R-CNN) can identify remote sensing images quickly and accurately to analyze the temporal and spatial changes in aquaculture areas, explore the aquaculture rules, and provide early warning to illegal aquaculture problems in prohibited and restricted areas, which is an essential means for automatic monitoring of aquaculture areas. However, the resolution of remote sensing images greatly influences the recognition effect. At present, the primary remote sensing data sources used to identify the mariculture area are multispectral satellite remote sensing images and microwave remote sensing images [7]. The multispectral satellite remote sensing images mainly include Spot, GF-1, GF-2, and Landsat [4,8,9,10,11]; while microwave remote sensing images mainly include Radarsat-2, GF-3, Sentinel-1, and Sentinel-2 [12,13,14,15,16,17]. The spatial resolution of diverse remote sensing data sources is quite different, and the prices and effects they can achieve are naturally different. For example, the spatial resolution of GF-1 can reach 2 m, but the price of one view is about USD 230; while the spatial resolution of Landsat-8 is 30 m and can be obtained for free. The higher the resolution of the data, the higher the accuracy of the monitoring. Higher-resolution data can reflect the changes in the mariculture area in more detail, but it is also more expensive and time-consuming to process. However, many current studies have detected the mariculture area without considering the effect of the price and resolution of the image on the results analysis and comparison [9,18,19]. There are mainly two kinds of marine aquaculture areas: raft culture area (RCA) and cage culture area (CCA) [20]. The size of these marine aquaculture areas ranges from a few meters to tens of meters. A large amount of image data is used to monitor the change in the aquaculture area in real-time, so it is necessary to balance the price and practical effect of remote sensing images.
To solve the above problems, according to the characteristics of offshore aquaculture areas, this study used remote sensing images with different resolutions to carry out experiments and established cost-effectiveness evaluation formulas in different scenes, to identify the image resolution that is more suitable for monitoring aquaculture areas.

2. Materials and Methods

2.1. Study Area and Data Sources

2.1.1. Study Area

The mariculture area of Sandu’ao in Fujian Province was selected as the study area. The area ranges from 119°28′8″ to 120°9′44″E and 26°21′34″ to 27°0′24″N, as shown in Figure 1. Fujian Province is located on the southeast coast of China with a sea area of 136,000 square kilometers, a land coastline of 3752 km, and numerous harbors. There are six deep-water harbors from north to south, including Shacheng Port, Sandu’ao, Luoyuan Bay, Meizhou Bay, Xiamen Port, and Dongshan Bay. A total of 366,000 plastic fishing rafts and more than 3100 deep-water storm-resistant cages have been built. In 2018, the total output of the top ten aquatic products was 3.38 million tons, and the output of rhubarb fish, abalone, kelp, laver, and oyster ranked first in China [21]. However, due to the inability to monitor the status of local aquaculture areas in real-time, problems such as over-cultivation and encroachment of non-farming areas are common, resulting in severe marine pollution [22]. Therefore, in order to meet the government’s demand for fishery development planning and build a modern fishing port system with a beautiful environment and orderly management, it is necessary to carry out dynamic monitoring of this area.

2.1.2. Data Sources

The GF-1 remote sensing image data were used in this study. The data sources and contents are shown in Table 1.
In the stage of model training and validation, experiments were carried out using GF-1 remote sensing image data of Sandu’ao mariculture area. The image was acquired on 13 June 2020, and the spatial resolution after image fusion processing was 2 m.

2.2. Method

2.2.1. Data Pre-Processing

Production of Training and Test Sets

Remote sensing images of GF-1 were cropped into 80 samples of 500 × 500 pixels, 64 of which were used for training and 16 for testing. The boundaries of RCA and CCA in the samples were finely marked by manual visual interpretation. The ground truth was obtained, in which the colored block was the aquaculture area, and the black block was the background. As the GF-1 image was a 10-bit depth and 4-band image, it was converted into an 8-bit RGB image for input to the network.
Due to the small size of the dataset, data augmentation was carried out to prevent overfitting in the training network, improve the robustness of the classifier to different sensors, atmospheric conditions, and lighting conditions, and improve the generalization ability of the model. By adding Gaussian noise, Gaussian blur, and adjusting the contrast, 64 images used for training were expanded to simulate the image state under different influences so that the number of training samples was increased to 256 and the ground truth corresponding to the samples was generated.

Resampling of Multi-Resolution Data

In order to investigate the influence of different resolutions on the monitoring effect, images with a resolution of 2 m were processed. According to the spatial resolution of satellite images commonly used in this kind of research, low-resolution images of 4 m, 10 m, 15 m, 20 m, 30 m, and 50 m were simulated, and the ground truth corresponding to the samples was generated using bilinear down-sampling.

2.2.2. Model Training and Validation

In order to make full use of existing resources and improve model performance, the transfer learning method was used to train the pre-training model on the large public COCO dataset [23]. On this basis, training samples and ground truth were input into the model.
The model effect was quantified by using the test set. The 16 aquaculture area recognition results were compared with the ground truth. The precision, recall, and F1 score were calculated using pixels as the basic calculation unit, which is different from the evaluation method of object detection algorithms [24], to evaluate the model performance on two categories of aquaculture area. The precision is mainly used to evaluate the correct recognition rate of a category. The recall is mainly used to evaluate how many pixels in a category were correctly recognized. The F1 score is an accuracy index calculated as the harmonic mean of the precision and recall. Harmonic mean is one of several averages used in mathematics, more suitable for ratios (such as precision and recall) than the traditional arithmetic mean. These three indicators are defined as follows:
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
F 1   s c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
where TP is the number of correctly identified pixels in ground-truth, FP is the number of misidentified pixels that do not exist in ground-truth, and FN is the number of undetected pixels in ground-truth.

2.2.3. Cost-Effectiveness Evaluation

To balance the cost and monitoring effect in automatic monitoring of aquaculture areas, the cost-effectiveness evaluation formula was established to evaluate the image resolution more suitable for monitoring of aquaculture areas.

Cost Analysis

The cost of automatic monitoring mainly includes the time consumed and the cost of purchasing remote sensing images of monitored areas. Time consumed includes the time to train the model for identification and to identify the aquaculture area. Considering that the model only needs to be trained the first time it is used and does not need to be repeated in subsequent monitoring, and that the automatic monitoring of aquaculture areas is less demanding in terms of real-time, the cost of automatic monitoring of aquaculture areas is set as the cost of purchasing images per square kilometer in USD, which is recorded as C.

Effectiveness Analysis

The precision, recall, and F1 score were mainly used to evaluate the recognition effect of aquaculture areas, while the F1 score is the synthesis of the first two indicators, so the effectiveness of automatic monitoring is set as the average of F1 score of two types of aquaculture areas, which is recorded as E.

Comprehensive Performance Evaluation

The cost and effectiveness of automatic monitoring in aquaculture areas are comprehensively analyzed to evaluate the comprehensive performance of different remote sensing images used for automatic monitoring, which is recorded as P. The calculation formula is:
P = α C + β E
where α and β are weight parameters, both positive numbers. Different parameters can be set in different scenes. The best image resolution for monitoring the aquaculture area can be obtained by calculating and comparing the comprehensive performance of different images.

3. Results

3.1. Model Validation

The model training ended after 100 epochs, then evaluation indicators were calculated quantitatively by using the test set, as shown in Table 2. The precision, recall, and F1 score of the cage culture area are 0.014, 0.069, and 0.043 higher than those of the raft culture area, respectively, which may be related to the clearer boundaries of the cage culture areas.
The precision, recall, and F1 score for the identification of aquaculture area in different provinces in studies of Liu et al. [4] were in the range of 0.79–0.98, 0.71–1.00, and 0.83–0.91, respectively, and in studies of Cui et al. [18] were around 0.89. The model’s accuracy constructed in this study reached the level of other current studies, indicating that this model is reliable. In the studies of Liu et al. [4], the F1 score of the extracted aquaculture area in Fujian using 15 m resolution data was 0.83. Compared with the F1 score of the extracted aquaculture area in Sandu’ao, Fujian Province, using higher resolution in this study allows us to obtain higher accuracy results correspondingly by using higher resolution images.
Combining NDWI and Mask R-CNN, the extraction results of the aquaculture area with 2 m resolution data in Sandu’ao area in June 2020 are shown in Figure 2; in which blue boxes represent RCA, and red boxes represent CCA, and most of the RCA and CCA are successfully identified. A total of 10234 raft culture areas were extracted, with an area of about 66.19 km2; and 4245 cage culture areas were extracted, with an area of about 25.33 km2. The actual number of aquaculture areas should be higher due to the influence of waves or turbid waters, and because several aquaculture areas were combined when the culture density was high. Identification results of several boxed locations in the study area in Figure 2 show that this method can accomplish the identification of two types of aquaculture area. The boundary of the aquaculture area was accurate, and there were fewer missed and false detections.
In addition, we also used other low-resolution images to identify aquaculture areas. Several 15 m resolution Landsat-8 images in Xiapu area of Sandu’ao were identified using this model, and the results are shown in Figure 3. As can be seen from the figure, the model can identify the overall situation of aquaculture areas for low-resolution images; however, it cannot accurately reflect the area and quantity of aquaculture areas so it is necessary to explore the influence of image resolution on the performance of the model.

3.2. The Impact of Different Resolutions on Model Performance

The model trained with different resolution data was used to identify samples with the corresponding resolutions, and the precision, recall, and F1 score were calculated and are shown in Figure 4.
Figure 4 shows the variation in precision, recall, and F1 score versus resolution for raft culture area and cage culture area. It can be seen that with the decrease in spatial resolution of samples, the F1 score shows an overall downward trend. For the raft culture area, the F1 score reaches 0.863 at 2 m resolution and drops to 0.433 at 50 m resolution, with a decrease of 49.83%; while it is 0.855 at 4 m resolution, with a decrease of only 0.93%. For the cage culture area, the F1 score reaches 0.906 at 2 m resolution and decreases to 0.349 at 50 m resolution, which decreases by 61.48%, while it is 0.894 at 4 m resolution, which only decreases by 1.32%. Similarly, the change rates of precision and recall of the raft culture area and cage culture area also have a significant span, with the precision decreasing from 0.890 and 0.904 to 0.393 and 0.266, respectively, and the recall decreasing from 0.839 and 0.908 to 0.481 and 0.510 respectively.
The results in Figure 4 indicate that the extraction efficiency of aquaculture areas depends on the spatial resolution of the used images. With better resolution (2–4 m), extraction efficiency is good and rather stable. Starting from resolutions larger than 4 m, performance gradually decreases, and from resolutions above 20 m, the model’s performance drops rapidly. Monitoring the actual situation of the aquaculture area by using such low-resolution remote sensing images is difficult or impossible.

3.3. Cost-Effectiveness Analysis of Aquaculture Monitoring

In the automatic monitoring of aquaculture areas, the effect of remote sensing images with different resolutions is quite different, so it is necessary to analyze the cost and effectiveness. Based on the results obtained in the experiment of resolution influence on model performance, the polynomial fitting function was established to obtain the average value of the F1 score for two types of aquaculture areas with different resolutions. The fitting function is:
y = 1.534 × 10 6   x 3 9.722 × 10 5   x 2 0.009282   x + 0.9063
where x is the spatial resolution of the image and y is the average value of the F1 score for two types of aquaculture areas identified by the image.
The SPOT series satellites with a wide range and rich spatial resolution were selected as the price reference, and the effect was obtained based on the fitting function. Considering the cost, as the importance of each factor may vary in different scenarios, the cost-effectiveness of three types of chargeable images in different scenarios was evaluated, as shown in Table 3.

3.3.1. Price First

In view of the two factors of price and effect, price is more important when in the scene of price priority. Set the weight α as 0.3 and β as 1 and evaluate the comprehensive performance of each image according to Formula (4), as shown in Table 3. It can be seen that under the condition of price priority, images with 10 m resolution can obtain relatively good results at a lower price, which is the best choice in this scene.

3.3.2. Effect First

When in the scene of effect priority, the effect is more important. Set the weight α as 0.06 and β as 2 and evaluate the comprehensive performance of each image according to Formula (4), as shown in Table 3. It can be seen that when the effect is given priority, images with 5 m resolution becomes the best choice for this scene with its excellent effect and relatively low price.

3.3.3. Balancing Price with Effect

For the two factors of price and effect, when they are in a balanced scenario, the importance of the two is considered in a balanced way. Set the weight α as 0.06 and β as 1 and evaluate the comprehensive performance of each image according to Formula (4), as shown in Table 3. The results show that in this scenario, the price and effect of images with 5 m resolution are moderate, which is the best choice in this scene.

4. Discussion

In recognition of the aquaculture area, with the decrease in the spatial resolution of image data, the actual area represented by a single pixel increases and the features of the aquaculture area gradually blur; more and more boundary parts of the aquaculture area are mixed with the background into the same pixel, resulting in a poor recognition effect. In addition, because there is some conflict between precision and recall, and when more pixels are identified, there may be more errors; therefore, the precision is lower and the recall is higher. When higher accuracy is needed, smaller pixels are identified; therefore, the precision is higher and the recall is lower. That is why the precision suddenly becomes lower at 10 m resolution in the cage culture areas, which is different from the overall change trend. However, the F1 score obtained by the combination of the precision and recall shows a relatively smooth decreasing trend.
Because the boundary of the cage culture areas is clearer but the features are complex, and the boundary of the raft culture areas is blurred but the color tone is uniform, the recognition effect of cage culture is slightly better than that of raft culture when the spatial resolution is high. With the gradual decrease in spatial resolution, the boundary information of the aquaculture area is gradually blurred. It can be seen from Figure 5 that the recognition effect of cage culture decreases slightly faster than that of raft culture, two indicators are below 0.7 at 20 m resolution for both raft culture and cage culture, so the recognition effect cannot meet the needs of monitoring at this time. Considering the image recognition accuracy of the two types of aquaculture area under different resolutions, the images of 15 m and above resolution should be selected for automatic monitoring of aquaculture areas when the funding problem is not considered to reflect the changes in the aquaculture area.
Considering that the effect of remote sensing images with different resolutions is quite different, it is necessary to analyze the cost and effect when considering the cost. Based on the analysis, the best images in different scenes can be quantified to obtain the best data scheme for automatic monitoring of offshore aquaculture areas. This study explores the cost-effectiveness analysis method of aquaculture area monitoring. In Formula (4), the selection of parameters considered the requirements of the relevant government budget and management departments, consulted the opinions of relevant experts and managers, and referred to the research of Zhang et al. [25]. Finally, three parameter selection schemes under different scenarios and requirements were obtained: taking SPOT series satellites as an example, when focusing on cost, images with 10 m resolution can be selected first, followed by 5 m resolution; when paying attention to effect, images with 5 m resolution can be selected first, followed by 2.5 m resolution; when comprehensively considering the performance, images with 5 m resolution can be selected first, followed by those with 2.5 m and 10 m resolution. This scheme is proposed for the Sandu’ao area and can also be extended to other marine aquaculture areas according to the needs of users. In addition, this cost-effectiveness analysis method can also be extended to other image classification applications, which will bring more reference directions for the selection of image classification schemes.

5. Conclusions

In order to assess the suitable resolution of spatial images for automatic monitoring in offshore aquaculture areas, this study experimented with images of different resolutions and carried out a cost-effectiveness evaluation accordingly to explore the most suitable data scheme for automatic monitoring. First, the identification method of coupling NDWI and Mask R-CNN was trained and verified. Secondly, the image resolution required for aquaculture area monitoring was discussed by comparing different extracting results among 2 m, 4 m, 10 m, 15 m, 20 m, 30 m, and 50 m resolutions. It was found that at least 15 m resolution is necessary for aquaculture area extracting. On this basis, SPOT series satellites can be taken as an example to analyze the cost-effectiveness of automatic monitoring in aquaculture areas; the best image resolution in different scenes is obtained, which provides the best data scheme for the automatic monitoring of offshore aquaculture areas.
The method proposed in this study has some limitations. The selection of cost and benefit items is not comprehensive enough; for example, without considering the different contributions of the two types of aquaculture areas, the average value of F1 score is directly used as the effect item. Further research in the field of marine aquaculture will be carried out in the future, and the allocation of cost and effect items in the automatic monitoring will be explored more comprehensively to obtain a more scientific data scheme and promote the development of aquaculture areas monitoring.

Author Contributions

Conceptualization, Y.W. and Y.Z.; Data curation, J.W.; Funding acquisition, T.Z.; Investigation, Y.W.; Methodology, Y.W.; Project administration, W.L. and S.L.; Resources, Y.C.; Software, Y.Z.; Supervision, H.B. and B.W.; Validation, Y.W., Y.C. and J.W.; Visualization, Y.Z.; Writing—Original draft, Y.W.; Writing—Review and editing, Y.Z. and Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Open Research Fund Program of MNR Key Laboratory for Geo-Environmental Monitoring of Great Bay Area (SZU51029202010), Key Laboratory of Marine Environmental Survey Technology and Application, Ministry of Natural Resources, China.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. FAO. The State of World Fisheries and Aquaculture. Sustainability in Action; FAO: Rome, Italy, 2020. [Google Scholar]
  2. Penczak, T.; Galicka, W.; Molinski, M.; Kusto, E.; Zalewski, M. The Enrichment of a Mesotrophic Lake by Carbon, Phosphorus and Nitrogen from the Cage Aquaculture of Rainbow Trout, Salmo gairdneri. J. Appl. Ecol. 1982, 19, 371–393. [Google Scholar] [CrossRef]
  3. Mcginnis, M.V.; Collins, M. A Race for Marine Space: Science, Values, and Aquaculture Planning in New Zealand. Coast. Manag. 2013, 41, 401–419. [Google Scholar] [CrossRef]
  4. Liu, Y.; Wang, Z.; Yang, X.; Zhang, Y.; Yang, F.; Liu, B.; Cai, P. Satellite-based monitoring and statistics for raft and cage aquaculture in China’s offshore waters. Int. J. Appl. Earth Obs. 2020, 91, 102118. [Google Scholar] [CrossRef]
  5. Demirak, A.; Balci, A.; Tuefekci, M. Environmental impact of the marine aquaculture in Güllük Bay, Turkey. Environ. Monit. Assess. 2006, 123, 1. [Google Scholar] [CrossRef] [PubMed]
  6. Fu, Y.; Ye, Z.; Deng, J.; Zheng, X.; Wang, K. Finer Resolution Mapping of Marine Aquaculture Areas Using WorldView-2 Imagery and a Hierarchical Cascade Convolutional Neural Network. Remote Sens. 2019, 11, 1678. [Google Scholar] [CrossRef] [Green Version]
  7. Xu, Y.; Hu, Z.; Zhang, Y.; Wang, J.; Yin, Y.; Wu, G. Mapping Aquaculture Areas with Multi-Source Spectral and Texture Features: A Case Study in the Pearl River Basin (Guangdong), China. Remote Sens. 2021, 13, 4320. [Google Scholar] [CrossRef]
  8. Chu, J.; Shao, G.; Zhao, J.; Gao, N.; Wang, F.; Cui, B. Information extraction of floating raft aquaculture based on GF-1. Sci. Surv. Mapp. 2020, 45, 92–98. [Google Scholar]
  9. Liu, Y.; Yang, X.; Wang, Z.; Lu, C.; Li, Z.; Yang, F. Aquaculture area extraction and vulnerability assessment in Sanduao based on richer convolutional features network model. J. Oceanol. Limnol. 2019, 37, 1941–1954. [Google Scholar] [CrossRef]
  10. Lin, Q.; Lin, G.; Chen, Z.; Chen, Y. The Analysis on Spatial-temporal Evolution of Beach Cultivation and Its Policy Driving in Xiamen in Recent Two Decades. Geo-Inf. Sci. 2007, 9, 9–13. [Google Scholar]
  11. Lu, X.; Gu, Y.; Wang, X.; Lin, Y.; Zhao, Q.; Wang, K.; Liu, X.; Fei, X. The identification of Porphyra culture area by remote sensing and spatial distribution change and driving factors analysis. Mar. Sci. 2018, 42, 87–96. [Google Scholar]
  12. Zhang, Y.; Wang, C.; Chen, J.; Wang, F. Shape-Constrained Method of Remote Sensing Monitoring of Marine Raft Aquaculture Areas on Multitemporal Synthetic Sentinel-1 Imagery. Remote Sens. 2022, 14, 1249. [Google Scholar] [CrossRef]
  13. Fan, J.; Zhao, J.; An, W.; Hu, Y. Marine Floating Raft Aquaculture Detection of GF-3 PolSAR Images Based on Collective Multikernel Fuzzy Clustering. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2741–2754. [Google Scholar] [CrossRef]
  14. Geng, J.; Fan, J.; Wang, H. Weighted Fusion-Based Representation Classifiers for Marine Floating Raft Detection of SAR Images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 444–448. [Google Scholar] [CrossRef]
  15. Hu, Y.; Fan, J.; Wang, J. Target recognition of floating raft aquaculture in SAR image based on statistical region merging. In Proceedings of the 2017 Seventh International Conference on Information Science and Technology (ICIST), Da Nang, Vietnam, 16–19 April 2017. [Google Scholar]
  16. Zhang, Y.; Wang, C.; Ji, Y.; Chen, J.; Deng, Y.; Chen, J.; Jie, Y. Combining Segmentation Network and Nonsubsampled Contourlet Transform for Automatic Marine Raft Aquaculture Area Extraction from Sentinel-1 Images. Remote Sens. 2020, 12, 4182. [Google Scholar] [CrossRef]
  17. Ottinger, M.; Bachofer, F.; Huth, J.; Kuenzer, C. Mapping Aquaculture Ponds for the Coastal Zone of Asia with Sentinel-1 and Sentinel-2 Time Series. Remote Sens. 2022, 14, 153. [Google Scholar] [CrossRef]
  18. Cui, B.; Fei, D.; Shao, G.; Lu, Y.; Chu, J. Extracting Raft Aquaculture Areas from Remote Sensing Images via an Improved U-Net with a PSE Structure. Remote Sens. 2019, 11, 2053. [Google Scholar] [CrossRef] [Green Version]
  19. Fu, Y.; Deng, J.; Wang, H.; Comber, A.; Yang, W.; Wu, W.; You, S.; Lin, Y.; Wang, K. A new satellite-derived dataset for marine aquaculture areas in the China’s coastal region. Earth Syst. Sci. Data 2020, 13, 1829–1842. [Google Scholar] [CrossRef]
  20. Liang, C.; Cheng, B.; Xiao, B.; He, C.; Liu, X.; Jia, N.; Chen, J. Semi-/Weakly-Supervised Semantic Segmentation Method and Its Application for Coastal Aquaculture Areas Based on Multi-Source Remote Sensing Images—Taking the Fujian Coastal Area (Mainly Sanduo) as an Example. Remote Sens. 2021, 13, 1083. [Google Scholar] [CrossRef]
  21. Fujian Development and Reform Commission. Layout and Construction Planning of Fishing Ports in Fujian Province; Fujian Development and Reform Commission: Fuzhou, China, 2020. [Google Scholar]
  22. Wang, Z. Analysis of Variation Trend of Water Quality Based on Time Series in Sansha Bay. Environ. Impact Assess. 2017, 39, 76–81. [Google Scholar]
  23. Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Zitnick, C.L. Microsoft COCO: Common Objects in Context; Springer International Publishing: Cham, Switzerland, 2014. [Google Scholar]
  24. Padilla, R.; Netto, S.L.; Silva, E.A.B.D. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 Interna-tional Conference on Systems, Signals and Image Processing (IWSSIP), Rio de Janeiro, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar]
  25. Zhang, A.; Que, L.; Li, X.; Wang, Y.; Cui, W. Cost-benefit Model and Its Application of Reclaimed Water Project Based on Perspective of Stakeholders. Water Resour. Power 2021, 39, 136–139. [Google Scholar]
Figure 1. The location and range of Sandu’ao mariculture area in Fujian Province.
Figure 1. The location and range of Sandu’ao mariculture area in Fujian Province.
Remotesensing 14 03079 g001
Figure 2. (a) Extraction results of aquaculture area in Sandu’ao; (b) extraction results of six aquaculture areas (A–F in (a)), where the blue box represents raft culture area, and the red box represents cage culture area.
Figure 2. (a) Extraction results of aquaculture area in Sandu’ao; (b) extraction results of six aquaculture areas (A–F in (a)), where the blue box represents raft culture area, and the red box represents cage culture area.
Remotesensing 14 03079 g002
Figure 3. Extraction results of aquaculture area in Xiapu using Landsat-8 images, where the blue box represents raft culture area, and the red box represents cage culture area.
Figure 3. Extraction results of aquaculture area in Xiapu using Landsat-8 images, where the blue box represents raft culture area, and the red box represents cage culture area.
Remotesensing 14 03079 g003
Figure 4. Precision, recall, and F1 score of test samples with different spatial resolutions in two types of aquaculture area.
Figure 4. Precision, recall, and F1 score of test samples with different spatial resolutions in two types of aquaculture area.
Remotesensing 14 03079 g004
Figure 5. Aquaculture areas with seven different spatial resolutions under the same location and the identification effect, where the blue box represents raft culture area and the red box represents cage culture area.
Figure 5. Aquaculture areas with seven different spatial resolutions under the same location and the identification effect, where the blue box represents raft culture area and the red box represents cage culture area.
Remotesensing 14 03079 g005
Table 1. Details of experimental data.
Table 1. Details of experimental data.
SourceFormatTime RangeSpace RangeSpatial Resolution/m
GF-1.tif13 June 2020119°28′8″–120°9′44″E, 26°21′34″–27°0′24″N2
Table 2. Precision, recall, and F1 score of test samples in two types of aquaculture area.
Table 2. Precision, recall, and F1 score of test samples in two types of aquaculture area.
IndicatorRaft Culture AreaCage Culture Area
Precision0.8900.904
Recall0.8390.908
F1 score0.8630.906
Table 3. Cost-effectiveness evaluation under different scenarios, where α and β are weight parameters, C is the cost, E is the effectiveness, and P is the comprehensive performance.
Table 3. Cost-effectiveness evaluation under different scenarios, where α and β are weight parameters, C is the cost, E is the effectiveness, and P is the comprehensive performance.
Spatial Resolution/mCost/(USD/km2)EffectPrice FirstEffect FirstBalancing
αEβCPαEβCPαEβCP
2.52.3610.8830.7080.8830.1750.1421.7661.6240.1420.8830.741
51.4170.8580.4250.8580.4330.0851.7161.6310.0850.8580.773
101.1020.8050.3310.8050.4740.0661.6101.5440.0660.8050.739
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, Y.; Zhang, Y.; Chen, Y.; Wang, J.; Bai, H.; Wu, B.; Li, W.; Li, S.; Zheng, T. The Assessment of More Suitable Image Spatial Resolutions for Offshore Aquaculture Areas Automatic Monitoring Based on Coupled NDWI and Mask R-CNN. Remote Sens. 2022, 14, 3079. https://doi.org/10.3390/rs14133079

AMA Style

Wang Y, Zhang Y, Chen Y, Wang J, Bai H, Wu B, Li W, Li S, Zheng T. The Assessment of More Suitable Image Spatial Resolutions for Offshore Aquaculture Areas Automatic Monitoring Based on Coupled NDWI and Mask R-CNN. Remote Sensing. 2022; 14(13):3079. https://doi.org/10.3390/rs14133079

Chicago/Turabian Style

Wang, Yonggui, Yaxin Zhang, Yan Chen, Junjie Wang, Hui Bai, Bo Wu, Wei Li, Shouwei Li, and Tianyu Zheng. 2022. "The Assessment of More Suitable Image Spatial Resolutions for Offshore Aquaculture Areas Automatic Monitoring Based on Coupled NDWI and Mask R-CNN" Remote Sensing 14, no. 13: 3079. https://doi.org/10.3390/rs14133079

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop