Multi-Task Learning for Ocean-Front Detection and Evolutionary Trend Recognition
Highlights
- We construct the Zhejiang–Fujian Coastal Front Mask (ZFCFM) and Evolutionary Trend (ZFCFET) datasets for daily ocean-front detection (OFD) and point-level ocean-front evolutionary trend recognition (OFETR) from filled and normalized SST gradients with trend labels derived from hand-annotated fronts.
- A multi-task 3D U-Net that shares 3D spatiotemporal features between OFD and OFETR matches strong 2D baselines for OFD with fewer parameters and similar computation, markedly improving OFETR over single-task and traditional methods.
- The proposed multi-task framework provides a practical alternative to cascade designs, offering more robust trend recognition under imperfect segmentation and reducing reliance on binary OFD results or auxiliary input channels.
- The datasets, model, and analysis framework introduced in this study provide a reproducible benchmark and design guidelines for future work on spatiotemporal OFD and OFETR in other regions and at different spatial and temporal resolutions.
Abstract
1. Introduction
- Most OFETR methods adopt a cascaded paradigm that first performs OFD and then conducts OFETR based on the resulting masks. This paradigm is sensitive to upstream recognition errors, which can be amplified in the OFETR stage. Moreover, the masks produced by OFD discard the original intensity distribution, often forcing the introduction of extra channels or handcrafted rules to compensate for the information gap, which in turn impedes effective information sharing between the two tasks.
- Most short-term OFETR methods produce results that are restricted to the window level and therefore lack explicit representations of point-level changes at a temporal resolution commensurate with the underlying data. As illustrated in Figure 1, relying solely on window-level labels can obscure the within-window, fine-grained evolution of the front, including the strengthening during the stage shown in Figure 1a–d and the weakening during the stage shown in Figure 1d–f. Moreover, because available datasets differ markedly in temporal resolution, and because many studies now analyze daily or even subdaily scales, a single window-level output is increasingly inadequate for probing the mechanisms that govern front evolution [7,8,9,10,11,12,13,14].
- We introduce a 3D U-Net-based multi-task learning framework that enables OFETR to explicitly share the spatiotemporal representations learned by OFD, avoiding the information loss caused by treating the two tasks in isolation. Under conditions where OFD varies only slightly, our framework improves OFETR while significantly reducing the parameter count and suppresses cascaded error propagation by design.
- The model is end-to-end and requires only SST gradient heatmaps as input, without any external data sources, thereby simplifying the data collection and implementation pipeline.
- Compared with window-level outputs, the framework directly provides point-level trends, preserving the strengthening and weakening processes within a window and aligning more closely with studies of the mechanisms of front evolutionary trend.
2. Materials and Methods
2.1. Dataset
2.1.1. Zhejiang–Fujian Coastal Front Mask Dataset
2.1.2. Zhejiang–Fujian Coastal Front Evolutionary Trend Dataset
| Algorithm 1 Two-step rule for daily trend labels |
|
2.2. Method
2.2.1. Network Architecture
2.2.2. Compared Model
2.2.3. Loss
2.3. Metrics
3. Results
3.1. Comparison Across Models
3.2. Effect of Clip Length on Performance
3.3. Effect of the P3D Block on Model Performance
3.4. Effect of Input Representation
3.5. Sensitivity of Cascade Models to OFD Mask Errors
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- McWilliams, J.C. Oceanic Frontogenesis. Annu. Rev. Mar. Sci. 2021, 13, 227–253. [Google Scholar] [CrossRef] [PubMed]
- Horner-Devine, A.R.; Hetland, R.D.; MacDonald, D.G. Mixing and Transport in Coastal River Plumes. Annu. Rev. Fluid Mech. 2015, 47, 569–594. [Google Scholar] [CrossRef]
- Walker, N.D.; Wiseman, W.J.; Rouse, L.; Babin, A. Effects of River Discharge, Wind Stress, and Slope Eddies on Circulation and the Satellite-Observed Structure of the Mississippi River Plume. J. Coast. Res. 2005, 21, 1228–1244. [Google Scholar] [CrossRef]
- Ou, H.W.; Dong, C.M.; Chen, D. Tidal Diffusivity: A Mechanism for Frontogenesis. J. Phys. Oceanogr. 2003, 33, 840–847. [Google Scholar] [CrossRef]
- Chen, C.; Xue, P.; Ding, P.; Beardsley, R.C.; Xu, Q.; Mao, X.; Gao, G.; Qi, J.; Li, C.; Lin, H.; et al. Physical Mechanisms for the Offshore Detachment of the Changjiang Diluted Water in the East China Sea. J. Geophys. Res. Ocean. 2008, 113, C02002. [Google Scholar] [CrossRef]
- Amos, C.M.; Castelao, R.M. Influence of the El Niño–Southern Oscillation on SST Fronts Along the West Coasts of North and South America. J. Geophys. Res. Ocean. 2022, 127, e2022JC018479. [Google Scholar] [CrossRef]
- O’Neill, L.W.; Chelton, D.B.; Esbensen, S.K. Observations of SST-Induced Perturbations of the Wind Stress Field over the Southern Ocean on Seasonal Timescales. J. Clim. 2003, 16, 2340–2354. [Google Scholar] [CrossRef]
- Liu, Y.; Meng, Z.; Chen, W.; Liang, Y.; Chen, W.; Chen, Y. Ocean Fronts and Their Acoustic Effects: A Review. J. Mar. Sci. Eng. 2022, 10, 2021. [Google Scholar] [CrossRef]
- Yao, Y.; Zhong, W.; He, H.; Sun, Y.; Feng, Z. The relationship between the North Pacific midlatitude oceanic frontal intensity and the storm track and its future changes. In Proceedings of the One Ocean Science Congress 2025, Nice, France, 3–6 June 2025. OOS2025-241. [Google Scholar] [CrossRef]
- Lévy, M.; Haëck, C.; Mangolte, I.; Cassianides, A.; El Hourany, R. Shift in phytoplankton community composition over fronts. Commun. Earth Environ. 2025, 6, 591. [Google Scholar] [CrossRef]
- Hsu, T.Y.; Chang, Y.; Lee, M.A.; Wu, R.F.; Hsiao, S.C. Predicting Skipjack Tuna Fishing Grounds in the Western and Central Pacific Ocean Based on High-Spatial-Temporal-Resolution Satellite Data. Remote Sens. 2021, 13, 861. [Google Scholar] [CrossRef]
- Coadou-Chaventon, S.; Speich, S.; Zhang, D.; Rocha, C.B.; Swart, S. Oceanic Fronts Driven by the Amazon Freshwater Plume and Their Thermohaline Compensation at the Submesoscale. J. Geophys. Res. Ocean. 2024, 129, e2024JC021326. [Google Scholar] [CrossRef]
- Zhu, R.; Yu, J.; Zhang, X.; Yang, H.; Ma, X. Air–Sea Interaction During Ocean Frontal Passage: A Case Study from the Northern South China Sea. Remote Sens. 2025, 17, 3024. [Google Scholar] [CrossRef]
- Cronin, M.F.; Zhang, D.; Wills, S.M.; Reeves Eyre, J.E.J.; Thompson, L.; Anderson, N. Diurnal warming rectification in the tropical Pacific linked to sea surface temperature front. Nat. Geosci. 2024, 17, 316–322. [Google Scholar] [CrossRef]
- Castelao, R.M.; Barth, J.A.; Mavor, T.P. Flow-topography interactions in the northern California Current System observed from geostationary satellite data. Geophys. Res. Lett. 2005, 32, L24612. [Google Scholar] [CrossRef]
- Castelao, R.M.; Wang, Y. Wind-Driven Variability in Sea Surface Temperature Front Distribution in the California Current System. J. Geophys. Res. Ocean. 2014, 119, 1861–1875. [Google Scholar] [CrossRef]
- Cayula, J.F.; Cornillon, P. Edge Detection Algorithm for SST Images. J. Atmos. Ocean. Technol. 1992, 9, 67–80. [Google Scholar] [CrossRef]
- Belkin, I.M.; O’Reilly, J.E. An algorithm for oceanic front detection in chlorophyll and SST satellite imagery. J. Mar. Syst. 2009, 78, 319–326. [Google Scholar] [CrossRef]
- Xing, Q.; Yu, H.; Wang, H.; Ito, S.i. An improved algorithm for detecting mesoscale ocean fronts from satellite observations: Detailed mapping of persistent fronts around the China Seas and their long-term trends. Remote Sens. Environ. 2023, 294, 113627. [Google Scholar] [CrossRef]
- Wang, Y.; Zhou, F.; Meng, Q.; Zhou, M.; Hu, Z.; Zhang, C.; Zhao, T. An ocean front detection and tracking algorithm. arXiv 2025, arXiv:2502.15250. [Google Scholar] [CrossRef]
- He, Q.; Gong, B.; Song, W.; Du, Y.; Zhao, D.; Zhang, W. DCENet: A Dense Contextual Ensemble Network for Multiclass Ocean Front Detection. IEEE Geosci. Remote Sens. Lett. 2024, 21, 1–5. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, D.; Zhang, X. A New Method for Ocean Fronts’ Identification With Res-U-Net and Remotely Sensed Data in the Northwestern Pacific Area. IEEE Trans. Geosci. Remote Sens. 2025, 63, 1–11. [Google Scholar] [CrossRef]
- Wan, X.; Zhang, L.; Ma, X.; Xu, W.; Chen, Q.; Zhao, R.; Zeng, M. Dynamic gradient orientation and multi-scale fusion network for ocean front detection. J. Sea Res. 2025, 206, 102601. [Google Scholar] [CrossRef]
- Xing, Q.; Yu, H.; Wang, H. Global mapping and evolution of persistent fronts in Large Marine Ecosystems over the past 40 years. Nat. Commun. 2024, 15, 4090. [Google Scholar] [CrossRef]
- Yang, K.; Meyer, A.; Strutton, P.G.; Fischer, A.M. Global trends of fronts and chlorophyll in a warming ocean. Commun. Earth Environ. 2023, 4, 489. [Google Scholar] [CrossRef]
- Yang, Y.; Lam, K.M.; Sun, X.; Dong, J.; Lguensat, R. An Efficient Algorithm for Ocean-Front Evolution Trend Recognition. Remote Sens. 2022, 14, 259. [Google Scholar] [CrossRef]
- Embury, O.; Merchant, C.J.; Good, S.A.; Rayner, N.A.; Høyer, J.L.; Atkinson, C.; Block, T.; Alerskans, E.; Pearson, K.J.; Worsfold, M.; et al. Satellite-based time-series of sea-surface temperature since 1980 for climate applications. Sci. Data 2024, 11, 326. [Google Scholar] [CrossRef] [PubMed]
- Cao, W.; Xie, C.; Han, B.; Dong, J. Automatic Fine Recognition of Ocean Front Fused with Deep Learning. Comput. Eng. 2020, 46, 266–274. [Google Scholar] [CrossRef]
- Zhang, Y.; Yang, Q. A Survey on Multi-Task Learning. IEEE Trans. Knowl. Data Eng. 2022, 34, 5586–5609. [Google Scholar] [CrossRef]
- Rafiq, G.; Rafiq, M.; Choi, G.S. Video Description: A Comprehensive Survey of Deep Learning Approaches. Artif. Intell. Rev. 2023, 56, 13293–13372. [Google Scholar] [CrossRef]
- Çiçek, Ö.; Abdulkadir, A.; Lienkamp, S.S.; Brox, T.; Ronneberger, O. 3D U-Net: Learning Dense Volumetric Segmentation from Sparse Annotation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention—MICCAI 2016, Athens, Greece, 17–21 October 2016; Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2016; Volume 9901, pp. 424–432. [Google Scholar] [CrossRef]
- Xie, B.; Qi, J.; Yang, S.; Sun, G.; Feng, Z.; Yin, B.; Wang, W. Sea Surface Temperature and Marine Heat Wave Predictions in the South China Sea: A 3D U-Net Deep Learning Model Integrating Multi-Source Data. Atmosphere 2024, 15, 86. [Google Scholar] [CrossRef]
- Hu, B.; Gao, B.; Woo, W.L.; Ruan, L.; Jin, J.; Yang, Y.; Yu, Y. A Lightweight Spatial and Temporal Multi-Feature Fusion Network for Defect Detection. IEEE Trans. Image Process. 2021, 30, 472–486. [Google Scholar] [CrossRef]
- Qiu, Z.; Yao, T.; Mei, T. Learning Spatio-Temporal Representation with Pseudo-3D Residual Networks. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 5534–5542. [Google Scholar] [CrossRef]
- Xing, Q.; Yu, H.; Yu, W.; Chen, X.; Wang, H. A global daily mesoscale front dataset from satellite observations: In situ validation and cross-dataset comparison. Earth Syst. Sci. Data 2025, 17, 2831–2848. [Google Scholar] [CrossRef]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention—MICCAI 2015, Munich, Germany, 5–9 October 2015; Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2015; Volume 9351, pp. 234–241. [Google Scholar] [CrossRef]
- Zhou, Z.; Rahman Siddiquee, M.M.; Tajbakhsh, N.; Liang, J. UNet++: A Nested U-Net Architecture for Medical Image Segmentation. In Proceedings of the Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, Granada, Spain, 20 September 2018; Stoyanov, D., Taylor, Z., Carneiro, G., Syeda-Mahmood, T., Martel, A., Maier-Hein, L., Tavares, J.M.R., Bradley, A., Papa, J.P., Belagiannis, V., et al., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2018; Volume 11045, pp. 3–11. [Google Scholar] [CrossRef]
- Crameri, F.; Shephard, G.E.; Heron, P.J. The misuse of colour in science communication. Nat. Commun. 2020, 11, 5444. [Google Scholar] [CrossRef]














| Model | OFD (Mean ± SD%) | OFETR (Mean ± SD%) | Params (M) | FLOPs (G) | ||
|---|---|---|---|---|---|---|
| mIoU | mF1 | Acc | F1 | |||
| CCAIM | 46.84 | 62.54 | – | – | – | – |
| U-Net | 92.84 ± 0.11 | 96.13 ± 0.06 | – | – | 31.04 | 18.84 |
| U-Net++ | 92.78 ± 0.19 | 96.11 ± 0.11 | – | – | 36.63 | 53.97 |
| DCENet | 92.93 ± 0.04 | 96.18 ± 0.02 | – | – | 65.65 | 77.69 |
| Res-U-Net | 92.11 ± 0.32 | 95.73 ± 0.18 | – | – | 8.04 | 15.17 |
| 3D U-Net-S | 92.55 ± 0.27 | 95.99 ± 0.16 | – | – | 9.08 | 159.64 |
| ETR | – | – | 69.04 ± 1.54 | 59.43 ± 4.05 | 21.79 | 6.40 |
| 3D U-Net-S(e1) | – | – | 82.07 ± 0.49 | 78.44 ± 0.71 | 0.07 | 11.45 |
| 3D U-Net-S(e2) | – | – | 82.88 ± 0.57 | 79.31 ± 0.94 | 0.36 | 18.43 |
| 3D U-Net-S(e3) | – | – | 84.09 ± 0.40 | 80.36 ± 0.83 | 1.52 | 25.39 |
| 3D U-Net-S(e4) | – | – | 84.19 ± 0.69 | 80.83 ± 1.02 | 6.13 | 32.32 |
| 3D U-Net-S(d1) | – | – | 85.15 ± 0.15 | 82.16 ± 0.52 | 8.40 | 63.22 |
| 3D U-Net-S(d2) | – | – | 85.68 ± 0.84 | 82.92 ± 1.45 | 8.97 | 94.19 |
| 3D U-Net-S(d3) | – | – | 84.82 ± 0.75 | 82.00 ± 0.75 | 9.12 | 125.3 |
| 3D U-Net-M(e1) | 92.59 ± 0.13 | 96.02 ± 0.08 | 82.01 ± 0.56 | 78.43 ± 0.87 | 9.12 | 125.31 |
| 3D U-Net-M(e2) | 92.35 ± 0.11 | 95.88 ± 0.07 | 83.45 ± 0.64 | 80.16 ± 1.00 | 9.22 | 125.27 |
| 3D U-Net-M(e3) | 91.98 ± 0.40 | 95.68 ± 0.23 | 85.35 ± 0.53 | 82.26 ± 0.89 | 9.66 | 125.26 |
| 3D U-Net-M(e4) | 91.98 ± 0.29 | 95.67 ± 0.16 | 85.99 ± 0.31 | 83.32 ± 0.41 | 11.38 | 125.25 |
| 3D U-Net-M(d1) | 91.87 ± 0.18 | 95.62 ± 0.10 | 85.40 ± 0.76 | 82.55 ± 1.50 | 9.66 | 125.26 |
| 3D U-Net-M(d2) | 91.49 ± 0.54 | 95.39 ± 0.31 | 84.09 ± 0.80 | 81.09 ± 1.40 | 9.22 | 125.27 |
| 3D U-Net-M(d3) | 91.08 ± 0.88 | 95.18 ± 0.49 | 84.20 ± 1.09 | 81.08 ± 1.35 | 9.12 | 125.31 |
| 3D U-Net-C(e1) | – | – | 87.60 ± 0.46 | 85.05 ± 0.71 | 9.15 | 131.23 |
| 3D U-Net-C(e2) | – | – | 88.37 ± 0.26 | 86.16 ± 0.38 | 9.44 | 138.21 |
| 3D U-Net-C(e3) | – | – | 87.94 ± 0.58 | 85.45 ± 0.65 | 10.60 | 145.16 |
| 3D U-Net-C(e4) | – | – | 87.58 ± 0.79 | 85.28 ± 0.89 | 15.21 | 152.10 |
| 3D U-Net-C(d1) | – | – | 87.79 ± 0.59 | 85.55 ± 0.72 | 17.48 | 183.00 |
| 3D U-Net-C(d2) | – | – | 87.48 ± 0.92 | 85.19 ± 0.91 | 18.05 | 213.97 |
| 3D U-Net-C(d3) | – | – | 86.67 ± 1.20 | 84.21 ± 1.84 | 18.20 | 245.08 |
| Model | Clip Length | OFD (Mean ± SD %) | OFETR (Mean ± SD %) | Daily FLOPs (G/day) | ||
|---|---|---|---|---|---|---|
| mIoU | mF1 | Accuracy | F1 | |||
| 3D U-Net-S | 4 | 92.80 ± 0.09 | 96.13 ± 0.05 | – | – | 79.82 |
| 6 | 92.55 ± 0.27 | 95.99 ± 0.16 | – | – | 59.87 | |
| 8 | 92.48 ± 0.14 | 95.96 ± 0.08 | – | – | 53.21 | |
| 3D U-Net-S (e3) | 4 | – | – | 84.52 ± 0.41 | 81.37 ± 0.53 | 16.92 |
| 6 | – | – | 84.09 ± 0.40 | 80.36 ± 0.83 | 12.70 | |
| 8 | – | – | 83.06 ± 0.37 | 79.30 ± 0.23 | 11.28 | |
| 3D U-Net-S (e4) | 4 | – | – | 84.46 ± 0.48 | 81.33 ± 0.80 | 26.38 |
| 6 | – | – | 84.19 ± 0.69 | 80.83 ± 1.02 | 16.16 | |
| 8 | – | – | 83.72 ± 0.47 | 80.11 ± 1.43 | 14.37 | |
| 3D U-Net-S (d1) | 4 | – | – | 84.80 ± 0.55 | 82.06 ± 0.89 | 42.15 |
| 6 | – | – | 85.15 ± 0.15 | 82.16 ± 0.52 | 31.61 | |
| 8 | – | – | 84.29 ± 0.70 | 81.37 ± 0.95 | 28.10 | |
| 3D U-Net-M (e3) | 4 | 92.26 ± 0.25 | 95.84 ± 0.14 | 85.30 ± 0.78 | 82.55 ± 0.75 | 83.50 |
| 6 | 91.98 ± 0.40 | 95.68 ± 0.23 | 85.35 ± 0.53 | 82.26 ± 0.89 | 62.63 | |
| 8 | 91.64 ± 0.57 | 95.49 ± 0.33 | 84.70 ± 0.37 | 81.32 ± 0.39 | 55.67 | |
| 3D U-Net-M (e4) | 4 | 92.12 ± 0.18 | 95.74 ± 0.11 | 84.82 ± 1.05 | 82.26 ± 1.00 | 83.50 |
| 6 | 91.98 ± 0.29 | 95.67 ± 0.16 | 85.99 ± 0.31 | 83.32 ± 0.41 | 62.63 | |
| 8 | 91.85 ± 0.19 | 95.60 ± 0.10 | 85.83 ± 0.53 | 83.27 ± 0.74 | 55.66 | |
| 3D U-Net-M (d1) | 4 | 91.42 ± 0.94 | 95.35 ± 0.54 | 85.40 ± 1.25 | 82.79 ± 1.72 | 83.50 |
| 6 | 91.87 ± 0.18 | 95.62 ± 0.10 | 85.40 ± 0.76 | 82.55 ± 1.50 | 62.63 | |
| 8 | 91.81 ± 0.28 | 95.59 ± 0.15 | 85.45 ± 0.17 | 83.07 ± 0.19 | 55.67 | |
| Model | Block | OFD (Mean ± SD %) | OFETR (Mean ± SD %) | Params(M) | Daily FLOPs (G/day) | ||
|---|---|---|---|---|---|---|---|
| mIoU | mF1 | Accuracy | F1 | ||||
| 3D U-Net-S | C3D | 92.88 ± 0.08 | 96.18 ± 0.05 | – | – | 17.70 | 129.42 |
| P3D | 92.55 ± 0.27 | 95.99 ± 0.16 | – | – | 9.08 | 59.87 | |
| 3D U-Net-S (e3) | C3D | – | – | 85.00 ± 0.44 | 81.80 ± 0.93 | 2.29 | 20.01 |
| P3D | – | – | 84.09 ± 0.40 | 80.36 ± 0.83 | 1.52 | 12.70 | |
| 3D U-Net-S (e4) | C3D | – | – | 84.49 ± 0.43 | 81.11 ± 1.02 | 9.32 | 21.55 |
| P3D | – | – | 84.19 ± 0.69 | 80.83 ± 1.02 | 6.13 | 16.16 | |
| 3D U-Net-S (d1) | C3D | – | – | 85.61 ± 0.91 | 82.76 ± 1.21 | 15.73 | 61.63 |
| P3D | – | – | 85.15 ± 0.15 | 82.16 ± 0.52 | 8.40 | 31.61 | |
| 3D U-Net-M (e3) | C3D | 91.86 ± 0.55 | 95.63 ± 0.31 | 85.15 ± 0.47 | 82.24 ± 0.57 | 18.27 | 132.18 |
| P3D | 91.98 ± 0.40 | 95.68 ± 0.23 | 85.35 ± 0.53 | 82.26 ± 0.89 | 9.66 | 62.63 | |
| 3D U-Net-M (e4) | C3D | 91.99 ± 0.32 | 95.69 ± 0.18 | 85.11 ± 0.48 | 82.26 ± 0.67 | 19.99 | 132.17 |
| P3D | 91.98 ± 0.29 | 95.67 ± 0.16 | 85.99 ± 0.31 | 83.32 ± 0.41 | 11.38 | 62.63 | |
| 3D U-Net-M (d1) | C3D | 92.02 ± 0.31 | 95.70 ± 0.17 | 85.17 ± 0.69 | 82.38 ± 0.73 | 18.27 | 132.18 |
| P3D | 91.87 ± 0.18 | 95.62 ± 0.10 | 85.40 ± 0.76 | 82.55 ± 1.50 | 9.66 | 62.63 | |
| Model | Input | OFD (Mean ± SD %) | OFETR (Mean ± SD %) | ||
|---|---|---|---|---|---|
| mIoU | mF1 | Accuracy | F1 | ||
| 3D U-Net-S | Gray | 92.67 ± 0.15 | 96.06 ± 0.09 | – | – |
| Jet | 92.55 ± 0.27 | 95.99 ± 0.16 | – | – | |
| Viridis | 92.53 ± 0.15 | 95.99 ± 0.08 | – | – | |
| 3D U-Net-S (e3) | Gray | – | – | 83.43 ± 0.54 | 80.02 ± 0.95 |
| Jet | – | – | 84.09 ± 0.40 | 80.36 ± 0.83 | |
| Viridis | – | – | 84.54 ± 0.65 | 80.79 ± 1.29 | |
| 3D U-Net-S (e4) | Gray | – | – | 82.56 ± 1.31 | 78.59 ± 1.59 |
| Jet | – | – | 84.19 ± 0.69 | 80.83 ± 1.02 | |
| Viridis | – | – | 84.44 ± 0.54 | 81.65 ± 0.70 | |
| 3D U-Net-S (d1) | Gray | – | – | 83.02 ± 0.97 | 79.47 ± 1.99 |
| Jet | – | – | 85.15 ± 0.15 | 82.16 ± 0.52 | |
| Viridis | – | – | 85.10 ± 0.75 | 82.51 ± 0.86 | |
| 3D U-Net-M (e3) | Gray | 91.38 ± 0.64 | 95.35 ± 0.35 | 84.16 ± 0.55 | 80.51 ± 1.27 |
| Jet | 91.98 ± 0.40 | 95.68 ± 0.23 | 85.35 ± 0.53 | 82.26 ± 0.89 | |
| Viridis | 91.74 ± 0.51 | 95.55 ± 0.30 | 85.34 ± 0.61 | 82.61 ± 0.44 | |
| 3D U-Net-M (e4) | Gray | 92.00 ± 0.20 | 95.68 ± 0.11 | 84.34 ± 0.26 | 81.53 ± 0.41 |
| Jet | 91.98 ± 0.29 | 95.67 ± 0.16 | 85.99 ± 0.31 | 83.32 ± 0.41 | |
| Viridis | 91.92 ± 0.26 | 95.64 ± 0.14 | 85.39 ± 0.43 | 82.85 ± 0.68 | |
| 3D U-Net-M (d1) | Gray | 92.21 ± 0.17 | 95.81 ± 0.09 | 84.35 ± 0.74 | 81.79 ± 0.78 |
| Jet | 91.87 ± 0.18 | 95.62 ± 0.10 | 85.40 ± 0.76 | 82.55 ± 1.50 | |
| Viridis | 91.47 ± 0.55 | 95.39 ± 0.30 | 84.71 ± 0.57 | 82.24 ± 0.74 | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
He, Q.; Huang, A.; Geng, L.; Zhao, W.; Du, Y. Multi-Task Learning for Ocean-Front Detection and Evolutionary Trend Recognition. Remote Sens. 2025, 17, 3862. https://doi.org/10.3390/rs17233862
He Q, Huang A, Geng L, Zhao W, Du Y. Multi-Task Learning for Ocean-Front Detection and Evolutionary Trend Recognition. Remote Sensing. 2025; 17(23):3862. https://doi.org/10.3390/rs17233862
Chicago/Turabian StyleHe, Qi, Anqi Huang, Lijia Geng, Wei Zhao, and Yanling Du. 2025. "Multi-Task Learning for Ocean-Front Detection and Evolutionary Trend Recognition" Remote Sensing 17, no. 23: 3862. https://doi.org/10.3390/rs17233862
APA StyleHe, Q., Huang, A., Geng, L., Zhao, W., & Du, Y. (2025). Multi-Task Learning for Ocean-Front Detection and Evolutionary Trend Recognition. Remote Sensing, 17(23), 3862. https://doi.org/10.3390/rs17233862

