# Development of Water-Wheel Tail Measurement System Based on Image Projective Transformation

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

## 3. Results

## 4. Discussion

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Li, H.-T.; Wang, X.-A.; Feng, Y.; Lan, C.-J. Intelligent Ecological Aquaculture System. Comput. Syst. Appl.
**2017**, 26, 73–76. [Google Scholar] - Yuan, F.; Huang, Y.-F.; Chen, X.; Cheng, E. A Biological Sensor System Using Computer Vision for Water Quality Monitoring. IEEE Access
**2018**, 6, 61535–61546. [Google Scholar] [CrossRef] - Kassem, T.; Shahrour, I.; El Khattabi, J.; Raslan, A. Smart and Sustainable Aquaculture Farms. Sustainability
**2021**, 13, 10685. [Google Scholar] [CrossRef] - Oh, S.-H.; Oh, Y.M.; Kim, J.-Y.; Kang, K.-S. A Case Study on the Design of Condenser Effluent Outlet of Thermal Power Plant to Reduce Foam Emitted to Surrounding Seacoast. Ocean Eng.
**2012**, 47, 58–64. [Google Scholar] [CrossRef] - Jenkinson, I.R.; Seuront, L.; Ding, H.; Elias, F. Biological Modification of Mechanical Properties of the Sea Surface Microlayer, Influencing Waves, Ripples, Foam and Air-Sea Fluxes. Elem. Sci. Anthr.
**2018**, 6, 26. [Google Scholar] [CrossRef] - Lánský, M.; Ruzičková, I.; Benáková, A.; Wanner, J. Effect of Coagulant Dosing on Physicochemical and Microbiological Characteristics of Activated Sludge and Foam Formation. Acta Hydrochim. Hydrobiol.
**2005**, 33, 266–269. [Google Scholar] [CrossRef] - Westlund, Å.D.; Hagland, E.; Rothman, M. Foaming in Anaerobic Digesters Caused by Microthrix Parvicella. Water Sci. Technol.
**1998**, 37, 51–55. [Google Scholar] [CrossRef] - Wang, Y.; Sun, M.; Tang, Y.; Xu, A.; Tang, J.; Song, Z. Effects of Haematococcus Pluvialis on the Water Quality and Performance of Litopenaeus Vannamei Using Artificial Substrates and Water Exchange Systems. Aquac. Int.
**2022**, 30, 1779–1797. [Google Scholar] [CrossRef] - Abdel-Raouf, N.; Al-Homaidan, A.A.; Ibraheem, I.B.M. Microalgae and Wastewater Treatment. Saudi J. Biol. Sci.
**2012**, 19, 257–275. [Google Scholar] [CrossRef] - Faugeras, O.D.; Luong, Q.T.; Maybank, S.J. Camera Self-Calibration: Theory and Experiments; Springer: Berlin/Heidelberg, Germany, 1992; pp. 321–334. [Google Scholar]
- Hartley, R.I. Estimation of Relative Camera Positions for Uncalibrated Cameras; Springer: Berlin/Heidelberg, Germany, 1992; pp. 579–587. [Google Scholar]
- Hartley, R.I.; Gupta, R.; Chang, T. Stereo from Uncalibrated Cameras. In Proceedings of the 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Champaign, IL, USA, 15–18 June 1992; Volume 92, pp. 761–764. [Google Scholar]
- Chong, C.F.; Wang, Y.; Ng, B.; Yang, X. Projective Transformation Rectification for Camera-Captured Chest X-ray Photograph Interpretation with Synthetic Data. arXiv
**2022**, arXiv:2210.05954. [Google Scholar] - Sze, V.; Chen, Y.-H.; Yang, T.-J.; Emer, J.S. Efficient Processing of Deep Neural Networks: A Tutorial and Survey. Proc. IEEE
**2017**, 105, 2295–2329. [Google Scholar] [CrossRef] - Szegedy, C.; Toshev, A.; Erhan, D. Deep Neural Networks for Object Detection. Adv. Neural Inf. Process. Syst.
**2013**, 26. Available online: https://papers.nips.cc/paper_files/paper/2013/file/f7cade80b7cc92b991cf4d2806d6bd78-Paper.pdf (accessed on 6 October 2021). - Montufar, G.F.; Pascanu, R.; Cho, K.; Bengio, Y. On the Number of Linear Regions of Deep Neural Networks. Adv. Neural Inf. Process. Syst.
**2014**, 2924–2932, 2924–2932. [Google Scholar] - Yandouzi, M.; Grari, M.; Berrahal, M.; Idrissi, I.; Moussaoui, O.; Azizi, M.; Ghoumid, K.; Elmiad, A.K. Investigation of Combining Deep Learning Object Recognition with Drones for Forest Fire Detection and Monitoring. Int. J. Adv. Comput. Sci. Appl.
**2023**, 14. [Google Scholar] [CrossRef] - Van Dyk, D.A.; Meng, X.L. The Art of Data Augmentation. J. Comput. Graph. Stat.
**2001**, 10, 1–50. [Google Scholar] [CrossRef] - Shorten, C.; Khoshgoftaar, T.M.; Furht, B. Text Data Augmentation for Deep Learning. J. Big Data
**2021**, 8, 101. [Google Scholar] [CrossRef] [PubMed] - Zhong, Z.; Zheng, L.; Kang, G.; Li, S.; Yang, Y. Random Erasing Data Augmentation. Proc. AAAI Conf. Artif. Intell.
**2020**, 34, 13001–13008. [Google Scholar] [CrossRef] - Chen, P.; Liu, S.; Zhao, H.; Jia, J. Gridmask Data Augmentation. arXiv
**2020**, arXiv:2001.04086. [Google Scholar] - Tanner, M.A.; Wong, W.H. The Calculation of Posterior Distributions by Data Augmentation. J. Am. Stat. Assoc.
**1987**, 82, 528–540. [Google Scholar] [CrossRef] - Taylor, L.; Nitschke, G. Improving Deep Learning with Generic Data Augmentation. In 2018 IEEE Symposium Series on Computational Intelligence; IEEE: Piscataway, NJ, USA, 2018; pp. 1542–1547. [Google Scholar]
- Park, D.S.; Chan, W.; Zhang, Y.; Chiu, C.-C.; Zoph, B.; Cubuk, E.D.; Le, Q.V. Specaugment: A Simple Data Augmentation Method for Automatic Speech Recognition. arXiv
**2019**, arXiv:1904.08779. [Google Scholar] - De Vries, T.; Taylor, G.W. Improved Regularization of Convolutional Neural Networks with Cutout. arXiv
**2017**, arXiv:1708.04552. [Google Scholar] - Cubuk, E.D.; Zoph, B.; Mane, D.; Vasudevan, V.; Le, Q.V. Autoaugment: Learning Augmentation Policies from Data. arXiv
**2018**, arXiv:1805.09501. [Google Scholar] - Zhang, Y.; Lu, S. Data Augmented Network for Facial Landmark Detection. In Proceedings of the 2021 The 5th International Conference on Compute and Data Analysis, Sanya, China, 2–4 February 2021; pp. 180–186. [Google Scholar]
- Guo, Y.; Fu, Y.; Hao, F.; Zhang, X.; Wu, W.; Jin, X.; Bryant, C.R.; Senthilnath, J. Integrated phenology and climate in rice yields prediction using machine learning methods. Ecol. Indic.
**2021**, 120, 106935. [Google Scholar] [CrossRef] - Kobler, A.; Adamic, M. Identifying brown bear habitat by a combined GIS and machine learning method. Ecol. Model.
**2000**, 135, 291–300. [Google Scholar] [CrossRef] - Pham, B.T.; Tien Bui, D.; Prakash, I.; Dholakia, M.B. Hybrid integration of Multilayer Perceptron Neural Networks and machine learning ensembles for landslide susceptibility assessment at Himalayan area (India) using GIS. CATENA
**2017**, 149, 52–63. [Google Scholar] [CrossRef] - Liu, C.-S.; Chen, X.-T.; Shih, W.-Y.; Lin, C.-C.; Yen, J.-H.; Huang, C.-J.; Yen, Y.-T. Smart water quality monitoring technology for fish farms using cellphone camera sensor. Sens. Mater.
**2023**, 35, 3019–3029. [Google Scholar] [CrossRef] - Annan District. Available online: https://en.wikipedia.org/wiki/Annan_District (accessed on 5 December 2022).
- Camera Calibration: From Image to World Coordinate. Available online: https://www.youtube.com/watch?v=Sjx1Db1CVic (accessed on 6 October 2021).

**Figure 2.**Illustration of the projective transformation effect [33].

**Figure 4.**Two calibration boards with different specifications (

**left**: specification A;

**right**: specification B).

**Figure 13.**Verification of calibration effects using different specifications of calibration boards (specifications A and B): (

**a**) A-1 (

**b**) B-1 (

**c**) A-2 (

**d**) B-2.

**Figure 14.**Measurement results of water-wheel tail using different specifications of calibration boards (specifications A and B): (

**a**) A-1 (

**b**) B-1 (

**c**) A-2 (

**d**) B-2.

Item | Model | Specifications | |||
---|---|---|---|---|---|

Camera | Canon EOS R10 | Effective pixels: 24.2 million pixels | |||

Resolution: 6000 × 4000 | |||||

Sensor type: CMOS Sensor size: APS-C (22.3 mm × 14.8 mm) | |||||

FUJIFILM FinePix HS10 | Effective pixels: 10.3 million pixels | ||||

Resolution: 3648 × 2736 | |||||

Sensor type: CMOS Sensor size: 1/2.3″ (6.17 mm × 4.55 mm) | |||||

Calibration board | Custom | Type | Square side length | Pattern array | Pattern size |

Spec. A | 100 mm | 7 × 115 | 0.7 m × 11.5 m | ||

Spec. B | 144 mm | 5 × 80 | 0.72 m × 11.52 m |

Predicted Positive | Predicted Negative | |
---|---|---|

Actual positive | TP | FN (Type I Error) |

Actual negative | FP (Type II Error) | TN |

**Table 3.**Calibration effects using different specifications of calibration boards (specifications A-1, B-1, A-2 and B-2).

Position | No. | Calculated Length of Calibration Board (m) | Theoretical Length of Calibration Board (m) | Error | Average Error |
---|---|---|---|---|---|

I (A-1) | 1 | 11.5108 | 11.3 | 1.87% | 1.10% |

2 | 11.3683 | 0.60% | |||

3 | 11.3927 | 0.82% | |||

II (A-1) | 1 | 11.5773 | 2.45% | 0.76% | |

2 | 11.3221 | 0.20% | |||

3 | 11.2566 | 0.38% | |||

III (A-1) | 1 | 11.3814 | 0.72% | 0.38% | |

2 | 11.2224 | 0.69% | |||

3 | 11.4258 | 1.11% | |||

IV (A-1) | 1 | 11.3609 | 0.54% | 0.27% | |

2 | 11.288 | 0.11% | |||

3 | 11.3427 | 0.38% | |||

V (A-1) | 1 | 11.3473 | 0.42% | 1.55% | |

2 | 11.307 | 0.06% | |||

3 | 10.7216 | 5.12% | |||

I (B-1) | 1 | 11.3098 | 11.232 | 0.69% | 0.50% |

2 | 11.2871 | 0.49% | |||

3 | 11.2679 | 0.32% | |||

II (B-1) | 1 | 11.3932 | 1.44% | 0.72% | |

2 | 11.1791 | 0.47% | |||

3 | 11.3677 | 1.21% | |||

III (B-1) | 1 | 11.4283 | 1.75% | 2.12% | |

2 | 11.5952 | 3.23% | |||

3 | 11.388 | 1.39% | |||

IV (B-1) | 1 | 11.4596 | 2.03% | 2.05% | |

2 | 11.4688 | 2.11% | |||

3 | 11.46 | 2.03% | |||

V (B-1) | 1 | 10.9396 | 2.60% | 0.53% | |

2 | 11.4851 | 2.25% | |||

3 | 11.451 | 1.95% | |||

I (A-2) | 1 | 11.2938 | 11.3 | 0.06% | 0.24% |

2 | 11.2856 | 0.13% | |||

3 | 11.2394 | 0.54% | |||

II (A-2) | 1 | 11.2483 | 0.46% | 0.41% | |

2 | 11.2323 | 0.60% | |||

3 | 11.2794 | 0.18% | |||

III (A-2) | 1 | 11.2735 | 0.23% | 0.21% | |

2 | 11.2746 | 0.22% | |||

3 | 11.2794 | 0.18% | |||

IV (A-2) | 1 | 11.2918 | 0.07% | 0.14% | |

2 | 11.2982 | 0.02% | |||

3 | 11.2615 | 0.34% | |||

V (A-2) | 1 | 11.2499 | 0.44% | 0.24% | |

2 | 11.2778 | 0.20% | |||

3 | 11.2902 | 0.09% | |||

I (B-2) | 1 | 11.0882 | 11.232 | 1.28% | 1.27% |

2 | 11.092 | 1.25% | |||

3 | 11.0863 | 1.30% | |||

II (B-2) | 1 | 11.1129 | 1.06% | 1.07% | |

2 | 11.1036 | 1.14% | |||

3 | 11.1198 | 1.00% | |||

III (B-2) | 1 | 11.108 | 1.10% | 1.10% | |

2 | 11.107 | 1.11% | |||

3 | 11.1093 | 1.09% | |||

IV (B-2) | 1 | 11.1305 | 0.90% | 0.81% | |

2 | 11.1219 | 0.98% | |||

3 | 11.1708 | 0.54% | |||

V (B-2) | 1 | 11.087 | 1.29% | 1.32% | |

2 | 11.0963 | 1.21% | |||

3 | 11.0673 | 1.47% |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Chen, X.-T.; Liu, C.-S.; Yen, J.-H.
Development of Water-Wheel Tail Measurement System Based on Image Projective Transformation. *Water* **2023**, *15*, 3889.
https://doi.org/10.3390/w15223889

**AMA Style**

Chen X-T, Liu C-S, Yen J-H.
Development of Water-Wheel Tail Measurement System Based on Image Projective Transformation. *Water*. 2023; 15(22):3889.
https://doi.org/10.3390/w15223889

**Chicago/Turabian Style**

Chen, Xin-Ting, Chien-Sheng Liu, and Jung-Hong Yen.
2023. "Development of Water-Wheel Tail Measurement System Based on Image Projective Transformation" *Water* 15, no. 22: 3889.
https://doi.org/10.3390/w15223889