Next Article in Journal
Coupled Responses and Performance Assessment of Mooring-Connection Systems for Floating Photovoltaic Arrays in Shallow Waters
Previous Article in Journal
Damage Characterisation of Scour in Riprap-Protected Jackets and Hybrid Foundations
Previous Article in Special Issue
Preliminary Feasibility Study of Using Hydrogen as a Fuel for an Aquaculture Vessel in Tasmania, Australia
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Cost-Effective UUV Localisation System Integrable with Aquaculture Infrastructure

1
Department of Mechanical Engineering, Auckland University of Technology, Auckland 1010, New Zealand
2
Blue Economy Cooperative Research Centre, Launceston 7248, Australia
3
The New Zealand King Salmon Company, Nelson 7011, New Zealand
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2026, 14(2), 115; https://doi.org/10.3390/jmse14020115
Submission received: 19 November 2025 / Revised: 10 December 2025 / Accepted: 19 December 2025 / Published: 7 January 2026
(This article belongs to the Special Issue Infrastructure for Offshore Aquaculture Farms)

Abstract

In many aquaculture farms, Unmanned Underwater Vehicles (UUVs) are being deployed to perform dangerous and time-consuming repetitive tasks (e.g., fish net-pen visual inspection) on behalf of or in collaboration with farm operators. Mostly, they are remotely operated, and one of the main barriers to deploying them autonomously is the UUV localisation. Specifically, the cost of the localisation sensor suite, sensor reliability in constrained operational workspace and return on investment (ROI) for the huge initial investment on the UUV and its localisation hinder the R&D work and adoption of the autonomous UUV deployment on an industrial scale. The proposed system, which leverages the AprilTag (a fiducial marker used as a frame of reference) detection, provides cost-effective UUV localisation for the initial trials of autonomous UUV deployment, requiring only minor modifications to the aquaculture infrastructure. With such a cost-effective approach, UUV R&D engineers can demonstrate and validate the advantages and challenges of autonomous UUV deployment to farm operators, policymakers, and governing authorities to make informed decision-making for the future large-scale adoption of autonomous UUVs in aquaculture. Initial validation of the proposed cost-effective localisation system indicates that centimetre-level accuracy can be achieved with a single monocular camera and only 10 AprilTags, without requiring physical measurements, in a 115.46 m3 laboratory workspace under various lighting conditions.

1. Introduction

UUV localisation is one of the most challenging technical problems, hindering the large-scale adoption of autonomous UUVs in aquaculture or any underwater constrained operational workspace. Although autonomous UUV deployments are reported in oceanographic research such as bathymetry scanning, environmental monitoring, and defence applications [1,2], they are not widely adopted in constrained operational workspaces (e.g., aquaculture). The open workspace (e.g., open ocean in the oceanographic research) allows for a large error tolerance in UUV localisation. However, this is not the case for the constrained workspace (e.g., an aquaculture farm with mooring lines and other infrastructure), due to the potential safety hazards resulting from the deficiency of UUV localisation in terms of precision, accuracy, and reliability, which can impact aquaculture infrastructure, the human workforce, and the UUV itself. With such concerns, the potential benefits of adopting autonomous UUVs in aquaculture cannot be fully demonstrated to the farm operators, policymakers, and governing authorities. On the other hand, without the financial backing and strong interests from industry and government, due to the lack of further evidence supporting these benefits, researchers encounter limitations in expanding R&D work on autonomous deployment to large-scale validation, which is necessary to produce convincing evidence on technical, environmental, and commercial values [3]. Therefore, a cost-effective UUV localisation system is one way to address the bottleneck in technology transfer from research to commercial-scale deployment.
Autonomous UUV deployment can benefit aquaculture or any underwater operation in a constrained workspace, such as offshore wind farms and the oil and gas industry, in terms of productivity, reliability, and economically viable frequent operations. For instance, one of the most relevant tasks for autonomous UUVs in aquaculture is the routine visual inspection of fish net-pens [4,5]. The New Zealand King Salmon (NZKS) company, the world’s largest producer of King Salmon, deploys a remotely operated underwater vehicle (ROV, a type of UUV) in its fish farms for net-pen visual inspection and cleaning. Such tasks for a square net-pen: 30 m × 30 m × 15 m take 2.5 h to complete. Recently, the NZKS’s Blue Endeavour project, the first-of-its-kind offshore salmon farm in New Zealand, was started, located 7 km off Cape Lambert in the Cook Strait [6]. Aiming to produce 10,000 Metric tonnes harvested annually, a total of twenty circular floating flexible net-pens, where each one has a diameter of 53.5 m and a depth of 15 m, are configured into two blocks of a two-by-five layout. Therefore, in performing fish net-pen inspections and cleaning on such a large scale, autonomous UUV deployment is more sustainable and economical in the long term due to the shortage of available trained workers for offshore farms, as well as the need for safety, productivity, and reliability in a harsh working environment.
There is extensive ongoing research on the autonomy and localisation of UUVs for various applications [4,7,8,9,10,11]. However, to the best of our knowledge, there are no fully autonomous UUV deployments on a large scale other than preliminary trials in the aquaculture industry. The standard sensor suite for UUV localisation (e.g., Ultra-Short Baseline (USBL), Doppler Velocity Logger (DVL), Inertial Measurement Unit (IMU), Attitude Heading Reference System (AHRS), and Inertial Navigation System (INS)) are very costly depending on the accuracy, reliability, and performance grade (hobby to industry). As shown in Table 1, some industry-grade sensors cost substantially more than the hobby-type UUV (e.g., BlueROV2 Heavy Configuration, costing USD 6700), and a few of them, except for the SPRINT-NAV U, need to be fused to acquire UUV localisation. Therefore, the initial huge capital investment in sensors for the R&D work of autonomy and localisation of UUVs is one of the main concerns, in addition to the actual sensor performance in real-world operations. In other words, the cost of an industrial-grade UUV with an industrial-grade sensor suite exceeds the project budgets for many research institutes, except for those focused on marine research or system developers/integrators. Therefore, as mentioned earlier, the hobby-grade UUV is affordable, and with the help of a cost-effective localisation, there are more research outputs to support the potential benefits of autonomous UUV deployment in the aquaculture industry. Subsequently, these research capabilities and demonstrations will attract aquaculture farm operators, policymakers, and government authorities to further invest in UUV research for aquaculture.
In this work, a cost-effective UUV localisation system is proposed using AprilTags, fiducial markers [25], which require minimal time and effort to set up, with only minor modifications (e.g., the fixture for AprilTags) to the existing infrastructure, unlike other AprilTags-based localisation approaches [26,27]. The proposed system is suitable for deployment in aquaculture with various fish pen designs, such as floating or semi-submersible rigid cages, floating flexible fish net-pens with additional rigid body structures, and a closed containment tank system, and is certainly suitable for a controlled laboratory environment. AprilTags with their fixtures must be attached within the UUV-operating zone of the existing infrastructure. For the whole localisation deployment, there are three steps, namely, (1) AprilTags installation, (2) AprilTags extraction and data-logging of the relative poses of AprilTags for frame transformation, and (3) AprilTags poses publishing and localisation. Among these three steps, only the first step requires slightly more time for the manual installation of AprilTags, while the remaining steps can be completed in just a few minutes. The novelties and contributions of the proposed method are listed as follows.
  • No measurements are required for the initial AprilTag installation (Step 1) as long as a pair of AprilTags is placed, visible to the UUV’s camera at each instant, and every subsequent pair is visible. Therefore, it saves a substantial amount of installation time.
  • The proposed extraction algorithm (Step 2) captures and updates, in real-time, the relative poses of all AprilTag pairs at each user confirmation and subsequently performs data-logging at the end.
  • The proposed localisation algorithm (Step 3) publishes the relative poses of AprilTags using the previously logged data and performs the UUV’s localisation using the real-time camera feedback.
  • The overall deployment of the localisation system is simple, cost-effective, and time-efficient.
  • This research work provides the initial phase of the full deployment procedure and algorithm development towards the actual underwater deployment in aquaculture infrastructure in the near future.
The remaining part of this article is structured as follows. Section 2 presents the types of aquaculture infrastructure that are suitable for the proposed cost-effective UUV localisation system. Section 3 details the deployment of the proposed UUV localisation system, covering three main areas: AprilTags installation, AprilTags extraction and data-logging, and AprilTags pose publishing and localisation. Section 4 explains the experimental setup. Section 5 discusses the experimental results. Section 6 concludes the summary of the research findings and covers future research directions.

2. Aquaculture Infrastructure

In this work, the operational definition of aquaculture infrastructure is the structure inside which fish (e.g., salmon) are cultivated. Generally, it consists of a floating collar, a fish net-pen, a sinker tube, a side rope, a mooring system, and a buoy. There are different types of aquaculture infrastructure, so not all aforementioned components are installed in all types. The following four types of aquaculture infrastructure are considered suitable for deploying the proposed cost-effective localisation system.

2.1. Floating/Semi-Submersible Rigid Cage

Usually, floating/semi-submersible rigid cage designs, such as Ocean Farm 1 and Havfarm 1, are built to operate in a harsh open ocean environment [28,29]. A simplified illustration of fish net-pen setup inside the rigid cage design is shown in Figure 1. Such designs provide the rigid body structures to which AprilTags can be mounted with the additional fixtures.

2.2. Floating Flexible Fish Net-Pen with a Rigid Body Structure

Many aquaculture farms are designed with floating flexible fish net-pens as shown in Figure 2. Using the floating flexible fish net-pen design, the New Zealand King Salmon company started its trial phase of the first of its kind offshore salmon farm, located 7 km off Cape Lambert in the Cook Strait, in June 2025 [30]. As the floating collars and other parts of the entire infrastructure are not attached to a rigid body but only to the sea/ocean floor via anchor lines, the existing infrastructure is not suitable for AprilTags installation. The additional rigid body structures, acting as inertial frames, need to be installed so that the AprilTags remain stationary during operation.

2.3. Closed Containment Tank System

Another cage design suitable for the proposed cost-effective localisation system is the closed containment tank system, such as ECO-ARK® AQUACULTURE 4.0 by Aquaculture Centre of Excellence Pte Ltd. [31]. Due to the rigid body structure of the tank in a controlled environment, it is one of the most suitable infrastructures for the proposed localisation system. Its simplified version is illustrated in Figure 3.

2.4. Controlled Laboratory Environment

Just like any engineering research and development work, the proposed cost-effective localisation system must undergo concept generation and testing in a controlled laboratory environment, followed by field trial validation on an industrial scale. For this purpose, another suitable infrastructure is a controlled laboratory environment, as shown in Figure 4.
For all the aforementioned infrastructure, relatively minor modifications to the existing infrastructure are required for AprilTags installation, with the level of modification depending on the type of aquaculture infrastructure. The focus of this article is not mainly on the AprilTags installation and mechanism design, but on the algorithm implementation. As AprilTags are supposed to act as the inertial frames to which the pose of the UUV, via its camera, is measured in real-time, respectively, little to no movement of AprilTags is an essential requirement for the deployment of the proposed localisation system.

3. UUV Localisation System Deployment

As mentioned previously, there are three main steps for the deployment of the proposed cost-effective localisation system, namely (1) AprilTags installation, (2) AprilTags extraction and data-logging, and (3) AprilTags pose publishing and localisation. In the following subsections, all three steps, together with the overall deployment procedure, will be detailed, along with their advantages.

3.1. AprilTags Installation

Through simple matrix manipulations on the homogeneous transformations, and without the need for a visual odometry-based landmark SLAM [32], the proposed strategy for AprilTags installation involves no manual measurement of each AprilTag’s pose. In other words, during the installation, the only requirement is that a pair of AprilTags must be placed, visible to the UUV’s camera, and so are the following pairs (e.g., Tags [0, 1], Tags [1, 2]) at any instant of time. It is essential to note that this requirement applies only to Step (2), and the real-time UUV’s localisation only requires at least one AprilTag at any given instant of time.
There are a few possible scenarios of AprilTags installation, such as a setup with multiple cameras and multiple AprilTags as shown in Figure 5, a setup with a single camera, a single AprilTag attached to the UUV, and multiple stationary AprilTags, as shown in Figure 6, or a setup with a camera attached to the UUV and multiple stationary AprilTags as shown in Figure 7. Scenario 1 is suitable for tracking an object of interest (OOI) with multiple AprilTags configured around it within the field of view of the cameras [33]. However, for underwater applications, this scenario requires a tethered solution to transfer the estimated OOI’s pose or to transfer the transformation data from cameras to the OOI for onboard processing. Scenario 2 also requires a tethered solution to transfer the data of transformations between the single AprilTag attached to the UUV and other stationary AprilTags. Scenario 3 is suitable when the UUV operation requires a tetherless solution, as the transformation data from the camera is directly accessible by the UUV. Therefore, the UUV must be equipped with sufficient computing capacity. In this work, the last scenario will be demonstrated as the UUV’s tethered cable can be entangled within the constrained operation workspace of an aquaculture infrastructure.
Regardless of the aforementioned scenarios, the proposed algorithm for AprilTag extraction and data-logging allows for the ease of installing AprilTags without the need for manual measurements of each AprilTag’s relative pose. This factor substantially reduces the installation time. The only requirement for AprilTags installation is to place a pair of AprilTags within the field of view of the UUV’s camera at each instant of time, and so is the same for other subsequent pairs, as shown in Figure 8. This requirement is only compulsory for AprilTags extraction and data-logging, and it is not necessary for real-time localisation, which can be carried out once one of the AprilTags is captured in the field of view of the UUV’s camera.

3.2. AprilTags Extraction and Data-Logging

AprilTags extraction and data-logging of the relative poses of AprilTags can be carried out using the UUV’s camera. Alternatively, if a camera with better specifications (e.g., a high resolution, a larger field of view, or a high frame rate) is available, it can be used in place of the UUV’s camera. In either approach, the relative poses of AprilTags are stored in a YAML (Yet Another Markup Language) file at the end of the AprilTags extraction process. The stored YAML file can be utilised later for AprilTags pose publishing and localisation. The overall process of AprilTags extraction and data-logging is illustrated in Figure 8.
For n AprilTags, there are n 1 registers that the user needs to confirm by pressing the ’Enter’ key after orienting the camera to fully capture each pair. Therefore, in a series of n 1 registers, for the i-th register, there are two homogeneous transformations, generated by the existing AprilTag detection system [26].
H i c = R i c r i c 0 1 × 3 1 R 4 × 4 ,
H i + 1 c = R i + 1 c r i + 1 c 0 1 × 3 1 R 4 × 4
where R i c and R i + 1 c are the rotation matrices from Frame { i } and Frame { i + 1 } to Frame { c } , respectively. r i c and r i + 1 c are the position vectors of the origins ( O i , O i + 1 ) of Frame { i } and Frame { i + 1 } with respect to Frame { c } , respectively. Note: For ease of readability, the AprilTag Frame { a i } is denoted as { i } , although the full description is used in the figures.
For the i-th register in constructing the whole static transformations between each AprilTag, H i + 1 c and H i c as shown in Equations (1) and (2) are required to be further manipulated as follows.
H i + 1 i = H c i H i + 1 c R 4 × 4
where
H i + 1 i = R i + 1 i r i + 1 i 0 1 × 3 1 , R i + 1 i = R c i R i + 1 c , r i + 1 i = R c i ( r i + 1 c r i c )
Using Equation (3) for the i-th register for n AprilTags, it results in acquiring n 1 registers: H 1 0 , H 2 1 , H 3 2 , , H n 1 n 2 , H n n 1 for the whole static transformations between each AprilTag pair.

3.3. AprilTags Pose Publishing and Localisation

In the previous subsection, acquiring n 1 registers: H 1 0 , H 2 1 , H 3 2 , , H n 1 n 2 , H n n 1 for n AprilTags is presented in detail. Although these registers provide the static transformation between each AprilTag pair, the transformation required for localisation is H i I , where I represents the world inertial frame, and i is any AprilTag frame, currently being captured in the field of view of the UUV’s camera. Therefore, all possible H i I can be pre-computed, as constant matrices, at the start of executing the localisation algorithm, as all n 1 registers are static transformations. In doing so, it avoids unnecessary real-time computing of the constant matrices. To compute all the possible H i I , the world inertial frame needs to be defined first. Suppose Frame { I } is the world inertial frame and H 0 I = I 4 , where I 4 R 4 × 4 is the identity matrix,
H i I = H i 0 = k = 1 i H k k 1 , i { 1 , 2 , , n }
During the real-time localisation algorithm execution, H i c as shown in Equation (1) is available once any i-th AprilTag is in the field of view of the UUV’s camera. Therefore, finally, the UUV’s pose with respect to Frame { I } can be computed in real-time as follows using Equation (4).
H b I = H i I H c i H b c
where
H c i = R i c T R i c T r i c 0 1 × 3 1 , H b c = R b c r b c 0 1 × 3 1
From Equation (5), R b I and r b I can be obtained. Subsequently, quaternions and Euler angles of the UUV’s orientation with respect to Frame { I } can be computed from R b I [34]. The position vector describing the origin of Frame { b } : O b with respect to Frame { I } can be directly obtained from r b I .
The detailed step-by-step process of AprilTags installation to localisation is presented above. Another plausible question from an implementation perspective is how to determine which AprilTag should be used when multiple ones are within the UUV’s camera’s field of view. As the AprilTag detection is heavily affected by the camera’s distance from the AprilTag and its viewing angle [35,36], the norm distance λ and orientation κ are used to select the most reliable AprilTag transformation H c i among them, as shown below.
i = arg min i { 1 , , n } ( λ i , κ i )
where
r c i x i , y i , z i , λ i = x i 2 + y i 2 + z i 2 R c i ϕ i , θ i , ψ i , κ i = ϕ i 2 + θ i 2 + ψ i 2
Finally, using Equations (5) and (6), the UUV localisation or the estimation of the UUV’s pose with respect to Frame { I } can be carried out solely using the UUV’s onboard camera.

3.4. The Overall Deployment Procedure

The overall deployment procedure for the UUV localisation is illustrated by Figure 9. After installing the AprilTags in the desired poses as shown in Figure 10, H i + 1 i as shown in Equation (3) is extracted by moving the camera as demonstrated in Figure 8 for ( n 1 ) registers, and then subsequently, the poses (position and quaternion) are stored in a YAML file. These extractions are performed in a ROS node and are live-updated in RViz (a ROS visualisation tool) as shown in Figure 11. Once the extractions and data-logging are completed, the relevant ROS node is shut down.
Using the previously generated YAML file, the extracted AprilTags poses with respect to Frame { I } as shown in Equation (4) are published by another ROS node. In the same ROS node, the UUV localisation is performed using Equation (5) and the UUV’s pose is published for real-time visualisation in RViz.
As summarised above, the overall deployment procedure is specifically designed to be simple, cost-effective, and time-efficient. Among all deployment steps, installing AprilTags is the only one that is relatively time-consuming. However, compared with the conventional AprilTags localisation approach, the proposed method requires no physical measurements, resulting in significant time savings. Moreover, AprilTags extraction and data-logging take only about 2 min. After this step, localisation can be executed immediately. The following section describes the experimental setup in detail.

4. Experimental Setup

For the experiments, a single ZED 2i camera and 10 AprilTags, installed inside the 115.46 m3 workspace, are used. The AprilTags are installed on both horizontal and vertical surfaces, and under various lighting conditions, as shown in Figure 10. The printed AprilTags are matt-laminated, effectively minimising the glare even under the direct ceiling lighting.
The specifications of the camera, AprilTags, workspace, ROS 2, and laptop are reported in Table 2. The camera intrinsic parameters are provided by the software development kit (SDK) of the ZED 2i camera from Stereolabs (San Francisco, New York, USA and Paris, France). For the AprilTag detection, the library ‘lib-dt-apriltags’, a Python binding for the AprilTags 3 library developed by AprilRobotics, is called in the ROS nodes, using Python 3.10.12. The dimensions of the workspace are produced using the homogeneous transformation matrices. To gauge the relative position-estimation error, the ground-truth measurements between the relative AprilTags were taken as shown in Table 3.
It is important to note that the current experiment is not conducted underwater due to the limited access to a large underwater environment. However, the overall deployment procedure remains the same for the underwater deployment. Therefore, this research work provides the initial phase of the full deployment procedure and algorithm development towards the actual underwater deployment in aquaculture infrastructure in the near future.

5. Results and Discussion

The overall deployment procedure for the localisation is carried out as shown in Figure 9. As presented in Section 3, the results of the three main deployment steps, namely (1) AprilTags installation, (2) AprilTags extraction and data-logging, and (3) AprilTags pose publishing and localisation, will be discussed in this section. The recorded videos of the experiments for AprilTags extraction and data-logging and AprilTags pose publishing and localisation are available at this hyperlink: https://youtube.com/playlist?list=PLG3nO3TEqwOmjvLVynK54pUugt_U1ChFA&si=tKaJysXCAkjKN2rn (accessed on 18 November 2025).

5.1. AprilTags Installation

Due to the application-oriented nature, the camera and AprilTag specifications reported in Table 2 play a crucial role in the AprilTags installation step. To accommodate future deployment in a large aquaculture environment, the tag family: tag36h11 is chosen, as it features 587 tags with unique IDs (0–586). In this experiment, 10 tags (IDs: 0–9) are utilised, and each tag size is 0.224 m, printed on standard A4 paper. Based on the preliminary test, it was found that the AprilTag detection using a ZED 2i camera provides the estimated pose of the AprilTag with the aforementioned tag size up to the working distance of 3 m.
Although the ZED 2i camera provides stereo vision, it is treated as a monocular camera, and the visual feedback from the left lens is used for the AprilTag detection. During the preliminary test of the AprilTag detection, it was found that a higher FPS with lower resolution is more reliable than a lower FPS with higher resolution in the case of fast motion. Therefore, the ZED 2i camera is configured to provide 60 FPS with 720 p resolution. As the recalibration of the ZED 2i camera is not recommended by the manufacturer unless necessary, the default calibration parameters are acquired via its SDK. Note: It is worth noting that, for the underwater deployment in the future, the recalibration of the ZED 2i camera is essential.
After all the aforementioned preliminary tests, 10 AprilTags are installed in the horizontal and vertical surfaces in the laboratory workspace of 5.79 m [W] × 8.67 m [L] × 2.30 m [H] under different lighting conditions as shown in Figure 10. Compared to the conventional approach, the proposed method does not require any physical measurements between the relative poses of all 10 AprilTags, so the AprilTags installation time is substantially reduced. The only requirement is that each pair of AprilTags must be visible to the ZED 2i camera view so that the Step (2) AprilTags extraction and data-logging can be carried out properly, as shown in Figure 8.

5.2. AprilTags Extraction and Data-Logging

Once the AprilTags installation is completed, the AprilTags extraction and data-logging can be carried out by a single person carrying a laptop and a ZED 2i camera, following the procedure illustrated in Figure 8. The results of the extracted and data-logged poses of AprilTags, installed inside the 115.46 m3 workspace, are reported in Figure 11. The recorded video on the real-time update of the extracted and data-logged poses (frames) of the AprilTags is available at this hyperlink: https://youtube.com/playlist?list=PLG3nO3TEqwOmjvLVynK54pUugt_U1ChFA&si=tKaJysXCAkjKN2rn (accessed on 18 November 2025).
During the AprilTags extraction test, it was found that AprilTag detection on the first available frame is adversely affected by motion blur and unstable image processing conditions. Therefore, the initially developed automated extraction procedure in ROS is not used in this work, and the registration of each AprilTag pair is performed only after user confirmation is provided by pressing the ’Enter’ key. At the end of registering all 10 AprilTags, the data-logging process is software-automated to save all poses into a YAML file. The entire process of extracting and data-logging the poses of all AprilTags installed inside the 115.46 m3 workspace takes approximately 2 min to complete.
The ground-truth poses of 10 AprilTags, installed inside the 115.46 m3 workspace, are not readily available but to gauge the relative position-estimation error % as shown in Equation (7), the ground-truth manual measurements of the position between Tag ID i and Tag ID i + 1 were taken and compared with the values obtained from the AprilTags extraction. The results are reported in Table 3. Therefore, the results show that the relative position-estimation with centimetre-level accuracy (min. error: 0.03744 cm or min. error %: 0.01% and max. error: 8.42708 cm or max. error %: 4.28%) can be achieved via the proposed AprilTags extraction. It is essential to note that inaccuracies in manual measurements are unavoidable, which is the primary reason why e λ 9 0 exhibits the largest error, propagated through the subsequent transformations H i + 1 i using AprilTags extraction when compared with the direct manual measurement between Tag ID 9 and Tag ID 0. However, the relative orientation (roll, pitch, yaw) cannot be measured manually without the help of an advanced and expensive 3D localisation system. Therefore, no quantitative orientation-estimation errors are reported except Figure 10 and Figure 11 to observe the AprilTags frames qualitatively.
e λ i + 1 i = | λ ^ i + 1 i λ i + 1 i | λ ^ i + 1 i × 100
where λ i + 1 i is the position norm using the translation part of H i + 1 i resulting from AprilTags extraction, and λ ^ i + 1 i is the position norm resulting from the manual measurement.
Note: For underwater deployment, the same procedure for AprilTags extraction and data-logging can be carried out via remotely controlled operation on a UUV equipped with a ROS-installed Ubuntu machine and a ZED 2i camera (or any underwater camera with calibrated intrinsic parameters).

5.3. AprilTags Pose Publishing and Localisation

As shown in Figure 9, provided that the poses of the AprilTags remain unchanged after their initial registration, the same YAML file can be used to publish the static homogeneous transformations among the AprilTags. The results of publishing the poses of AprilTags, installed inside the 115.46 m3 workspace, and localisation using those AprilTags are reported in Figure 12. The recorded video on the real-time update of localisation using AprilTags is available at this hyperlink: https://youtube.com/playlist?list=PLG3nO3TEqwOmjvLVynK54pUugt_U1ChFA&si=tKaJysXCAkjKN2rn (accessed on 18 November 2025).
The full localisation path with its start and end points is illustrated in Figure 13. Noise is also observed in the AprilTags detection output because no filtering or noise-reduction procedures have been incorporated into the current implementation. Due to the lack of a ground-truth path (e.g., a 3D motion capture system), the relative localisation error cannot be quantified directly. However, it can be inductively concluded from Table 3 that the proposed localisation system can provide centimetre-level accuracy. Alternatively, based on the recorded video with real-time localisation updates, it can be qualitatively concluded that the proposed localisation method performs well with minimal noise or small detection error.
In summary, three main factors are validated in this work. Firstly, since no physical measurements are required between the relative poses of AprilTags, the installation can be completed with minimal setup time. Secondly, the proposed AprilTag-extraction algorithm requires the user to confirm each AprilTag pair by pressing the ’Enter’ key n 1 times for n AprilTags. Therefore, this second factor complements the first factor. Thirdly, the matt-laminated AprilTags can be used as a cost-effective localisation under different lighting conditions in the laboratory workspace. For underwater applications, the same deployment procedure shown in Figure 9 can be carried out using the matt-laminated or transparent anti-fouling-coated AprilTags, along with a recalibrated underwater camera mounted on a UUV.

6. Conclusions

In this work, a simple and cost-effective localisation method using AprilTags with minimal setup time is proposed and validated under various lighting conditions. The results show that, in the 115.46 m3 laboratory workspace, AprilTags-based localisation achieves centimetre-level accuracy even before the implementation of an additional filtering algorithm. The overall deployment procedure is detailed and demonstrated in the laboratory workspace, and the factors to consider for future underwater deployment are also presented. This cost-effective localisation research work will impact the initial trials of autonomous UUV deployment, requiring only minor modifications to the aquaculture infrastructure. Future work will cover the full-state (pose and twist) estimation of the UUV using AprilTags installed underwater, as well as the filter implementation to minimise the noise due to water turbidity.

Author Contributions

Conceptualisation, L.H. and T.T.T.; methodology, L.H. and T.T.T.; software, T.T.T.; validation, L.H. and T.T.T.; formal analysis, T.T.T.; investigation, T.T.T. and L.H.; resources, L.H.; data curation, T.T.T.; writing—original draft preparation, T.T.T.; writing—review and editing, L.H. and M.A.P.; visualisation, T.T.T.; supervision, L.H.; project administration, L.H.; funding acquisition, L.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Blue Economy Cooperative Research Centre (CRC), established and supported under the Australian Government’s CRC Program, grant number CRC-20180101.

Data Availability Statement

The data presented in this study are available from the corresponding author upon request, subject to the terms of the project contract governing the study.

Acknowledgments

The authors acknowledge the financial support of the Blue Economy Cooperative Research Centre (CRC), established and supported under the Australian Government’s CRC Program, grant number CRC-20180101. The CRC Program supports industry-led collaborations between industry, researchers, and the community. The authors also acknowledge the graduate research facilities and the Ph.D. fees scholarship of the Auckland University of Technology (AUT). We greatly appreciate the technical support of Simon Hartley, the technical specialist-mechatronics at AUT and Adam Poloha, the research assistant at Mechatronics Lab, AUT.

Conflicts of Interest

Author Mark Anthony Preece was employed by the company New Zealand King Salmon Co. Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Kragelund, S.; Walton, C.; Kaminer, I.; Dobrokhodov, V. Generalized Optimal Control for Autonomous Mine Countermeasures Missions. IEEE J. Ocean. Eng. 2020, 46, 466–496. [Google Scholar] [CrossRef]
  2. Salavasidis, G.; Munafò, A.; McPhail, S.D.; Harris, C.A.; Fenucci, D.; Pebody, M.; Rogers, E.; Phillips, A.B. Terrain-Aided Navigation with Coarse Maps—Toward an Arctic Crossing with an AUV. IEEE J. Ocean. Eng. 2021, 46, 1192–1212. [Google Scholar] [CrossRef]
  3. Tun, T.T.; Huang, L.; Preece, M.A. High-Fidelity Simulation Platform for Autonomous Fish Net-Pen Visual Inspection with Unmanned Underwater Vehicles in Offshore Aquaculture. In Proceedings of the Philip Liu Honoring Symposium on Water Wave Mechanics and Hydrodynamics; Blue Economy Symposium, International Conference on Offshore Mechanics and Arctic Engineering, Singapore, 9–14 June 2024; Volume 9, p. V009T13A014. [Google Scholar] [CrossRef]
  4. Amundsen, H.B.; Caharija, W.; Pettersen, K.Y. Autonomous ROV inspections of aquaculture net pens using DVL. IEEE J. Ocean. Eng. 2021, 47, 1–19. [Google Scholar] [CrossRef]
  5. Akram, W.; Casavola, A.; Kapetanović, N.; Miškovic, N. A visual servoing scheme for autonomous aquaculture net pens inspection using ROV. Sensors 2022, 22, 3525. [Google Scholar] [CrossRef] [PubMed]
  6. New Zealand King Salmon. Blue Endeavour. Available online: https://www.kingsalmon.co.nz/open-ocean-blue-endeavour/ (accessed on 25 September 2025).
  7. Li, Y.; Wen, M.; Wan, P.; Mu, Z.; Wu, D.; Chen, J.; Zhou, H.; Zhang, S.; Yao, H. Deformable USV and Lightweight ROV Collaboration for Underwater Object Detection in Complex Harbor Environments: From Acoustic Survey to Optical Verification. J. Mar. Sci. Eng. 2025, 13, 1862. [Google Scholar] [CrossRef]
  8. Bjerkeng, M.; Kirkhus, T.; Caharija, W.; Thielemann, J.T.; Amundsen, H.B.; Ohrem, S.J.; Grøtli, E.I. ROV Navigation in a Fish Cage with Laser-Camera Triangulation. J. Mar. Sci. Eng. 2021, 9, 79. [Google Scholar] [CrossRef]
  9. Constantinou, C.C.; Georgiades, G.P.; Loizou, S.G. A Laser Vision System for Relative 3-D Posture Estimation of an Underwater Vehicle with Hemispherical Optics. Robotics 2021, 10, 126. [Google Scholar] [CrossRef]
  10. Duecker, D.A.; Hansen, T.; Kreuzer, E. RGB-D Camera-based Navigation for Autonomous Underwater Inspection using Low-cost Micro AUVs. In Proceedings of the 2020 IEEE/OES Autonomous Underwater Vehicles Symposium (AUV), St John’s, NL, Canada, 30 September–2 October 2020; pp. 1–7. [Google Scholar]
  11. Karlsen, Ø.; Amundsen, H.B.; Caharija, W.; Martin Ludvigsen, H. Autonomous Aquaculture: Implementation of an autonomous mission control system for unmanned underwater vehicle operations. In Proceedings of the OCEANS 2021 Porto, San Diego, CA, USA, 20–23 September 2021; pp. 1–10. [Google Scholar]
  12. Sonardyne. Micro-Ranger 2 USBL. Available online: https://shop.sonardyne.com/products/Systems/Micro%20Ranger%202%20USBL (accessed on 30 September 2025).
  13. Advanced Navigation. Subsonus. Available online: https://landing.advancednavigation.com/acoustic-navigation/usbl/subsonus/?utm_source=google&utm_medium=cpc&utm_campaign=&utm_term=&utm_term=&utm_campaign=ED+%7C+DSA+%7C+USBL+%7C+Key+Markets&utm_source=adwords&utm_medium=ppc&hsa_acc=2690362089&hsa_cam=18559402989&hsa_grp=140895538303&hsa_ad=627323903997&hsa_src=g&hsa_tgt=dsa-1823315331353&hsa_kw=&hsa_mt=&hsa_net=adwords&hsa_ver=3&gad_source=1&gad_campaignid=18559402989&gbraid=0AAAAADu_H99D96Eo4k26OeHMyGoCPgQkO&gclid=EAIaIQobChMI-vbM9PPZjwMVSSN7Bx0I4ybSEAAYASAAEgIkj_D_BwE (accessed on 30 September 2025).
  14. Teledyne Marine Technologies. TrackIt USBL System. Available online: https://www.teledynemarine.com/brands/benthos/trackit (accessed on 30 September 2025).
  15. EvoLogics GmbH. S2C R 18/34H USBL. Available online: https://www.evologics.com/product/s2c-r-18-34h-usbl-29 (accessed on 30 September 2025).
  16. Deep Trekker Inc. MicronNav USBL Positioning. Available online: https://www.deeptrekker.com/shop/products/micronnav-usbl (accessed on 30 September 2025).
  17. Cerulean Sonar. ROV Locator Bundle Mark III. Available online: https://ceruleansonar.com/product/rov-locator-mark-iii/ (accessed on 30 September 2025).
  18. Water Linked. DVL A125. Available online: https://waterlinked.com/shop/dvl-a125-178#attribute_values=47,78 (accessed on 30 September 2025).
  19. Nortek Group. DVL 500 Compact. Available online: https://www.nortekgroup.com/products/dvl500-c-300-m (accessed on 30 September 2025).
  20. Teledyne Marine Technologies. Wayfinder DVL. Available online: https://www.teledynemarine.com/wayfinder (accessed on 30 September 2025).
  21. Cerulean Sonar. Tracker 650. Available online: https://ceruleansonar.com/product/tracker650/#specs (accessed on 30 September 2025).
  22. Advanced Navigation. Boreas A70 & A90. Available online: https://landing.advancednavigation.com/imu-ahrs/fog-imu/boreas-a/#h-specifications (accessed on 30 September 2025).
  23. Norwegian Subsea. MRU Marine. Available online: https://norwegian-subsea.no/products/marine (accessed on 30 September 2025).
  24. Sonardyne. SPRINT-Nav U. Available online: https://www.sonardyne.com/product/sprint-nav-u/ (accessed on 30 September 2025).
  25. Kallwies, J.; Forkel, B.; Wuensche, H.J. Determining and Improving the Localization Accuracy of AprilTag Detection. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 8288–8294. [Google Scholar] [CrossRef]
  26. Chen, J.; Sun, C.; Zhang, A. Autonomous Navigation for Adaptive Unmanned Underwater Vehicles Using Fiducial Markers. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 9298–9304. [Google Scholar] [CrossRef]
  27. Jung, J.; Choi, H.T.; Lee, Y. Persistent Localization of Autonomous Underwater Vehicles Using Visual Perception of Artificial Landmarks. J. Mar. Sci. Eng. 2025, 13, 828. [Google Scholar] [CrossRef]
  28. Wen, X.; Sakaris, C.; Schlanbusch, R.; Ong, M.C. Numerical modelling and analysis of tendon failures in nets of semi-submersible fish cages. Ocean Eng. 2025, 325, 120768. [Google Scholar] [CrossRef]
  29. Åland, P.A.; Fjellheim, A.; Torgersen, T. Large Offshore Floating Fish Pens: Ocean Farm 1 and Havfarm 1. In Large Floating Solutions: Design, Construction, Legality of Offshore Structures and Buoyant Urbanism; Wang, B.T., Wang, C.M., Weinert, K., de Graaf-van Dinther, R., Eds.; Springer Nature: Singapore, 2025; pp. 249–273. [Google Scholar] [CrossRef]
  30. Tun, T.T.; Huang, L.; Preece, M.A. Development and High-Fidelity Simulation of Trajectory Tracking Control Schemes of a UUV for Fish Net-Pen Visual Inspection in Offshore Aquaculture. IEEE Access 2023, 11, 135764–135787. [Google Scholar] [CrossRef]
  31. Chu, Y.; Wang, C.M.; Park, J.C.; Lader, P.F. Review of cage and containment tank designs for offshore fish farming. Aquaculture 2020, 519, 734928. [Google Scholar] [CrossRef]
  32. Koide, K.; Oishi, S.; Yokozuka, M.; Banno, A. Scalable Fiducial Tag Localization on a 3D Prior Map via Graph-Theoretic Global Tag-Map Registration. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 5347–5353. [Google Scholar] [CrossRef]
  33. Tang, G.; Yang, T.; Yang, Y.; Zhao, Q.; Xu, M.; Xie, G. Relative Localization and Dynamic Tracking of Underwater Robots Based on 3D-AprilTag. J. Mar. Sci. Eng. 2025, 13, 833. [Google Scholar] [CrossRef]
  34. Sarabandi, S.; Thomas, F. A Survey on the Computation of Quaternions From Rotation Matrices. J. Mech. Robot. 2019, 11, 021006. [Google Scholar] [CrossRef]
  35. Bauschmann, N.; Duecker, D.A.; Alff, T.L.; Seifried, R. Evaluation of Underwater AprilTag Localization for Highly Agile Micro Underwater Robots. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1–5 October 2023; pp. 9926–9932. [Google Scholar] [CrossRef]
  36. Abbas, S.M.; Aslam, S.; Berns, K.; Muhammad, A. Analysis and Improvements in AprilTag Based State Estimation. Sensors 2019, 19, 5480. [Google Scholar] [CrossRef] [PubMed]
Figure 1. A simplified illustration of a floating/semi-submersible rigid cage [Note: Artist impression on fish net-pen of Ocean Farm 1 and Havfarm 1].
Figure 1. A simplified illustration of a floating/semi-submersible rigid cage [Note: Artist impression on fish net-pen of Ocean Farm 1 and Havfarm 1].
Jmse 14 00115 g001
Figure 2. A simplified illustration of the floating flexible fish net-pen with a rigid body structure.
Figure 2. A simplified illustration of the floating flexible fish net-pen with a rigid body structure.
Jmse 14 00115 g002
Figure 3. A simplified illustration of the closed containment tank system.
Figure 3. A simplified illustration of the closed containment tank system.
Jmse 14 00115 g003
Figure 4. A simplified illustration of the controlled laboratory environment.
Figure 4. A simplified illustration of the controlled laboratory environment.
Jmse 14 00115 g004
Figure 5. Scenario 1 with the use of multiple cameras and multiple AprilTags.
Figure 5. Scenario 1 with the use of multiple cameras and multiple AprilTags.
Jmse 14 00115 g005
Figure 6. Scenario 2 with the use of a single camera, a single AprilTag attached to the UUV, and multiple stationary AprilTags.
Figure 6. Scenario 2 with the use of a single camera, a single AprilTag attached to the UUV, and multiple stationary AprilTags.
Jmse 14 00115 g006
Figure 7. Scenario 3 with the use of a single camera attached to the moving UUV and multiple AprilTags.
Figure 7. Scenario 3 with the use of a single camera attached to the moving UUV and multiple AprilTags.
Jmse 14 00115 g007
Figure 8. A simplified illustration of the AprilTags extraction and data-logging. Note: The orange colored lines and the intersecting rectangular plane illustrate the field of view of the camera.
Figure 8. A simplified illustration of the AprilTags extraction and data-logging. Note: The orange colored lines and the intersecting rectangular plane illustrate the field of view of the camera.
Jmse 14 00115 g008
Figure 9. Overall deployment procedure for the UUV localisation.
Figure 9. Overall deployment procedure for the UUV localisation.
Jmse 14 00115 g009
Figure 10. Installation of 10 AprilTags inside the 115.46 m3 workspace.
Figure 10. Installation of 10 AprilTags inside the 115.46 m3 workspace.
Jmse 14 00115 g010aJmse 14 00115 g010b
Figure 11. Different perspectives of the extracted and data-logged poses of AprilTags, installed inside the 115.46 m3 workspace. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Figure 11. Different perspectives of the extracted and data-logged poses of AprilTags, installed inside the 115.46 m3 workspace. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Jmse 14 00115 g011aJmse 14 00115 g011b
Figure 12. Different perspectives of AprilTags pose publishing and localisation in the 115.46 m3 workspace. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Figure 12. Different perspectives of AprilTags pose publishing and localisation in the 115.46 m3 workspace. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Jmse 14 00115 g012
Figure 13. Full localisation path with its start & end points and noise. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Figure 13. Full localisation path with its start & end points and noise. Note: The axes of the frames are color-coded, representing the x-axis, y-axis and z-axis with the red, green and blue colors, respectively.
Jmse 14 00115 g013
Table 1. Sensors for UUV localisation. Note: The prices for some sensors are not available and thus, denoted as “N.A”.
Table 1. Sensors for UUV localisation. Note: The prices for some sensors are not available and thus, denoted as “N.A”.
CompanySensor TypeModelPrice
Sonardyne [12]USBLMicro-Ranger 2 USBL£32,147
Advanced Navigation [13]USBL, INSSubsonusN.A
Teledyne Marine Technologies [14]USBLTrackIt USBL SystemN.A
EvoLogics GmbH [15]USBLS2C R 18/34HN.A
Deep Trekker Inc. [16]USBLMicronNavN.A
Cerulean Sonar [17]USBLROV Locator Bundle Mark III$5990
Water Linked [18]DVLDVL-A125$11,380
Nortek Group [19]DVLDVL 500 CompactN.A
Teledyne Marine Technologies [20]DVLWayfinderN.A
Cerulean Sonar [21]DVLTracker 650$2990
Advanced Navigation [22]IMU, AHRSBoreas A70 & A90N.A
Norwegian Subsea [23]AHRSMRU MarineN.A
Sonardyne [24]AHRS, DVL, INS,
pressure sensor
SPRINT-Nav UN.A
Table 2. Specifications of the camera, AprilTags, workspace, ROS 2, and laptop.
Table 2. Specifications of the camera, AprilTags, workspace, ROS 2, and laptop.
ItemParameterValue
CameraModelZED 2i
Resolution1280 × 720 (720p)
Focal Length (fx,fy)523.9381 px, 523.9381 px
Principal Point (cx,cy)643.8251 px, 340.0925 px
Horizontal FoV101.3918 [deg]
Vertical FoV68.9883 [deg]
Frame Rate60 FPS
AprilTagTag size0.224 m
Tag familytag36h11 (587 tags with unique IDs: 0–586)
No. of Tags10
Tag IDs used0–9
WorkspaceDimension (W × L × H) 5.79 m × 8.67 m × 2.30 m
ROS 2ROS 2 DistributionHumble Hawksbill
PlatformUbuntu 22.04 (Jammy)
LaptopModelROG Strix G15 G513 (Manufacturer: ASUS)
CPUAMD Ryzen™ 7 5800H
GPUNVIDIA® GeForce RTX™ 3050 Laptop GPU
Table 3. Error % of relative position-estimation between Tag ID i and Tag ID i + 1 . Note: | λ ^ i + 1 i λ i + 1 i | is intentionally converted to centimetre for ease of readability and comparison. For the calculation of e λ i + 1 i , the measurement unit of the position norm is still metre.
Table 3. Error % of relative position-estimation between Tag ID i and Tag ID i + 1 . Note: | λ ^ i + 1 i λ i + 1 i | is intentionally converted to centimetre for ease of readability and comparison. For the calculation of e λ i + 1 i , the measurement unit of the position norm is still metre.
IDs λ ^ i + 1 i [m] λ i + 1 i [m] | λ ^ i + 1 i λ i + 1 i | [cm] e λ i + 1 i [%]
0 to 12.820002.819630.037440.01
1 to 23.727003.761663.465670.93
2 to 32.620002.671135.113341.95
3 to 43.395003.402630.762940.22
4 to 52.595002.594440.056190.02
5 to 63.560003.573541.354200.38
6 to 72.431002.447561.656090.68
7 to 81.907001.927052.004581.05
8 to 91.290001.278641.135690.88
9 to 0 (Special case: e λ 9 0 )1.970001.885738.427084.28
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tun, T.T.; Huang, L.; Preece, M.A. Development of a Cost-Effective UUV Localisation System Integrable with Aquaculture Infrastructure. J. Mar. Sci. Eng. 2026, 14, 115. https://doi.org/10.3390/jmse14020115

AMA Style

Tun TT, Huang L, Preece MA. Development of a Cost-Effective UUV Localisation System Integrable with Aquaculture Infrastructure. Journal of Marine Science and Engineering. 2026; 14(2):115. https://doi.org/10.3390/jmse14020115

Chicago/Turabian Style

Tun, Thein Than, Loulin Huang, and Mark Anthony Preece. 2026. "Development of a Cost-Effective UUV Localisation System Integrable with Aquaculture Infrastructure" Journal of Marine Science and Engineering 14, no. 2: 115. https://doi.org/10.3390/jmse14020115

APA Style

Tun, T. T., Huang, L., & Preece, M. A. (2026). Development of a Cost-Effective UUV Localisation System Integrable with Aquaculture Infrastructure. Journal of Marine Science and Engineering, 14(2), 115. https://doi.org/10.3390/jmse14020115

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop