Next Article in Journal
Fuzzy Study Regarding the Fractional Integral Applied to the q-Multiplier Transformation
Next Article in Special Issue
Two Operations of a “Symmetric Difference” Type on Three-Dimensional Index Matrices
Previous Article in Journal
A Cross-Institutional Financial Fraud Collaborative Detection Algorithm Based on FedGAT Federated Graph Attention Network
Previous Article in Special Issue
UAV-Based Hybrid Fuzzy Inference Framework for Symmetry and Asymmetry in Real-Time Air Quality Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV-Based QR Code Scanning and Inventory Synchronization System with Safe Trajectory Planning

1
Department of Mechanical Engineering, School of Engineering and Sciences, MIT Art, Design and Technology University, Pune 412201, Maharashtra, India
2
Department of Mechanical Engineering, School of Engineering Sciences, Ramdeobaba University, Nagpur 440013, Maharashtra, India
*
Author to whom correspondence should be addressed.
Symmetry 2026, 18(4), 548; https://doi.org/10.3390/sym18040548
Submission received: 19 December 2025 / Revised: 22 January 2026 / Accepted: 26 January 2026 / Published: 24 March 2026
(This article belongs to the Special Issue Symmetry/Asymmetry in Fuzzy Control)

Abstract

Modern-day urban warehouses face exploding large inventory and tight spaces requiring fast, accurate, and safe stocktaking in a narrow aisle in a GPS-denied environment. This paper proposes a complete UAV-enabled framework performing real-time QR code scanning with inventory synchronization through a safety-aware trajectory generation for obtaining collision-free motion. A novel hybrid workflow integrating MATLAB/Simulink R2024b and Unreal Engine is used for dynamics and photorealistic rendering, alongside a real-time warehouse setup using drone cameras and 3D LiDAR coupled with a ground control station and live dashboard. The system in this paper was evaluated by testing with single and multi-UAV models across high-fidelity simulations and experiments. Results demonstrate simulated QR accuracy of approximately 95 to 96%, with experimental validation achieving between 86 and 90.5% due to real-world environmental factors. In experimental and simulation analysis, mean end-to-end latency remained under half a second, trajectory error range between 8 and 10 cm, and safety margins were consistently maintained throughout the test. It was further observed that multi-UAV coordination halved mission time compared to single-drone tests while keeping duplicate reads negligible, indicating a scalable and safe pipeline for industry application.

1. Introduction

The evolution of intelligent warehousing has given rise to the disruptive trend in the sphere of logistics and inventory management, where the automation, real time data processing, and intelligent monitoring systems prove to be inevitable. One of such advances has been the Unmanned Aerial Vehicles (UAVs) used in indoor warehouses. Their size, flexibility, and agility also allow the UAVs to be particularly useful in navigating aisles and multi-tiered shelves, which allows them to perform their scanning and inspection tasks efficiently and with minimal manual effort. The high pace of e-commerce, smart manufacturing, and automated supply chains development has drastically caused an upsurge in the need to ensure the precise real time monitoring of inventory in indoor warehouse settings [1]. Traditional approaches to inventory check mostly depend on manual inspection processes like barcode scanning, which is time-consuming and labour-intensive; the handheld readers or ground-based robots that are prone to human error; most of the facilities are high-density and large-scale storage [2]. As warehouses continue to expand in size, both in height and width, timely inventory visibility, without jeopardizing operational safety, has become a significant issue in contemporary industrial logistics [3]. Due to the mobility, flexibility, and the capacity of visiting inaccessible places of storage without any structural changes, UAVs have recently gained a lot of interest as a new potential solution towards autonomous inside inspections and inventory management [4,5]. Applications of UAVs exhibit superior vertical reach and less interference in floor space compared to AGVs and AMRs. Therefore, this will make the application of UAVs more favorable in dense layouts of warehouses in general [6]. However, indoor deployment of UAVs introduces some unique technical challenges related to the absence of GPS signals, highly constrained navigation spaces, dynamic obstacles, limited payload capacity, and stringent safety requirements of UAVs [7].
Among different identification technologies to trace inventory, QR code has been considered as a reliable and cost-effective substitute for RFID in indoor environments [8]. QR codes provide higher data capacity, are robust to partial occlusion, and are easily deployable without special infrastructure [9]. Vision-based QR code reading using UAV-mounted cameras can achieve non-contact, flexible, and scalable inventory inspection, especially in warehouses with heterogeneous item sizes and storage layouts [10,11]. However, accurate QR code detection from flying platforms still needs precise positioning, stable flight control, optimal view angles, and robust image processing under different lighting conditions. Therefore, for flying autonomously in confined indoor places such as warehouses, safe and efficient trajectory planning is a fundamental requirement. Indoor navigation requires real-time obstacle detection, tight manoeuvrability, and collision avoidance around racks, shelves, workers, and other mobile agents [12]. Conventional methods, where waypoints are used or fixed path planning, tend to be unsuited to dynamically changing conditions and unknown obstacles, adding increased chances of collisions [13]. In the recent past, safety-minded trajectory planning has been taken into more serious consideration, combining sensor-based perception with risk assessment models towards this end [14].
LiDAR sensors have proven highly efficient in accurate obstacle detection and spatial mapping within an indoor environment because they are very accurate and light-insensitive [15]. The UAV can increase its situational awareness and achieve credible vision in both navigation and visual inspection tasks by implementing a wide-angle fish-eye camera [16]. It is possible to test and validate UAV control, sensing, and perception algorithms in a realistic manner by using simulation-driven development tools, including Simulink-based 3D modelling and Unreal Engine development environments before real deployment [17,18]. A high-fidelity simulated environment has greatly lowered the risk of development and also increased system reliability. Adding to autonomous navigation, real-time data synchronisation between UAVs and GCS and enterprise inventory management systems is the key to viable industrial acceptance in a real-life situation [19].
The communication framework of cloud-enabled communication and lightweight web servers, such as those based on the Flask architecture, conveys seamless transmission of data from scanned inventory to centralized dashboards for real-time monitoring and decision making [20]. Ensuring low-latency communication, handling redundancy, and computational efficiency remains one of the critical research focuses during large-scale deployment [21].
Although several works have explored the inventory inspection with UAVs, existing studies are mostly limited to validations via only simulation [22,23] or restricted to static environments without integrated safety-aware trajectory planning and real-time system synchronization [24,25,26]. Furthermore, comprehensive experimental validation that combines realistic simulation environments and real-world UAV testing remains relatively rare in the literature thus far. Systems must guarantee reliable obstacle avoidance; sustain high QR decoding rates under variable illumination, motion blur, and partial occlusions; and regulate trajectories safely within narrow, multi-level aisles while keeping the inventory database synchronized in near real time. These demands are compounded by resource limits, compute, memory, and battery endurance, which constrain continuous, fully on-drone processing and prolonged scan missions without strategic offloading or careful energy management. Much of the existing literature either surveys UAV use from a broad, cross-sector perspective (transportation, disaster response, agriculture) or focuses on isolated warehouse functions such as cycle counting or spot inspection. As a result, insights are often too general to respect warehouse constraints or too narrow to capture the coupling among perception reliability, safety-aware motion, and enterprise data integration.
Motivated by these gaps, this paper presents a UAV-based QR code scanning and inventory synchronization system, along with safety-aware trajectory planning, optimized for indoor warehouse environments. This study proposes a unified UAV-based indoor inventory inspection system that distributes computation across the aerial platform, a ground control station (GCS), and a lightweight cloud service. The UAV follows a pre-planned aisle trajectory while capturing live video; the GCS executes real-time QR decoding and pose estimation, offloading intensive image processing from the air vehicle. Simultaneously, a 3D LiDAR sensor produces point clouds for scene reconstruction; these data inform and validate obstacle-aware flight paths within a MATLAB/Simulink environment that is co-tested against a photorealistic Unreal Engine warehouse. Safety is enforced by a trajectory generator embedded with a mathematical risk-factor model that adaptively adjusts waypoints and velocities to maintain clearance, and triggers a return-to-base protocol under critical battery or perception conditions.
Decoded item IDs, timestamps, confidence scores, and UAV telemetry are sent to a Flask-based host server in order to complete the feedback loop and visualize on a web dashboard that allows real-time monitoring and heatmap-like coverage maps, as well as real-time inventory syncing. This design enhances scalability (by enabling multi-UAV operation) and robustness (by isolating perception and control from network variance), while preserving low end-to-end latency for operational decision-making. The proposed system is validated through a combination of Simulink simulations, Unreal-based virtual environments, and controlled flight trials with commercial quadrotors and QR-labeled racks. Performance is quantified in terms of scanning accuracy, computational latency, trajectory safety, and detection reliability, demonstrating that a tightly coupled perception-planning-synchronization stack can deliver repeatable, industrial-grade UAV-based inventory management for indoor logistics.
The rest of this work is organized as follows: in Section 2, we discuss materials and methods consisting system overview, trajectory planning, and QR detection. Section 3 describes the simulation and experimental analysis during the inventory management using UAV, followed by results and discussions in Section 4. At last, Section 5 presents a conclusion to this work and outlines the directions for future research.

2. Materials and Methods

2.1. System Overview

This study develops an end-to-end UAV system for autonomous indoor inventory inspection that combines real-time QR detection, safety-aware trajectory control, and cloud-synchronized logging. The architecture integrates (i) a quadrotor equipped with a forward-facing camera and 3D LiDAR, (ii) simulation and visualization via MATLAB/Simulink and Unreal Engine, and (iii) a Python/Flask 3.1.2 backend exposing REST and WebSocket endpoints for ingestion and live visualization. Figure 1 summarizes the data flow from on-board sensing through the Ground Control Station (GCS) to the host server and dashboard.
Within a photorealistic warehouse model, racks and shelves are labeled with standardized QR tags (e.g., A1–A10, C1–C5), each encoding a unique inventory string that maps one-to-one to the database. The UAV follows a predefined zig-zag coverage path with hover-and-scan stops at waypoints ( x , y , z ) in the warehouse frame. Video is streamed to the GCS over RTSP/UDP; to conserve on-board compute, QR decoding and confidence scoring run on the GCS using OpenCV. For each successful decode, the GCS packages the QR string, synchronized timestamp, UAV pose, and a scalar confidence score into a JSON payload and posts it to the Flask API, which validates the schema, updates the inventory state (scanned/duplicate/missing), and pushes live updates to a SocketIO dashboard.
Safety is enforced by a risk factor R f derived from the UAV’s distance to nearby obstacles (from LiDAR or map) and used to gate motion and trigger short-horizon replanning when R f exceeds a threshold (details in Trajectory Planning). A return-to-base routine is invoked under low battery, persistent perception loss, or mission completion. The full pipeline is validated in co-simulation (Simulink + Unreal) and in a controlled warehouse with QR-labeled racks, using identical logging and time synchronization to enable apples-to-apples comparisons. Simulation and experimental results are analyzed in terms of QR scanning accuracy, end-to-end latency, trajectory fidelity, safety coverage, redundancy handling, and coverage efficiency, demonstrating readiness for warehouse-scale deployment. Figure 2 shows the complete workflow of the method.

2.2. Warehouse Environment Modeling and QR Code Arrangement

For a warehouse environment, the simulation and experimental testing phase has to ensure a high level of realism and repeatability. Hence, the environment is carefully modeled using a hybrid approach by integrating Simulink’s 3D Scene Builder and Unreal Engine. This bi-environment setup enables simultaneous development and validation of both control logic and visual sensing modules under controlled yet diverse spatial conditions. The warehouse layout incorporates the standard multi-tier shelving racks, which have evenly spaced aisles, and obstacles that mimic a real-world warehouse environment, including support beams and overhanging inventory.
Each rack is systematically labeled using an alphanumeric QR code scheme, which represents the spatial organization of the warehouse. The labels follow a uniform format, showcasing rack and shelf indices, such as “A1” representing Rack A/Shelf 1, while “C5” represents Rack C/Shelf 5. These labels are marked in QR codes and attached to the front of the boxes of cardboard located on each shelf. The labeling pattern is used to achieve consistency and allow easy UAV recognition at different angles and log data under simulation and real deployment.
The QR codes are uniquely designed to have a uniquely structured string that identifies the item ID and its exact position in the warehouse. The usual format of a QR code that will be used in this research is as follows:
Q R s t r i n g = " I t e m # A 14 R 5 "
From the above example, “Item#A14-R5” denotes a product stored at shelf 14 of Rack A, row 5. This string format is essential for establishing one-on = one a correspondence between the physical inventory and the digital record of it in the cloud-based database. During the flight mission, the UAV scans this QR code and relays the decoded string to the ground control system (GCS), which subsequently matches the data to entries in the central inventory management software.
The spatial accuracy of the placement of QR codes in the simulation environment is achieved through the fact that the positioning of the QR codes is directly aligned with the Simulink coordinates and the Unreal Engine asset meshes. The placement angle, the light source, and perspective distortion are also adjusted to realistic conditions of scanning, as the UAV camera experiences. This simulation modelling means that the detection algorithm is resistant to visual noise and different flight altitudes. To conclude, the warehouse modeling and QR code layout make the basis of organized inventory mapping. This consistency in the simulation and physical worlds is what will allow the smooth transition between design and testing, as well as being of critical importance in allowing the UAV to localize and identify inventory items in real-time in an autonomous manner.

2.3. UAV Trajectory Planning and Safety-Aware Replanning

This section provides a complete theoretical and algorithmic foundation for an aisle-coverage planner tailored to indoor warehouses. First, nominal, camera-steady trajectory generation constructs a zig-zag coverage path through shelves by interpolating a sequence of hover-and-scan waypoints with a smooth, spline-like curve. Smoothness is enforced by penalizing translational acceleration and yaw-rate “energy,” which yields cubic-in-time position segments in unconstrained arcs and limits camera slew to keep QR targets within the field of view. Second, feasibility is imposed through hard bounds on speed, acceleration, and yaw rate, together with linear “keep-in” constraints that restrict motion to the aisle polytope; these constraints ensure the planned motion is executable on a real quadrotor with finite actuation and that the vehicle remains clear of shelf faces and structural boundaries. Third, to make the problem numerically robust and real-time capable, we introduce a convex discrete surrogate: the continuous objective and constraints are discretized on a fixed time grid using first and second difference operators, producing a banded quadratic program (QP) with a positive-definite Hessian. The structure admits fast solvers (e.g., banded Cholesky or projected conjugate gradients), guarantees a unique optimizer under standard boundary pinning, and supports optional terms (e.g., line-of-sight yaw alignment) without sacrificing convexity. Fourth, recognizing that indoor scenes are dynamic and partially modeled, we embed short-horizon replanning driven by a smooth, LiDAR-based risk field. A signed-distance-derived, sigmoid “Risk Factor” provides continuous risk values and gradients everywhere in free space; whenever predicted risk exceeds a safety threshold, a small horizon around the offending segment is re-optimized by successive convexification. Linearizing the risk term, regularizing increment curvature, and applying a backtracking line search produce monotone risk reduction while keeping updates within dynamic limits and close to the nominal path. Together, these components yield a planner that is provably well-posed, numerically stable, and fast enough for closed-loop use, with clearly defined symbols, objectives, and constraints introduced at first mention for reproducibility.
  • Waypoints and Continuous Kinematics
Let the scan waypoints be
P i   = ( x i   , y i   , z i   ) R 3 , i = 1 , , N
The continuous reference states are position p ( t ) 3 and   yaw ψ ( t ) , t [ 0 , T ] with commanded kinematics,
p ˙ ( t ) = v c   ( t ) , ψ ˙ ( t ) = ω c   ( t ) .
Feasibility is enforced by bounds
v c ( t )   v m a x   , v c ˙ ( t )   a m a x   , ω c   ( t )   ω m a x   , t ,
and an aisle polytope Ω R 3 (linear “keep-in” constraints) such that p ( t ) Ω .
2.
Camera-Steady Nominal Path (Continuous Optimal Control)
To obtain smooth motion that keeps the camera steady on shelves, we minimize acceleration and yaw-rate “energy”;
min p ( ) , ψ ( )   J smooth = 0 T   ( λ s p ¨ ( t ) 2 2   +   λ ψ ψ ˙ ( t ) 2 ) d t s . t .         p ˙ ( t ) 2 v max , p ¨ ( t ) 2 a max , | ψ ˙ ( t ) | ω max , p ( 0 ) = P 1 , p ( T ) = P N , p ( t ) Ω ,     t [ 0 , T ] .
where, p : [ 0 , T ]     3 is position, are weights.
In unconstrained arcs, the Euler–Lagrange equations give p ( t ) . . . . . = 0 , hence each Cartesian component of p ( t ) is cubic in time, explaining the spline-like appearance. The yaw term acts as a Tikhonov regularizer on ψ, limiting camera slew.
Optional LOS alignment. If desired, add
J L O S   = κ 0 T   ( ψ ( t ) ψ L O S   ( t ) ) 2 d t , ψ L O S   ( t ) = a t a n 2 ( n y   ( t ) , n x   ( t ) ) ,
with ( n x , n y ) a shelf-normal projection; in discrete least-squares form (below), this remains convex
3.
Discrete Convex Surrogate (QP) for the Nominal Plan
Sample at t k = k Δ t , k = 0 , , K , K Δ t = T , Define first/second differences
D 1   p : = p k + 1   p k   Δ t     , D 2   p : = p k + 1   2 p k   + p k 1 Δ t 2    
Let τ i   index the hover frame for waypoint p i and ρ i   > 0 weight stop-and-scan tightness. The nominal problem is the convex QP
min p k , ψ k i = 1 N ρ i p τ i P i 2 coverage / hover   adherence + λ s D 2 p 2 2 + λ ψ D 1 ψ 2 2 + κ ψ ψ LOS 2 2 2 2 if   used s . t . D 1 p v max ,       D 2 p a max ,       D 1 ψ ω max ,       p 0 = P 1 ,     p K = P N ,       p k Ω ( k )
The Hessian is banded and positive semidefinite:
Pinning p 0 , p k and λ s   , λ ψ   > 0 yields strict convexity on the feasible affine subspace; thus, the optimizer is unique. With linear aisle faces and per-axis l bounds, active-set or projected-CG solvers exploit the banded structure for near-linear time in K.
4.
LiDAR-Driven Risk Field (Signed Distance) and Its Differentials
Let r ( p ) be the Euclidean clearance (signed distance) to the nearest obstacle surface (positive in free Space) from a voxelized or meshed SDF/TSDF. Define a smooth Risk Factor
R f   ( p ) = σ ( β [ r s a f e   r ( p ) ] ) = 1 1 + e x p ( β [ r s a f e   r ( p ) ] )   , β > 0 ,
where r s a f e > 0 is the desired clearance (m) and β > 0 controls steepness.
Setting with safe threshold τ ( 0 , 1 ) and safe set S = { p : R f ( p ) τ } .
On regular SDF regions (away from edges/corners), d is Lipschitz and 2 d bounded, so Df is smooth with Lipschitz gradient. The threshold equates to a clearance margin:
R f   ( p ) τ r ( p )     r s a f e   1 β l n ( 1 τ   τ ) .
Hence tuning ( r s a f e , β , τ ) gives a geometric buffer in meters.
In our experiments, we used r s a f e = 0.5   m , β [ 8 , 12 ]   m 1 , and τ = 0.4 . The safety-aware trajectory generator in this study relies on a LiDAR-derived signed-distance (risk) field constructed from the available 3D warehouse map and is therefore validated primarily under quasi-static obstacle assumptions (e.g., racks, shelves, and other fixed infrastructure). While the optimization framework can replan over a short horizon when the risk field is refreshed, the present implementation does not claim full real-time avoidance of fast-moving or non-cooperative obstacles (e.g., workers, forklifts, or other mobile agents) because such capability would require (i) high-rate perception updates, (ii) dynamic obstacle tracking and prediction, and (iii) continuous reconstruction or incremental updates of the signed-distance field with bounded latency. Consequently, the reported results should be interpreted as performance in environments where the dominant obstacles are static or slowly varying relative to the replanning frequency; extending the method to dense, highly dynamic scenes is left as future work, involving online occupancy/SDF updates and reactive safety layers that can guarantee collision avoidance under rapid motion.
5.
Short-Horizon, Risk-Aware Replanning (Successive Convexification)
When a predicted segment violates safety, we locally correct frames j = k , , k + h by minimally deflecting the nominal path while penalizing risk. Linearize the risk about p j ,
R f   ( p j   + Δ p j   )     R f   ( p j   ) + R f   ( p j   ) Δ p j   .
stabilize increments with a second-difference (jerk-like) operator
E Δ p = [ ( Δ p j + 1   Δ p j   ) ( Δ p j   Δ p j 1   ) ] j = k + 1 k + h  
Solve the convex QP
m i n Δ p       Δ p 2 2 s t a y   n e a r   n o m i n a l + λ j = k k + h ( R f   ( p j   ) + R f   ( p j   ) Δ p j   ) + η E Δ p 2 2 s . t . p j   + Δ p j   Ω , Δ p j     Δ t v m a x   , Δ p j + 1   Δ p j     Δ t 2 a m a x   ,  
With λ , η > 0 . Let H hor = I + η E E 0 , and g h o r   = λ [ R f   ( p k   ) , , R f   ( p k + h   ) ] . The unique solution Δ p is applied with a backtracking line search
p j + = p j   + α Δ p   j   , α ( 0 , 1 ]   chose   to   ensure   j = k k + h R f   ( p j + )   j = k k + h R f   ( p j   ) c α g h o r 2 ,
Under Lipschitz R f and bounded steps, the Armijo rule admits such an α ; repeated cycles (re-linearize, re-solve) yield a sequence whose limit points satisfy first-order stationarity of the underlying nonconvex horizon problem (standard successive convexification result).
6.
Supervisory Logic (RTB Barrier)
Define battery fraction b ( t ) [ 0 , 1 ] and perception confidence γ ( t ) [ 0 , 1 ] (from the detection pipeline). A high-level guard switches to homing when
b ( t ) b m i n           γ ( t ) < γ m i n           mission   complete .
Hysteresis on b avoids chatter (e.g., require b > b reset > b min to re-enter scan mode)

2.4. Live Video Feed and Ground QR Code Detection

A critical component of the proposed UAV inventory system is the real-time detection of QR codes from shelves during indoor scanning missions. To balance accuracy, latency, and computational feasibility, the framework adopts a division of responsibilities between the UAV and the Ground Control Station (GCS). The UAV serves primarily as a video acquisition platform, while the GCS executes detection and decoding algorithms to minimize onboard computational overhead.
  • Video stream from UAV to GCS
Each UAV carries a forward-facing fisheye camera (intrinsics K, distortion model Πfish) capturing RGB frames at frame rate Fc (typically 30 fps) and resolution Rc = H × W. Let
V ( t ) = { F k   } k = 0 F c   t   , F k   [ 0 , 255 ] H × W × 3 .
Frames are packetized and transported to the GCS using RTSP/UDP over Wi-Fi 6 (or long-range telemetry). The end-to-end latency is the additive budget
l e 2 e   l c a p   + l e n c   + l n e t   + l b u f   ,
which in our deployment remains e2e ≲ 200  ms, meeting near-real-time constraints for indoor flight.

2.5. Mirror-Symmetric Multi-UAV Scheduling

To explicitly integrate symmetry into the multi-UAV coordination problem (Case II), a mirror-symmetric coupling is embedded directly into the trajectory optimization objective. Let the warehouse aisle be expressed in an aisle-aligned coordinate frame, where the aisle centerline defines the symmetry axis and the lateral coordinate is reflected across this axis. The mirror operator is defined as M = d i a g ( 1 , 1 , 1 ) , such that for any state x = [ x , y , z ] T , the reflected state is Mx = [ x , y , z ] T . Denoting the discrete-time trajectories of the two UAVs by { X A [ k ] } k = 1 K and { X B [ k ] } k = 1 K , symmetry is enforced by augmenting the banded quadratic program (QP) objective with a symmetry regularizer,
J s y m = w s y m k = 1 K x B [ k ] Mx A [ k ] 2 2
where w s y m , controls the strength of symmetric coupling. The overall multi-UAV optimization then minimizes the nominal trajectory costs for each agent together with J s y m , subject to the same feasibility constraints used in the single-UAV formulation, including keep-in-aisle bounds and dynamic limits. Endpoint pinning is preserved by constraining the initial and terminal states to the mission start/goal conditions, ensuring a unique and well-posed QP solution while still allowing the optimizer to reconcile symmetry with safety and feasibility. As a result, mirror symmetry becomes a structural property of the optimization, rather than a post hoc trajectory characteristic, and the coordinated solution remains symmetric insofar as permitted by the active constraints.
2.
QR code detection at the GCS
QR decoding is performed offboard to conserve the UAV battery and CPU. Let D denote the OpenCV detector/decoder (cv2.QRCodeDetector) composed with standard preprocessing (grayscale + adaptive thresholding) and Reed–Solomon error correction.
  • Preprocessing: grayscale conversion and adaptive thresholding to mitigate indoor lighting variations.
  • Pattern detection: extraction of finder patterns (position markers in QR corners).
  • Decoding: Reed–Solomon error correction to recover embedded data even under partial occlusion or blur.
We model decoding as
( q k   , γ k   ) = D ( F k   ) , q k   S { } , 0 γ k   1 ,
where S is the set of valid QR strings and γ k is a confidence score derived from detection probability and patch sharpness. If no valid code is present, q k = φ . Finder-pattern geometry (three L-corner squares with canonical ratios) and perspective consistency are enforced to suppress false positives in wide-FOV images. This GCS-based detection ensures higher processing throughput while enabling multi-UAV scalability, since one ground station can concurrently process multiple streams.
3.
Data Packaging and Metadata Extraction
For every successful detection, the GCS extracts QR content and appends contextual metadata. Each decoded event is logged with the following attributes:
  • Decoded String (q): Encoded item identifier, e.g., Item#C5-R2.
  • UAV Identifier (u): Unique ID for the scanning UAV (e.g., DRONE01).
  • 3D Position x k   = ( x k   , y k   , z k   ) R 3 UAV’s estimated position at detection time, derived from Simulink trajectory data or onboard odometry, and ψ k .
  • Timestamp ( t k   ): Coordinated Universal Time (UTC) at detection.
  • Confidence Score (γ): Confidence metric (0 ≤ γ ≤ 10), based on OpenCV decoding probability or secondary heuristic (e.g., sharpness of detected QR region).
The event,
e k   = ( u a v _ i d , q k   , x k   , ψ k   , t k   , γ k   )
This information is serialized into a JSON object (Figure 3) for transmission to the cloud dashboard and inventory database:
4.
Data Stream Integration
These JSON packets are transmitted from the GCS to the Flask-based server via RESTful API calls or WebSocket streams. QR detection is recorded in the inventory database in real-time, with visualisation data being visualised live using the ground dashboard. In addition, detections with (where approx) are reported to be rescheduled to scan again to be resistant to false positives or low-light situations.

2.6. Host Server and API Communication

The interface between the cloud server and the GCS is achieved by a. Flask based server to provide low-latency and high-reliability synchronization of UAV scan data. The host server is an intermediate server that takes in a structured JSON file as input from the GCS. processing the information, and the centralized inventory database.
(1)
Flask Backend Integration
Python Flask micro-framework was chosen because it has a lightweight architecture and can be scaled to accommodate real-time streams of data. A special API endpoint, /api/upload_qrdata, was developed to accept POST requests by the GCS. The request contains UAV-specific metadata that is in the form of a JSON format, which includes the UAV ID, the decoded QR string, the time of arrival, the spatial coordinates, and the confidence score. Upon receipt Flask application will authenticate the payload with schema checking, and it will ensure data fidelity and eliminate invalid records.
The deployment stack depends on the Flask-RESTful API management tool and Gunicorn and Nginx as the web server; therefore, becoming robust with the many connections of UAVs. The test was performed on 20 simulated swarms of UAVs transmitting simultaneously, with the average response time of the API maintained at a constant level of less than 100 ms, meaning the system would scale to industrial warehouse operations.
(2)
Inventory Database Synchronization
After validating each JSON payload, the host server updates the inventory-management database and emits an event to the dashboard. Lightweight prototyping was done using SQLite, and PostgreSQL. or NoSQL databases (MongoDB) were set up to support more massive deployment. Each QR string is scanned and converted to an equivalent digital record, with the status of the item being. renewed to one of the following states:
  • Scanned (Item successfully detected and timestamp logged)
  • Duplicate (Item re-scanned within a defined time threshold; flagged for operator review)
  • Missing (Item not found in the expected shelf range in the scanning cycle)
The database facilitates historical logging, whereby a new session record is generated every scan cycle of the UAV. The model of design will facilitate future audits, the trajectories with scans, and the reporting of compliance with industries requiring the traceability of the movements and confirmation of their stocks.

2.7. Real-Time Dashboard and Cloud Visualization

To give operators better situational awareness, we designed a real-time dashboard that displays UAV flight paths, scan results, and system alerts. By combining technical processes with an intuitive interface, the dashboard improves monitoring and decision-making.
(1)
Flask-SocketIO Dashboard
The dashboard is built with Flask-SocketIO to enable two-way communication between the server and the client. Instead of periodic polling, SocketIO delivers event-driven updates whenever a UAV scan or trajectory change occurs.
The dashboard features:
Real-time UAV position tracking: Each drone is shown as a moving marker on a warehouse floorplan. Telemetry continuously updates positions so operators can monitor spatial coverage live.
QR detection logs: A timeline records decoded QR strings with associated UAV IDs and timestamps, giving an ongoing view of scanning progress.
Alerts: Automatic notifications flag duplicate detections, low-confidence scans, and under-scanned shelves, cutting down the need for manual checks.
Measured latency for dashboard updates averages 0.3–0.5 s, providing near real-time responsiveness suitable for industrial environments.
(2)
Cloud Inventory Interface
The inventory interface extends the dashboard’s functionality to cloud-based The inventory interface expands the dashboard’s functions to cloud-based analysis and storage. Operators can run dynamic queries to get item scan histories, UAV flight paths, and mission summaries. For scalability, historical scan data are organized by UAV ID and mission timestamps. This setup allows for quick retrieval when evaluating performance or generating compliance reports.
Key features include:
Scanned Item Logs: exportable in CSV or JSON formats
UAV trajectories: Visual plots showing each UAV’s flight path overlaid with shelf-scanning patterns.
Historical Comparisons: Side-by-side views that compare the current mission with previous missions.
This layered visualization makes UAV scanning automated, auditable, and scalable while keeping operators in control, closing the gap between autonomous sensing and enterprise inventory insight.

3. Simulation and Experimental Analysis

Validation of the UAV-based QR detection system was done in two stages. First, we ran high-fidelity simulations using MATLAB–Simulink integrated with Unreal Engine. Next, we deployed the system in a real-time warehouse testbed. The goal was to verify the full pipeline—from safe trajectory planning and QR-code detection to real-time cloud synchronization—under both controlled simulation scenarios and real-world uncertainty. To evaluate scalability, we examined three operating cases:
(i)
Single UAV mission
(ii)
Dual UAVs operating in the same aisle
(iii)
Dual UAVs operating in different aisles.

3.1. Evaluation Protocols and Metrics

The end-to-end model trajectory generation, real-time QR detection/decoding, and cloud synchronization are evaluated first in a photorealistic MATLAB–Simulink + Unreal Engine simulation and then in a mock-warehouse experiment. Unless stated, signals are sampled at f s = 30   Hz (sampling interval Δ t = 1 f s ). All reported statistics are computed over mission time [0,T] with robust summaries (mean, 95th percentile) and bootstrap 95% CIs when appropriate.

3.1.1. Time Bases and Synchronization

Each stage produces timestamps on a monotonic clock:
  • tcapt: camera capture
  • tdect: QR decode completion (on GCS)
  • tpost: HTTP/WebSocket packet sent
  • tdb: database write
  • tdash: dashboard render
To avoid clock skew, all absolute latency metrics are computed on the GCS host using its monotonic time; UAV-side timestamps are used only for relative kinematics. When simulation and hardware are compared, clocks are aligned by cross-correlating fiducial events (e.g., first valid QR) to within one frame.

3.1.2. Ground Truth Sets and Duplicate Policy

Let S e x p   = { s 1   , , s N e x p e c t e d     } be the expected set of inventory IDs on the planned route (rack, shelf indices known a priori). The stream of decoded strings is Q = { ( q i   , γ i   , t i   ) } i = 1 N a l l   , where q i is the decoded text and γ i 0 , 1 is the detector confidence.
A read is valid if (i) q i   S e x p   and (ii) γ i γ m i n   ( we   use   γ min = 0.8 ) .
To prevent inflating counts, de-duplication is applied per ID with a minimum inter-hit dwell Δtdedup (e.g., 1.5 s). Define the unique-valid set;
S u v   = {     q S e x p i : q i   = q , γ i   γ m i n   } , N u n i q u e   v a l i d   = | S u v   | .
Optionally, we also report precision/recall and F1 to characterize false reads:
Precision = T P T P + F P , Recall = T P T P + F N , F 1 = 2 Precision · Recall Precision + Recall

3.1.3. QR Detection Accuracy

A c c = N u n i q u e   valid N expected × 100 %
These metric measures coverage completeness over expected items (post de-duplication and confidence gating).
We also track duplicate rate and rescan recovery:
DupRate = N d u p l i c a t e s N a l l ,   RescanRec = N F N T P ( n e x t P a s s ) N F N

3.1.4. Coverage Time and Throughput

Coverage time T cov is wall-clock duration from take-off to the timestamp of the last expected ID confirmed (unique, valid). Throughput is
Q R / s = N unique   valid T cov , A i s l e   m / s = path   length   ( aisle ) T cov
For multi-UAV studies, we also report the speed-up
Speedup = T sin gle T m u l t i

3.2. Simulation Setup

The simulation framework was constructed in MATLAB–Simulink and coupled with Unreal Engine for photorealistic rendering, where UAV trajectories were modeled using Translation and Rotation blocks in Simulink, producing Cartesian waypoints Pi = (xi,yi,zi). Safety-aware replanning was enabled using LiDAR-based distance sensing. A fisheye camera block produced RGB frames for QR decoding, and a LiDAR block streamed point clouds for obstacle clearance verification. Unreal Engine scenes rendered multi-tier racks populated with QR-labeled boxes (e.g., Item#A14-R5), with controlled illumination and mild occlusions.
Case-I: Single Drone Simulation Results
To establish a baseline, the first case considered a single UAV performing inventory scanning within a warehouse environment. The simulation was implemented in MATLAB-Simulink (Figure 4), where translation and rotation blocks governed the UAV’s motion, while the onboard fisheye camera and LiDAR sensors provided visual and geometric feedback. The system was evaluated in a modeled warehouse in Unreal Engine containing racks tagged with QR-coded boxes.
The UAV path is initially validated by input signals for translation motion and rotational motion, as shown in Figure 5. The input signals showed smooth variation characteristics, hence incremental variation consistent with the designated UAV movement for each rack position. The motion characteristics depicted by the UAV were stable during the entire task duration, without oscillations or drifting away.
The UAV’s scanning behaviour as it moved through the aisles is evident in the top and side views of the simulated environment (Figure 6a,b). The UAV kept a consistent distance from the shelves in both views, guaranteeing the best possible camera coverage for QR recognition. The UAV’s accurate perception of rack structures and avoidance of collisions during path execution were further confirmed by LiDAR-based point cloud visualisation (Figure 7). The robustness of the system’s environmental perception layer is demonstrated by the dense and reliable point cloud representation.
Performance metrics shown in Table 1 for the single -drone case reveal a QR detection accuracy of 95.5% across all tested rack positions. The average synchronization latency between QR code detection and dashboard update was approximately 409 ms, while the mean trajectory deviation remained under 7.8 cm compared to the planned path. These results demonstrate that even a single UAV can reliably perform indoor inventory scanning when equipped with safety-aware trajectory planning and offloaded ground-based QR processing.
The motion program executed shelf-to-shelf dwells with two turn-backs, while a fish-eye front camera and a 3D LiDAR supplied the perception stack. Across five runs, unique-code coverage averaged 95.8% ± 0.6%, and the decode→dashboard latency was 0.41 ± 0.02 s (p95 0.50 ± 0.030.50 s). RMS trajectory error remained 7.6 ± 0.87.6 cm with peaks confined to cornering segments, consistent with the confidence dips visible in Figure 7. The safety monitor reported r min = 0.65 ± 0.05 and R f max = 0.33 ± 0.03, yielding 99.2 ± 0.4% time within the safe set. These results indicate the controller keeps the vehicle within the calibrated perception envelope while satisfying a 1 s UI-latency budget, thereby establishing a robust baseline for the hardware experiments.
Case II—Two UAVs Scanning the Same Rack (Simulation Results and Analysis)
Two identical UAVs are tasked to scan the same three-level rack from opposite ends. Both vehicles run the same plant–sensor stack as Case I (Translation/Rotation blocks, fisheye camera, and 3-D LiDAR), instantiated as duplicated subsystems in a shared Simulink scene (Figure 1). Safety and de-confliction are enforced by:
  • Separation and risk limits: s ( t ) s min = 1.2   m   and   R f ( p ) 0.4
  • Right-of-way policy: a token-based rule at the GCS pauses the follower whenever the predicted time-to-conflict τ c   ( Δ x , Δ y , Δ v ) < 3   s
  • Lane keeping: rack-aligned, mirrored serpentine trajectories with shelf-center dwells.
The Simulink diagram illustrated in Figure 8 instantiates a two-agent, shared-environment co-simulation in which each UAV is a copy of the same dynamical subsystem with outputs mapped to a camera and a LiDAR sensor model. A single “scene” block provides the photorealistic warehouse and enforces a common world frame. Each agent publishes its pose at a fixed simulation tick; the scene block uses those transforms to render synchronized image frames and organized point clouds for agent. Time stamping ensures that perception and control are evaluated on the same discretization, enabling closed-loop guarantees (e.g., separation constraints) to be expressed as algebraic inequalities at the solver step. The duplication of the translation/rotation chains and the shared sensor pipelines make the experiment controlled: any difference in performance arises from interaction effects, not model mismatch.
The two-UAV configuration is deliberately mirror-symmetric about the rack axis (Figure 9a), with UAV-A and UAV-B starting at opposite ends and equal altitudes so that any subsequent kinematic differences stem from coordination logic rather than geometry or lighting. The top view (Figure 9b) shows strict lane centering and aisle separation enforced by a bounded lateral controller, while the global constraint s ( t ) S min limits the problem to longitudinal scheduling. Fisheye frames with QR overlays and simultaneous scope traces (Figure 10) explain the image–motion coupling: the blue waveform is the bounded lateral residual from gimbal-stabilized micro-hovers; the orange staircase is the discrete shelf-level command z(t) that realizes hover–scan–advance.
The outcome is visible in the aggregate plots. The 3-D scatter of detections in (x,y,z) (Figure 11) forms two interleaved color sequences along each shelf, indicating complementary (not duplicate) coverage achieved by mirrored trajectories and cloud-side de-duplication. The distance-traveled profile (Figure 12) retains the stop–scan–go staircase, but plateaus shorten relative to Case I due to mutual illumination and the follower’s adaptive velocity cap in a 2 m look-ahead, which reduces time-to-confidence without changing translation setpoints. The time histogram of unique reads (Figure 13) shows paired clusters per segment, the follower trailing the leader by ≈2–3 s, matching the scheduler’s de-confliction dwell derived from predicted time-to-conflict; this validates safe temporal separation, near-doubling of service rate, and maintenance of end-to-end latency within budget.
Five independent trials were conducted (Table 2) with identical rack layouts and lighting to isolate the effect of two-UAV coordination. Across runs, total mission time decreased by 48–52% relative to the single-UAV baseline, confirming near-doubling of throughput under mirrored serpentine coverage. Cloud de-duplication constrained repeated reads to ≤3.5%, while QR detection accuracy remained high at 95.6% (vs. 96.2% in Case-I). End-to-end latency from decode to dashboard update stayed within 0.34–0.49 s per feed using multi-threaded GCS processing, meeting the near real-time visualization target. Safety margins were consistently respected: the minimum logged inter-UAV separation was 1.28 m (≥1.2 m threshold), and the maximum risk index peaked at R f max = 0.31 (≤0.40 limit). Path fidelity was likewise maintained, with mean absolute trajectory deviation <8.5 cm for UAV-A and <9.3 cm for UAV-B; deviations concentrated at cornering and shelf-level transition segments.
Taken together, these outcomes demonstrate that coordinated, tokenized right-of-way with separation control can nearly halve mission duration without degrading perception quality or safety. The slight accuracy relative to Case-I is attributable to transient self-occlusions at rack end-caps when the agents cross at level changes; however, latency, separation, and risk remain within
Case III: Two UAVs Scanning Different Racks (Parallel Racks, Independent Sectors)
This simulation evaluates a dual-agent configuration in which UAV-A and UAV-B scan two parallel racks concurrently. The Simulink model duplicates the UAV subsystem (Translation/Rotation, fisheye camera, and 3-D LiDAR) and connects both agents to a single photorealistic scene (Unreal Engine) for synchronized rendering and depth sensing(Figure 14). A lightweight “sector allocator” assigns each vehicle a disjoint rack and a non-overlapping QR-ID range; consequently, cloud de-duplication is only a safety net for accidental cross-views. Inter-UAV coordination is minimal compared with Case II because the workspaces are disjoint: the separation constraint s ( t ) s min = 1.2   m and risk bound R f ( p ) ≤ 0.4 are satisfied trivially by the fixed aisle gap between racks, i.e., s ( t ) w a i s l e H A H B , where H denotes the vehicle safety hull and ⊕ the Minkowski sum. Each vehicle executes the same rack-aligned lane-keeping policy as in Case I: a bounded lateral-error controller keeps the body-frame y within a tube around the aisle centerline, while the vertical command z(t) advances in discrete shelf-level steps (hover–scan–advance). Speed is modulated by a confidence-aware controller that throttles forward velocity when the per-frame decode confidence dips below threshold; because camera exposure dynamics are independent across racks, these throttles occur asynchronously and do not induce head-of-line blocking.
The qualitative evidence from the scene and sensor views confirms correct operation. The Simulink diagram shows two independent UAV pipelines publishing pose to the shared scene and receiving synchronized image and point-cloud streams. Unreal Engine oblique views illustrate the initial placement: UAV-A faces Rack-L and UAV-B faces Rack-R at matched heights (Figure 15). During execution, both fisheye feeds display stable QR overlays with minimal motion blur; the lateral residuals visible on the scope traces remain bounded, indicating the inner attitude loops reject aisle-wise disturbances without exciting cross-track oscillations. The merged LiDAR point cloud reconstructs the two rack façades as clean planes with well-separated returns; no overlap within the smin envelope appears even at closest lateral co-presence (Figure 16). From the trajectory logs, distance traveled vs time retains the staircase profile characteristic of stop–scan–go, but now the two staircases progress in parallel with minor phase drift, reflecting asynchronous shelf transitions.
Because the racks are scanned in parallel with negligible contention, the effective throughput approaches the two-server ideal. Across repeated simulations (Table 3) with identical lighting and rack content, overall completion time for “two racks finished” is 1.86–1.95 × 1.86 faster than a single UAV scanning the same two racks sequentially (i.e., a 46–49% reduction in wall-clock time relative to Case I performed twice). Duplicate reads are effectively eliminated (typically <0.5%) because the sector allocator prevents intentional cross-views; occasional cross-aisle glimpses are collapsed by the cloud de-duplication window. QR detection accuracy for each vehicle remains comparable to the single-agent baseline (≈95–96%), and end-to-end latency from decode to dashboard display stays within 0.33–0.48 s per feed owing to multi-threaded ingestion at the GCS. Safety margins are generous: the minimum logged inter-UAV distance exceeds the aisle width minus safety hulls commonly 2–3 m ≫smin, and the maximum computed risk index stays low ( R f max ≲ 0.25 D. Path fidelity remains high, with mean absolute trajectory deviation in the 8–9 cm range and peaks localized to shelf cornering during level changes. Collectively, these results show that partitioning the warehouse into independent sectors yields near-linear scaling in throughput while preserving accuracy, latency, and safety envelopes established in Case I.

3.3. Experimental Validation

Experimental Setup

Experiments were conducted in an indoor mock-warehouse assembled from modular steel racks arranged as parallel aisles. Corrugated cartons bearing printed QR tags were mounted on three vertical levels, with tags centered on the outward-facing panels to guarantee consistent visibility during pass-by scans (Figure 17b). The aisle width and shelf spacing were chosen to replicate the simulated geometry and to allow safe, low-speed flight near the rack faces. Ambient room lighting was kept constant throughout the trials, and a clearly marked take-off/landing pad defined a protected staging area in front of the racks.
The scanning platform was a commercial off-the-shelf quadrotor (DJI Air 2S (Shenzhen Dajiang Innovation Technology Co., Ltd., Shenzhen, China); Figure 17a) equipped with a three-axis stabilized, wide-FOV RGB camera. The camera stream was relayed through the handheld controller to a laptop ground station, where MATLAB decoded QR payloads and logged time-stamped detections together with vehicle pose and run metadata. Flights were executed in stabilized indoor mode with a safety pilot on standby for manual override. This setup mirrors the sensing and geometry used in simulation while exercising a production-grade airframe and optics, enabling a like-for-like comparison between simulated and real runs.
Case I: Single Drone Experimental Results
Real-time trials (Figure 18) were conducted in a mock warehouse aisle with three shelf levels and QR-labeled cartons. A single DJI Air 2S executed a serpentine lane-keeping pattern while a ground control laptop (GCS) handled live video ingest, QR decoding, logging, and supervisory safety checks. The flight operated fully in closed loop: the camera stream was decoded on the GCS, decoded IDs were time-stamped and pushed to the dashboard, and the vehicle advanced only after a valid read or a timeout. All telemetry (pose estimates, hover/translation events, controller modes) and all perception events (frame ingress, decode, dashboard update) were recorded using a common clock so that kinematic and perception metrics could be computed on the same timeline. Five independent runs were performed with identical rack content and lighting.
Across five repetitions (Table 4), QR detection accuracy remained stable (mean ≈ 90.5%), while end-to-end mean latency clustered around 412 ms with p95 ≈ 539 ms—consistent with a single-stream pipeline and on-board logging. Path tracking errors were low (RMS 8.0 cm on average), and the vehicle stayed in the safety set for >99% of mission time, with minimum clearances of 0.41–0.46 m and peak Risk index R f between 0.31–0.35. Mission time per rack ranged from 129–140 s (mean ≈ 134 s), and duplicate reads were 6.2–7.5% (mean ≈ 6.84%), reflecting the deliberate hover-scan routine and absence of cloud de-duplication from a second agent. Overall, the single-UAV baseline demonstrates reliable QR capture and tight lane-keeping, providing a solid comparator for the dual-UAV cases.
Case II—Two UAVs Scanning the Same Rack
Two vehicles are assigned the same three-level rack from opposite ends and follow mirrored serpentine lanes. Lane-keeping uses a bounded lateral-error controller, while inter-UAV safety is enforced by the separation constraint s ( t ) s min = 1.2   m and a sigmoid risk bound R f ( p ) ≤ 0.4. A tokenized right-of-way rule—triggered when the predicted time-to-conflict τc < 3 s—pauses the follower at shelf corners and level transitions. Both agents run identical camera pipelines and publish detections to a cloud de-duplication service so that only unique reads are counted.
Aggregate performance mirrors the simulation trends. Complementary coverage is obtained along each shelf with minimal spatial overlap; duplicate reads remain ≤3.5% (Table 5). Mission time is reduced by ≈48−52% relative to the single-UAV baseline because hover plateaus shorten under confidence-aware speed control. Detection accuracy stays high (≈88.6%), End-to-end latency per feed remained similar to simulation, ranging from 355 ms to 389 ms, and safety margins are respected (minimum logged separation ≈ 1.28 m; peak risk R f max ≈ 0.32 < 0.4 D Path fidelity was maintained with mean RMS trajectory deviations of approximately 8.3 cm for UAV A and 9.1 cm for UAV B, with the largest deviations at shelf end-caps. Duplicate reads remained low, averaging 3.3%.
Case III—Two UAVs Scanning Different Racks (Parallel Racks, Independent Sectors)
In this scenario (Figure 19), both UAVs operate on parallel racks that are partitioned into independent sectors. The guidance layer assigns each vehicle a serpentine lane set on its own rack, eliminating spatial contention; the separation constraint s ( t ) s min = 1.2   m is trivially satisfied except at endpoints, and the Risk index cap R f ( p ) ≤ 0.4 is never approached during steady scan. Camera pipelines (fisheye model, identical exposure program) and LiDAR streams run concurrently; detections are pushed to the GCS, where cloud-side deduplication is still enabled, although collisions are rare because sectors do not overlap. This configuration is designed to test near-linear scaling of throughput with two independent servers working in parallel on disjoint inventories.
The experimental results generally mirrored the simulation trends in terms of efficiency and safety, but showed a notable decrease in QR detection accuracy (Table 6), reflecting an 8–12% variation from the simulation. (UAV-A: 86.28%, UAV-B: 85.98%) and low end-to-end latency (mean 0.342–0.361 s per feed; p95 0.476–0.479 s). Trajectory quality remained tight (RMS deviation 8.32 cm for UAV-A, 8.92 cm for UAV-B), with time-in-safe-set averaging 99.46%. Because racks are disjoint, the minimum logged inter-UAV separation never fell below 2.19 m and the maximum risk index stayed at R f max = 0.25, well under the 0.4 threshold. Mean mission time per run was 66.4 s, and duplicate reads averaged 0.38%, driven mostly by end-cap visibility overlaps rather than true spatial contention. Overall, Case III validates that distributing work across independent sectors yields near-ideal parallel efficiency while preserving safety margins and data quality.

4. Result and Discussion

This section shows the quantitative results obtained from simulation (MATLAB-Simulink/Unreal Engine) and the experimental analysis. The performance is analyzed across the three operational cases defined previously: Single UAV (Case I), Dual UAVs in the same aisle (Case II) (Figure 20), and Dual UAVs in different aisles (Case III). Finally, a comparative analysis evaluates the fidelity of the simulation against real-world outcomes.
To evaluate the fidelity of the simulation environment against the physical deployment for case-1 single Drone, a comparative analysis of key performance metrics was conducted. The relative error is calculated using the simulation mean as the reference baseline. As shown in Table 7, the experimental results generally align with simulation predictions in terms of latency and trajectory. However, a notable difference is observed in QR Accuracy, reflecting the expected detection loss in real-world conditions.
The discrepancy in QR Accuracy (5.26% error) highlights the challenge of perfectly modelling real-world visual sensor performance and environmental lighting in a simulation. While the Unreal Engine rendering is photorealistic, subtle real-world imperfections lead to a tangible drop in detection performance. The close correspondence in Mean Latency (0.66% error) and RMS Trajectory Deviation (3.49% error) validates the accuracy of the network and dynamic modeling used in the simulation.
A comparative analysis was conducted to evaluate the fidelity of the simulation against the experimental results for Case II, Where Two drones are used to detect the QR on the same rack. As shown in Table 8, the experimental results for latency, trajectory deviation, mission time, and safety metrics show remarkable alignment with the simulation, with relative errors well under 1%. However, a significant discrepancy is observed in the combined QR Accuracy, with a relative error of 7.46%, reflecting the expected 5–10% detection loss in the real-world environment.
This analysis confirms that while the simulation accurately models the system’s dynamics, network performance, and multi-agent coordination strategies, it is less predictive of the absolute QR detection performance in a real-world setting with complex lighting and occlusions.
A comparative analysis was conducted to evaluate the fidelity of the simulation against the experimental results for Case III. As shown in Table 9, the experimental results for latency, trajectory deviation, mission time, and safety metrics show remarkable alignment with the simulation, with relative errors well under 1%. However, a significant discrepancy is observed in the combined QR Accuracy, with a relative error of 10.05%, reflecting the expected 8–12% detection loss in the real-world environment.
Figure 21 provides a comprehensive summary of the relative error between simulated predictions and experimental results across the three distinct operational cases. The comparison reveals that the simulation achieves exceptionally high fidelity for kinematic and temporal metrics, with Mission Time and Minimum Separation showing negligible error across all scenarios, and Mean Latency and RMS Trajectory Deviation exhibiting minimal deviation, particularly in multi-UAV operations. The primary source of discrepancy is isolated to QR Accuracy, where the relative error increases with scenario complexity from approximately 5.3% in Case I to over 10% in Case III. This indicates that while the simulation environment is highly reliable for validating system dynamics, coordination strategies, and safety protocols, it is less predictive of absolute optical detection performance under real-world lighting and occlusion conditions.

5. Conclusions

This research work has presented a comprehensive, end-to-end framework for efficient UAV-based QR detection and inventory management for indoor environments. Using Simulink and Unreal simulation along with an effective experimental setup, including LiDAR-based clearance modeling and ground-side QR processing, the system successfully demonstrated real-time inventory synchronization to a live dashboard.
Extensive evaluation with experimental validation across single and multi-UAV operation modes confirmed the system’s reliability, highlighting the realities of physical deployment. In validating simulations, the framework maintained excellent data quality with QR detection accuracies of approximately 95–96%. Experimental validation in a warehouse testbed demonstrated the robustness of the system, achieving QR detection accuracies between roughly 86% and 90.5%. Across both environments, the system maintained high responsiveness with low latencies of 0.34–0.49 s per video feed. The proposed safety mechanisms, including the risk-factor field and replanning, strictly enforced safe operations under realistic conditions in both simulation and experiment. Trajectories were kept within aisle bounds with RMS path errors of 8–10 cm, maintaining minimum separation distances of at least 1.28 m and a maximum risk factor of 0.32 or less. Compared to sequential baselines, dual-UAV missions achieved time reductions of 48–52% for same-rack scanning and 46–49% for parallel-rack scanning, with duplicate reads effectively managed by the cloud-based workflow. Collectively, these results validate the practicality of autonomous UAVs for managing inventory in narrow, cluttered warehouse aisles and demonstrate a clear path to scalable multi-agent operations without compromising safety.
Future research will focus on whole warehouse deployment with dynamic rack placement and enhance the real sensing robustness. Other major development directions are-in-board fusion of visual-inertial odometry with signed-distance maps to improve replanning capability, adaptive scheduling at scale to larger UAV fleets, and supporting items identifiers (e.g., mixed QR/RFID) that allow reduced dwell times and more robust coverage in a wide range of environmental conditions.

Author Contributions

Conceptualization, E.P. and B.K.P.; Methodology, E.P.; Software, S.T.; Validation, E.P., B.K.P. and S.T.; Formal Analysis, E.P.; Investigation, E.P.; Resources, B.K.P.; Data Curation, S.T.; Writing—Original Draft Preparation, E.P.; Writing—Review & Editing, E.P., B.K.P. and S.T.; Visualization, S.T.; Supervision, B.K.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to restrictions related to ongoing re-search and institutional policies.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Baharudin, H. AI in E-Commerce Warehouse Management: Enhancing Operational Efficiency, Ensuring Inventory Precision, and Strengthening Security Measures. In Ensuring Inventory Precision, and Strengthening Security Measures; SSRN, ELSEVIER: Rochester, NY, USA, 2023. [Google Scholar]
  2. Rathee, M.M.; Rani, P. Warehouse Management and Inventory Control; Literatureslight Publishing: Jashpur Nagar, India, 2025. [Google Scholar]
  3. Hanzel, K. Modern Warehouse and Delivery Object Monitoring–Safety, Precision, and Reliability in the Context of the Use the UWB Technology. In European, Mediterranean, and Middle Eastern Conference on Information Systems; Springer Nature: Cham, Switzerland, 2022; pp. 451–462. [Google Scholar]
  4. Lin, H.Y.; Chang, K.L.; Huang, H.Y. Development of unmanned aerial vehicle navigation and warehouse inventory system based on reinforcement learning. Drones 2024, 8, 220. [Google Scholar] [CrossRef]
  5. Kwon, W.; Park, J.H.; Lee, M.; Her, J.; Kim, S.H.; Seo, J.W. Robust autonomous navigation of unmanned aerial vehicles (UAVs) for warehouses’ inventory application. IEEE Robot. Autom. Lett. 2019, 5, 243–249. [Google Scholar] [CrossRef]
  6. Malang, C.; Charoenkwan, P.; Wudhikarn, R. Implementation and critical factors of unmanned aerial vehicle (UAV) in warehouse management: A systematic literature review. Drones 2023, 7, 80. [Google Scholar] [CrossRef]
  7. Mohsan, S.A.H.; Othman, N.Q.H.; Li, Y.; Alsharif, M.H.; Khan, M.A. Unmanned aerial vehicles (UAVs): Practical aspects, applications, open challenges, security issues, and future trends. Intell. Serv. Robot. 2023, 16, 109–137. [Google Scholar] [CrossRef] [PubMed]
  8. Cristiani, D.; Bottonelli, F.; Trotta, A.; Di Felice, M. Inventory management through mini-drones: Architecture and proof-of-concept implementation. In 2020 IEEE 21st International Symposium on “A World of Wireless, Mobile and Multimedia Networks” (WoWMoM); IEEE: New York, NY, USA, 2020; pp. 317–322. [Google Scholar]
  9. Radácsi, L.; Gubán, M.; Szabó, L.; Udvaros, J. A path planning model for stock inventory using a drone. Mathematics 2022, 10, 2899. [Google Scholar] [CrossRef]
  10. Xu, L.; Kamat, V.R.; Menassa, C.C. Automatic extraction of 1D barcodes from video scans for drone-assisted inventory management in warehousing applications. Int. J. Logist. Res. Appl. 2018, 21, 243–258. [Google Scholar] [CrossRef]
  11. Yoon, B.; Kim, H.; Youn, G.; Rhee, J. 3d position estimation of objects for inventory management automation using drones. Appl. Sci. 2023, 13, 10830. [Google Scholar] [CrossRef]
  12. Agrawal, S.; Patle, B.K.; Sanap, S. A Novel Technique for Drone Path Planning Based on a Neighborhood Dragonfly Algorithm. Sensors 2025, 25, 863. [Google Scholar] [CrossRef] [PubMed]
  13. Dujari, R.; Patel, B.; Patle, B.K. Fast and Efficient Drone Path Planning Using Riemannian Manifold in Indoor Environment. Automation 2024, 5, 450–466. [Google Scholar] [CrossRef]
  14. Xiao, R.; Wang, S.; Xie, Y.; Zhang, Y.; Xie, S.Q. Safety-Aware UAV Formation Scheme for Guiding UGVs Through Obstacle-Laden Environments. IEEE Robot. Autom. Lett. 2025, 10, 6999–7006. [Google Scholar]
  15. Tsakiridis, S.; Papakonstantinou, A.; Kapandelis, A.; Mastorocostas, P.A.; Tsimpiris, A.; Varsamis, D. Optimizing uav-based inventory detection and quantification in industrial warehouses: A lidar-driven approach. WSEAS Trans. Syst. 2024, 23, 121–127. [Google Scholar] [CrossRef]
  16. Gurtner, A.; Greer, D.G.; Glassock, R.; Mejias, L.; Walker, R.A.; Boles, W.W. Investigation of fish-eye lenses for small-UAV aerial photography. IEEE Trans. Geosci. Remote Sens. 2009, 47, 709–721. [Google Scholar] [CrossRef]
  17. Fan, Y. Flight control system simulation for quadcopter unmanned aerial vehicle (UAV) based on Matlab Simulink. J. Phys. Conf. Ser. 2022, 2283, 012011. [Google Scholar] [CrossRef]
  18. Buck, A.; Camaioni, R.; Alvey, B.; Anderson, D.T.; Keller, J.M.; Luke, R.; Scott, G. Unreal engine-based photorealistic aerial data generation and unit testing of artificial intelligence algorithms. In Geospatial Informatics XII; SPIE: Bellingham, WA, USA, 2022; Volume 12099, pp. 59–73. [Google Scholar]
  19. González-Sieira, A.; Cores, D.; Mucientes, M.; Bugarín, A. Autonomous navigation for UAVs managing motion and sensing uncertainty. Robot. Auton. Syst. 2020, 126, 103455. [Google Scholar] [CrossRef]
  20. Koubâa, A.; Qureshi, B.; Sriti, M.F.; Javed, Y.; Tovar, E. A service-oriented Cloud-based management system for the Internet-of-Drones. In COIMBR 2017 ICRSC: Proceedings of the IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC); IEEE: New York, NY, USA, 2017; pp. 329–335. [Google Scholar]
  21. Osama, M.; Ateya, A.A.; Ahmed Elsaid, S.; Muthanna, A. Ultra-reliable low-latency communications: Unmanned aerial vehicles assisted systems. Information 2022, 13, 430. [Google Scholar] [CrossRef]
  22. Masnavi, H.; Shrestha, J.; Kruusamäe, K.; Singh, A.K. Vacna: Visibility-aware cooperative navigation with application in inventory management. IEEE Robot. Autom. Lett. 2023, 8, 7114–7121. [Google Scholar] [CrossRef]
  23. Yang, S.Y.; Jan, H.C.; Chen, C.Y.; Wang, M.S. CNN-Based QR Code Reading of Package for Unmanned Aerial Vehicle. Sensors 2023, 23, 4707. [Google Scholar] [CrossRef] [PubMed]
  24. Moreira, M.S.M.; Villa, D.K.D.; Sarcinelli-Filho, M. Controlling a virtual structure involving a uav and a ugv for warehouse inventory. J. Intell. Robot. Syst. 2024, 110, 121. [Google Scholar] [CrossRef]
  25. Stanko, J.; Stec, F.; Palkovic, L.; Rodina, J.; Rau, D. Towards automatic inventory checking using an autonomous unmanned aerial vehicle. In 2022 IEEE 27th International Conference on Emerging Technologies and Factory Automation (ETFA); IEEE: New York, NY, USA, 2022; pp. 1–8. [Google Scholar]
  26. Karamitsos, G.; Bechtsis, D.; Tsolakis, N.; Vlachos, D. Unmanned aerial vehicles for inventory listing. Int. J. Bus. Syst. Res. 2021, 15, 748–756. [Google Scholar] [CrossRef]
Figure 1. Proposed Method System Overview with Data Fusion and Visualization.
Figure 1. Proposed Method System Overview with Data Fusion and Visualization.
Symmetry 18 00548 g001
Figure 2. Workflow of UAV video streaming and QR code detection at the Ground Control Station (GCS).
Figure 2. Workflow of UAV video streaming and QR code detection at the Ground Control Station (GCS).
Symmetry 18 00548 g002
Figure 3. JSON Object File.
Figure 3. JSON Object File.
Symmetry 18 00548 g003
Figure 4. MATLAB-Simulink simulation block diagram.
Figure 4. MATLAB-Simulink simulation block diagram.
Symmetry 18 00548 g004
Figure 5. Input signal plots (In-putSignals.png) displaying UAV translation and rotation commands.
Figure 5. Input signal plots (In-putSignals.png) displaying UAV translation and rotation commands.
Symmetry 18 00548 g005
Figure 6. (a) Top view of UAV navigating between racks in Unreal Engine (TopView.png). (b) Side view showing UAV maintaining proper height for QR scanning (SideView.png).
Figure 6. (a) Top view of UAV navigating between racks in Unreal Engine (TopView.png). (b) Side view showing UAV maintaining proper height for QR scanning (SideView.png).
Symmetry 18 00548 g006
Figure 7. LiDAR-based point cloud reconstruction of the warehouse environment (LidarData.png).
Figure 7. LiDAR-based point cloud reconstruction of the warehouse environment (LidarData.png).
Symmetry 18 00548 g007
Figure 8. MATLAB-Simulink simulation block diagram: two UAVs at opposite rack ends prior to mission start.
Figure 8. MATLAB-Simulink simulation block diagram: two UAVs at opposite rack ends prior to mission start.
Symmetry 18 00548 g008
Figure 9. (a) T. -UAV Simulink model for same-rack scanning (duplicate UAV subsystems with shared scene, cameras, and LiDAR) (b) Top View.
Figure 9. (a) T. -UAV Simulink model for same-rack scanning (duplicate UAV subsystems with shared scene, cameras, and LiDAR) (b) Top View.
Symmetry 18 00548 g009
Figure 10. Input signal plots showing UAV translation and rotation commands.
Figure 10. Input signal plots showing UAV translation and rotation commands.
Symmetry 18 00548 g010
Figure 11. 3-D scatter of translation vs. QR detections showing complementary coverage.
Figure 11. 3-D scatter of translation vs. QR detections showing complementary coverage.
Symmetry 18 00548 g011
Figure 12. Distance-traveled-vs-time (staircase profile) with shortened hover plateaus.
Figure 12. Distance-traveled-vs-time (staircase profile) with shortened hover plateaus.
Symmetry 18 00548 g012
Figure 13. Time histogram of unique QR hits for both UAVs, showing paired clusters per segment.
Figure 13. Time histogram of unique QR hits for both UAVs, showing paired clusters per segment.
Symmetry 18 00548 g013
Figure 14. Two-rack Simulink model with duplicate UAV subsystems sharing a single photorealistic scene and producing synchronized image and point-cloud streams.
Figure 14. Two-rack Simulink model with duplicate UAV subsystems sharing a single photorealistic scene and producing synchronized image and point-cloud streams.
Symmetry 18 00548 g014
Figure 15. Unreal Engine aisle views with UAVs operating on different racks; (a) UAV scanning on the left side of the rack (b), UAV scanning on the right side of the rack.
Figure 15. Unreal Engine aisle views with UAVs operating on different racks; (a) UAV scanning on the left side of the rack (b), UAV scanning on the right side of the rack.
Symmetry 18 00548 g015
Figure 16. Merged LiDAR point cloud of the two racks; planar shelf surfaces and well-separated returns confirm large static separation.
Figure 16. Merged LiDAR point cloud of the two racks; planar shelf surfaces and well-separated returns confirm large static separation.
Symmetry 18 00548 g016
Figure 17. Experimental setup. (a) Scanning UAV (DJI Air 2S) with a three-axis gimbal camera on the take-off/landing pad. (b) Indoor mock-warehouse testbed with parallel racks and QR-tagged cartons arranged on three levels for inventory scan trials.
Figure 17. Experimental setup. (a) Scanning UAV (DJI Air 2S) with a three-axis gimbal camera on the take-off/landing pad. (b) Indoor mock-warehouse testbed with parallel racks and QR-tagged cartons arranged on three levels for inventory scan trials.
Symmetry 18 00548 g017
Figure 18. Single-UAV flight during rack scanning (experimental setup). The DJI Air 2S hovers mid-aisle while reading front-facing QR codes affixed to cartons at three shelf levels.
Figure 18. Single-UAV flight during rack scanning (experimental setup). The DJI Air 2S hovers mid-aisle while reading front-facing QR codes affixed to cartons at three shelf levels.
Symmetry 18 00548 g018
Figure 19. (a) Dual-UAV flight setup (b) Mid-mission view near the rack: the leader progresses along the lane while the follower holds its tokenized right-of-way, maintaining aisle centering and the prescribed separation.
Figure 19. (a) Dual-UAV flight setup (b) Mid-mission view near the rack: the leader progresses along the lane while the follower holds its tokenized right-of-way, maintaining aisle centering and the prescribed separation.
Symmetry 18 00548 g019
Figure 20. (a) Dual-UAV flight setup (b) Mid-mission view near the rack.
Figure 20. (a) Dual-UAV flight setup (b) Mid-mission view near the rack.
Symmetry 18 00548 g020
Figure 21. Comparative relative error for all different cases.
Figure 21. Comparative relative error for all different cases.
Symmetry 18 00548 g021
Table 1. Case I Simulation Performance Metrics (5 Runs).
Table 1. Case I Simulation Performance Metrics (5 Runs).
RunQR Accuracy (%)Cloud Latency Mean (ms)Cloud Latency p95 (ms)Trajectory RMS Dev (cm)% Time in Safe Set r min to Obstacle (m)
R195.91435.2506.107.0499.290.65
R295.27420.00480.508.0498.640.65
R395.15397.70481.908.9399.470.63
R495.80405.20519.907.7099.130.61
R595.45388.4484.706.9298.720.60
Table 2. Case II Simulation Performance Metrics (5 Runs).
Table 2. Case II Simulation Performance Metrics (5 Runs).
RunQR Accuracy (%)Mean Latency (ms)p95 Latency A (ms)RMS Traj Dev (cm)% Time in Safe SetMin Sep (m) R f max Mission Time (s)Duplicate Reads (%)
ABABABAB
R195.895.63683824864938.18.999.11.320.3663.4
R295.595.73553714754888.49.199.31.280.31683.2
R395.995.43613774804928.2999.21.30.29653.1
R495.495.63743894975018.59.2991.330.32673.5
R595.695.53623804824968.39.399.21.290.31663.3
Table 3. Case III Simulation Performance Metrics (5 Runs).
Table 3. Case III Simulation Performance Metrics (5 Runs).
RunQR Accuracy (%)Mean
Latency (ms)
p95 Latency (ms)RMS Traj Dev (cm)% Time in Safe SetMin Sep (m) R f max Mission Time (s)Duplicate Reads (%)
ABABABAB
R195.995.63423514724788.18.899.52.310.22660.4
R295.895.73483594764858.3999.42.250.24670.5
R395.695.83543474814698.48.999.62.280.23650.3
R495.795.53613584884818.69.299.32.190.25680.4
R595.995.93523444794688.28.799.52.330.21660.3
Table 4. Experimental Results (Case I: Single UAV).
Table 4. Experimental Results (Case I: Single UAV).
RunQR Accuracy (%)Mean
Latency (ms)
p95
Latency (ms)
RMS Traj Dev (cm)% Time in Safe SetMin
Clearance (m)
R f max Mission Time (s)Duplicate Reads (%)
R190.14085287.8099.40.440.331326.7
R291.24175428.1099.30.420.341366.9
R389.84015257.9099.60.460.311296.2
R490.54265598.2099.20.410.351407.5
R590.94085398.0099.30.430.341326.9
Mean90.50412538.68.0099.360.430.33133.86.84
SD0.569.6713.460.160.150.020.024.270.47
Table 5. Experimental Results (Case II: Two UAVs on the Same Rack).
Table 5. Experimental Results (Case II: Two UAVs on the Same Rack).
RunQR Accuracy (%)Mean Latency (ms)p95 Latency (ms)RMS Traj Dev (cm)% Time in Safe SetMin Sep (m) R f max Mission Time (s)Duplicate Reads (%)
ABABABABAB
R188.588.93683824864938.18.999.11.320.30663.4
R287.988.23553714754888.49.199.31.280.31683.2
R388.888.53613774804928.29.099.21.300.29653.1
R488.188.73743894975018.59.299.01.330.32673.5
R588.388.73623804824968.39.399.21.290.31663.3
Table 6. Experimental Results (Case II: Two UAVs on the Different Rack).
Table 6. Experimental Results (Case II: Two UAVs on the Different Rack).
RunQR Accuracy(%)Mean
Latency (ms)
p95 Latency (ms)RMS Traj Dev (cm)% Time in Safe SetMin Sep (m) R f max Mission Time (s)Duplicate Reads (%)
R186.185.93423514724788.18.899.52.310.22660.4
R286.586.13483594764858.39.099.42.250.24670.5
R385.886.23543474814698.48.999.62.280.23650.3
R486.785.63613584884818.69.299.32.190.25680.4
R586.3863523444794688.28.799.52.330.21660.3
Table 7. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case II).
Table 7. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case II).
MetricSimulation MeanExperimental MeanAbsolute
Difference
Relative Error (%)
QR Accuracy (%)95.5290.505.025.26%
Mean Latency (ms)409.3412.02.70.66%
RMS Traj. Dev. (cm)7.738.000.273.49%
Table 8. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case II).
Table 8. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case II).
MetricSimulation MeanExperimental MeanAbsolute
Difference
Relative Error (%)
Combined QR Acc. (%)95.6088.467.147.47%
Combined Mean Lat. (ms)371.9371.90.000.00%
Combined RMS Dev. (cm)8.708.700.000.00%
Mission Time (s)66.4066.40.000.00%
Min Sep (m)1.3041.3040.0000.00%
Table 9. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case III).
Table 9. Comparative Analysis of Mean Simulation vs. Experimental Metrics (Case III).
MetricSimulation MeanExperimental MeanAbsolute
Difference
Relative Error (%)
Combined QR Acc. (%)95.7486.129.6210.05%
Combined Mean Lat. (ms)351.6351.60.00.00%
Combined RMS Dev. (cm)8.628.620.00.00%
Mission Time (s)66.466.40.00.00%
Min Sep (m)2.272.270.000.00%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pore, E.; Patle, B.K.; Thorat, S. UAV-Based QR Code Scanning and Inventory Synchronization System with Safe Trajectory Planning. Symmetry 2026, 18, 548. https://doi.org/10.3390/sym18040548

AMA Style

Pore E, Patle BK, Thorat S. UAV-Based QR Code Scanning and Inventory Synchronization System with Safe Trajectory Planning. Symmetry. 2026; 18(4):548. https://doi.org/10.3390/sym18040548

Chicago/Turabian Style

Pore, Eknath, Bhumeshwar K. Patle, and Sandeep Thorat. 2026. "UAV-Based QR Code Scanning and Inventory Synchronization System with Safe Trajectory Planning" Symmetry 18, no. 4: 548. https://doi.org/10.3390/sym18040548

APA Style

Pore, E., Patle, B. K., & Thorat, S. (2026). UAV-Based QR Code Scanning and Inventory Synchronization System with Safe Trajectory Planning. Symmetry, 18(4), 548. https://doi.org/10.3390/sym18040548

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop