Abstract
Today, the use of UASs (unmanned aerial systems) is rapidly expanding across civil, military, and scientific applications. The deployment of drones in close proximity to urban areas is becoming increasingly common, particularly during missions conducted beyond visual line of sight (BVLOS) or in fully autonomous modes. Advancements in technology have enabled the development of systems and platforms that no longer require a human operator onboard and are equipped with progressively higher levels of autonomy. Therefore, enhancing onboard systems is crucial to ensure a high level of operational safety, particularly during missions conducted in harsh and complex environments, such as urban and suburban areas, where the presence of a large number of static and dynamic obstacles, including pedestrians, vehicles, and other aircraft, is pervasive. In this context, the implementation and integration of multiple onboard devices and sensors represent the core focus of this work, with the objective of improving perception, navigation, and safety capabilities for autonomous UAV operations. In particular, communication channels, hardware integration, and data fusion techniques have been implemented and evaluated to improve system performance and situational awareness. This work presents the hardware and software integration of LiDAR and radar sensors with a Pixhawk autopilot and a Raspberry Pi companion computer, aimed at developing obstacle detection applications.