Next Article in Journal
Unmanned Aerial Systems for Civil Applications: A Review
Open AccessEditorial

Drones—An Open Access Journal

TIDOP Research Group, Department of Cartography and Land Engineering, Polytechnical Higher School of Avila, University of Salamanca, 05003 Avila, Spain
Author to whom correspondence should be addressed.
Received: 15 December 2016 / Revised: 16 December 2016 / Accepted: 16 December 2016 / Published: 4 January 2017


Since the beginning of aviation, unmanned aerial systems have been a challenge for scientists and engineers.
Since the beginning of aviation, unmanned aerial systems have been a challenge for scientists and engineers. The first automatic airplane developed by the Wright brothers in 1916 and the drone used by the British Royal Navy for gunnery practice in 1933 serve as examples. The possibility of controlling an aircraft without a pilot has been a challenge, both from the civil and military point of view. Nowadays, the proliferation of unmanned aerial systems, popularly known as “drones”, is a reality for local policy makers, regulatory bodies, mapping authorities, start-ups and consolidated companies. The number of developed drones has increased threefold from 2005 to present and, additionally, a relevant increase has been observed in the civil/commercial type of platforms, especially in 2012 and 2013 [1]. There are many uses and benefits of drones based on their own pilot system (autonomous or remotely controlled) and sensory to achieve accurate positioning and to acquire a great variety of data. By this binomial, drones are an efficient solution for the observation, inspection, measurement and monitoring of territory; ensuring better spatial, radiometric, spectral and temporal resolutions than any manned aerial vehicle and satellite.
Drones have caused an unprecedented impact on society and the economy. According to recent market research [2], the global drones market revenue is worth $6,800 M United States Dollars (USD) as of 2016 and is expected to grow up to $36,900 M USD by 2022. The low-cost of the sensors integrated in aerial platforms of different designs has launched its application and proliferation more than its military origin, offering new applications and leading to more clients in the civil sector. It is in this last sector where more progress by drones is expected. For instance, new developments in robotics, computer vision and geomatic technologies, together with more research and development supported by technological centres and universities, allow the improvement of technology transfer with an important insight into new markets. While it is true that these civil-drones are still far from the systems used in military applications, studies and advances in the development of technology can offer professional systems with new, improved and promising features.
In this sense, it is expected that the main advances of drones may occur in the following lines: (1) the emergence of new sensors that allow the improvement of the geometric and radiometric resolution, as well as the spectral range; (2) the evolution of new platforms that improve robustness and increase autonomy; (3) the development of software, from the navigation and communication with the platform to the processing and analysis of the images captured; (4) new applications in emerging sectors: logistics, disaster assistance, security and surveillance, health and marine science, among others.
The emergence of new sensors, thanks to advances in microelectronics and nanotechnology [3,4,5] will be crucial in the coming years. In fact, the evolution of integrated circuitry and radio-controlled systems in the late twentieth century was key in the advent of modern drones. At present, improvements in miniaturization and the development of new sensors (e.g., computer boards, Global Navigation Satellite System (GNSS) receivers and antennas, Inertial Measurement Systems (IMU), cameras, etc.) has entailed a very important evolution for drones [6,7,8]. These advances have appeared in the range of existing sensors both active (e.g., Light Detection and Ranging (LiDAR) and Synthetic aperture radar (SAR)) and passive (e.g., multispectral, hyperspectral and thermographic cameras). Specifically for drones, many imaging and ranging sensors are identified in [9], including active and passive, as well as optical systems—from the visible band and the Near Infrared (NIR) to the Thermal Infrared (TIR)—and microwave systems. The possibilities offered by these new sensors have created new opportunities for research and study. For instance, RGB sensors have improved their capabilities, including multiple-head RGB cameras and wide-angle lens [10,11]. For its part, multispectral sensors have improved their dynamic range and sensitivity [12], including geometric and radiometric calibration procedures [13,14]. In parallel, hyperspectral cameras provide more detailed information than multispectral sensors because an entire spectrum is acquired at each pixel [15]. There has been remarkable progress in thermographic cameras miniaturization in recent years. Thermographical sensors have also been improved in weight and quality, being applied to inspect or monitor structures or pathologies, which otherwise would not be visible to the human eye [16]. They have also been applied in the military context for remote reconnaissance [17] and in applications such as forest fire monitoring [18].
The new platforms have also undergone a great evolution; since the beginning of the first captive unmanned vehicles, such as kites [19], balloons [20] and zeppelins [21], to the modern drones based on multi-rotors [22], fixed-wing [23] and hybrid platforms (fixed-wing hybrid vertical take-off and landing) [24], without forgetting the nano-drones [25]. Some characteristics of these new platforms, which in many cases make them unique, can be heights of flight below the limits of aerial navigation with a consequent increase in spatial resolution; low flight speed which can reach zero while maintaining a static flight (multi-rotors); the possibility of flight in complex environments, for example, between buildings or even flights for indoor mapping [26]; the addition of durable and lightweight materials such as carbon fiber. These platforms have been made possible thanks to advances in electronics that have allowed inertial measurement units (IMU) for navigation and stabilization of the drones; the use of global navigation satellite systems for positioning and navigation; or the use of stations and control systems which have allowed us to operate drones in an efficient way. However, most civil drones are small vehicles, sensitive to wind and of limited autonomy, which restricts the effective usability of the platform in terms of coverage area, payload and mission time. This is a hot topic for the Scientific Community, which is looking to improve the power storage system. Some of these efforts are focused on the employment of photovoltaic cells, in the quest for the perpetual flight [27]. Alternative research lines are focused on the use of fuel cells (hydrogen) [28] or the use of new materials, such as graphene, to improve the energy density storage and conversion to electrical current [29].
Advances in algorithms supported by mathematics and computer science have allowed new achievements in the field of navigation and images processing either from active or passive sensors. Without these advances in processing algorithms, it would be unthinkable to deal with the enormous amount of information obtained by the sensors. This information must be processed in the shortest time possible whilst guarantying quality. Therefore, the advances in the treatment of large volumes of information (big-data) have also led to a significant advance in the technology of drones [30]. On the other hand, these algorithms have been automated so that they can provide final products guaranteeing an almost total automation of the process: from automatic flight planning [31] and autonomous navigation (autopilot), even under complicated environments [32], to the generation of quality cartographic products, mainly digital surface models and orthoimages [33,34,35]. Another important aspect to consider is the fusion of sensors, which presents a twofold purpose: on the one hand offering robust approaches to obtain data fusion from different sensors (visible, near infrared, middle infrared, thermographic) [36,37,38] and on the other hand, attempting approaches for data fusion using different sensors in different platforms [39,40,41].
Last but not least, new applications will define new market niches in the coming years. Regarding logistics, the integration of drones into commercial delivery networks is already being tested to optimize the trade-off between the mission time and range, and the security and profit [42,43]. Moreover, medical applications could benefit from a rapid response in a cardiac arrest scenario [44] or for the delivery and transportation of laboratory specimens [45]. Regarding disaster assistance, security and surveillance applications, the automatic recognition based on semantic attributes using images or videos will be crucial to provide a real-time response [46]. In addition, nanotechnology could be used to create nano-drones that are very promising for indoor navigation and inspections [47]. Regarding monitoring, some complex infrastructures (e.g., high and medium voltage lines, oil and gas pipe lines, roads, railways, etc.) could be monitored based on drones [48].
Drones Open Access Journal (ISSN 2504-446X) aims to be a central forum for researchers engaged in drones research and applications. Drones encourages researchers worldwide to contribute original research and technical manuscripts on the topics described above, and take the opportunity to share their findings and contributions on these topics with the Scientific Community. The cooperation of everyone involved will result in the success of this interdisciplinary journal. Let us wish Drones a bright future.

Author Contributions

All authors contributed extensively to the work presented in this paper.

Conflicts of Interest

The authors declare no conflicts of interest.


  1. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  2. Markets Research. Drones Market Shares, Strategies, and Forecasts, Worldwide, 2016 to 2022; Technical Report; Wintergreen Research: Lexington, KT, USA, 2016. [Google Scholar]
  3. Lyshevski, S.E. Nano and molecular technologies in microelectronics, MEMS and electronic systems. In Proceedings of the 2013 IEEE XXXIII International Scientific Conference Electronics and Nanotechnology (ELNANO), Kiev, Ukraine, 16–19 April 2013; pp. 38–42.
  4. Grondel, S.; Cattan, E. Controlled Lift for an Efficient ARtificial insect flight. Impact 2016, 2016, 35–37. [Google Scholar]
  5. Lyshevski, S.E. Power Electronics, Microelectronics and Propulsion Systems for Solar-Powered Unmanned Aerial Vehicles. In Proceedings of the 2016 IEEE 36th International Conference on Electronics and Nanotechnology (ELNANO), Kyiv, Ukraine, 19–21 April 2016; pp. 304–308.
  6. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006. [Google Scholar] [CrossRef][Green Version]
  7. Ramasamy, S.; Sabatini, R.; Gardi, A. Avionics sensor fusion for small size unmanned aircraft Sense-and-Avoid. In Proceedings of the 2014 IEEE Metrology for Aerospace (MetroAeroSpace), Benevento, Italy, 29–30 May 2014; pp. 271–276.
  8. Troglia Gamba, M.; Marucco, G.; Pini, M.; Ugazio, S.; Falletti, E.; Lo Presti, L. Prototyping a GNSS-based passive radar for UAVs: An instrument to classify the water content feature of lands. Sensors 2015, 15, 28287. [Google Scholar] [CrossRef] [PubMed]
  9. Van Blyenburgh, P. 2013–2014 RPAS Yearbook: Remotely Piloted Aircraft Systems: The Global Perspective 2013/2014; UVS International: Paris, France, 2013. [Google Scholar]
  10. Grenzdörffer, G.; Niemeyer, F.; Schmidt, F. Development of four vision camera system for a micro-UAV. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2012, XXXIX-B1, 369–374. [Google Scholar] [CrossRef]
  11. Xie, F.; Lin, Z.; Gui, D.; Lin, H. Study on construction of 3D building based on UAV images. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2012, XXXIX-B1, 469–473. [Google Scholar] [CrossRef]
  12. Geelen, B.; Blanch, C.; Gonzalez, P.; Tack, N.; Lambrechts, A. A tiny VIS-NIR snapshot multispectral camera. Proc. SPIE 2015. [Google Scholar] [CrossRef]
  13. Kelcey, J.; Lucieer, A. Sensor correction and radiometric calibration of a 6-band multispectral imaging sensor for UAV remote sensing. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2012, XXXIX-B1, 393–398. [Google Scholar] [CrossRef]
  14. Del Pozo, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Felipe-García, B. Vicarious radiometric calibration of a multispectral camera on board an Unmanned Aerial System. Remote Sens. 2014, 6, 1918–1937. [Google Scholar] [CrossRef]
  15. Rikola, Ltd. Rikola Hyper-Spectral Camera Specifications. 2012. Available online: (accessed on 22 December 2016).
  16. Carrio, A.; Pestana, J.; Sanchez-Lopez, J.-L.; Suarez-Fernandez, R.; Campoy, P.; Tendero, R.; García-De-Viedma, M.; González-Rodrigo, B.; Bonatti, J.; Rejas-Ayuga, J.G.; et al. UBRISTES: UAV-Based Building Rehabilitation with Visible and Thermal Infrared Remote Sensing. In Robot 2015: Second Iberian Robotics Conference: Advances in Robotics, Volume 1; Reis, L.P., Moreira, A.P., Lima, P.U., Montano, L., Muñoz-Martinez, V., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 245–256. [Google Scholar]
  17. Kostrzewa, J.; Meyer, W.; Laband, S.; Terre, W.; Petrovich, P.; Swanson, K.; Sundra, C.; Sener, W.; Wilmott, J. Infrared microsensor payload for miniature unmanned aerial vehicles. Proc. SPIE 2003, 5090. [Google Scholar] [CrossRef]
  18. Scholtz, A.; Kaschwich, C.; Kruger, A.; Kufieta, K.; Schnetter, P.; Wilkens, C.; Kruger, T.; Vorsmann, P. Development of a new multi-purpose UAS for scientific application. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2011, XXXVIII-1/C22, 149–154. [Google Scholar] [CrossRef]
  19. Verhoeven, G.J.J.; Loenders, J.; Vermeulen, F.; Docter, R. Helikite aerial photography—A versatile means of unmanned, radio controlled, low-altitude aerial archaeology. Archaeol. Prospect. 2009, 16, 125–138. [Google Scholar] [CrossRef]
  20. Altan, M.; Celikoyan, T.; Kemper, G.; Toz, G. Balloon photogrammetry for cultural heritage. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2004, XXXV-B5, 964–968. [Google Scholar]
  21. Gomez-Lahoz, J.; Gonzalez-Aguilera, D. Recovering traditions in the digital era: The use of blimps for modelling the archaeological cultural heritage. J. Archaeol. Sci. 2009, 36, 100–109. [Google Scholar] [CrossRef]
  22. Segui-Gasco, P.; Al-Rihani, Y.; Shin, H.-S.; Savvaris, A. A novel actuation concept for a multi rotor UAV. J. Intell. Robot. Syst. 2014, 74, 173–191. [Google Scholar] [CrossRef]
  23. Beard, R.W.; Kingston, D.; Quigley, M.; Snyder, D.; Christiansen, R.; Johnson, W.; McLain, T.; Goodrich, M. Autonomous Vehicle Technologies for Small Fixed-Wing UAVs. J. Aerosp. Comput. Inf. Commun. 2005, 2, 92–108. [Google Scholar] [CrossRef]
  24. Ozdemir, U.; Aktas, Y.O.; Vuruskan, A.; Dereli, Y.; Tarhan, A.F.; Demirbag, K.; Erdem, A.; Kalaycioglu, G.D.; Ozkol, I.; Inalhan, G. Design of a commercial hybrid VTOL UAV system. J. Intell. Robot. Syst. 2014, 74, 371–393. [Google Scholar] [CrossRef]
  25. Petricca, L.; Ohlckers, P.; Grinde, C. Micro- and nano-air vehicles: State of the art. Int. J. Aerospace Eng. 2011, 2011, 17. [Google Scholar] [CrossRef]
  26. Grzonka, S.; Grisetti, G.; Burgard, W. A Fully Autonomous Indoor Quadrotor. IEEE Trans. Robot. 2012, 28, 90–100. [Google Scholar] [CrossRef]
  27. Oettershagen, P.; Melzer, A.; Mantel, T.; Rudin, K.; Stastny, T.; Wawrzacz, B.; Hinzmann, T.; Alexis, K.; Siegwart, R. Perpetual flight with a small solar-powered UAV: Flight results, performance analysis and model validation. In Proceedings of the 2016 IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2016.
  28. González-Espasandín, Ó.; Leo, T.J.; Navarro-Arévalo, E. Fuel cells: A real option for Unmanned Aerial Vehicles propulsion. Sci. World J. 2014, 2014, 12. [Google Scholar] [CrossRef] [PubMed]
  29. Cho, E.S.; Ruminski, A.M.; Aloni, S.; Liu, Y.-S.; Guo, J.; Urban, J.J. Graphene oxide/metal nanocrystal multilaminates as the atomic limit for safe and selective hydrogen storage. Nat. Commun. 2016, 7, 10804. [Google Scholar] [CrossRef] [PubMed]
  30. Ofli, F.; Meier, P.; Imran, M.; Castillo, C.; Tuia, D.; Rey, N.; Briant, J.; Millet, P.; Reinhard, F.; Parkan, M. Combining human computing and machine learning to make sense of big (aerial) data for disaster response. Big Data 2016, 4, 47–59. [Google Scholar] [CrossRef] [PubMed]
  31. Hernandez-Lopez, D.; Felipe-Garcia, B.; Gonzales-Aguilera, D.; Arias-Perez, B. An automatic approach to UAV flight planning and control for photogrammetric applications: A test case in the Asturias Region (Spain). Photogramm. Eng. Remote Sens. 2013, 79, 87–98. [Google Scholar] [CrossRef]
  32. Chen, H.; Chang, K.; Agate, C.S. UAV Path Planning with Tangent-plus-Lyapunov Vector Field Guidance and Obstacle Avoidance. IEEE Trans. Aerosp. Electron. Syst. 2013, 49, 840–856. [Google Scholar] [CrossRef]
  33. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV photogrammetry for mapping and 3D modeling—Current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2011, XXXVIII-1/C22, 25–31. [Google Scholar] [CrossRef]
  34. Barazzetti, L.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F. True-orthophoto generation from UAV images: Implementation of a combined photogrammetric and computer vision approach. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 2, 57. [Google Scholar] [CrossRef]
  35. Gonçalves, J.A.; Henriques, R. UAV photogrammetry for topographic monitoring of coastal areas. ISPRS J. Photogramm. Remote Sens. 2015, 104, 101–111. [Google Scholar] [CrossRef]
  36. Lindner, M.; Kolb, A.; Hartmann, K. Data-fusion of PMD-based distance-information and high-resolution RGB-images. In Proceedings of the 2007 International Symposium on Signals, Circuits and Systems, Iasi, Romania, 13–14 July 2007; pp. 1–4.
  37. Susperregi, L.; Martínez-Otzeta, J.M.; Ansuategui, A.; Ibarguren, A.; Sierra, B. RGB-D, laser and thermal sensor fusion for people following in a mobile robot. Int. J. Adv. Robot. Syst. 2013, 10. [Google Scholar] [CrossRef]
  38. Näsi, R.; Honkavaara, E.; Tuominen, S.; Saari, H.; Pölönen, I.; Hakala, T.; Viljanen, N.; Soukkamäki, J.; Näkki, I.; Ojanen, H. UAS based tree species identification using the novel FPI based hyperspectral cameras in visible, NIR and SWIR spectral ranges. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2016, XLI-B1, 1143–1148. [Google Scholar] [CrossRef]
  39. Lin, Y.; Hyyppä, J.; Rosnell, T.; Jaakkola, A.; Honkavaara, E. Development of a UAV-MMS-Collaborative Aerial-to-Ground Remote Sensing System – A Preparatory Field Validation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 1893–1898. [Google Scholar] [CrossRef]
  40. Kato, A.; Obanawa, H.; Hayakawa, Y.; Watanabe, M.; Yamaguchi, Y.; Enoki, T. Fusion between UAV-SFM and terrestrial laser scanner for field validation of satellite remote sensing. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 2642–2645.
  41. Torres-Martínez, J.; Seddaiu, M.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; González-Aguilera, D. A multi-data source and multi-sensor approach for the 3D reconstruction and web visualization of a complex archaelogical site: The case study of “Tolmo De Minateda”. Remote Sens. 2016, 8, 550. [Google Scholar] [CrossRef]
  42. Park, S.; Zhang, L.; Chakraborty, S. Design space exploration of drone infrastructure for large-scale delivery services. In Proceedings of the 35th International Conference on Computer-Aided Design, Austin, TX, USA, 7–10 November 2016; pp. 1–7.
  43. Yang, N.K.; San, K.T.; Chang, Y.S. A novel approach for real time monitoring system to manage UAV delivery. In Proceedings of the 5th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI), Kumamoto, Japan, 10–14 July 2016; pp. 1054–1057.
  44. Fleck, M. Usability of lightweight defibrillators for UAV delivery. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 3056–3061.
  45. Amukele, T.K.; Sokoll, L.J.; Pepper, D.; Howard, D.P.; Street, J. Can Unmanned Aerial Systems (Drones) be used for the routine transport of chemistry, hematology, and coagulation laboratory specimens? PLoS ONE 2015, 10, e0134020. [Google Scholar] [CrossRef] [PubMed]
  46. Fernandez Galarreta, J.; Kerle, N.; Gerke, M. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning. Nat. Hazards Earth Syst. Sci. 2015, 15, 1087–1101. [Google Scholar] [CrossRef]
  47. Zhang, X.; Xian, B.; Zhao, B.; Zhang, Y. Autonomous flight control of a nano quadrotor helicopter in a GPS-denied environment using on-board vision. IEEE Trans. Ind. Electron. 2015, 62, 6392–6403. [Google Scholar] [CrossRef]
  48. Ham, Y.; Han, K.K.; Lin, J.J.; Golparvar-Fard, M. Visual monitoring of civil infrastructure systems via camera-equipped Unmanned Aerial Vehicles (UAVs): A review of related works. Vis. Eng. 2016, 4, 1. [Google Scholar] [CrossRef]
Back to TopTop