1. Introduction
Accountability is a value often mentioned in the debate on autonomous systems and their increased pervasiveness in our society (see Verdiesen, de Sio, and Dignum [
1] for an overview). In a narrow sense, it is regarded as a mechanism for corporate and public governance to impart responsibility into agents and organisations. Bovens [
2] (p. 450) focuses on this narrow sense of accountability and defines it as follows:
“Accountability is a relationship between an actor and a forum, in which the actor has an obligation to explain and to justify his or her conduct, the forum can pose questions and pass judgement, and the actor may face consequences”. The relationship between an
actor and a
forum is a key notion in the concept of accountability. In a broad sense, accountability is seen as a virtue and used to criticise or praise the performance of organisations or states regarding policy and decision and their willingness to give information and explanations about their actions [
3]. Throughout the literature, the notion of accountability is often framed as a form of backward-looking responsibility [
4] and there is much public administration literature on accountability procedures and sanctions that can be imposed when ex-post explanations are inadequate [
2,
5,
6]. However, accountability should not be limited only to scrutiny after an event has occurred: it has also an anticipatory and preventive use to (re)produce, internalise, and adjust norms [
1]. Broadly construed, the ability to hold an actor accountable hinges on having control mechanisms [
7] to oversee, discuss, and verify the behaviour of the system to check its alignment with determined values and norms.
When focusing specifically on autonomous systems, a critical gap emerges: although there is much work on governance and attribution of accountability [
8,
9], there is a significant lack of methods for the
operationalisation of accountability within the socio-technical layer of autonomous systems [
1]. This is particularly salient where autonomous systems are concerned: as executive autonomy is delegated to the system, guaranteeing deployment accountability is a challenge, both in terms of specifications (what does it mean operationally for accountability to be ensured during an autonomous deployment?) and processes (which verifiable behaviours of the autonomous system and the socio-technical system around it guarantee accountability?). In the case of autonomous unmanned aerial vehicles or drones, as we shall refer to them in the remainder of the text—the critical question of how to maintain accountability as they undertake
fully autonomous flights becomes increasingly important as their uses multiply in both the commercial and military fields. Although the level of autonomy that should be granted to drones—particularly in the military context—is the subject of debate [
10], applications in, e.g., emergency response [
11,
12] already consider autonomous flights a necessity due to the possibility of failing communication infrastructure or operator unpreparedness. Therefore, we assume in this paper that in-flight communication is not possible and that it is important to implement a monitoring process before and after the flight to ensure human oversight. To the best of our knowledge, there are no other accountability frameworks for human oversight when there is no in-flight contact. We believe that the lack of accountability frameworks when there is an absence of in-flight communication is a gap that needs to be filled.
In this paper, we aim to fill the operationalisation gap by proposing a socio-technical framework to guarantee human oversight and accountability in drone deployments, showing its enforceability in the real case of military surveillance drones. For this purpose, we adapt the Glass Box method of Aler Tubella et al. [
13] to provide a monitoring framework for the socio-technical system composed of drone and operator, focusing solely on
observable constraints on pre- and post-flight processes. By keeping a focus on accountability and human oversight as values, we align with the emphasis placed on human responsibility [
14], while requiring a concretisation of what these principles mean for each specific application, connecting them with concrete socio-technical requirements. In addition, by constraining the framework to observable elements of pre- and post-deployment, we do not rely on assumptions on the internal workings of the drone nor the technical fluency of the operator. This paper has a conceptual focus and provides an implementation concept of the pre- and post-deployment observable elements as an illustration of the Glass Box method to ensure human oversight, which is a novel approach.
In the remainder of this paper we first describe related work on accountability and human oversight before describing the Glass Box framework with its interpretation and observation stages. In the following section, we describe our proposed two-stage accountability framework for drone deployment. To illustrate it, we then showcase an initial implementation concept as an example for the real case of military surveillance drones formalised in the discrete-event modelling language given by Coloured Petri Nets (CPNs). Finally, in the conclusion we discuss our findings, limitations of our work, and directions for future work.
3. Framework
Ensuring accountability and adherence to values in the context of drone deployment is inextricably tied to the notion of human oversight and human accountability. For this reason, we propose to consider drone deployment a “process within a socio-technical system”, the monitoring of which includes not only examining the behaviour of the drone itself but also examining human-led procedures in pre- and post-deployment. A specific adaptation of the Glass Box approach to this context is therefore the explicit inclusion of the operator(s) as an entity to which norms can apply.
A significant choice in this framework is the decision to consider the drone a “black box”, the internal logic of which is not accessible. This responds to two motivations. Firstly, relying on access and monitoring capabilities on the internal workings of drones would be a strong assumption, since the proprietary nature of this technology often precludes observation of its software. Second, for auditability purposes, the users of this framework should be able to transparently follow the monitoring process. However, such users, who will respond to the monitoring process, do not necessarily possess the technical background required to understand or check constraints on the internal logic of a drone. Thus, our framework is based on monitoring adherence to norms constraining purely observable elements of pre-, and post-deployment. Another choice is that we purposely designed a technology-agnostic approach so that it can be used on many different systems independent from the AI techniques and algorithms that are used as internal workings of the drone. We consider these as part of the black box.
A final adaptation is the explicit call to restrict the specifications and monitoring to pre- and post-flight processes. This choice is due to our focus on autonomous drones: after landing, we can check what has happened during the flight, but during it we assume no contact between the drone and its operator, for example, due to a failing communication structure, an electronic warfare threat, or operator unpreparedness. Of course, if the possibility of in-flight communication exists, expanding the norms to include in-flight behaviour is a possibility.
In what follows, we present an adaptation of the Glass Box approach for the inclusion of human oversight in autonomous drone deployment. The proposed framework includes an interpretation and an observation stage, each discussed in detail.
3.1. Interpretation
The interpretation stage entails turning values into concrete norms constraining observable elements and actions within the socio-technical system. As high-level concepts, values are abstract, whereas norms are prescriptive and impose or forbid courses of action. Such a translation is done by constructing norms progressively, subsuming each norm into several more concrete ones, until the level of norms containing concrete testable requirements is reached. This concretisation of norms will be carried out by all stakeholders involved in the deployment, ideally with legal advisory as well as with participation from operators themselves (whose processes will be subject to the norms identified).
Through a
Design for Values perspective [
4,
24,
25,
26,
27], concretising values requires carefully adapting to the specific context, as values may take different meanings in different contexts. In the case of drone deployment, the context is made up of two main factors: the context of deployment itself, and the organisation doing the deployment. Thus, some norms may generally apply to any deployment (such as organisational rules), whereas others may be highly specific (such as regulations governing specific areas or purposes). For this reason, the interpretation stage does not produce a one-size-fits-all normative framework, but rather it needs to be updated in any change of context. The specific tying of norms to a context enforces human oversight in this stage: new human-designed norms are needed for any new context of deployment, thus necessarily implicating the deploying organisation in the process of considering each situation’s specificity and risk.
Even though values and their interpretations vary by culture, purpose, organisation, and context, some values are fundamentally tied to the context of drone deployment. As with any technology deployed into society, a fundamental value is that of
lawfulness. A requirement for any drone deployment is, for example, to respect flight rules (e.g., maximum height of flight and avoidance of airport surroundings). Thus, the identification of requirements for the trajectory taken by the drone is a fundamental aspect of this stage. Given the different capabilities that drones may be equipped with, aspects of the law related to flying over public spaces, commercial liability, or privacy [
28], as well as surveillance [
29] or warfare, must be considered. The purpose of deployment itself (e.g., humanitarian aid, commercial delivery, or bird observation) will determine the relevant values that guide the process, such as privacy [
30], safety [
31], humanity [
32], or ecological sustainability [
33].
Requirements need to refer to the observable behaviour of drone and operator, and are considered in the context of pre- and post-flight procedures. They may apply to checkable behaviours of the drone (flying over a certain altitude or flying over certain areas), to pre-flight processes (getting approval or checking weather conditions), or to post-flight processes (evaluation of route followed or treatment of the data obtained). Crucially, they are not limited to the drones’ behaviour, but must include the system around it for human oversight: procedures such as pre-flight safety checks, acquiring authorisations or human review of the data obtained should all be mandated and constrained, so that we can guarantee that the entire flight process has been subject to human oversight.
The norms and observable requirements identified at this stage form the basis for the next stage, indicating what should be monitored and checked, and which actions constitute norm violations.
Author Contributions
Conceptualization, I.V., A.A.T., and V.D.; methodology, I.V.; formal analysis, A.A.T.; domain knowledge, I.V.; writing—original draft preparation, I.V. and A.A.T.; writing—review and editing, V.D.; visualization, I.V.; supervision, V.D. All authors have read and agreed to the published version of the manuscript.
Funding
Aler Tubella and Dignum are supported by the Wallenberg AI, Autonomous Systems and Software Program (WASP), funded by the Knut and Alice Wallenberg Foundation.
Data Availability Statement
Conflicts of Interest
The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of the data; in the writing of the manuscript; or in the decision to publish the results.
References
- Verdiesen, I.; de Sio, F.S.; Dignum, V. Accountability and Control Over Autonomous Weapon Systems: A Framework for Comprehensive Human Oversight. Minds Mach. 2020, 31, 137–163. [Google Scholar] [CrossRef]
- Bovens, M. Analysing and assessing accountability: A conceptual framework 1. Eur. Law J. 2007, 13, 447–468. [Google Scholar] [CrossRef]
- Bovens, M.; Goodin, R.E.; Schillemans, T. The Oxford Handbook Public Accountability; Oxford University Press: Oxford, UK, 2014. [Google Scholar]
- Van de Poel, I. Translating values into design requirements. In Philosophy and Engineering: Reflections on Practice, Principles and Process; Springer: Berlin/Heidelberg, Germany, 2013; pp. 253–266. [Google Scholar]
- Keohane, R.O. Global Governance and Democratic Accountability; Citeseer: Princeton, NJ, USA, 2003. [Google Scholar]
- Greer, S.L.; Wismar, M.; Figueras, J.; McKee, C. Governance: A framework. Strength. Health Syst. Gov. 2016, 22, 27–56. [Google Scholar]
- Schedler, A. Conceptualizing accountability. Self-Restraining State Power Account. New Democr. 1999, 13, 17. [Google Scholar]
- Pagallo, U. From automation to autonomous systems: A legal phenomenology with problems of accountability. In Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia, 19–25 August 2017; pp. 17–23. [Google Scholar] [CrossRef] [Green Version]
- De Sio, F.S.; van den Hoven, J. Meaningful human control over autonomous systems: A philosophical account. Front. Robot. AI 2018, 5, 28. [Google Scholar] [CrossRef] [Green Version]
- Horowitz, M.C. The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous Weapons. Daedalus 2016, 145, 25–36. [Google Scholar] [CrossRef]
- López, L.B.; van Manen, N.; van der Zee, E.; Bos, S. DroneAlert: Autonomous Drones for Emergency Response. In Multi-Technology Positioning; Springer: Berlin/Heidelberg, Germany, 2017; pp. 303–321. [Google Scholar] [CrossRef]
- Waharte, S.; Trigoni, N. Supporting search and rescue operations with UAVs. In Proceedings of the In 2010 International Conference on Emerging Security Technologies, Canterbury, UK, 6–7 September 2010; pp. 142–147. [Google Scholar] [CrossRef]
- Aler Tubella, A.; Theodorou, A.; Dignum, F.; Dignum, V. Governance by Glass-Box: Implementing Transparent Moral Bounds for AI Behaviour. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI’2019), Macao, China, 10–16 August 2019. [Google Scholar]
- GGE. Emerging Commonalities, Conclusions and Recommendations; Possible Guiding Principles; United Nations: Geneva, Switzerland, 2018. [Google Scholar]
- Caparini, M. Media and the Security Sector: Oversight and Accountability; Geneva Centre for the Democratic Control of Armed Forces (DCAF) Publication: Addis Ababa, Ethiopia, 2004; pp. 1–49. [Google Scholar]
- Scott, C. Accountability in the regulatory state. J. Law Soc. 2000, 27, 38–60. [Google Scholar] [CrossRef] [Green Version]
- Pelizzo, R.; Stapenhurst, R.; Olson, D. Parliamentary Oversight for Government Accountability. 2006. Available online: https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1136&context=soss_research (accessed on 15 September 2021).
- Åström, K.J.; Kumar, P.R. Control: A perspective. Autom 2014, 50, 3–43. [Google Scholar] [CrossRef]
- Pigeau, R.; McCann, C. Re-Conceptualizing Command and Control; Technical Report; Defence and Civil Institute of Environmental Medicine: Toronto, ON, Canada, 2002.
- Koppell, J.G. Pathologies of accountability: ICANN and the challenge of “multiple accountabilities disorder”. Public Adm. Rev. 2005, 65, 94–108. [Google Scholar] [CrossRef]
- Busuioc, M. Autonomy, Accountability and Control. The Case of European Agencies. In Proceedings of the 4th ECPR General Conference, Pisa, Italy, 6–8 September 2007; pp. 5–8. [Google Scholar]
- Pesch, U. Engineers and active responsibility. Sci. Eng. Ethics 2015, 21, 925–939. [Google Scholar] [CrossRef]
- Castelfranchi, C.; Falcone, R. From automaticity to autonomy: The frontier of artificial agents. In Agent Autonomy; Springer: Berlin/Heidelberg, Germany, 2003; pp. 103–136. [Google Scholar]
- Friedman, B.; Kahn, P.H.; Borning, A.; Huldtgren, A. Value sensitive design and information systems. In Early Engagement and New Technologies: Opening Up the Laboratory; Springer: Berlin/Heidelberg, Germany, 2013; pp. 55–95. [Google Scholar]
- Cummings, M.L. Integrating ethics in design through the value-sensitive design approach. Sci. Eng. Ethics 2006, 12, 701–715. [Google Scholar] [CrossRef] [PubMed]
- Davis, J.; Nathan, L.P. Value sensitive design: Applications, adaptations, and critiques. In Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains; Springer: Berlin/Heidelberg, Germany, 2015; pp. 11–40. [Google Scholar]
- Van den Hoven, J.; Vermaas, P.; Van de Poel, I. Design for values: An introduction. In Handbook of Ethics, Values, and Technological Design; Springer: Berlin/Heidelberg, Germany, 2015; pp. 1–7. [Google Scholar]
- Rao, B.; Gopi, A.G.; Maione, R. The societal impact of commercial drones. Technol. Soc. 2016, 45, 83–90. [Google Scholar] [CrossRef]
- Rosen, F. Extremely Stealthy and Incredibly Close: Drones, Control and Legal Responsibility. J. Confl. Secur. Law 2014, 19, 113–131. [Google Scholar] [CrossRef] [Green Version]
- Luppicini, R.; So, A. A technoethical review of commercial drone use in the context of governance, ethics, and privacy. Technol. Soc. 2016, 46, 109–119. [Google Scholar] [CrossRef]
- Clarke, R.; Bennett Moses, L. The regulation of civilian drones’ impacts on public safety. Comput. Law Secur. Rev. 2014, 30, 263–285. [Google Scholar] [CrossRef]
- van Wynsberghe, A.; Comes, T. Drones in humanitarian contexts, robot ethics, and the human–robot interaction. Ethics Inf. Technol. 2020, 22, 43–53. [Google Scholar] [CrossRef] [Green Version]
- Vas, E.; Lescroël, A.; Duriez, O.; Boguszewski, G.; Grémillet, D. Approaching birds with drones: First experiments and ethical guidelines. Biol. Lett. 2015, 11, 20140754. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Aler Tubella, A.; Dignum, V. The Glass Box Approach: Verifying Contextual Adherence to Values. In Proceedings of the AISafety 2019, Macao, China, 11–12 August 2019. CEUR-WS. [Google Scholar]
- Von Wright, G.H. On the Logic of Norms and Actions. In New Studies in Deontic Logic; Springer: Dordrecht, The Netherlands, 1981; pp. 3–35. [Google Scholar] [CrossRef]
- Jensen, K.; Kristensen, L.M.; Wells, L. Coloured Petri Nets and CPN Tools for modelling and validation of concurrent systems. Int. J. Softw. Tools Technol. Transf. 2007, 9, 213–254. [Google Scholar] [CrossRef]
- JUAS-COE. JFCOM-UAS-PocketGuide; JUAS-COE: Arlington County, VA, USA, 2010.
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).