Next Article in Journal
Reprioritising Sustainable Development Goals in the Post-COVID-19 Global Context: Will a Mandatory Corporate Social Responsibility Regime Help?
Next Article in Special Issue
The Simplification of Procedures in Portuguese Administrative Law
Previous Article in Journal
Comprehensive Opportunity Assessment Using Commercial and Moral Intensities
Previous Article in Special Issue
Simplification of Administrative Procedure on the Example of the Czech Republic, Poland, Slovakia, and Hungary (V4 Countries)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Simplification of Administrative Procedures through Fully Automated Decision-Making: The Case of Norway

by
Emily M. Weitzenboeck
Oslo Business School, Faculty of Social Sciences, OsloMet-Oslo Metropolitan University, Pilestredet 35, 0166 Oslo, Norway
Adm. Sci. 2021, 11(4), 149; https://doi.org/10.3390/admsci11040149
Submission received: 31 October 2021 / Revised: 26 November 2021 / Accepted: 29 November 2021 / Published: 7 December 2021

Abstract

:
Norway has a high degree of digitalisation. In the public sector, there is a long tradition of automation of parts of case management. This includes automation of cases where a public sector body makes a so-called individual administrative decision, that is, a decision made in the exercise of public authority through which the rights or duties of one or more specified private persons are determined. In the last five years, various amendments to public sector legislation were proposed by a number of government departments and agencies in Norway to ensure that the relative administrative agency has a legal basis to carry out fully automated individual decisions. This is challenging both from an administrative law and from a data protection law standpoint. Among the main reasons for the move towards fully automated legal decision-making that are mentioned in the preparatory works to the proposed amendments are greater efficiency in decision-making, equal treatment of citizens and a claim that such decisions will be less prone to error than human decisions. This paper examines this trend in Norway and identifies the statutes and regulations that have been amended or are in the process of being amended. It analyses the measures specified in these amendments to safeguard the individual party’s rights, freedoms and legitimate interests. Finally, it discusses the tightrope that must be walked to safeguard important administrative law principles and rules such as protection from arbitrary decisions, the audi alternam partem rule and the right under the European Union’s General Data Protection Regulation not to be subject to fully automated decisions.

1. Introduction

Norway has a high degree of digitalisation, not least in the public sector. In 2020, Norway was ranked thirteenth in the United Nations (“UN”) e-government survey of digitalization in the public sector out of a total of 193 countries (UN 2020). Both state and municipal organisations increasingly offer services digitally, and public use of such services is on the rise (Norwegian Ministry of Foreign Affairs 2019; OECD 2017).
Within the public sector, Norway has a tradition of using computers and automation of case processing that dates back several decades, with the first totally automated routine–the Government Housing Benefit System–already in use in the early 1970s (Hildonen and Gulustuen 2012; Schartum 2020). Since then, there has been even more digitization of documents, materials and data and further digitalisation of processes in the public sector. Furthermore, the availability of massive troves of data (Big Data) coupled with access and increased computing power at a lower cost than ever before, have facilitated the use of more automation in case processing (Norwegian Ministry of Government Administration, Reform and Church Affairs 2012–2013). In various white papers and strategy documents, the increased automation of case processing, as well as the full or partial automation of administrative procedures have broadly been seen as positive and desirous aims by the government (Norwegian Ministry of Local Government and Modernisation 2019; Norwegian Ministry of Local Government and Modernisation 2020; Norwegian Ministry of Government Administration, Reform and Church Affairs 2012–2013, 2015–2016) and by various public sector bodies and agencies (Norwegian Ministry of Labour and Social Affairs 2019). Among the advantages of automation is that it can result in substantial efficiency gains and contribute to increased equal treatment of citizens (Norwegian Ministry of Labour and Social Affairs 2019). The Law Commission on the Public Administration Act (2019) has also stated that automated case processing can both ensure that procedural requirements for case processing are complied with as well as facilitate the correct implementation of rights and obligations. Seen in this light, the more automation there is, the more efficient the public sector becomes. Efficiency is, of course, an admirable goal to strive for as long as the fundamental administrative law principles such as lawfulness and conformity with statutory purpose, equality of treatment, objectivity and impartiality, proportionality, legal certainty, transparency and the individual’s right to be heard, are observed (Council of Europe 2018; Graver 2019).
In Norway, the main legislation that sets down rules on case processing by the public administration is the Public Administration Act of 1967 (“PAA”) which entered into force on 1st January 1970. The PAA applies to activities carried out by administrative agencies. The term “administrative agency” comprises all central and local government bodies, as well as any “private legal person […] in cases where such person makes individual decisions or issues regulations”, c.f. PAA Section 1. An individual decision is a decision made in the exercise of public authority through which the rights or duties of one or more specified private persons are determined, c.f. PAA Section 2 first paragraph letters b, cf. letter a.
Although the PAA was amended several times since its enactment, it was only in 2001 that significant amendments were made to remove barriers that could impede electronic case processing. Through these amendments, the term “document” was made technology-neutral and electronic communication was equated with paper-based communication, also for public sector purposes. Another set of amendments to the PAA in 2013 strengthened the public sector’s possibility of communicating electronically both internally within the public administration as well as externally with the business sector and the general public. The legal framework to facilitate digitization, electronic communication and automation of case processing in the public sector has thus existed for several years. Partial case processing where case officers use digital support tools to assist them with various parts of case processing and case management, is commonplace. This includes the use of decision-making support tools to assist case officers when a formal individual decision pursuant to the PAA is to be made. As regards the use of fully automated legal decision-making systems, this also appears to be on the rise in certain sectors of the public administration. Thus, for example, most tax decisions concerning individual taxpayers are totally automated. As are more than 70 percent of applications to the Norwegian State Educational Loan Fund and the majority of applications for housing benefits (Schartum 2020). With the push towards further digitalisation and digital transformation as underlined in Norway’s digital strategy for the public sector for 2019–2025 (Norwegian Ministry of Local Government and Modernisation 2019), one expects an increase in the use of fully automated legal decision-making systems. In fact, in the last five years, various amendments to public sector legislation were proposed by a number of government departments and agencies in Norway to ensure that the relative administrative agency has a legal basis to carry out fully automated administrative decisions in the case of certain specific tasks or individual decisions.
The issue of fully automated individual decisions poses challenges both from an administrative law as well as from a data protection law standpoint. Besides having to comply with fundamental administrative law principles as abovementioned, a decision that is “based solely on automated processing, including profiling, which produces legal effects concerning [an individual] or similarly produces effects concerning him or her or similarly significantly affects him or her” is subject to Article 22 of the European Union’s (“EU”) General Data Protection Regulation 2016/679 (“GDPR”) (my emphasis). Individual decisions by government agencies, by definition, would thus fall within GDPR Article 22 when such decisions are based “solely on automated processing”.
Pursuant to GDPR Article 22(1), an individual has the right to not be subject to such fully automated legal decisions. However, GDPR Article 22(1) does not apply if there is a legal basis for such a fully automated decision pursuant to GDPR Article 22(2) and, in the case of special categories of personal data, a legal basis pursuant to GDPR Article 22(4). One such legal basis—and the one that is more pertinent to processing by the public sector—is that the solely automated processing is authorized by either EU or national law “which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”, c.f. GDPR Article 22(2)(b). It is this requirement for a legal basis in, inter alia, national law, that appears to be the reason behind the flurry of legislative amendments to public sector legislation–both primary legislation (i.e., statutes) and secondary legislation (i.e., regulations)–that have been proposed or passed in Norway in the last five years, i.e., since the entry into force of the GDPR.
This paper identifies and examines the various amendments to public sector legislation—both statutes and regulations—that have been passed in Norway in the last few years as well as those that have been proposed and are still pending. The focus is directed on amendments that permit fully automated legal decision-making by public sector bodies and agencies. The scope and wording of such amendments are analyzed to identify whether the legislator laid down any limitations, requirements, measures or other safeguards regarding the use of fully automated decisions. The amendments are then examined in light of the fundamental principles of Norwegian public administrative law and the right not to be subject to fully automated legal decision-making pursuant to the GDPR. This paper will question whether the piecemeal approach taken so far in Norway is conducive to having clear, precise and foreseeable legal bases that are suited for full automation. Though the study looks at the Norwegian implementation of GDPR Article 22 in the public sector, it is likely to be of interest to other European Economic Area (“EEA”) countries that are still deliberating on the implications and scope of GDPR Article 22.

2. Research Questions and Methodological Outline

2.1. Research Questions

The aim of this paper is twofold.
The paper first carries out identification and mapping of legislation that has been amended and as well as of proposals for amendment of legislation that is still pending which give a legal basis to public administration agencies or bodies to issue individual decisions that are solely based on automated processing (also referred in this paper as “fully automated legal decisions”) with regards to certain types of processing. The mapping exercise also includes identifying amendments to statutes that only consist of a provision enabling the issue of regulations to enable administration agencies to issue fully automated legal decisions. Both primary and secondary legislation are the object of the study.
The mapping exercise also identifies any limitations included in the wording of the amendment or proposed amendment that circumscribe or set requirements on the type of decision that can be fully automated, as well as any other measures that are expressly mentioned in the wording of the amendment or proposed amendment to safeguard the individual’s rights, freedoms and legitimate interests.
Following this, the paper then discusses the efficacy of the limitations and measures in the amended legislation or in the proposed amendments in light of the requirement for “suitable measures” laid down in GDPR Article 22 and the rules underlying Norwegian public administrative law.

2.2. Methodological Outline

A document search was carried out in two databases that contain various Norwegian legal sources—the government’s public database “Regjeringen.no” and the Lovdata database that is run by the Lovdata foundation.
Firstly, a document search was made in the Regjeringen.no database for all documents that contain the string “automatiserte avgjørelser” (in English: automated decisions). Among the documents contained in this database are preparatory works to legislation such as proposals for legislation or amendments to existing legislation, law commission reports on specific legal issues (known in Norwegian as “Norsk offentlige utredninger” or “NOUs”) and comments and replies to public consultations on proposed legislation. It also contains other documents and reports such as white papers, fiscal budgets and government reports, plans and strategies. The Regjeringen.no database includes all proposals for new statutes and proposals to amend existing statutes since the legislative session of 1997–1998.
Following the document search in the Regjeringen.no database, the results of the search were analyzed and all proposals for legislation or amendments to existing legislation were identified. A search for each of the identified proposed amendments and new legislation was then made in the Lovdata database to check their status (i.e., whether they had been passed by Parliament) and, if so, what the final text of the enacted amendment or new legislation was. The Lovdata database contains legislation (both statute and regulations) that has been enacted, including legislation that has been passed by the Storting (the Norwegian Parliament) even if it has not yet come into effect, as well as abrogated legislation. However, the Lovdata database does not always include proposals for legislation or proposals to amend legislation that is still pending public consultation or still to be debated in the Storting, that is, that have not yet been passed by the Storting. Hence the use of both the Regjeringen.no and the Lovdata database. Furthermore, to ensure that no relevant statutes or regulations were inadvertently omitted in the Regjering.no search, the same search string was searched in the Lovdata database.
Statutes or regulations which only apply for a temporary period which expires by or before the end of 2021, for example, to provide subsidies for certain months during the COVID-19 pandemic, have been omitted from this study.
Although the search in Regjeringen.no was carried out in the first week of August 2021, the status of the proposals for amendments that had not been discussed or passed by the Storting was checked in the third week of October 2021 and the findings were updated accordingly.

3. Theoretical Framework

3.1. Safeguards under the Personal Data Act and the GDPR

3.1.1. Article 22 on Fully Automated Decision-Making

Though a member state of the EEA, Norway is not a member state of the EU and thus, EU regulations do not have direct application in Norwegian law. The GDPR was therefore incorporated into Norwegian law by the Personal Data Act of 2018, c.f. Section 1. According to GDPR Article 22(1), an individual has a “right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. However, as further discussed below, Article 22(1) does not apply in respect of three types of decisions specified in Article 22(2). As stated earlier in this paper, an individual decision pursuant to the PAA is, by definition, a decision that determines the rights or duties of one or more specified persons and thus falls within the provisions of GDPR Article 22 when it is “based solely on automated processing” (my emphasis).
Solely automated decision-making is described as “the ability to make decisions by technological means without human intervention” (Article 29 Data Protection Working Party 2018). To qualify as human involvement, the public body or agency that determines the purposes and means of the processing (i.e., as data controller) must ensure that any oversight of the decision is meaningful and not a token gesture. In other words, “it should be carried out by someone who has the authority and competence to change the decision” (Article 29 Data Protection Working Party 2018). As explained by Bygrave and Mendoza, “[e]ven if a decision is formally ascribed to a person, it is to be regarded as based solely on automated processing if a person does not actively assess the result of the processing prior to its formalization as a decision” (Mendoza and Bygrave 2017).

3.1.2. Interpreting Article 22(1)

There has been much discussion among data protection scholars on whether GDPR Article 22(1) is to be interpreted as a general prohibition or as a right to be exercised at the choice of the data subject. (Mendoza and Bygrave 2017; Tosoni 2021; Wachter et al. 2017).
If GDPR Article 22(1) is interpreted as a prohibition, data controllers would basically not be allowed to make individual decisions solely based on automated processing unless one of the exceptions specified in Article 22(2) applies. According to GDPR Article 22(2), solely automated decision-making is permitted: (a) if it “is necessary for entering into, or performance of, a contract between the data subject and a data controller”, (b) if it “is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”, or (c) if “it is based on the data subject’s explicit consent”. Contract formation or performance is not a suitable legal basis for processing of personal data in the context of the exercise of public authority. Nor is explicit consent a suitable legal basis because of the unequal bargaining position and the imbalance in power dynamics between, on one side, the administrative agency and, on the other side, the individuals to whom such decisions apply. Thus, if one were to interpret Article 22(1) as a prohibition, the use of fully automated individual decisions pursuant to GDPR Article 22(1) will only be lawful as long as such decisions are authorized by EU or Norwegian law and provided the safeguards specified in Article 22(2)(b) are in place.
However, if GDPR Article 22(1) is interpreted as a right, the use of fully automated decisions would only be restricted where the data subject has expressly objected to such decisions. This would mean that data controllers such as administrative agencies would be free to issue decisions based solely on automated processing, but the addressees of the decisions resulting from such processing would remain free not to accept them. This interpretation implies that to oblige data subjects to accept a fully automated legal decision, there must be a legal basis in national or EU law that authorizes the public administration body to issue such decisions.
The Article 29 Data Protection Working Party has interpreted Article 22(1) as establishing a general prohibition for decision-making based solely on automated processing. According to the Working Party’s guidelines, “[t]his prohibition applies whether or not the data subject takes an action regarding the processing of their personal data” (Article 29 Data Protection Working Party 2018). These guidelines were expressly endorsed by the European Data Protection Board (“EDPB”), the successor of the Article 29 Data Protection Working Party (European Data Protection Board 2018). Though not legally binding, these guidelines are highly authoritative as they emanate from the body (Article 29 Data protection Working Party and its successor, the EDPB) which comprises all the data protection authorities in the EEA, i.e., the supervisory authority that has power to enforce data protection legislation. Following the publication of the guidelines, the prevailing view has been that Article 22(1) contains a general prohibition. However, a recently published article by Tosoni presents a thorough consideration of the ‘right-vs.-prohibition’ issue and argues persuasively that Article 22(1) is best regarded as laying down a right to be exercised by data subjects, rather than as a general prohibition (Tosoni 2021). It remains to be seen whether Tosoni’s article changes the current view and tips the balance in favour of the interpretation that Article 22(1) establishes a right rather than a general prohibition. Nevertheless, it appears that the Norwegian drafters of the various proposed (and adopted) amendments examined in this paper have been highly influenced by the Article 29 Data Protection Working Party’s guidelines and simply stated, often referring to such guidelines, that Article 22(1) lays down a general prohibition and that, in light of this, there was a need for a specific legal basis in Norwegian law for individual administrative decisions to be fully automated.
Even if the tide turns in favour of interpreting Article 22(1) as a right and not a general prohibition, there are still strong arguments to be made in favour of the legislator ensuring that there is a specific legal basis that authorizes fully automated decision-making by administrative agencies. A legal basis as specified in Article 22(2)(b) strengthens the rule of law by providing a clear rule that lays down when such decisions can be taken by administrative agencies and also what measures are to be taken to safeguard the data subject’s rights, freedoms and legitimate interests. It also makes for more effective public sector automation by restricting the individual’s right to be able to challenge automated administrative decisions purely on the basis that such decisions were fully automated.
As to the nature of the legal basis that authorizes fully automated decision making pursuant to GDPR Article 22(2)(b), the textual wording of this provision refers to “Union or member state law” that provides suitable measures. As Recital 41 states, the requirement in the GDPR for a legal basis or a legislative measure does not necessarily require a legislative act adopted by a parliament. What is essential is that “such a legal basis or legislative measure should be clear and precise, and its application should be foreseeable to persons subject to it”, c.f. GDPR Recital 41. It is thus clear that both primary and secondary legislation are encompassed. Moreover, a clear and precise pronouncement in the preparatory works to a sector-specific law or regulation to the effect that a certain processing activity or activities are fully automated may also suffice as a legal basis pursuant to Article 22(2)(b) (Schartum 2018). This is, for example, the case with section 9-2 of the Tax Management Act of 2016, which deals with tax calculation. The Preparatory Works to the Tax Management Bill (2015–2016) refer to “mass administrative systems” where the calculation of tax and individual decisions is fully automated, but the Act itself does not have any provision which specifically regulates the issue of fully or partially automated decisions (Norwegian Ministry of Finance 2018).
A further distinction is introduced in GDPR Article 22(4) where a special category of personal data is processed. Special category of personal data is data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data processed for the purpose of uniquely identifying an individual, health data or data concerning one’s sex life or sexual orientation, c.f. GDPR Article 9(1). Where the personal data processed in respect of a fully automated legal decision includes special categories of data, the processing by the public body or agency must either be “necessary for reasons of substantial public interest. on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject”, c.f. GDPR Article 9(2)(g), or based on the data subject’s explicit consent.

3.1.3. The Requirement for Suitable Measures for Automated Individual Decision-Making

According to GDPR Article 22(2)(b), the legal basis that authorizes fully automated decisions must lay down “suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”. What these “suitable measures” are to be is not further specified in this provision and EEA states thus have a broad leeway in this regard. However, Article 22(3) lists three suitable measures that must be provided as a minimum in cases where fully automated decisions are based either on the data subject’s explicit consent (c.f. Article 22(2)(c)) or where it is necessary for entering into or performance of a contract with the data subject (c.f. Article 22(2)(a)). Though on their face, the suitable measures required of a national legal basis pursuant to Article 22(2)(b) do not need to include the safeguards specified in Article 22(3), “[i]n many if not all contexts, however, these safeguards (or elements of them) are likely to figure as measures for the purposes of Article 22(2)(b)” (Bygrave 2020a). The measures that must be provided to data subjects pursuant to Article 22(3) are the following:
  • the right to obtain human intervention on the part of the controller;
  • the right to express his or her point of view;
  • the right to contest the decision.
As stated above, the list of measures in Article 22(3) is not exhaustive and GDPR Recital 71 also mentions the data subject’s right to specific information, and the right “to obtain an explanation of the decision reached after such assessment”. Scholars differ on whether a data subject has a right to an ex post explanation of automated decisions affecting them (Edwards and Veale 2017; Edwards and Veale 2018; Kaminski 2019; Mendoza and Bygrave 2017; Selbst and Powles 2017; Wachter et al. 2017) and on the extent to which such a right is inherent, if not in Article 22, in the various other provisions of the GDPR (Maglieri 2019). Bygrave holds that “solid grounds exist for viewing the right as inherent in the penumbra” of various other rights in the GDPR, such as the right to contest a decision in Article 22(3), in the overarching principle that personal data must be processed “fairly and in a transparent manner” in Article 5(1)(a), and in the data subject right to obtain “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject” in Article 15(1)(h) (Bygrave 2020a).
The Article 29 Data Protection Working Party provides a list of other good practice suggestions for additional safeguards that include: regular quality assurance checks of systems against discrimination and unfair treatment, algorithmic audition (internal and/or by independent third party auditing), contractual assurances for third party algorithms, data minimisation measures to incorporate clear retention periods, anonymization or pseudonymization techniques, ways to allow the data subject to express his or her point of view and contest the decision and a mechanism for human intervention (Article 29 Data Protection Working Party 2018).

3.2. General Safeguards under the Public Administration Act

Underlying Norwegian public administrative laws are the key principles of lawfulness and conformity with statutory purpose (i.e., the duty not to act arbitrarily), proper and trustworthy case processing and the duty of objectivity and impartiality, the duty of proportionality and the individual’s right to be heard, right to review and right to appeal administrative decisions (Graver 2019). The principle of lawfulness and non-arbitrariness is enshrined in Section 113 of the Constitution of Norway which states that any intervention by public authorities against an individual must have a basis in law. Many of the other fundamental principles were codified in the PAA. The requirement of impartiality is regulated in PAA Section 6 which lays down rules that specify when such officials and anyone performing services or working for an administrative agency shall be disqualified from preparing the basis for a decision and/or from making any decision in an administrative case. Many of the other fundamental administrative law principles are enshrined in the PAA. Thus, administrative agencies have a duty to provide general guidance to individuals seeking assistance on matters within the specific agency’s competence, c.f. PAA Section 11. An individual in respect of whom an administrative decision is about to be issued must receive advance notice of the decision that is expected to be issued, unless such individual is already aware of that fact, c.f. PAA Section 16. This, coupled with the administrative agency’s duty to ensure that the case before it is clarified as thoroughly as possible before any individual decision is made, enables the individual to participate in the case, provide any other information or evidence relevant to the case, and to otherwise be heard before a final administrative decision is made in his or her respect, c.f. PAA Section 17. Furthermore, public sector bodies and agencies must give grounds for their individual decisions, c.f. PAA Section 24. The grounds must refer to (i) the rules on which the individual decision is based, (ii) the factual circumstances upon which the administrative decision is based, and (iii) where the decision involves the use of administrative discretion, the chief considerations that were decisive for the exercise of discretionary powers. If guidelines for the exercise of administrative discretion exist, a reference to such guidelines would be deemed sufficient, c.f. PAA Section 25. Once an individual decision is made, it must be notified to the party, together with information on inter alia the right to appeal, the time limit for an appeal and the specific procedure to be followed for an appeal. Thus, the combined effect of the abovementioned provisions enables the individual (i.e., the data subject) “to express his or her point of view” and “to contest the decision”, i.e., two of the measures specified in GDPR Article 22(3).

3.3. Findings

As mentioned earlier, a search for the string “automatiserte avgjørelser” (in English: automated decisions) was made in the Regjeringen.no public database during the first week of August 2021. The search identified forty-three documents containing both words. Some documents made proposals for amendments to more than one piece of legislation or regulation. Each of the forty-three documents was examined and twenty-seven different proposals for amendment to legislation (statute or regulations) that provided a specific legal basis to a public sector body or agency to issue fully automated decisions in respect of certain types of processing and/or had an enabling provision that permitted such government body or agency to issue more detailed regulations on automated decision-making, were identified.
As stated earlier, to determine whether the proposal for amendment was passed, as well as what the final text of the amendment as approved by the Storting was, a search was made in Lovdata for all the twenty-seven proposals for amendments to laws and regulations. From this, it emerged that twenty-one of the proposals for amendments have already been passed, whereas six proposals for statutory amendments are still pending adoption.
Fourteen of the twenty-one amendments that were passed by the Storting contained a substantive legal basis that permits the use of fully automated legal decisions in respect of certain specific types of processing. These are set out in Table 1 below, with the Norwegian short name of the respective legislation included in brackets. Table 1 also indicates (i) the relevant section number and the type of processing activity that the legal basis is in respect of, (ii) any limitations, included in the wording of the legal provision, that circumscribe or set requirements on the type of decision that can be fully automated, e.g., regarding the exercise of discretion, the need for proper and trustworthy case processing, or the requirement that the processing must be compatible with the right to protection of personal data, and (iii) any other measures or safeguards that are expressly mentioned in the wording of the amendment such as the individual’s right to demand that a human being reviews the decision, the requirement for frequent manual checks, the need for quality assurance, and any specification as to when the legal provision(s) may be given effect.
The other seven amendments that were passed only contain an enabling section, i.e., a section which states that the public agency or body concerned may issue regulations on fully automated legal decisions. They contain no other substantive provision on the matter. These seven amendments are set out in Table 2 below. Four of these enabling provisions have given rise to amendments to regulations that specify when and how fully automated decisions may be made by a public sector body. As indicated in the “Remarks” column in Table 2, the amendments to these four regulations are examined in more detail in Table 1.
Table 3 identifies two proposals for statutory amendments that were published for public consultation in July 2021 in one and the same consultation document and which are still pending, i.e., they have not yet undergone formal discussion by the Storting. These are proposals for amendment to, respectively, the Patient Medical Records Act and the National Insurance Act. Table 3 also includes a reference to a bill for a new Carriage of Goods Act and a bill for a new Customs Duties Act that were published for consultation in September 2021 (the earlier proposal was sent for public consultation in 2019 by the Ministry of Finance). Both bills were published in one and the same consultation document (Norwegian Ministry of Finance 2021).
Similar to Table 1, Table 3 identifies any limitations included in the wording of the proposed amendment that circumscribe or set requirements on the specific processing that the legal basis is in respect of, and any other measures or safeguards that are expressly included in the wording of the proposed amendment.
Table 3 also includes a reference to two different Law Commission reports which respectively propose a new Public Administration Act and a new Archives Act. Both reports were published for public consultation in 2019. Both Law Commissions discussed the use of fully automated legal decision-making systems and each of them suggested wording in their respective proposal to regulate this issue. The report of the Law Commission on the Public Administration Act and its proposal for a new Act is currently being examined by the Ministry of Justice Public Security—the government ministry which has responsibility for this field of law. As regards the Law Commission on the Archives Act’s report, this has progressed further and, on 5 October 2021, the government published a bill that proposes a new Archives Act. The bill is open for public consultation until 14 January 2022. Though the bill is based on the Law Commission on the Archive Act’s draft law, the Law Commission’s proposed provisions that dealt specifically with fully automated legal decisions were not taken forward in the bill. This is discussed in more detail in Section 4 below.
The bill proposing a new Archives Act and the Law Commission’s proposal for a new Public Administration Act are horizontal legislation that have a broad substantive scope and apply to all administrative agencies. The other amendments specified in Table 1, Table 2 and Table 3 are in respect of sector-specific statutes and regulations.

4. Discussion: Results of Mapping Exercise

4.1. Overview

An analysis of the results in Table 1 and Table 2 shows that the sectors in respect of which the amendments to permit fully automated decisions were passed may be broadly grouped into the following sectors: education, citizenship and immigration, driving licenses, and welfare and pension. The two sector-specific proposals for amendments (Table 3) concern the health and care sector as well as welfare. Whereas some of the legal basis, e.g., the amendments to the Universities and University Colleges Act, the Primary, Lower and Upper Secondary Schools Act, the Higher Vocational Act and the Integration Regulations are limited to very specific types of processing, such as the approval of foreign education/training, the computation of multiple-choice tests results in certain specific exams, or the processing in a specific system (e.g., administrative system), other legal basis are worded more broadly to encompass virtually all processing that is necessary to enable the particular public sector body to perform its public tasks or to exercise official authority (e.g., amendments to the NAV Act). It also appears that where the scope of the legal basis is relatively broad, the legislator sought to introduce very specific limitations, presumably to offset the fact that the administrative decisions will be fully automated. These limitations relate to the type of legal rules that may be fully automated, that is: (i) whether and to what extent one may fully automate rules that allow the exercise of discretion, (ii) distinguishing between the automation of decisions which are “only slightly invasive/intrusive” from decisions which are “invasive/intrusive”, and (iii) requiring that the full automation of such rules is compatible with the right to protection of personal data. These limitations are more closely examined in Section 4.2 below. Furthermore, the amendments also specify which measures must be implemented to safeguard the data subject’s rights, freedoms and legitimate interests, such as: (i) providing a right to obtain human review; (ii) requiring quality assurance; (iii) specifying the time/period when the fully automated legal provision applies; and (iv) emphasizing specific data protection principles. These measures are further examined in Section 4.3 below. Section 4.4 of this paper then examines how a right to explanation of fully automated decisions is to be construed.

4.2. Limitations

4.2.1. Limitations to the Exercise of Discretion and the Right to Proper Case Processing

Nine of the statutes and regulations in Table 1 specifically state that the fully automated decision “cannot be based on discretionary terms in a statute or a regulation, unless the decision is indubitable”. This requirement is discussed in the preparatory works to all the nine legislative amendments. While three of the legislative amendments—the Educational Financial Support Act, the Regulations on Citizenship and the Regulations on Foreigners’ Access to Norway—were proposed in separate preparatory works documents (Norwegian Ministry of Education 2019, 2021; Norwegian Ministry of Justice and Public Security 2019), the amendments to the NAV Act, Public Service Pension Fund Act, Employed Seamen’s Pension Scheme Act, Fishermen’s Pension Insurance Act, Nurses’ Pension Scheme Act, and Pension Scheme for Persons Accompanying Foreign Service Employees Act were all proposed and discussed in one and the same comprehensive document (Norwegian Ministry of Labour and Social Affairs 2019).
The preparatory works do not elaborate on what the term “indubitable“ means in this context. The question that arises is how one can determine whether a decision is “indubitable” and what are the criteria for determining this.
Perhaps in an effort to shed some light on this, the preparatory works to the aforementioned six legislative amendments (Norwegian Ministry of Labour and Social Affairs 2019) link the abovementioned prohibition (i.e., against the use of fully automated decisions where a provision has discretionary terms unless it is indubitable) with a party’s right to “proper” or “trustworthy” case processing by administrative agencies (termed, in Norwegian, “forsvarlig saksbehandling”). The proposed amendment to each of these six statutes, as well as the proposed amendment to the Education Financial Support Act, specifically include wording to the effect that “the processing must respect the party’s right to proper case processing”.
The principle that case processing by administrative agencies must be “proper/trustworthy” (in Norwegian, “forsvarlig”) has been recognised in Norwegian administrative legal theory (Eckhoff and Smith 2018; Frihagen and Rasmussen 2010; Graver 2019) since at least 1968 (Stub 2011). Norway has also adopted the Council of Europe’s 2007 Recommendation on good administration, though this did not bring about any further amendments to the PAA (Stub 2011). Graver (2019) explains that the two main aspects of the principle of proper/trustworthy case processing that emerge from Norwegian jurisprudence are the individual’s right to be heard and the duty of administrative agencies to clarify a case before issuing individual decisions. These two aspects of the principle of property/trustworthy case processing were codified in Chapter IV of the PAA which contains rules regarding the preparation of cases concerning individual decisions. According to Section 17, an administrative agency must clarify a case before an individual decision is made. As mentioned earlier, this duty includes giving advance notice to a party, who has not already expressed his or her opinion on the case, that an individual decision will be issued in his or her respect, thereby giving such party the opportunity to express his or her views (i.e., a manifestation of the audi alteram partem rule), c.f. Section 16. For a party to be able to exercise his or her right to express his or her views on the case, such party must also have a right of access to the documents in the case, c.f. Sections 18 to 20 (Graver 2019).
The question that arises is whether the principle of proper/trustworthy case processing has any independent meaning beyond the codified right to express one’s views (including a right to access case documents) and the administrative agency’s duty to clarify a case. Recent research by Stub (2011) has claimed that, contrary to scholarly writings, there is no such general right to proper case processing beyond the codified rules in the PAA. However, according to Stub (2011), there could be grounds for holding that there is an unwritten right to proper case processing that has a more limited sphere of application such as, for example, in cases regarding decisions that are rather invasive.
According to the preparatory works to the nine amendments identified in Table 1, the duty of proper/trustworthy case processing here implies that one must, as a first step, assess whether a specific legal provision is suited for full automation. One should therefore assess how the specific legal provision is worded, for example, how straightforward and easy it is to interpret the provision, whether its terms are clear and objective, and whether those terms in the provision that are vague, open-ended or otherwise appear to be discretionary, can be operationalized (Norwegian Ministry of Labour and Social Affairs 2019). Where discretionary terms cannot be operationalized, the preparatory works emphasize, the principle that case processing must be proper/trustworthy does not permit the use of fully automated legal decisions (Norwegian Ministry of Labour and Social Affairs 2019). This view is also echoed in the preparatory works to the amendments to, respectively, the Educational Financial Support Act (Norwegian Ministry of Education 2021), the Regulations on Citizenship (Norwegian Ministry of Education 2019) and the Regulations on Foreigners’ Access to Norway (Norwegian Ministry of Justice and Public Security 2019), although the wording of the amendment to the latter two regulations does not specifically mention the principle that case processing must be proper/trustworthy. The duty to refrain from fully automating legal decisions where discretionary terms cannot be operationalized could perhaps be seen as an illustration of what Stub (2011) refers to as the limited sphere of application of the unwritten duty of proper case processing.
To date, in Norway, the process of transforming legal provisions in statutes and regulations into algorithms and programming code that enable fully automated legal decisions has had one common characteristic. As stated in Norway’s national strategy for the use of artificial intelligence (“AI”), “[a] feature common to all of the current automated case management systems is that they are rule-based. The regulations are programmed into the solution, making it possible to give reasons for the decisions made” (my emphasis) (Norwegian Ministry of Local Government and Modernisation 2020). Current established techniques employ technology that is based on fixed algorithms and well-defined use of databases (Schartum 2020). The development of automated legal decision-making systems necessitates that legal sources are transformed and embedded as legal rules in the programming code (Schartum 2020). In the case of individual administrative decisions, the typical sources of law are statutes and regulations that determine how government agencies must proceed to reach valid legal decisions in individual cases. These statutes and regulations typically have a substantive scope that is limited to the particular branch of government that is competent to issue the individual decision (e.g., regulating the issue of social benefits, driving licenses, the granting of citizenship) (Schartum 2020). Furthermore, the process of transformation of specific substantive legislation must also take account of legislation that has a wide or general substantive scope and that applies, as it were, across the board as a type of framework legislation. In Norwegian public administration, such framework or background laws include the PAA and the GDPR, the latter applying where there is processing of personal data which is always the case for individual decisions. A detailed analysis of the process of transformation of legislation into legal rules is beyond the scope of this paper, this topic having been extensively examined by other scholars, foremost of which, in Norway, is Schartum (1993, 2012, 2018, 2020); Schartum et al. (2017). Suffice it to highlight that the process of transformation involves the interpretation of the specific substantive legal provisions according to accepted legal methods and legal sources, resulting in a high number of legal rules which must then be formalized by means of a programming language. As Schartum (2020) emphasizes “[d]erived rules must be precise and complete: computers only follow unambiguous rules, and there is no room for doubt or discretion.”
However, legal provisions in statutes and regulations often allow for the exercise of some measure of discretion in the process of interpretation. This typically requires that the person interpreting the legal provision must assess, weigh or otherwise carry out a balancing exercise between various factors that may be relevant to the matter at issue. Indeed, some scholars have questioned whether there are any norms which are totally free from the need to carry out such a balancing exercise. Eng illustrates this with the example of a seemingly clear-cut norm such as “Men over 18 years of age are obliged to do military service” (Eng 2007). As Eng shows, difficulties can still arise in interpreting words such as “men” (with respect to persons who have had gender reassignment), “over 18 years” (with respect to the starting point of this period), and “military service” (with respect to the type of services encompassed). For a norm such as this to be automated, the person interpreting the legal provision must map out all the possible interpretations of each term and condition in that norm. Following that, the results of interpretation must be transformed from natural language to programming code.
The assessment made by a person (typically a judge, lawyer or case officer) during the process of interpreting a term or condition in a statute or regulation while applying it to a specific case must, of course, be distinguished from situations where the law permits an administrative agency to exercise administrative discretion. The former types of assessment must be made because a word or phrase is either vague or ambiguous. Ambiguity may be semantic (where a word may have a different meaning depending on the context) or syntactic (where the meaning depends on punctuation or sentence structure). These examples are different to situations where a legal provision empowers an administrative agency to “freely” exercise administrative discretion, i.e., cases where it is up to the administrative agency that has competence over a particular matter to decide whether and/or to what extent it should apply a particular provision in a statute or regulation to a concrete case before it (Moen 2019). The manner in which a provision in a statute or regulation is worded indicates whether the provision calls for the exercise of administrative discretion or not. Provisions in legislation that permit the exercise of administrative discretion typically contain the verb “may” to signify that the application of the legal provision is in the administrative agency’s discretion. Of course, although an administrative agency would have a wide measure of discretion in such cases, its discretion will never be completely free or untethered because the underlying administrative law principles to safeguard the rule of law, remain applicable.
The discretion alluded to in the words “discretionary terms” in the amendment to the nine statutes abovementioned and identified in Table 1 is not the exercise of discretionary powers by an administrative agency. The preparatory works state that the fact that a law or regulation requires a case-by-case basis assessment whether a condition or requirement in legal provision is fulfilled or not, does not necessarily entail the exercise of administrative discretion (Norwegian Ministry of Labour and Social Affairs 2019). For example, the requirement in the Social Security Act for a case-by-case analysis to ensure that sickness benefit is paid only if the conditions to obtain such benefit have been fulfilled (i.e., there is a medical condition/illness) does not require any discretionary evaluation by the public sector employee handling the case. If the claim for sickness benefit is accompanied by a medical certificate with a valid diagnosis, the Department for Labour and Welfare will consider the condition fulfilled. The same applies in cases concerning claims for disability pensions pursuant to the Public Service Pension Fund Act. In each of such cases, the evaluation of the requirement which requires a case-by-case evaluation (Is the applicant sick/disabled?) may be operationalized by means of automation (Norwegian Ministry of Labour and Social Affairs 2019). Thus, where all the potential outcomes or conditions of a legal requirement can be mapped out beforehand, the legal rule may be operationalized and hence, automated. This is usually the case where a legal rule may be operationalized by means of a decision tree. A decision tree is a hierarchically structured, predictive classification model that maps observations about a particular unit of analysis to arrive at conclusions about its character (Russell and Norvig 2020). According to the abovementioned preparatory works to the welfare and pension legislation, if the outcome of the discretionary assessment of the statutory/regulatory condition would be indubitable if the assessment was made by a human being, the decision may be fully automated (Norwegian Ministry of Labour and Social Affairs 2019).
Where open-ended or discretionary terms such as “reasonably”, “adequately”, “justifiably”, “suitably” or “likely”, are used in legislation to give the administrative agency a certain leeway when assessing each individual case, automation becomes very difficult if not impossible (Schartum 2018). If, in each case-by-case assessment, it is possible to keep adducing new arguments, factors or elements, full automation of the legal provision is not possible. Where, however, the assessment of, for example, what is “reasonable” or “justifiable” is limited to a very specific area or context in such a way that it is possible to map out all the possible criteria that give an indication of what “reasonable” or “justifiable” means, automation of such assessment may be possible (Schartum et al. 2017; Schartum 2018). Though full automation may be possible by removing discretion and instead introducing a limited number of more firm conditions, the “lawful application of an Act or regulation may imply a duty to exercise discretion” (Schartum 2020). The question here is whether it is at all lawful to replace the discretionary assessment with a finite list of firm conditions. A potential way forward would be to simply carry out discretion outside the automated system and arrange for inputs expressing results of discretionary assessments carried out by the case officer (for example: reasonable/adequate/justifiable/ suitable/likely? Y/N) (Schartum 2020). Such systems would not be fully automated and thus fall outside the scope of GDPR Article 22. Furthermore, the pace of processing would be slowed down and efficiency somewhat reduced.
Decisional systems that encode and apply static legislative requirements, employing decision trees, preclude by their very nature the use of machine learning techniques (Bygrave 2020b). Machine learning techniques are often classified into three, depending on whether they employ supervised learning, reinforcement learning or unsupervised learning. Both supervised learning and reinforcement learning involve the training of a system by way of examples. In supervised learning, the machine learns to recognize/classify new cases in a manner that is patterned on examples of correct answers fed into it. In reinforcement learning, the system learns from the outcomes of its own actions through rewards (e.g., points awarded) or penalties (e.g., points deducted) that are linked to the outcomes of such actions. In unsupervised learning, the system is given data and learns without receiving external instructions, either in advance or as feedback (Sartor and Lagioia 2020). One challenge with learning algorithms that predict outcomes on the basis of previous cases is that the algorithmic model that develops will reflect the attitudes of the decision-makers whose decisions are in the training set, i.e., both their virtues and biases (Sartor and Lagioia 2020). Unless compensating routines are introduced, machine learning will thus only reinforce previous practice, leading to the development of echo chambers (Schartum 2020). As Bygrave (2020b) notes, machine learning “needs discretionary or logical ‘space’ in which to develop and will thus be shut out of decisional systems where there is no such facility.”
The future use of machine learning in public administrative systems was only superficially touched upon by the Law Commission on the Public Administration Act (2019). The Law Commission stated that machine learning provides new possibilities but also raises questions about transparency and control/verification. It referred with approval to a Swedish law commission report (Swedish Law Commission on the Benefit Crime Act 2018) that proposed that administrative agencies that consider using AI with machine learning algorithms must in advance draw up procedures whereby third parties may perform audits, supervision, certification or other types of control of any algorithms that will be used. The Norwegian Law Commission’s comments were broad-based and did not touch upon whether, or the extent to which, AI or machine learning ought to be permitted when administrative agencies issue individual decisions. However, the preparatory works to all the nine amendments (Norwegian Ministry of Education 2019, 2021; Norwegian Ministry of Justice and Public Security 2019; Norwegian Ministry of Labour and Social Affairs 2019) did posit that in the future, it may become possible to fully automate more complex assessments by using AI and machine learning. If that were to happen, according to the preparatory works, there would be need for more specific and concrete regulation as well as a comprehensive review and impact assessment. It does, however, seem somewhat ironic that, in its efforts to circumscribe what technology ought to be used in fully automated decisions, the legislator opted for open-ended phrases such as “cannot be based on discretionary statutory terms” and “indubitable decision” rather than utilizing plain language which is more automation-friendly.
Though it does not appear that Norwegian administrative agencies, as of October 2021, are using AI and machine learning in legal decision-making systems that are fully automated, at least in the sense of GDPR article 22, there is growing interest in the use of AI and machine learning in some parts or stages of case processing. The Labour and Welfare Agency (known as “NAV” in Norway) has initiated a project to explore the use of AI to predict how long persons that are on sick leave are likely to be on such leave (NAV Sandbox Project 2021). The Norwegian State Educational Loan Fund has also used machine learning to select candidates for so-called “residential verification”, that is to verify the residential address of students registered as living away from home by checking their address against that of their parents (Norwegian Ministry of Local Government and Modernisation 2019). The use of AI and machine learning is thus not as remote as the Law Commission on the Public Administration Act may have thought.

4.2.2. The Processing Must Be Compatible with the Right to Protection of Personal Data

Seven amendments identified in Table 1 state that the processing must be compatible with the right to protection of personal data. Although the preparatory works to the amendments do not discuss this further, “the protection of personal integrity” is recognized in Article 102 of the Constitution of Norway as a fundamental human right, together with the right to private and family life. The protection of personal data is also enshrined in the EU Charter of Fundamental Rights of the EU (“EU Charter”). According to Section 8 of the EU Charter, personal data must be processed fairly for specified purposes and pursuant to a legitimate basis laid down by law. It also states that “[e]veryone has the right of access to data which has been collected concerning him or her and the right to have it rectified.” These are key data subject rights that are elaborated further in the GDPR. Individuals should thus have the possibility to find out what personal data is processed by fully automated legal decision-making tools and to demand rectification of erroneous data. The preparatory works to the amendment to the NAV Act, Public Service Pension Fund Act, Employed Seamen’s Pension Scheme Act, Fishermen’s Pension Insurance Act, Nurses’ Pension Scheme Act, and Pension Scheme for Persons Accompanying Foreign Service Employees Act state that where the data that is fed into the algorithm contains errors, there must be a possibility to manually correct the decision (Norwegian Ministry of Labour and Social Affairs 2019). Although the EU Charter is not directly binding on Norway as it is not a member state of the EU, the GDPR refers to the importance of the EU Charter in several of its Recitals and thus the EU Charter is an important source to interpret the GDPR, which is law in Norway, c.f. Personal Data Act Section 1.
Moreover, the right to data protection, as Hijmans (2020) points out, is often a prerequisite for the effective exercise of other fundamental rights such as the freedom of expression. To this one may add the right to freedom from discrimination, c.f. Article 98 of the Constitution. The interplay of the right to protection of personal data with the other fundamental rights is highlighted in Article 1(2) of the GDPR which enunciates that a key objective of the regulation is to protect “fundamental rights and freedoms of natural persons and in particular the right to the protection of personal data”, c.f. GDPR Article 1(2).

4.2.3. The Notion of a Decision Which Is “Only to a Little Degree Invasive”

The two sector-specific proposals for amendment identified in Table 3, that is, the proposed amendment to the Patient Medical Records Act and those to the National Insurance Act were published for public consultation in July 2021 with a deadline of 15 October 2021 for the submission of responses. Both amendments were published in one and the same preparatory works document (Norwegian Ministry of Health and Care Services 2021). Both proposals state that fully automated legal decisions may be made when the decision has a low impact on the individual. More precisely, the Norwegian term used is ”lite inngripende” which can be freely translated as “slightly invasive/intrusive” or “to a little degree invasive/intrusive”.
This term is further discussed in the preparatory works as referring to individual decisions that have limited consequences for the individual such as, for example, decisions regarding small amounts (Norwegian Ministry of Health and Care Services 2021). What is slightly invasive must be determined on a case-by-case basis and must consider the extent of personal data processed, the sensitivity of the data and whether there is reason to believe that the personal data is correct. Furthermore, according to the preparatory works, the conditions for the decision must also be clear and objective to such an extent that it is easy to determine whether such conditions are fulfilled or not. Such a decision, according to the preparatory works, will typically not contain elements where there is need for assessment or the exercise of judgment/discretion. Examples of decisions that are only slightly invasive would be decisions concerning the settlement of payment for patient journeys (e.g., to and from a hospital), the issue of a European Health Insurance Card, and the automatic issue of a card to exempt an individual from further payment for health services once a particular threshold has been reached. Although these examples can be considered as having a low impact on the individual, the phrase “slightly invasive/intrusive” when applied to other types of processing may give rise to difficulties of interpretation. Where should the line be drawn between “slightly” invasive/intrusive decisions and those which are invasive or intrusive? That the ability to draw a clear line is more than an academic one is evidenced by the fact that the preparatory works state that decisions that are “more complicated and invasive/intrusive” may be permitted if regulations have been issued that allow such types of decisions. These types of decisions, as the preparatory works emphasize (Norwegian Ministry of Health and Care Services 2021), require more precise rules and more “customized” due process guarantees such as, for example, the requirement to carry out random sampling and other measures to ensure quality improvement. Other examples of what may be regulated in secondary legislation in order to limit the risk of error, and that are mentioned in the preparatory works, are the need for frequent audits of the algorithms used, as well as regular review of the correctness and relevance of the automated decisions.
The term “slightly invasive/intrusive” appears to be inspired by wording used by the Law Commission on the Public Administration in its proposal for a new Public Administration Act. The Law Commission was split in its views on the use of fully automated legal decisions by administrative agencies. The majority view proposed a new section 11 that would empower administrative agencies to make fully automated legal decisions where the decision is only slightly invasive/intrusive. In the case of intrusive/invasive decisions, the majority view held, specific regulations that provide a legal basis for such intrusive/invasive decisions must be issued. The minority view went even further and, in what appears to be a circular argument, held it sufficient for section 11 to state that administrative agencies may issue fully automated legal decisions as long as the processing is necessary to exercise official authority or a legal obligation.
The use of vague and open-ended terms, such as “slightly invasive/intrusive decisions” and “more invasive/intrusive decisions” in legislation, is unfortunate. It also fails to meet the requirement in GDPR Recital 41 that a legal basis (as required in GDPR Article 22(2)(b)) must be “clear and precise and its application should be foreseeable by persons subject to it”. Ironically, instead of a clear legal rule that is itself suited for automation and operationalization, the proposed amendments provide a rule that is rather open-ended and little suited to automation.

4.3. Measures

4.3.1. The Right to Obtain Human Review

As stated earlier in this paper, thirteen of the amended laws and regulations (Table 1) explicitly state that a party in respect of whom a decision has been made has a right to obtain human review of the fully automated decision. Besides being referred to in GDPR Recital 71, the right to human intervention is one of the minimum rights that must be granted to data subjects who either explicitly consent to a fully automated legal decision or in respect of whom such a decision is necessary for contract formation or performance purposes, c.f. GDPR Article 22(3). Though a right to human review is absent from GDPR Article 22(2)(b), as stated in Section 3.2 of this paper, the Norwegian PAA provides a right of appeal from individual administrative decisions, c.f. Section 28. According to the preparatory works, by codifying the right to human review in each of these sector-specific laws and regulations, the legislator is ensuring that the right to human intervention will also apply to cases where there is no automatic right of appeal under the PAA (for example, in respect of interlocutory/provisional decisions). Another positive effect of this amendment is that it clarifies that the review must be made by a human being. Though this may today seem obvious—running the same facts through the same, rule-based algorithm, will give the same result and would thus be futile—were the public sector to move from the current practice of using decision tree-based algorithms to the unchartered territory of machine learning techniques, one cannot completely rule out that a different outcome may follow from running the same facts in a different machine learning algorithm used in the appellate stage.
The amendment to the Integration Regulations permits the full automation of tests in the Norwegian language and in Norwegian social studies where there is only one possible correct answer to each question in such tests, for example, multiple-choice tests. When these types of tests are held, an appeal is only be permitted on formal errors, c.f. Section 64, c.f. Section 42 of the Integration Regulations.
Unlike the abovementioned fourteen laws and regulations, the two proposed amendments which are still to be debated in the Storting, i.e., the proposed amendment to the Patient Medical Records Act and to the National Insurance Act, do not contain specific wording that clarifies that an individual has a right to human review. However, that such a right exists is mentioned in the preparatory works to the proposed amendments (Norwegian Ministry of Health and Care Services 2021).

4.3.2. Quality Assurance and Audits

The system that processes fully automated tests pursuant to the Integration Regulations, discussed in Section 4.3.1 above, must be subject to satisfactory quality assurance to ensure the correct result of the processing.
On similar lines, the amendment to the Regulations on Citizenship and the Regulations on Foreigners’ Access to Norway states that the use of fully automated legal decisions must be subject to frequent manual checks.
Although the above two regulations are the only ones which specifically mention the need for quality assurance and manual checks as a specific measure, most of the preparatory works to the other amended legislation also refer to the need for frequent checks and audits as an additional measure (Norwegian Ministry of Labour and Social Affairs 2019).

4.3.3. Time: Applicability Only in Extraordinary Situations

The amendment to the National Insurance Act (see Table 2) is an emergency provision that was passed in April 2020, as a result of the outbreak of the COVID-19 pandemic and allows the issue of fully automated legal decisions upon outbreak or risk of outbreak of infectious disease that endanger public health or safety. The amendment is qualified and applies when either of two alternative factors exist: a period of emergency and/or a large case load. As to the former, the amendment only applies during an outbreak or when the risk of outbreak is present. As to the latter qualification, the amendment also applies when the Labour and Welfare Agency has an unusually high case load or long processing periods because of the outbreak or risk of outbreak.

4.3.4. Data Protection Principles

The proposed amendment to the Patient Medical Records Act (Table 3) states that health data may be processed without the patient’s consent. However, the amendment continues, the degree of personal identification must not be greater than is necessary for the purpose in question, and information about diagnosis or illness can only be processed when it is necessary to achieve the purpose of the processing of information. These two qualifications are a manifestation of the data minimisation principle (c.f. GDPR Article 5(1)(b)) and the purpose limitation principle (c.f. GDPR Article 5(1)(c)) and likely meant as a reminder to entities that provide health care to take into account these key data protection principles.

4.4. Right to an Explanation of the Decision

According to GDPR Recital 71, a data subject should have a right to obtain an explanation of a decision that is based solely on automated processing. There has been much discussion in data protection legal literature on the extent to which this is a legally enforceable right since it is only found in a recital and thus does not have the direct force of law as an article does. This discussion becomes rather redundant in the case of individual decisions because, as is the case in other democratic jurisdictions, public sector bodies and agencies in Norway are obliged by public administrative law to provide grounds for their individual decisions. The contents of the grounds are specified in PAA Section 25 and must include: (i) a reference to the rules on which the individual decision is based, (ii) the factual circumstances upon which the administrative decision is based, and (iii) where the decision involves the use of administrative discretion, the chief considerations that were decisive for the exercise of discretionary powers.
Coupled with the requirements of PAA Section 25 is the right of the data subject to obtain access to personal data concerning him or her that is processed by the data controller and, where there is fully automated decision-making, the right to “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”, c.f. GDPR Article 15(1)(h).
As Schartum (2020) explains: “Fully automated decisions must logically be based on two main premises: first, data representing every relevant fact of the case must exist in machine-readable form and be digitally accessible in the appropriate technical format; second, it must be possible to process all data of the case by means of computer programs which contain correct and complete representation of all relevant applicable legal rules.” Opaque systems that defy human interpretability and that do not provide an explanation for how a decision was reached, also known as “black boxes”, would run counter to the right to obtain the grounds for an individual decision.
The Law Commission for the Archives Act proposed a duty to document automated application of the law (the Law Commission’s proposed Section 10) and a duty to document decisions (the Law Commission’s proposed Section 11). The former was a duty to document, inter alia, the data types used, the sources used for these data types and the processing rules derived from the legal rules that are determinative for the decision. The latter was a duty to document, inter alia, the extent and manner in which a decision is based on automated application of the law, the statutes, regulations and instructions/orders that were determinative for the decision and the factual circumstances that affected the outcome. The proposed wording was not taken forward by the drafters of the bill proposing a new Archives Act that was published for consultation in October 2021. The drafters held that such requirements are more suited for inclusion in a new Public Administration Act than in a law to regulate archives. Unfortunately, as has been shown earlier in Section 4.2.3 of this paper, the text proposed by the Law Commission on the Public Administration Act lacks a similar assessment on this issue as that made by the Law Commission on the Archives Act. It is hoped that any new Public Administration Act will include a duty to document automated application of the law and a duty to document decisions on lines similar to those proposed by the Law Commission on the Archives Act.
Another potential source of inspiration for the drafters of the new Public Administration Act would be Article R311-3-1-2 of the French Code of Administrative Procedure, which has parallels with the Law Commission on the Archives Act’s now-defunct proposals. According to Article R311-3-1-2, where an individual decision is based on algorithmic processing, the administrative agency must provide the party to whom such decision is directed, with the following information: (i) the degree and mode of contribution of the algorithmic processing to the decision-making; (ii) the data processed and their source; (iii) the parameters of the processing and, where appropriate, their weighting, applied to the situation of the person concerned; and (iv) the operations carried out by the processing. An advantage of the wording in both Article R311-3-1-2 and the text proposed by the Law Commission on the Archives Act is that the information/documentation duty would apply not just when decisions are fully automated but also where automation is partial and thus the sphere of application would be wider than that of GDPR Article 22.

5. Conclusions

The activity in the last five years has shown that the requirements of GDPR Article 22 are being addressed on a piecemeal basis in Norway. Though some limitations regarding the type of legal rules that may be fully automated and some of the suitable measures identified in the legislation and in their respective preparatory works are similar, there are also some marked differences between them. This piecemeal approach on its own may lead to fragmentation and complexity. A simpler, clearer and more foreseeable path forward would be one similar to the approach taken in the French Code of Administrative Procedure vis-à-vis the use of algorithms in individual decisions or to that proposed by the Law Commission on the Archives Act with regard to cases where there is automated application of the law. It is hoped that the new Public Administration Act will address these issues and take a more proactive and comprehensive approach that lays down requirements and measures that must be applied across the board, i.e., by all administrative agencies. This could then be supplemented by clearer and simpler amendments to the respective sector-specific statutes or regulations to provide a legal basis that facilitates fully automated decision-making in those cases where the legislator deems this necessary. This is essential if digitalisation of the public sector is to remain truly transparent, inclusive and trustworthy.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

The author is grateful to the research group in public administration and governance at the Oslo Business School at Oslo Metropolitan University for incisive comments to an earlier draft of this paper.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Article 29 Data Protection Working Party. 2018. Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679. WP251rev. 01. Adopted 3 October 2017, Last Revised and Adopted on 6 February 2018. Available online: https://ec.europa.eu/newsroom/article29/items/612053 (accessed on 9 October 2021).
  2. Bygrave, Lee A. 2020a. Article 22. In The EU General Data Protection Regulation (GDPR)—A Commentary. Edited by Christopher Kuner, Lee A. Bygrave and Christopher Docksey. Oxford: Oxford University Press, pp. 522–42. [Google Scholar]
  3. Bygrave, Lee A. 2020b. Machine Learning, Cognitive Sovereignty and Data Protection Rights with Respect to Automated Decisions, Version 2.0. In The Cambridge Handbook of Information Technology, Life Sciences and Human Rights (Forthcoming). Edited by Marcello I., Oreste P., Laura L., Elisa S. and Roberto A. Cambridge: Cambridge University Press, Also available as University of Oslo Faculty of Law Research Paper No. 2020–35. Available online: https://ssrn.com/abstract=3721118 (accessed on 22 October 2021).
  4. Council of Europe. 2018. The Administration and You: Principles of Administrative Law Concerning Relations between Individuals and Public Authorities. Available online: https://rm.coe.int/the-administration-and-you/16808eb47e (accessed on 10 August 2021).
  5. Eckhoff, Torstein, and Eivind Smith. 2018. Forvaltningsrett, 11th ed. Oslo: Universitetsforlaget. [Google Scholar]
  6. Edwards, Lilian, and Michael Veale. 2017. Enslaving the Algorithm: From a “Right to an Explanation” to a “Right to Better Decisions”? IEEE Security & Privacy 16: 46–54. [Google Scholar]
  7. Edwards, Lilian, and Michael Veale. 2018. Slave to the Algorithm? Why a ‘Right to an explanation’ is probably not the remedy you are looking for’. Duke Law & Technology Review 18: 18–84. [Google Scholar]
  8. Eng, Svein. 2007. Rettsfilosofi. Oslo: Universitetsforlaget. [Google Scholar]
  9. European Data Protection Board. 2018. Endorsement 1/2018. Available online: https://edpb.europa.eu/sites/default/files/files/news/endorsement_of_wp29_documents_en_0.pdf (accessed on 8 October 2021).
  10. Frihagen, Arvid, and Ørnulf Rasmussen. 2010. Frihagens forvaltningsrett: Bind 1: Innledning til forvaltningsretten, dokumentoffentlighet, informasjonsbehandling, inhabilitet, saksbehandling, klage og omgjøring, 2nd ed. Bergen: Fagbokforlaget. [Google Scholar]
  11. Graver, Hans Petter. 2019. Alminnelig forvaltningsrett, 5th ed. Oslo: Universitetsforlaget. [Google Scholar]
  12. Hijmans, Hielke. 2020. Article 1. In The EU General Data Protection Regulation (GDPR)—A Commentary. Edited by Christopher Kuner, Lee A. Bygrave and Christopher Docksey. Oxford: Oxford University Press, pp. 48–59. [Google Scholar]
  13. Hildonen, Benita Haftorn, and Siri Gulstuen. 2012. Kartlegging av automatiserte avgjørelser Ii offentlig forvaltning. A report commission by the Norwegian Data Protection Authority. Available online: https://www.uio.no/studier/emner/jus/afin/FINF4001/h12/undervisningsmateriale/automatisert_forvaltning.pdf (accessed on 15 August 2021).
  14. Kaminski, Margot E. 2019. The Right To Explanation, Explained. Berkeley Technology Law Journal 34: 189–218. [Google Scholar] [CrossRef] [Green Version]
  15. Law Commission on the Public Administration Act. 2019. NOU 2019:5 Ny forvaltningslov. (English: A New Public Administration Act). Available online: https://www.regjeringen.no/no/dokumenter/nou-2019-5/id2632006/ (accessed on 16 October 2021).
  16. Maglieri, Gianclaudio. 2019. Automated decision-making in the EU Member States: The right to explanation and other “suitable safeguards” in the national legislation. Computer Law & Security Review 35: 105327. [Google Scholar]
  17. Mendoza, Isak, and Lee A. Bygrave. 2017. The Right Not to Be Subject to Automated Decisions Based on Profiling. In EU Internet Law: Regulation and Enforcement. Edited by Tatiania-Eleni Synodinou, Philippe Jougleux, Christiana Markou and Thalia Prastitou. Cham: Springer, pp. 77–98. [Google Scholar]
  18. Moen, Olav Haugen. 2019. Forvaltningsskjønn og domstolskontroll [Administrative Discretion and Judicial Review]. Oslo: Gyldendal. [Google Scholar]
  19. NAV—Norwegian Labour and Welfare Agency’s Sandbox Project. 2021. Prosjektplan: Prediksjon av sykefravær for mer effektiv oppfølging [English: Project Plan: Prediction of Sick Leave for More Effective Follow-Up]. Available online: https://www.datatilsynet.no/contentassets/ea0683afe9d04ff1a9e87be9747380c9/prosjektplan-for-nav---sladdet.pdf (accessed on 24 October 2021).
  20. Norwegian Ministry of Education. 2019. Høringsnotat—Forslag til endringer i statsborgerloven og statsborgerforskriften—Behandling av personopplysninger og automatiserte avgjørelser [English: Consultation Paper: Amendments to the Regulations on Citizenship—Processing of Personal Data, Use of Automated Decisions, etc.]. Available online: https://www.regjeringen.no/contentassets/ec5f3f7d0e804ebd9aaefceac4342b02/horingsnotat-forslag-til-endringer-i-statsborgerloven-og-statsborgerforskriften-behandling-av-personopplysninger-og-automatiserte-avgjorelser.pdf (accessed on 10 August 2021).
  21. Norwegian Ministry of Education. 2021. Prop. 111 L (2020–2021) Proposisjon til Stortinget Endringer i universitets- og høyskoleloven, utdanningsstøtteloven, fagskoleloven og yrkeskvalifikasjonsloven mv. [English: Bill 111 L (2020–2021) Proposal to amend the Universities and University Colleges Act, Educational Financial Support Act, Higher Vocational Education Act and Vocational Qualifications Act.] Document 03/2021. Available online: https://www.regjeringen.no/no/dokumenter/prop.-111-l-20202021/id2840742/ (accessed on 10 August 2021).
  22. Norwegian Ministry of Finance. 2018. Høringsnotat—forslag til endringer Ii reglene om Skattetatens informasjonsbehandling [English: Consultation paper—Amendments to the rules regarding the processing of information by the Norwegian Tax Administration]. Available online: https://www.regjeringen.no/no/dokumenter/hoyring---forslag-om-endringar-i-reglane-om-informasjonshandsaminga-i-skatteetaten/id2594901/ (accessed on 15 October 2021).
  23. Norwegian Ministry of Finance. 2021. Prop. 237L (2020–2021) Proposisjon til Stortinget—Lov om inn- og utførsel av varer (vareførselsloven) og lov om tollavgift (tollavgiftsloven) [Bill 237L (2020–2021) Proposal for a New Carriage of Goods Act and New Customs Act]. Available online: https://www.regjeringen.no/no/dokumenter/prop.-237-l-20202021/id2871469/ (accessed on 15 October 2021).
  24. Norwegian Ministry of Foreign Affairs. 2019. White Paper: Digital Transformasjon og utviklingspolitikken (Meld.St.11 (2019–2020). [English: Digital Transformation and Development Policy.]. Available online: https://www.regjeringen.no/no/dokumenter/meld.-st.-11-20192020/id2682394/?ch=1 (accessed on 1 August 2021).
  25. Norwegian Ministry of Government Administration, Reform and Church Affairs. 2012–2013. Meld.St.23 (2012–2013). Report to the Storting (White Paper) Digital Agenda for Norway: ICT for Growth and Value Creation. Available online: https://www.regjeringen.no/en/dokumenter/meld.-st.-23-2012-2013/id718084/ (accessed on 17 October 2021).
  26. Norwegian Ministry of Health and Care Services. 2021. Høringsnotat om endringer i pasientjournalloven mv.—nasjonal digital samhandling til beste for pasienter og brukere [English: Consultation Paper: Amendments to the Patient Medical Records Act etc.—National Digital Collaboration in the Best Interest of Patients and Users]. Available online: https://www.regjeringen.no/no/dokumenter/horing-om-endringer-i-pasientjournalloven-mv.-nasjonal-digital-samhandling-til-beste-for-pasienter-og-brukere/id2865518/ (accessed on 29 October 2021).
  27. Norwegian Ministry of Justice and Public Security. 2019. Høringsnotat: Endringer i utlendingsforskriften—Behandling av personopplysninger, bruk av automatiserte avgjørelser m.m. [English: Consultation paper: Amendments to the Regulations on Foreigners Access to Norway—Processing of Personal Data, Use of Automated Decisions, etc.]. Available online: https://www.regjeringen.no/contentassets/907ec51527ef4e8fa3c7f1ff2a92ffc3/horingsnotat.pdf (accessed on 10 August 2021).
  28. Norwegian Ministry of Labour and Social Affairs. 2019. Prop. 135L (2019–2020) Endringer i arbeids- og velferdsforvaltningsloven, sosialtjenesteloven, lov om Statens pensjonskasse og enkelte andre lover (behandling av personopplysninger) [English: Bill 135L (2019–2020) Proposal to amend the Labour and Welfare Act, Social Services Act, National Insurance Act, Public Service Pension Fund Act and other laws.]. Available online: https://www.regjeringen.no/no/dokumenter/prop.-135-l-20192020/id2714572/ (accessed on 10 August 2021).
  29. Norwegian Ministry of Local Government and Moderinsation. 2015–2016. Meld.St.27 (2015–2016). Digital agenda for Norge: IKT for en enklere hverdag og økt produktivitet. Available online: https://www.regjeringen.no/no/dokumenter/meld.-st.-27-20152016/id2483795/ (accessed on 17 October 2021).
  30. Ministry of Local Government and Modernisation. 2019. Strategy paper: One digital public sector: Digital strategy for the public sector 2019–2025. Available online: https://www.regjeringen.no/en/dokumenter/one-digital-public-sector/id2653874/ (accessed on 17 October 2021).
  31. Norwegian Ministry of Local Government and Modernisation. 2020. National Strategy for Artificial Intelligence. Document 01/2020. Available online: https://www.regjeringen.no/contentassets/1febbbb2c4fd4b7d92c67ddd353b6ae8/en-gb/pdfs/ki-strategi_en.pdf (accessed on 16 October 2021).
  32. OECD. 2017. Digital Government Review of Norway: Boosting the Digital Transformation of the Public Sector. Paris: OECD Publishing. [Google Scholar] [CrossRef]
  33. Preparatory Works to the Tax Management Bill. 2015–2016. Preparatory Works to Bill 38L (2015–2016) proposing an Act on Tax Management 2015. (Available only in Norwegian: Prop. 38L (2015–2016) Lov om skatteforvaltning). [Google Scholar]
  34. Russell, Stuart, and Peter Norvig. 2020. Artificial Intelligence: A Modern Approach, 4th ed. Englewood Cliffs: Prentice Hall. [Google Scholar]
  35. Sartor, Giovanni, and Francesca Lagioia. 2020. The Impact of the General Data Protection Regulation (GDPR) on Artificial Intelligence. Brussels: European Parliamentary Research Service, Scientific Foresight Unit (STOA) PE 641.530, Available online: https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2020)641530 (accessed on 23 October 2021).
  36. Schartum, Dag Wiese. 1993. Rettssikkerhet og systemutvikling i offentlig forvaltning (Rule of law and Systems Development in Public Administration). Oslo: Universitetesforlaget. [Google Scholar]
  37. Schartum, Dag Wiese. 2012. Fra lovtekst til programkode. Utvikling av rettslige beslutningssystemer i elektronisk forvaltning. Available online: https://www.uio.no/studier/emner/jus/afin/FINF4001/h11/Fra%20lovtekst%20til%20programkode%202011-1.pdf (accessed on 22 October 2021).
  38. Schartum, Dag Wiese. 2018. Digitalisering av offentlig forvaltning—Fra lovtekst til programkode. Oslo: Fagbokforlaget. [Google Scholar]
  39. Schartum, Dag Wiese. 2020. From Legal Sources to Programming Code: Automatic Individual Decisions in Public Administration and Computers under the Rule of Law. In The Cambridge Handbook of the Law of Algorithms. Edited by Woodrow Barfield. Cambridge: Cambridge University Press, pp. 301–335. [Google Scholar]
  40. Schartum, Dag Wiese, Arild Jansen, and Tommy Tranvik. 2017. Digital Forvaltning—En innføring. Oslo: Fagbokforlaget. [Google Scholar]
  41. Selbst, Andrew D, and Julia Powles. 2017. Meaningful Information and the Right to Explanation. International Data Privacy Law 7: 233. [Google Scholar] [CrossRef]
  42. Stub, Marius. 2011. Tilsynsforvaltningens kontrollvirksomhet: Undersøkelse og beslag I feltet mellom forvaltningsprosess og straffeprosess. Oslo: Universitetsforlaget. [Google Scholar]
  43. Swedish Law Commission on the Benefit Crime Act. 2018. Statens Offentliga Utredningar SOU 2018:14. Bidragsbrott och underrättelseskyldighet vid felaktiga utbetalningar från välfärdssystemen—en utvärdering [English: Benefit Violations and Notification Obligations in the Event of Incorrect Payments from the Welfare System—An Evaluation]. Available online: http://www.sou.gov.se/wp-content/uploads/2018/02/SOU-2018_14_till-webben.pdf (accessed on 24 October 2021).
  44. Tosoni, Luca. 2021. The right to object to automated individual decisions: Resolving the ambiguity of Article 22(1) of the General Data Protection Regulation. International Data Privacy Law 11: 145–62. [Google Scholar] [CrossRef]
  45. UN. 2020. E-Government Survey 2020: Digital Government in the Decade of Action for Sustainable Development. New York: UN, Available online: https://www.un.org/development/desa/publications/publication/2020-united-nations-e-government-survey (accessed on 10 August 2021).
  46. Wachter, Sandra, Brent Mittelstadt, and Luciano Floridi. 2017. Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation. International Data Privacy Law 7: 76–99. [Google Scholar] [CrossRef] [Green Version]
Table 1. Substantive amendments to primary and secondary legislation passed by parliament.
Table 1. Substantive amendments to primary and secondary legislation passed by parliament.
Name of Statute/RegulationsSection and Context of the Legal BasisLimitationsMeasures
1A—Universities and University Colleges Act
(universitets-og høyskoleloven)
s. 4-15(4): case processing of administrative systems of educational institutions-Right to human review
1B—Universities and University Colleges Act
(universitets- og høyskoleloven)
s. 3-4: approval of foreign higher education-Right to human review
2—Primary, lower and upper secondary schools Act
(opplæringslova)
s. 3-4a: approval of foreign education/training; power to issue regulations-Right to human review
3—Higher Vocational Education Act
(fagskoleloven)
s. 7: approval of relevant foreign education-Right to human review
4—Educational Financial Support Act
(utdanningsstøtteloven)
news. 20 (in effect from 1 January 2022): power of the State Educational Support Fund to make fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
5—Regulations on Citizenship
(statsborgerforskriften)
s. 13A-3: power of Directorate of Immigration to make fully automated decisionsNon-discretionary unless indubitableRight to human review
Frequent human checks
6—Regulations on Foreigners’ Access to Norway
(utlendingsforskriften)
s. 17-7c: power of Directorate of Immigration to make fully automated decisionsNon-discretionary unless indubitableRight to human review
Frequent manual checks
7—Integration Regulations
(integreringsforskriften)
s. 64: Results of multiple-choice tests in Norwegian and social studies may be fully automated-Satisfactory quality assurance
Right to appeal formal errors
8—Driving License Regulations
(førerkortforskriften)
s. 15-1: issue of driving license and temporary driving license-Right to human review
9—Labour and Welfare Act—“NAV Act”
(NAV-loven)
s. 4a: Labour and Welfare Agency may issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
10—Public Service Pension Fund Act
(Statens pensjonskasseloven)
s. 45b: the Public Service Pension Fund can issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
11—Employed Seamen’s Pension Scheme Act
(Pensjonsordning for arbeidstakere til sjøs)
s. 21a: the Employed Seamen’s Pension Scheme may issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
12—Fishermen’s Pension Insurance Act
(fiskerpensjonsloven)
s. 29a: the directorate administering the fund may issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
13—Nurses’ Pension Scheme Act
(sykepleierpensjonsloven)
s. 36: the Public Service Pension Fund may issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
14—Pension Scheme for Persons Accompanying Foreign Service Employees Act
(lov om ledsagerpensjon i utenrikstjenesten)
s. 3a: the Public Service Pension Fund may issue fully automated decisions;
power to issue regulations
Non-discretionary unless indubitable
Proper case processing
Compatible with data protection right
Right to human review
Table 2. Statutes which solely contain an enabling section.
Table 2. Statutes which solely contain an enabling section.
Name of StatuteRelevant RegulationsIssued Y/NName of RegulationsRemarks
Citizenship Act
(statsborgerloven)
YRegulations on CitizenshipSee Table 1
Act on Foreigners’ Access to Norway
(utlendingsloven)
YRegulations on Foreigners Access to NorwaySee Table 1
Integration Act
(integreringsloven)
YIntegration RegulationsSee Table 1
Road Traffic Act
(vegtrafikkloven)
YDriving Licenses RegulationsSee Table 1
Animal Welfare Act
(dyrevelferdsloven)
N--
National Insurance Act
(folketrygdeloven)
N-Power to issue regulations upon outbreak or risk of outbreak of infectious disease that endanger public health/safety. To be given effect as long as the outbreak or risk of outbreak is present or where the Labour and Welfare Agency has an unusually high case load or long processing periods because of the outbreak or risk thereof.
State Pensions Fund Act
(Statens pensjonsfond-loven)
N S.10 enables regulations whereby data subjects’ rights may be restricted
Table 3. Substantive proposal for amendments to legislation.
Table 3. Substantive proposal for amendments to legislation.
Name of Proposed Statute/RegulationsSection and Context of the Legal BasisLimitationsMeasures/Remarks
1—Proposed amendment to Patient Medical Records Act (pasientjournalloven)s. 11: Case processing, administration, settlement and implementation of healthcare;
Power to issue regulations
Fully automated decisions permitted where the decision is slightly invasiveThe degree of identification shall not be greater than necessary for the purpose;
Processing of information on diagnosis or illness only when necessary to achieve purpose of processing the information
2—Proposed amendment to National Insurance Act (folketrygdeloven)s. 21-11: Decisions on health care benefits pursuant to chapter V of the Act;
Power to issue regulations
Fully automated decisions permitted where the decision is slightly invasive-
3—Bill proposing a new Carriage of Goods Act (forslag til ny vareførselsloven)s. 7-15: Customs authorities can issue fully automated decisions;
Power to issue regulations
Proper caseprocessingRight to human review
4—Bill proposing a new Customs Duty Act (forslag til ny tollavgiftsloven)s. 8-5: Customs authorities can issue fully automated decisions;
Power to issue regulations
Proper case processingRight to human review
5—Bill proposing a new Archives Act (forslag til ny arkivloven)--Note: The Law Commission on the Archives Act’s proposal to introduce a duty to document automated application of the law was not taken up in the Bill.
6—Law Commission on the Public Administration Act’s proposal for a new PAA (lovutvalgets forslag til ny forvaltningsloven)s. 11 (majority view):
Power to issue regulations that an administrative agency may issue decisions based on fully automated case processing.
Decisions that are only slightly invasive may be made by fully automated processing without need for legal basis in regulations.
s. 12: Duty to document the legal content in automated case processing systems. The documentation shall be made public, unless provided otherwise by law or special considerations require otherwise; power to issue regulations on system requirements and on publication
--
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Weitzenboeck, E.M. Simplification of Administrative Procedures through Fully Automated Decision-Making: The Case of Norway. Adm. Sci. 2021, 11, 149. https://doi.org/10.3390/admsci11040149

AMA Style

Weitzenboeck EM. Simplification of Administrative Procedures through Fully Automated Decision-Making: The Case of Norway. Administrative Sciences. 2021; 11(4):149. https://doi.org/10.3390/admsci11040149

Chicago/Turabian Style

Weitzenboeck, Emily M. 2021. "Simplification of Administrative Procedures through Fully Automated Decision-Making: The Case of Norway" Administrative Sciences 11, no. 4: 149. https://doi.org/10.3390/admsci11040149

APA Style

Weitzenboeck, E. M. (2021). Simplification of Administrative Procedures through Fully Automated Decision-Making: The Case of Norway. Administrative Sciences, 11(4), 149. https://doi.org/10.3390/admsci11040149

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop