Previous Article in Journal
Effectiveness of Variable Message Signs on Utah Roadways
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

According to Whose Morals? The Decision-Making Algorithms of Self-Driving Cars and the Limits of the Law

1
Department of Legal Theory, Széchenyi István University, Áldozat utca 12, H-9026 Győr, Hungary
2
Department of Road and Rail Vehicles, Audi Hungaria Faculty of Vehicle Engineering, Széchenyi István University, Egyetem tér 1, H-9026 Győr, Hungary
*
Author to whom correspondence should be addressed.
Future Transp. 2026, 6(1), 5; https://doi.org/10.3390/futuretransp6010005 (registering DOI)
Submission received: 11 November 2025 / Revised: 16 December 2025 / Accepted: 19 December 2025 / Published: 27 December 2025

Abstract

The emergence of self-driving vehicles raises not only technological challenges, but also profound moral and legal challenges, especially when the decisions made by these vehicles can affect human lives. The aim of this study is to examine the moral and legal dimensions of algorithmic decision-making and their codifiability, approaching the issue from the perspective of the classic trolley dilemma and the principle of double effect. Using a normative-analytical method, it explores the moral models behind decision-making algorithms, the possibilities and limitations of legal regulation, and the technological and ethical dilemmas of artificial intelligence development. One of the main theses of the study is that in the case of self-driving cars, the programming of moral decisions is not merely a theoretical problem, but also a question requiring legal and social legitimacy. The analysis concludes that, given the nature of this borderline area between law and ethics, it is not always possible to avoid such dilemmas, and therefore it is necessary to develop a public, collective, principle-based normative framework that establishes the social acceptability of algorithmic decision-making.
Keywords: trolley problem; doctrine of double effect; autonomous vehicles; self-driving cars; law and morals; algorithmic decision-making trolley problem; doctrine of double effect; autonomous vehicles; self-driving cars; law and morals; algorithmic decision-making

Share and Cite

MDPI and ACS Style

Pődör, L.; Lakatos, I. According to Whose Morals? The Decision-Making Algorithms of Self-Driving Cars and the Limits of the Law. Future Transp. 2026, 6, 5. https://doi.org/10.3390/futuretransp6010005

AMA Style

Pődör L, Lakatos I. According to Whose Morals? The Decision-Making Algorithms of Self-Driving Cars and the Limits of the Law. Future Transportation. 2026; 6(1):5. https://doi.org/10.3390/futuretransp6010005

Chicago/Turabian Style

Pődör, Lea, and István Lakatos. 2026. "According to Whose Morals? The Decision-Making Algorithms of Self-Driving Cars and the Limits of the Law" Future Transportation 6, no. 1: 5. https://doi.org/10.3390/futuretransp6010005

APA Style

Pődör, L., & Lakatos, I. (2026). According to Whose Morals? The Decision-Making Algorithms of Self-Driving Cars and the Limits of the Law. Future Transportation, 6(1), 5. https://doi.org/10.3390/futuretransp6010005

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop