top of page
Writer's pictureFOFA

Is AI Driverless Technology Here, and Are Drivers Obsolete?

When drivers are no longer needed, who is responsible if an accident occurs? During the initial boom of AI, a popular joke circulated: AI can never replace accountants and lawyers because AI can't take the blame when things go wrong, but accountants and lawyers can. This joke now applies to the field of driverless technology, presenting an awkward yet unavoidable issue.


If you were a driverless AI, how would you choose? Some might say I'm overly concerned, as current driverless cars like those in the "sprinting" phase haven't reached speeds that require emergency braking and lead to loss of control. Moreover, driverless cars are equipped with safety officers, so how could accidents happen?


In reality, Germany's Ethics Commission, in its comprehensive report on autonomous and connected driving, lists scenarios driverless technology must consider. Simply put, in fully autonomous driving at the L4-L5 levels, programmers or AI must decide who to prioritize in a "dilemma" situation, addressing the "life safety conflict" problem in driverless scenarios.

The definition of driverless technology is straightforward:

It refers to vehicles autonomously completing driving tasks without relying on a driver.

Academically, it involves using sensors, control systems, AI, and other technologies to enable a vehicle to perceive the environment, make decisions, and execute driving tasks without direct driver control. Internationally, the SAE's L0-L5 scale is used to classify autonomous driving technologies. L0 is no automation, while L5 is full automation. In our country, the Ministry of Industry and Information Technology also defines levels 0-4, with level 4 being full automation.



Why discuss the definition of driverless technology?

To address the question of "who is responsible in an accident," we must clarify that in SAE's L0-L3 levels, this issue doesn't arise. In these levels, drivers still retain control, and if an accident occurs, they can be held accountable. We need to focus on the "collision" challenge in the L4-L5 levels of driverless technology.

Currently, as autonomous vehicles are equipped with safety officers (or at least remote safety officers), and since in our country, they have the ability to control and handle the vehicle, they bear the driver's responsibility if an accident occurs. Moreover, if the operator of autonomous vehicles fails to fulfill their primary safety responsibilities, they could be liable for major accidents.


However, real-life moral dilemmas are more complex and involve probabilities. For example, if a certain autonomous driving algorithm prioritizes protecting passengers, and an accident suddenly occurs, emergency braking might result in a 40% chance of death and 50% chance of serious injury for passenger A, whereas not braking could mean a 99% chance of death and 1% chance of serious injury for pedestrians outside.


Should the algorithm still prioritize protecting passengers under such circumstances? This complex question requires further exploration.

Reference:

1.Automated Vehicles Act2018 UK Public General Acts 2018 c.18

2.The Automated and Electric Vehicles Act 2018 (Commencement No. 2) Regulations 2022 UK Statutory Instruments 2022 No. 587 c. 31

3.The Road Traffic (Highway Code) (Increase in Price) Order (Northern Ireland) 1969 Northern Ireland Statutory Rules and Orders 1969 No. 88

4.白惠仁:《自動駕駛汽車的「道德演算法困境」》,《科學研究》2019年第1期

5.[德]康德:《實踐理性批判》,鄧曉芒譯,楊祖陶校,人民出版社2016年版

6. [美]約翰‧羅爾斯:《正義論》(修訂版),何懷宏等譯,中國社會科學出版社2009年版

7. 鄭玉雙:《自動駕駛的演算法正義與法律責任體系》,《法制與社會發展》2022年第4期

8. 肖飒律師:深度 | 假如萝卜快跑撞了人,到底该谁负责? 2024年08月09日

9. https://www.autopilotreview.com/self-driving-cars-sae-levels/

10 views0 comments

Comentarios


bottom of page