When an autonomous vehicle crashes or has an accident of some sort – who is responsible, and to what degree? Consider the following story:

Alice buys an autonomous vehicle from Bob, who sells them. The autonomous vehicle has different settings, some more aggressive (where the autonomous vehicle drives faster and brakes harder), and some less. Alice sets the autonomous vehicle to its most aggressive setting. One night on a dark and wet road, Alice hits a pedestrian, Carlos, who was jaywalking. Carlos is badly hurt.

First, some assumptions that I have made regarding this story: The vehicle still has a steering wheel in the car, but has the option of full autonomous mode. I also assume that the government have made some sort of standardized test that all autonomous systems must follow. This includes minimum braking distance, minimum distance to other cars at varying speed and at minimum distance at which humans must be recognized by the system (given that there is a clear line of sight) – let’s assume 50 meters in urban areas.
Given these circumstances, I would argue that the state and the manufacturer are partially responsible for this injury. The government is mainly responsible due to the fact that they have allowed this vehicle on the road. The manufacturer might not be responsible per se, if they have been truthful during testing (Remember VW diesel problems?). Furthermore, the driver of an autonomous vehicle should not really be held responsible in this situation, as she has to assume that the vehicle is fully compliant with all the safety regulations.

This means that either the problem was in the testing of the vehicles itself, either from poorly conducted safety assurance from the government or that the manufacturer cheated the test. Of course, it could also be a freak accident, but that would probably be due to either some malfunction or bug in the system itself, leading to the manufacturer having to take responsibility due to poor quality or poor design of the system itself.


The question at hand is rather complex and I have not given a direct answer. That is because there are so many variables that have to be accounted for, so a full investigation has to be put in place so that the root cause can be found. However, I think that when the vehicles become fully autonomous, the driver cannot be held responsible – they just take for granted that the system will work no matter what and therefore it’s not their responsibility if the vehicle crashes. (In addition, being involved in such a crash would be “punishment” enough for any driver). 

Comments

  1. It is indeed a complex situation. Yes, being involved in a crash would be a punishment in itself. Thank you for your thoughts. - The CDIO Academy Team

    ReplyDelete

Post a Comment