TechDecisions spoke with Tom Kiley of Tom Kiley Law Group about the legal issues surrounding nuTonomy’s self-driving care pilot program coming to Boston.
How did Boston get involved with autonomous cars?
Because nuTonomy is a start-up, whose genesis is from MIT and Boston is their home turf. Boston presents the ultimate challenge to the autonomous vehicle due to its labyrinth of ancient streets, congestion and pedestrian traffic.
In what capacity are they coming?
It’s a test, the stated purpose for which is to gather data and to test how the vehicles respond to the conditions in Boston, including inclement weather, ice, and snow interfering with the cameras and the efficacy of the software.
When will this be happening?
Which autonomous cars will be coming? Who are the manufacturers? nuTonomy has a fleet of electric Renaults that are sub compact in size.
How will these cars know the rules of the road?
Especially in a city as convoluted in terms of roadways as Boston? Presumably the vehicles will know the rules of the road which will be programmed into the software. The cameras will detect traffic signals, stop lights, speed limits signs, yield signs, etc. The convoluted pattern of roadways and traffic is the variable that nuTonomy is testing for. Additionally, there will be drivers in all of the cars who will be prepared to intervene in the event of a problem.
If there is an accident with an autonomous car, how will the law interpret who is at fault?
In the event of an accident in that there is no new statutory law, it will be general negligence principles that would apply. In other words, the conduct of the autonomous vehicle will be evaluated as compared to the non-autonomous vehicle or pedestrian involved. If the fault lies mostly with the autonomous vehicle, there are a host of potential responsible parties, from the owner of the autonomous vehicle to the developers and installers of the software, and anyone involved in the setup, design, or maintenance of the mechanical or electric components of the vehicle if they are determined to have been faulty.
How will we know if it is an error on the part of the autonomous car’s programming, it’s mechanics, or an error on the part of another driver involved in a crash?
Presumably there is a control module much like the ACM in most vehicles and other software that will diagnose a failure in the programming as well as determine the speed of the vehicle at the time of the crash.
How will passengers be involved with these autonomous cars? If an autonomous car crashes with a passenger inside, who is liable for damages or injuries?
With respect to passengers, the same general tort principles would apply. Presumably a passenger would be fault-free and therefore the same parameters described above would apply.