This week’s question is from Marianne K. from Pacific Heights who asks: “I read your column last week about the first Uber robot-car pedestrian death in Arizona. Then, the next day there was a fatal crash involving a Tesla on “Autopilot.” What is going on? Who is responsible when a car on Autopilot crashes?”
An Update On New Regulations Concerning Automated Vehicles
This week’s column covers recent developments in laws and regulations concerning automated vehicles, which you may know as “driverless cars” or “robot cars.”
The area of Automated Vehicle (AV) law and regulation is rapidly evolving. It is a complex process which involves elected legislative bodies, regulatory bodies, and vehicle manufacturers. There are countless human lives and billions, if not trillions, of dollars at stake as this technology develops and these vehicles enter our transportation infrastructure.
Are self-driving cars safe if no one can take control of the car in an emergency?
This week’s question comes from Adrian H. in San Francisco who asks: “I read something in the paper about self-driving vehicles the other day. Did the government recently state that these cars can be operated without someone being able to take control in an emergency? What does that mean: will we be seeing cars on the road without someone in the driver’s seat?”
Is Your Car Safe From Hackers?
This week’s question comes from John J. in the Mission who asks “I have been reading about the vulnerability of today’s modern cars. Is it true that they can be hacked and taken over? If they are taken over and a crash happens who is responsible?”
John, your question is timely. There have been recent articles published about a remote take-over of a vehicle’s control systems by hackers ten miles away. To date there is no published case which has been brought involving such an incident but this simply means that a collision as a result of a vehicle hacking has yet to occur.