We Offer In-Person and Virtual Appointments
AV Preeminent
Best of Davie Voted Best Personal Injury Attorney
Best of Davie Voted Best Law Firm
Verified Authorize.Net Merchant

Accidents With Driverless Cars

What was once thought to be seen only in cartoons, or not in our life time, is quickly becoming a reality. Driverless cars, also referred to as self-driving cars, robotic cars, and autonomous vehicles, are motor vehicles that need no driver but are capable of sensing its environment and navigating and driving.

The auto and tech industries believe that driverless cars could help ease traffic congestion, lower pollution and prevent accidents. There are also many questions and ethical debates surrounding driverless cars – for example – how can a machine or program make life and death decisions to avoid an accident or what to do if in an accident.

Almost every major manufacturer (General Motors, Ford, Toyota, Audi, BMW and Tesla) and even companies like Apple and Google have self-driving vehicle programs with most expecting them to be on the roads for consumer purchase in the next 5-10 years.

Who is at Fault in an Accident

A study by the University of Michigan’s Transportation Research Institute in Ann Arbor, Michigan, reports that accidents rates are twice as high as for regular cars. The study found self-driving cars distance driving is still relatively low and driven in limited conditions. The study also reported that self-driving vehicles were not at fault in any crashes they were involved in and that the overall severity of crash-related injuries involving self-driving vehicles has been lower than for conventional vehicles. There is no question that more testing and studies are needed.

In February 2016, Google’s self-driving car caused its first crash, when it changed lanes and put itself in the path of an oncoming bus. In other crashes Google stated that crashes were caused by human drivers of the other vehicle and it was often a result of human error and inattention (driver distraction) and not the driverless car or its technology.

According to a CNN report – recently, U.S. safety regulators, for the purposes of federal law, stated that they would consider the “driver” in Google’s new self-driving car, to be … the car itself. The law uses the “reasonable driver” standard in evaluating negligence liability. Until now, that just meant comparison to a reasonable person. But if a “driver” can now be defined both as a “reasonable person” and as a computer — then what does it mean to say “reasonable driver” anymore?

There Future of Self-Driving Cars

There are many question that need to discussed and will be debated for some time to come.

  • Are human drivers ready to give up control? A study released by AAA this month found three-quarters of American drivers reluctant to ride in an autonomous vehicle.
  • California regulators recently released draft rules requiring vehicles to specifically have a human back-up driver ready and able to take control in the event of a problem. Will other states follow and what other restrictions/qualifications be applied to self-driving cars?
  • How will self-driverless cars be able to communicate with not only human drivers but also the pedestrians and bicyclists with whom they share the road?
  • What will happen to the “reasonable driver” standard? Who will be liable in a self-driving auto accident – the self-driving car owner, the designers, programmers, manufacturers, or the human driver? Will humans not pass the “reasonable driver standard” and be banned from ever driving again? And what type of insurance coverage will one have to carry.