What, exactly, is a “self-driving” car?
For our purposes, we can define a “self-driving” car to be a vehicle that is capable of safely moving from Point A to Point B over public streets and highways, under any and all possible combinations of traffic and weather conditions, by making decisions based solely on its internal software without the need for human interaction or oversight.
The term “self-driving car” is actually a misnomer since there are currently no vehicles that are truly self-driving. A more accurate description would be vehicles that are equipped with varying degrees of driver-assistive technology. Driver-assistive technology can range from relatively low-level (such as cruise control or GPS-assisted navigation) to sophisticated (collision avoidance and highway lane-following).
Automotive engineers assign each vehicles’ driver-assistance technology a “degree” or “level” of autonomy, ranging from “0” (a human driver is in complete control) to “5” (the vehicle is in complete control at all times under any traffic conditions). Currently, no production models have been rated at level 5 although the sensor/software package installed in Tesla products is rated a level 4 system (performs all driving functions and monitor traffic conditions for an entire trip, but cannot safely drive in all conditions or on all roads).
Who is responsible in “self-driving” car accidents?
The driver of any vehicle, from a bicycle to a high-performance sports car, is responsible for the safe operation of that vehicle. If a driver operates their vehicle in a manner that is in violation of a city or state’s traffic laws or in an unsafe manner and an accident occurs, that driver can be held liable for the consequences of the accident. By the same line or reasoning, operating a “self-driving” car or operating a car with sophisticated driver-assistive technology does not relieve a driver of the obligation to drive safely. Although cases where a “self-driving” car was involved in traffic accidents are still relatively rare, the courts have so far held drivers responsible, even in cases where a technological failure led to an accident.
“Third Party” lawsuits and “self-driving”/driver assistance technology
In addition to questions related to driver responsibility in “self-driving” car accidents, the courts are also deciding issues related to the liability of vehicle manufacturers if problems with their self-driving/driver assistive technology may have led to an accident. These issues have arisen during the trials of third party lawsuits.
In an auto accident, you are the first party, the other driver (and his or her insurance company) is the second party and anyone (or anything) that may have indirectly contributed to an accident is the third party. In the case of a vehicle being operated in its self-driving mode at the time of an accident, the vehicle’s manufacturer as well as the manufacturers of any components of its driver-assist technology could be named as defendants in a third party lawsuit.
Third party lawsuits prevent manufacturers from avoiding liability for selling products that were known to contain important safety defects. As an example, Tesla Motors has long been aware that the sensors and software that are critical to its hazard detection and avoidance technology have experienced difficulty in distinguishing light-colored trailers of semi-trucks from a blue sky background, yet has continued to use that technology in its vehicles. Tesla Motors could therefore be the target of a third party lawsuit alleging that it knowing sold a defective product. It would then be up to a jury to hear evidence and then decide the issue of liability.
Injured in a Car accident ?
Schedule an appointment with us now! 210-342-2777
*Information obtained from: thedoanlawfirm.com