It’s incredible how far technology has come in recent years, but it also can be slightly scary. While the world is racing to self-driving cars, SteinLaw has slightly hit the brakes due to the dangers that still exist when operating self-driving vehicles. As it turns out, we aren’t the only ones who are aware of how self-driving cars aren’t 100% danger free. Just recently The Wall Street Journal posted about the dangers of self-driving cars. Let’s take a closer look at what makes self-driving cars dangerous and what you should know.
How Do Self-Driving Cars Operate?
Self-driving cars utilize several complicated systems to run correctly. The automobile contains sensors, cameras, and high-tech software. These cameras and sensors continuously scan and navigate the terrain.
The levels of Self-Driving Cars
There are several levels of a self-driving car. These include:
0: Humans must control all of the functions. This level is mainly what has existed since the beginning of automobiles.
1: A handful of systems operate without human interaction. A good example is cruise control.
2: Some tasks are performed without much interaction. Acceleration or steering can be performed autonomously, with occasional human intervention.
3: Vehicle handles all safety-oriented features, but the driver must intercede when an alert occurs.
4: In specific situations, autonomous driving is fully operational. Sometimes, there is still the need for a driver to step in. This level is still in beta – which means it is being tested and improved upon.
5: Lacking brakes and a steering wheel, this automobile is entirely autonomous. There is no need for human intervention.
The Dangers of Self-Driving Cars
While most people want to proclaim all the goodness involved with this new technology, it’s just as vital that we look at the downsides. Here are a few disadvantages to cars that operate themselves.
- Inability to understand – While the autonomous vehicle can learn and become smarter, it won’t be able to understand all directions. For example, how will it know what to do if a construction worker is signaling traffic? These are considerations to give before putting trust into a machine.
- Poor weather – When the weather becomes poor, cameras and sensors face additional interference. This clouds the communication and diminishes reliability.
- Human Reaction – Although AI is incredibly advanced, it will never have human sentiment and the ability to reason based on the physical world. (See below for example)
- Possible hacking threat – Because the car runs on software, there’s a high probability of hacking. Owners must be vigilant against malware.
Self-Driving Car Accidents
Fatalities from car accidents aren’t decreasing at rates you would expect with all of this technology in play. From 2017 to 2018, the National Highway Traffic Safety Administration (NHTSA) estimated that there were only a few hundred fewer deaths. Most motor vehicle accidents occur due to distraction, human error, drunk driving, and fatigue. The thought is that many of these accidents will be avoided when autonomous cars take over, but that’s not necessarily true.
How does a self-driving vehicle make ethical decisions? What will it do when a child runs out into the road? Sometimes, human reasoning outweighs what a machine is capable of. Many drivers will put themselves in harm’s way to protect that child, but will a Level 5 autonomous car do the same? We aren’t sure of that.
We don’t have sufficient data yet to make clear distinctions about how safe or unsafe these vehicles are, but time will tell if the numbers drop.
In regard to features such as the Automatic Emergency Braking (AEB), there is some data to support the safety benefits for drivers.
Automatic Emergency Braking (AEB) is when you are driving and your car stops itself automatically if it detects your vehicle is about to hit another vehicle or obstacle by factoring the current speed with the sensor data.
The Wall Street Journal states, “According to new data from the Insurance Institute for Highway Safety, AEB reduces rear-end crashes by 50%, and reduces crashes with injuries by 56%.”
The fact is that semi-autonomous self-driving car features could provide many safety benefits, but it is dangerous to rely on them fully.
In fact, CBS News recently covered a story, “Disturbing Video Shows Driver Apparently Asleep in Moving Tesla on Highway.” As the title indicates, the article highlights various instances reporting Tesla drivers and passengers caught on video apparently fast asleep on the highway. One Massachusetts man, Dakota Randall, even tweeted a video of a sleeping Tesla driver (see below).
Some guy literally asleep at the wheel on the Mass Pike (great place for it).
Teslas are sick, I guess? pic.twitter.com/ARSpj1rbVn
— Dakota Randall (@DakRandall) September 8, 2019
Again, CBS article shared another tweet of a motorist who also appears to be sleeping while behind the wheel of a Tesla on a California highway:
@RalphNader Ralph can you do anything about stopping this disturbing phenomenon? My wife and I shot this video last week on the busy I-5 in LA. @Tesla #sleepingdrivers #unsafeatanyspeed pic.twitter.com/ADbpt0uSZ4
— Clint Olivier (@ClintOlivier) August 21, 2019
A Tesla spokesperson claimed the “videos appear to be dangerous pranks or hoaxes.” Tesla explained that the driver-monitoring system repeatedly reminds drivers to remain engaged. “At highway speeds, drivers typically receive warnings every 30 seconds or less if their hands aren’t detected on the wheel,” a company spokesperson said.
Tesla also says “Tesla owners have driven billions of miles using Autopilot, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot experience fewer accidents than those operating without assistance.”
The fact of the matter is that these very well could be pranks if Tesla’s statement claiming their driver-monitoring systems require drivers to jerk the wheel every 30 seconds is indeed true. Regardless, it is true that autopilot has saved the lives of many drivers who would’ve gotten into fatal crashes, but it doesn’t excuse drivers to get overconfident about driving to the point where they may fall asleep. The grey area of autonomous car accidents is tricky, especially when it boils down to a motorist causing a car accident while operating Tesla’s autopilot feature and determining who is liable.
Liability Concerns in Autonomous Car Accident Cases
Where liability falls after an accident revolves around whether it’s the driver’s fault or not. If the driver failed, the liability rules continue to apply. If the self-driving car caused the accident, there’s a problem. It’s either related to a software flaw or sensor failure. It’s also possible that something was amiss with the road infrastructure.
While the latter would fall on the state or city, the other concerns bring us to a gray area. If the sensors or software caused the incident, the lawsuit must be against the vehicle designer or parts manufacturer.
What To Do After a Self-Driving Car Accident
If you’ve been involved in an accident with a self-driving car, time is of the essence. Start by making sure everyone in the car is okay and call 9-1-1 if necessary. Turn on your hazard lights and get off the road if possible. Don’t leave the car in the middle of the road where it can cause another accident.
If you can’t move the vehicle, you must call a tow truck. Once you are safe, make sure you contact the police and file a report. Then, exchange information with the other drivers and take pictures of everything related to the scene. This includes pictures of the damage, vital road marks, or anything that could prove your case.
Stay on the scene until the police arrive. If you plan on getting any reimbursement for your medical bills, you must get treatment within 14 days. But, if you have life-threatening injuries, head to the ER immediately. Once your health has been checked, contact your insurance company. You must report the incident and give them any key information. Don’t forget to call your lawyer to ensure you get the compensation you deserve in the case the accident was not your fault. Self-driving car accident cases are very complex and it’s important you get legal advice from an experienced car accident attorney, such as Brandon Stein.
What Does a Self-Driving Car Lawyer Handle?
We don’t disagree that self-driving cars have many benefits to the world. However, right now there simply isn’t enough proof that indicates they are safe. Still, they are making their way onto the road where dangers abound.
If you are involved in a crash because of a self-driving car on the roadway, you need guidance. A top attorney can help prove that there were design defects, a failure to alert the driver, or another type of negligence.
With the right lawyer by your side, you can recover the following damages:
- Physical therapy or rehabilitation
- Medical bills
- Long-term care
- Lost earning capacity
- Lost wages
- Pain and suffering
You may also need a lawyer that pursues wrongful death claims against the car manufacturer. You deserve justice for your loved one.
Get the Help You Need
If you’ve been hurt by a self-driving car, the experienced team at SteinLaw can help navigate you through the process. We handle the paperwork, defense attorneys, and insurance adjusters so you can focus on your health.
At SteinLaw, you are more than just a client to us; we seek to know you personally. Attorney Brandon Stein fights to receive the compensation you need to heal from your injuries. Call us today for a free legal consultation at 877.STEINLAW or fill out our online form.