Tuesday, 18 December 2018 12:50

Consumers Confused by Partially Automated Driving Features

Written by David A. Wood, CarComplaints.com



However, three of the four test vehicles required the drivers to intervene and take control to avoid an imminent crash when a lead vehicle changed lanes to reveal a stationary vehicle.


Real-world examples exist, as drivers claimed they crashed due to believing their cars could take full control in all driving situations, even when the owners’ manuals clearly said otherwise.


Within the past few months, Tesla has been sued twice by drivers who crashed while not paying attention to the roads. In one crash, a Tesla Model S crashed into a stalled vehicle at 80 mph because the driver believed the car would stop on its own.


A separate lawsuit alleges the driver was deceived by Tesla's marketing of its Autopilot system, causing the driver to remove her hands from the wheel for 80 seconds just prior to the crash.


AAA's research isn't the first to show how drivers can be confused by the different names used for similar features offered by various automakers.


In a paper released by the Thatcham Research Center and the Association of British Insurers, researchers found consumers can easily fail to understand the differences between advanced driver assistance systems and fully automated driving technology. The confusion can cause drivers to ignore the reality of needing to stay fully alert to deal with all driving conditions.


In addition, researchers said confused drivers can also lack the ability to understand that they may be completely legally responsible for any crashes that occur.

« Previous Page Continue reading