How Safe Do Autonomous Cars Need to Be to Gain Your Trust?
- July 20, 2018
- Categories: Vehicle Accidents
Driverless cars were once the realm of science fiction. Fiction is quickly becoming reality — and driverless cars are about to be a huge presence on our roads. A recent survey highlighted by the Washington Post indicates that the majority of Americans expect autonomous vehicles to be common within the next fifteen years.
The potential for a driverless takeover excites some drivers but makes others nervous. Can autonomous cars truly be trusted? Is the technology advanced enough to prevent accidents or wide-scale hacking? Below, we delve into these and other concerns.
Driver-Assist Versus Fully Autonomous
Those who worry about driverless cars often point to recent headlines in which vehicles with autonomous technology crashed, leading to devastating injuries or fatalities. For example, one of Uber’s test autonomous cars killed a pedestrian in March 2018. Soon after, a Waymo test vehicle with a human occupant crashed in response to swerving from another vehicle. Sadly, this occurred shortly after Waymo’s CEO claimed that the company’s self-driving car could have avoided Uber’s crash.
When examining these headlines, remember this: Not all vehicles with driverless technology are built alike. Many vehicles are not truly driverless; they feature driver-assist mode, which is intended to improve driver performance behind the wheel, but not necessarily allow full autonomy. Driver-assist features work best when the person behind the wheel remains vigilant. Unfortunately, many individuals find themselves distracted in such vehicles and therefore unable to take over when necessary. This problem is best solved via increased training for drivers in vehicles that aren’t fully autonomous.
It’s no secret that humans struggle to drive in rain or on icy roads. But are autonomous vehicles that much better? Experts at the National Center for Atmospheric Research answer this question with a resounding ‘no.’ Solar storms are especially concerning. Although not immediately evident to human drivers, these sudden flare-ups can wreak havoc on autonomous systems by severing GPS connections. Likewise, sensors on driverless cars can easily be disabled by a condition researchers refer to as ‘snow smoke.’
Hacking: Real Threat or Overblown Concern?
A common concern regarding driverless vehicles is the potential for ill-intentioned individuals to hack into driverless systems and control vehicles. This is a valid concern; the systems that underlie driverless cars are computers, and any computer holds the potential for hacking. The University of Michigan’s autonomous vehicle center highlights these fears, warning that new vulnerabilities are sure to emerge as driverless cars become more common.
To demonstrate the potential for hacking, researchers from Zhejiang University, the University of South Carolina, and the security firm Qihoo 360 purposefully tried to fool Tesla’s autopilot sensors. All it took to succeed? Simple sound and light-emitting tools. Occasionally, their efforts caused the vehicle’s autonomous system to perceive objects that didn’t actually exist. In other cases, they successfully caused the vehicle to miss real objects clearly in its path. Researchers warn that companies manufacturing autonomous vehicles will need to start taking such threats seriously — or face dire consequences.
Ultimately, there is no escaping driverless technology. You can choose not to travel in autonomous vehicles, but you’re bound to encounter them on the road in years to come. Driverless cars could significantly reduce the role of human error in accidents, but further testing is needed before we know the full effect these vehicles will have on our roads.
Ready to learn more? The team at Neale & Fhima is happy to answer your questions, so reach out today for more information.