source: Google Images
search terms: Tesla fatal autopilot crash Florida
date: Sep. 13, 2017
The Tortuous Case of the Tesla Autopilot
by Christopher D. D. Cabrall
Sep. 13, 2017
The coverage of the infamous fatal Tesla crash of Mr. Joshua Brown with a tractor trailer truck on Florida Highway 27A on May 7 2016, certainly has been full of twists and turns. Because of the notoriety and attention this single “accident” has generated, some reflection on its impact on the national scale is warranted.
Today, nearly 500 days later, the importance of the critically cynical foresight of the human factors engineering community (esp. those involved in automated/autonomous vehicle research and development) begins to become more clear. When lives are at stake on our open public roads, it is a real shame that safety has had to take a backseat to convenience/commodity for this long. I should candidly admit, the relentless release of unsafe systems in the automotive domain has led me to serious doubts as to my choice to branch out from my background in the more highly regulated aviation domain, and out of feelings of futility to perhaps even quit my current day job as a PhD student in Human Factors of Automated Driving (www.hf-auto.eu).
But now, making headlines today, “new” safety recommendations from the National Transportation Safety Board (NTSB) have been issued including:
“ To manufacturers of vehicles equipped with Level 2 vehicle automation systems (Audi of America, BMW of North America, Infiniti USA, Mercedes-Benz USA, Tesla Inc., and Volvo Car USA) –
5. Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.
6. Develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use. “
title: Tesla crash report leads USDOT to issue new autonomous vehicle recommendations
date: Sep. 13, 2017
It is an eye-opening contrast to compare the above take on things with what the consensus being spun was at the beginning of this year. A perhaps overly technophilic or tech-centric camp seemed to dominate the spotlight and regrettably reinforced the fallacy of assurance in reliable machine hardware/software and advanced AI systems. The context could not be more wrong. Worse than merely ignoring or overlooking implications of inadequate safety system design provisions, companies with released LoDA 2 (SAE levels of driving automation) systems were literally being reported as “exculpated” and the misdirected human error blame game was being noxiously repeated.
” ‘A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted,’ NHTSA’s investigators found. In other words: Tesla didn’t cause Brown’s death.
Even before Brown’s death, some worried the inevitable first death caused—or not prevented—by autonomous tech could cripple the shift toward a safer, self-driving future. Tesla’s perceived hubris in pushing immature technology into consumers hands enhanced fears of regulatory intervention. Had NHTSA come to a different conclusion, it could have decided to recall every Tesla with the tech.
Instead, the agency exculpated Tesla, and then some. It crunched the numbers to find that among Tesla cars on the road, those carrying its Autosteer technology, which can keep the car within clear lane markings, crashed 40 percent less frequently than those without.
‘It’s very positive for Tesla,’ says Bart Selman, who studies AI safety systems at Cornell University. ‘It puts the whole issue of the Florida accident in the right context.’
It’s a near guarantee Joshua Brown won’t be the only person to die at the wheel of a semi-autonomous car. The systems aren’t perfect, but the statistics now show the humans who cause 94 percent of crashes are worse. From that perspective, driverless cars can’t come quickly enough. “
title: AFTER PROBING TESLA’S DEADLY CRASH, FEDS SAY YAY TO SELF-DRIVING
date: Jan. 20, 2017
In the months following the crash and before, warnings from subject matter experts were already being expressed. And while I laud the government process for giving it “due” attention (i.e., at the level of congressional testimony), it clearly is somehow not enough or the red-tape/politics too burdening to the point of apparent irrelevancy to enforce what profitable players freely choose to dangerously (over)sell and layman public choose to buy and (mis)use.
” The most important thing to know is that the Tesla that was involved in the crash was not a self-driving car. That is, a car that ‘performs all safety-critical functions for the entire trip,’ or even a car in which ‘the driver can fully cede control of all safety-critical functions in certain conditions”—otherwise known as ‘level 4’ and ‘level 3’ cars in the National Highway Traffic Safety Administration’s (NHTSA) classification of automated cars.
Instead, the Tesla was equipped with an Advanced Driver Assistance System (ADAS), which performs some steering and speed functions but still requires continuous driver monitoring.
In the NHTSA’s classification, it was a ‘level 2’ car, meaning it automated ‘at least two primary control functions’ —in this case, adaptive cruise control (controlling speeds to avoid hitting vehicles in front) and lane centering (steering within the stripes).
Thus, the Tesla driver who was killed in the accident, Joshua Brown, probably should have been paying more attention. There are conflicting reports about whether Brown was speeding or was watching a movie at the time of the accident.
Just two months before the accident, Duke University roboticist Missy Cummings presciently testified before Congress that auto companies were ‘rushing to market’ before self-driving cars are ready and that ‘someone is going to die.’ “
title: DEATH BY TESLA: WHAT WENT WRONG?
date: Jul. 08, 2017
A dangerous creep has been lurking in plain sight. While sourcing Dash-Cam videos for research purposes during my PhD studies, I have been both extremely impressed as well as terrified by what is “out there”. On the one hand, the explosive availability and proclivities for video content generation and network sharing may represent a powerful boon in understanding driving norms, behaviors, and risky situations in ways not previously possible. On the other hand, I also am frightened by the increasing comfort that seems to be emerging regarding untrained novices playing the role of test-driver or engineer out on the public roads (yes, even driving right there next to you and me!).
I believe in the authoritative power of checks and balances represented by a certificate/degree vetting process by which we collectively refer to as “being qualified” and am appalled at how easily this sort of security is capable of being broken these days. I would strongly encourage my readers to also question what it means when prototypes and betas are released for the public to try out in lieu of years of professional/expert and regulated evaluation before hand. Note: Dr. Cummings above has warned that we need to check a trend where “demonstrations are substituted for rigorous testing”. I might be a bit of an old-soul, but I value and expect products and services that are more responsibly completed pre-release in contrast to the recent push-out patch/update model that seems prevalent in our computers and smartphones over the last decades or so.
Far from blaming the end-consumer for mis-use or temptations to go beyond what is stated in the owner’s manual as an early adopter/reviewer, I would implore we keep some important responsibility within the accredited designers and builders who at least by now should clearly be considering (rather than ignoring or even encouraging) such risks.
” But Mr. Brown became a victim of an innovation geared precisely to people like him when his Tesla Model S electric sedan collided with a semitrailer truck on a Florida highway in May, making him the first known fatality in a self-driving car.
Mr. Brown was particularly interested in testing the limits of the Autopilot function, documenting how the vehicle would react in blind spots, going around curves and other more challenging situations.
‘He knew the hill that it would give up on, because it couldn’t see far enough,’ Mr. Vernon said. ‘He knew all the limitations that it would find and he really knew how it was supposed to work.’
Mr. Brown attended the University of New Mexico, where he studied physics and computer science, but did not graduate, the school said. Instead, he joined the Navy, where he served for more than a decade and specialized in disarming explosives, according to his company’s website.
Ricky Hammer, a retired Navy master chief who worked with Mr. Brown at the development group, said he had strong computer skills and ‘was the equivalent of an electrical engineer even though he didn’t have the degree.’
‘He’d probably fly an F-18 to test-drive it,’ she said, referring to the military fighter jet. Tesla enthusiasts often also share a loyalty to the company, much the way Apple has engendered true believers whom it relies on to back the introduction of new iPhones, Macs and other products. “
title: Joshua Brown, Who Died in Self-Driving Accident, Tested Limits of His Tesla
date: Jul. 01, 2017
In conclusion, I believe more work is needed to promote a pro-active safety-driven culture and safer roads for everyone. In coherence with the new guidelines from NTSB and US DOT, I believe a strong way forward is to invest in the development and improvement of monitors for the monitoring that is required of our drivers (across levels of automated/autonomous vehicles).