Levels of driving automation
The Society of Automotive Engineers identifies five levels of driving autonomy.
Level 0, which is now common in even economy cars, issues warnings or take action if a collision is imminent. At the other end of the spectrum is Level 5, in which the car could be fully functional without a steering wheel or other driver controls – the car is capable of full autonomy.
Level 4 is very nearly as autonomous as Level 5, but requires human intervention under certain circumstances, such as inclement weather, navigating around an unusual obstruction, or operating outside well-mapped areas.
Even KITT was Level 4
“Gimme all ya got, KITT!”
KITT, pictured, is the fictional autonomous car from the ’80s TV show Knight Rider. Way ahead of its time, KITT could handle almost all day-to-day driving, and is the benchmark for what most people think of as a “robot car”.
Nevertheless, when the bad guys were up to their villainous ways and turbo mode was called for to jump trains or slam through brick walls, driver Michael Knight had to intervene.
Tesla claims its new cars are Level 5 ready as far as hardware goes, but will need a software upgrade to operate fully autonomously. This claim has been challenged, because while most autonomy efforts include LIDAR sensors, Tesla is relying on cameras alone to sense the world around the car. Their stance is that cameras are inexpensive and so can be installed on all their cars, allowing them to gather the big data required for the software to learn the skill of driving, earlier than their competition.
“Seamless Autonomous Mobility,” or SAM, is Nissan’s autonomous vehicle program, run by Dr. Liam Pedersen, who was an engineer on NASA’s Mars rover team.
SAM works by having human operators standing by remotely in case an autonomous Nissan needs human intervention to navigate a situation it is not prepared. A “mobility manager” will paint a line for the car to traverse safely and monitor its progress until it is clear of the obstruction.
At this point, a mobility manager can supervise ten cars at a time. As the autonomous software learns from each unknown situation, it should be possible to increase that ratio.
“This is not only a demonstration of the transfer of space technology to industry, but also the application of their research back to our space technology, with additional uses for our unmanned aircraft systems research. This is a perfect example of technology literally driving exploration and enabling future space missions,”
Eugene Tu, Center Director, NASA Ames Research Center.
As we have seen in other posts, Elon Musk and his companies are big fans of Principle 6 – Universality. It should come as no surprise then that Tesla is adamant that cameras are good enough for autonomous driving – they are already in place and inexpensive. And after all, humans have been driving with two “cameras” three inches apart for a hundred years.
The notion of having humans standing by to intervene is a great example of Principle 3 – Local Quality. By dividing up the workload between cars’ autonomous systems and human mobility managers, Nissan creates a seamless system capable of addressing local, confusing issues but still permit driverless operation for the majority of the time.