Jan. 6 – Advances in artificial intelligence are making vehicles smarter, more responsive, and better at making decisions in a variety of driving environments. But we are still not at a point where autonomous vehicles can know exactly how to handle unpredictable situations. This is one of the roadblocks to realizing a fully autonomous future for driving. The solution is Nissan’s Seamless Autonomous Mobility system or SAM.
During CES, Nissan conducted a live demonstration of the system in operation using a link-up to our Silicon Valley Research center. The demonstration of the drive showed just how SAM will work in reality.
Maarten Sierhius, Director of Nissan Research Center
SAM will ensure a seamless mobility system in which millions of autonomous cars can operate safely and smoothly. SAM can help cars to navigate unforeseen situations that occur on city streets, such as accidents, road construction, or other obstacles.
Here’s how it works: imagine an autonomous vehicle is moving through city streets and comes across an accident, with police using hand signals to direct traffic, perhaps across double yellow lines and against traffic lights. The vehicle cannot and should not, reliably judge what to do by itself.
Vehicle sensors (LIDAR, cameras, radars) can tell the car where obstacles are, the traffic light state, and even recognize some hand gestures, but human judgment is required to understand what other drivers and pedestrians are doing and decide on the appropriate course of action.
A Mobility Manager maps a new path for the AD
With SAM, the autonomous vehicle becomes smart enough to know when it should not attempt to negotiate the problem by itself, as in this instance. Instead, it brings itself to a safe stop and requests help from the command center. The request is routed to the first available mobility manager – a person who uses vehicle images and sensor data (streamed over the wireless network) to assess the situation, decide on the correct action, and create a safe path around the obstruction.
The mobility manager does this by “painting” a virtual lane for the vehicle to drive itself through. When the policemen wave the vehicle past, the manager releases the car to continue on by itself along the designated route. Once clear of the area, the vehicle resumes fully autonomous operations, and the mobility manager is free to assist other vehicles calling for assistance.
As this is all happening, other autonomous vehicles in the area are also communicating with SAM. The system learns and shares the new information created by the Mobility Manager. Once the solution is found, it’s sent to the other vehicles.
As the system learns from experience, and autonomous technology improves, vehicles will require less assistance and each mobility manager will be able to guide a large number of vehicles simultaneously. There are several factors that will determine how many managers are necessary: for example, how busy the zone is, and what service the vehicle is providing, whether it’s for robo-taxis, robo-shuttle, or a robo-delivery vehicle.
Live demonstration from the NASA Ames Research Facility
NASA’s Visual Environment for Remote Virtual Exploration (VERVE) software, used to visualize and supervise interplanetary robots, was the starting point for Nissan’s SAM platform. NASA’s robots use autonomous technology to avoid obstacles and calculate safe driving paths through unpredictable and uncertain environments. Where the environment makes autonomous decision-making difficult, NASA supervisors draw the desired route and send to the robot for execution.
Back on Earth, SAM is not for just Nissan vehicles, but for all vehicles.
For more information, please visit our global newsroom.