This site may earn affiliate commissions from the links on this page. Terms of use.

Tesla owners take come up to wait frequent, and sometimes substantial, over-the-air (OTA) updates to their cars — peculiarly for systems like the Autopilot software that is technically a beta. But in that location is niggling dubiousness that the extent and timing of its latest version 8 of Autopilot was driven in function past the widely covered fatal crash in Florida, where a Model Southward failed to detect a white truck crossing its path against a vivid sky. For those tracking the details, the issue wasn't really related to autonomous features of the automobile, but to the AEB (Automated Emergency Braking System). Either way, the arraign was placed on a shortcoming in Mobileye's camera-based object detection organisation — leading to a departing of ways between the two companies.

Radar comes to the fore in Autopilot Version eight.0

Unlike typical cameras, radar is unaffected by lighting, and much less sensitive to atmospheric weather like fog. Tesla models compatible with Autopilot (those built since October 2014) include a front-facing radar, but until now the system relied primarily on camera input. With Version viii.0, that Tesla CEO Elon Musk said was an endeavor to fit into the retention chapters of some models, the radar will be pulsed up to 10 times per second and used to create a 3D image of what is in front of the auto.

Overhead signs present a challenge for autopilot systemsWhile radar is much amend than a camera in poor weather condition, and is bully at detecting metallic, it has a harder time seeing people, woods, and plastic. That'due south meant that Tesla has had to put a lot of work into the signal processing software that works on the raw radar data — both to not over-react to small-scale pieces of metal similar a soda can, and to correctly detect non-metallic objects. Part of the update is increasing the density of the raw signal cloud from the radar by a factor of half-dozen.

Tesla cars acquire as a fleet

One interesting aspect of Tesla's system is that if i machine detects an object (for instance an overhead sign) that turns out not to be an consequence, the data is geotagged and shared. If several cars written report the same feel, then the unabridged fleet can be taught to ignore it. Personally, while this blazon of learning works great for marking speed traps and red light cameras, it is a petty chilling to recollect nearly an emergency organization ignoring an input because of deject-sourced data. Nvidia's Jen-Hsun Huang demonstrated a different version of this technique at GTC, where vehicles could share the paradigm data that could be postal service-processed and used to provide an entire fleet of cars with improved "footing truth" maps.

Offending drivers will have to take a time out

Tesla is clearly both concerned and frustrated by drivers that abuse the Autopilot system by repeatedly taking their hands off the wheel for long periods. So Version 8.0 too includes a feature that volition disable the organization later 3 incidents in an hour. The commuter will demand to pull over and put the car in park to re-enable it. Musk is upfront about his enthusiasm for Tesla's Autopilot offering, as it is role of what he says makes "the Model Southward and Ten past far the safest cars on the road." Making improve use of its radar should help make them even safer. In that location are too a variety of other small-scale upgrades and bug fixes in the release that are detailed past Tesla.