Tesla Removing Ultrasonic Sensors in Move to Vision Only

tesla-model-3-dash

Tesla made a bold move by removing the radar in its cars in favor of its new camera-based vision-only Tesla Vision approach. 

People said it was a bad idea and couldn't be done, but it didn't take long before safety organizations tested the vision-only features and approved them. Now, Tesla has announced it will move forward even further with the vision-only setup, by eliminating ultrasonic sensors from its cars.

Much like the situation with the removal of radar, Tesla will certainly face some pushback here. Its cars' safety features will also need to be retested and approved all over again, since the systems will be working differently. This means its cars will likely lose various awards and/or recommendations until after they're proven to work as intended without the ultrasonic sensors.

Tesla's original Autopilot suite featured eight cameras, forward-facing radar and a host of ultrasonic sensors. The company insists if a car is going to drive like a human, it needs to "see" the world around it and then react accordingly. 

Tesla has also noted the various technologies can contradict one another and it has learned when paired with AI and a neural network---essentially the brain---camera-based vision appears to have the most success, and it can see much more than any human.

Tesla announced: "Today, we are taking the next step in Tesla Vision by removing ultrasonic sensors (USS) from Model 3 and Model Y. We will continue this rollout with Model 3 and Model Y, globally, over the next few months, followed by Model S and Model X in 2023."

Much like the removal of radar, Tesla is starting with the Model 3 and Model Y. It will begin removing the USS from its flagship Model S and Model X vehicles next year.

According to Electrek, the USS were especially helpful for shorter-range object detections. For example, they helped the car with Autopark and collision warnings. However, if the car can "see" the world around it, render it and analyze distances, Tesla could argue the sensors are creating unnecessary redundancy.

Tesla went on to explain it's using software and its neural network to replace the data provided by the USS with that of data collected via its vision-based system. The company said "this approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time."

We thank InsideEVs for reprint permission.

Abby Andrews

Online & Web Content Editor
Abby Andrews is the editor of Autobody News.

Stay connected to the number one source of collision repair news!

Subscribe now to your region’s monthly magazine, in print and/or digital, and to receive our weekly e-newsletters, delivered directly to your inbox.

Website Rt Graphic Ep.51 Nancy Rolland 600x400 1.9.24

Shop & Product Showcase

  • Read testimonials from real collision repair shops about the tools and technologies they use to get the job done.