Tesla's Autopilot struggles against deceptive visual obstacles

futurism.com

A recent video by YouTuber Mark Rober showed a Tesla on Autopilot crashing into a wall designed to look like the road. Rober, a former NASA engineer, used high-speed footage to demonstrate how the Tesla could not recognize the wall. This has raised questions about Tesla’s reliance on camera systems for its self-driving technology, unlike competitors that also use LIDAR and radar. The video sparked discussions among Tesla fans. Some argued that Rober should have used Tesla's more advanced Full Self-Driving (FSD) software for the test. Regardless, concerns remain about the safety of Tesla's camera-only approach. YouTuber Kyle Paul later tested a Tesla Model Y with FSD and found it also crashed into a similar wall. He noted that the car could not see the wall until it got very close. However, when he tested a Cybertruck with updated technology, it successfully detected the wall and stopped. The varying results from these tests highlight the uncertainty surrounding Tesla's self-driving capabilities. While newer vehicles may perform better, many older models still use outdated technology, which raises safety concerns. This situation underscores Tesla CEO Elon Musk’s promise to upgrade older models to newer systems for free. However, it is unclear if he will follow through, which worries many Tesla drivers. In real-world scenarios, the current systems can struggle, such as failing to recognize obstacles or navigating poor weather conditions. There is still significant work ahead for Tesla to achieve its goal of safe and fully autonomous driving.


With a significance score of 2.8, this news ranks in the top 26% of today's 18739 analyzed articles.

Get summaries of news with significance over 5.5 (usually ~10 stories per week). Read by 9000 minimalists.


loading...