According to a newly released report, an investigation into Tesla’s internal testing procedures for their autonomous driving technologies has uncovered alarming practices.
Tesla’s prospects seem to hinge on autonomous driving capabilities, as it deploys its “Supervised Self-Driving” (FSD) technology and maintains an internal testing fleet of vehicles.
Tesla previously announced plans to hire drivers across the country to test its latest ‘Full Self-Driving’ (FSD) software updates.
According to a new report from Enterprise Insider, insights have been gathered through interviews with nine autonomous driving test pilots involved in a project called “Rodeo”. They describe the mission:
Professional truck drivers, particularly those on Challenge Rodeo’s “vital intervention” team, claim that their training prepares them to intervene at a moment’s notice, seizing control of the vehicle before perilous situations escalate. As Tesla’s autonomous vehicles operate for extended periods, the accumulating data enables engineers to fine-tune their systems and refine performance, yielding more accurate and efficient autonomous driving capabilities. While consultants in self-driving tech and security suggest that accelerated testing methods could hasten software development, they also warn that such approaches pose significant risks to the safety of test drivers and the general public on public roads?
Former drivers described the game as “a cowboy on a bull, where you’re just trying to hold on for as long as you can” – hence its title.
Beyond standard usage, test drivers leverage an unlaunched Tesla FSD model, treating it much like actual customers, with the key difference being their frequent attempts to push its limits.
Enterprise Insider elucidates on the crucial mission dynamics of the “mission-critical intervention team,” aptly dubbed Rodeo.
Highly advanced algorithms are designed to permit the AI-powered driver-assistance system to learn from and recover from occasional mistakes made by some of the most accomplished riders in the Challenge Rodeo competition, allowing it to continue operating effectively. Trained to orchestrate “interventions” – seizing manual control of the vehicle – solely to prevent a collision, revealed the three critical-intervention drivers and five additional drivers familiar with the team’s objective. Incidents involving Tesla’s Full Self-Driving (FSD) system reportedly include instances where vehicles disregarded stop signs, ran through red lights, veered into opposing lanes, and failed to adhere to posted speed limits while the technology was activated. Despite the drivers’ concerns, supervisors encouraged them to allow FSD to remain in charge during these incidents, purportedly to minimize their workload?
While Full Self-Driving (FSD) is designed to perform specific tasks in buyer-owned vehicles, human drivers typically intervene before the system progresses beyond a certain point.
The goal of this team is to push boundaries further.
Several of the notable drivers mentioned:
With a steady supply of caffeine and sheer willpower, you somehow managed to power through the entire eight-hour stretch without breaking a sweat. The feeling persists that you’re precariously close to something taking a disastrous turn.
A close call was recounted by a Tesla driver regarding the company’s Full Self-Driving (FSD) technology narrowly avoiding a collision with a cyclist just a few feet away.
I still recall the moment when he jumped off his bicycle. He was terrified. As the automobile suddenly bore down on him, there was little else for me to do but stomp fiercely on the brakes.
The crew members seemed to be in a jovial mood following the unexpected event. “The driver recounted, ‘He told me, “That was excellent.” Exactly what I was supposed to accomplish.'”
Can you imagine the Enterprise Insider report revealing the shocking extent to which the crew engages in reckless behavior, putting innocent bystanders, including pedestrians and cyclists, at risk?
Which companies are pioneering autonomous vehicle innovation and what sets their approaches apart from one another is a crucial inquiry.
While Waymo’s market chief is said to maintain a team focused on related tasks, similar to Tesla’s “vital intervention crew” at Rodeo, their approach differs significantly, as they conduct testing exclusively within controlled environments utilizing mannequins instead of real people.
Electrek’s Take
Although this appears to stem from Tesla’s inception methodology, it is far from justifiable.
Notably, none of the nine Uber drivers interviewed by Business Insider reported being involved in an accident; instead, each driver described hazardous situations where bystanders were inadvertently put at risk without their explicit consent during the testing phase.
It’s utterly distasteful to consider such a notion, let alone entertain the idea of perpetuating something so morally repugnant. Elon Musk’s assertion that Tesla prioritizes “security first” appears dubious in light of the evidence presented here, which seems to contradict that claim.