close
close

Working as a Tesla self-driving car tester seems like the most stressful job in the world.

Working as a Tesla self-driving car tester seems like the most stressful job in the world.

If the job description stated that “all sense of legal duty, moral ethics and humanity must be eliminated” as duties, would you apply? According to a new report, this is essentially the role of a Tesla test driver.

Self-driving cars are the future. If tech companies have their way, self-driving cars will become only future. However, many of us live in reality and understand that the software is far from what it should be for such sci-fi levels of automation. But people with the means will continue to try to speed things up, even if it means beta testing on public roads where we’ll all be guinea pigs.

Business Insider revealed details of the work of a specialized team of Tesla test drivers, called Project Rodeo. This group of testers is trained to enhance the capabilities of the automaker’s Full Self-Driving (FSD) and Autopilot software. What exactly is the limit? Failure. But the rule of thumb seems to be to get as close to colliding with something or someone as possible. The worse the situation, the better.

“You drive on adrenaline for the entire eight-hour shift,” said one former test driver. “There is a feeling that you are on the verge of something going wrong.”

Nine current and former Project Rodeo drivers and three Autopilot engineers from California, Florida and Texas were interviewed. Most asked to remain anonymous. The situations they describe are eye-opening, but not surprising. Although FSD-related accidents are well documented, no one interviewed was involved in any of them.

Project Rodeo is a testing group made up of small teams. For example, the “golden manual” team drives according to instructions, follows traffic rules, and does not use any driver assistance systems. At the opposite end of this spectrum is the “critical intervention” team. More racers than drivers, mission-critical testers let software control every aspect of driving. They intervene or “step in” only to prevent a collision.

One of the reasons test drivers wait until the 11th hour to manually take control is because it gives the software time to react and make the right or wrong decision. The more data is collected, especially in real-world scenarios, the easier it is for engineers to customize and update the software.

“We want the data to show what led the car to make that decision,” said the former Autopilot engineer. “If you keep intervening too early, we won’t get to the point where we say, ‘OK, we understand what happened.’

However, this results in vehicles being allowed to run red lights, cross double yellows, ignore stop signs, and speed, all on public roads. Even if the situation becomes uncomfortable for the driver, management will say that he took control too early. Thus, Project Rodeo drivers, even those in non-critical intervention roles, felt the need to maintain risky driving situations or sometimes create them altogether in order to test the software in order to keep their jobs.

John Bernal, a former test driver and data analyst, said he was never told to intentionally break the data collection law, but it was clearly implied. “My training was to wait until the wheels touched the white line before I could hit the brakes,” he said.

On top of this, some actuators were used solely to train software to recognize and adapt to “vulnerable road users” such as pedestrians, cyclists and wheelchair users. A former tester said that while driving with his trainer, their car came within three feet of the cyclist before he hit the brakes.

“I remember this guy jumping off his bike,” he said. “He was terrified. The car rushed at him and all I could do was slam on the brakes.” Apparently his coach was actually happy about it, telling him that his late reaction was “perfect” and exactly what they wanted him to do. “It seemed to me that the goal was to simulate an accident and then prevent it at the last second.”

Cruise and Waymo are also developing self-driving cars, but say they conduct rigorous software testing in controlled environments or consider their autonomous system to be “fundamentally different” from Tesla’s. Hmm, then why do these companies have the same problems with vehicles that don’t read the room on their own? In the case of Uber’s now-shuttered self-driving car division, the results are sometimes deadly.

“If you have a parent who holds the bike all the time, they will never learn,” said the former Tesla engineer. Ultimately, data is king. For these standalone tech companies now at the mercy of shareholders, this is a high-risk, high-reward environment that the public did not sign up for.