Every morning, at about 8 a.m., Anthony Levandowski walks out of his house in Berkeley and folds his six-foot-six-inch frame into the driver’s seat of his white Lexus. Levandowski is embarking on his daily commute to work. It’s the most ordinary, familiar moment there is. Most of us perform this ritual five times a week, 50 weeks out of the year. Levandowski’s commute, however, is decidedly different. He’s got a chauffeur, and it’s a robot.
Levandowski backs out of his suburban driveway in the usual manner. By the time he points his self-driving car down the street, it has used its GPS and other sensors to determine its location in the world. On the dashboard, right in front of the windshield, is a low-profile heads-up display. manual, it reads, in sober sans serif font, white on black. But the moment Levandowski enters the freeway ramp near his house, a colorful graphic appears. It’s a schematic view of the road: two solid white vertical lines marking the boundaries of the highway and three dashed lines dividing it into four lanes. The message now reads go to autodrive lane; there are two on the far side of the freeway, shown in green on the schematic. Levandowski’s car and those around him are represented by little white squares. The graphics are reminiscent of Pong. But the game play? Pure Frogger.
There are two buttons on Levandowski’s steering wheel, off and on, and after merging into an auto-drive lane, he hits on with his thumb. A dulcet female voice marks the moment by enunciating the words auto driving with textbook precision. And with that, Levandowski has handed off control of his vehicle to software named Chauffeur. He takes his feet off the pedals and puts his hands in his lap. The car’s computer is now driving him to work. Self-driving cars have been around in one form or another since the 1970s, but three DARPA Grand Challenges, in 2004, 2005, and 2007, jump-started the field. Grand Challenge alumni now populate self-driving laboratories worldwide. It’s not just that’s developing the technology, but also most of the major car manufacturers: Audi, Volkswagen, Toyota, GM, Volvo, BMW, Nissan. Arguably the most important outcome of the DARPA field trials was the development of a robust and reliable laser range finder. It’s the all-seeing eye mounted on top of Levandowski’s car, and it’s used by virtually every other experimental self-driving system ever built.
This year will mark another key milestone in self-driving technology. The National Highway Traffic Safety Administration (NHTSA) is widely expected to announce standards and mandates for car-borne beacons that will broadcast location information to other vehicles on the road. The beacons will warn drivers when a collision seems imminent—when the car ahead breaks hard, for example, or another vehicle swerves erratically into traffic. Automakers may then use this information to take the next step: program automated responses.
Automatic driving is a fundamentally different experience than driving myself. when I arrive at work, I’m ready. Levandoswki’s commute inside of his self-driving car is 45 miles long, and if Chauffeur were perfect, he might use the time napping in the backseat. In reality, Levandowski has to stay awake and behind the wheel, because when Chauffeur encounters a situation in which it’s slightly unsure of itself, it asks him to retake control. Following policy, Levandowski drives through residential roads and surface streets himself, while Chauffeur drives the freeways. Still, it’s a lot better than driving the whole way. Levandowski has his hands on the wheel for just 14 minutes of his hour-long commute: at the very beginning, at the very end, and during the tricky freeway interchanges on the San Mateo Bridge. The rest of the time, he can relax. “Automatic driving is a fundamentally different experience than driving myself,” he told automotive engineers attending the 2012 SAE International conference. “When I arrive at work, I’m ready. I’m just fresh.”00:0200:51
Levandowski works at headquarters in Mountain View, California. He’s the business lead of self-driving-car project, an initiative that the company has been developing for the better part of a decade. has a small fleet of driverless cars now plying public roads. They are test vehicles, but they are also simply doing their job: ferrying employees back and forth from work. Commuters in Silicon Valley report seeing one of the cars—easily identifiable by a spinning turret mounted on the roof—an average of once an hour. Itself reports that collectively the cars have driven more than 500,000 miles without crashing. At a ceremony at headquarters last year, where Governor Jerry Brown signed California’s self-driving-car bill into law, co-founder Sergey Brin said “you can count on one hand the number of years until ordinary people can experience this.” In other words, a self-driving car will be parked on a street near you by 2018. Yet releasing a car will require more than a website and a “click here to download” button. For Chauffeur to make it to your driveway, it will have to run a gauntlet: Chauffeur must navigate a path through a skeptical Detroit, a litigious society, and a host of technical catch-22’s.
Right now, Chauffeur is undergoing what’s known in Silicon Valley as a closed beta test. In the language particular to, the researchers are “dogfooding” the car—driving to work each morning in the same way that Levandowski does. It’s not so much a perk as it is a product test. Needs to put the car in the hands of ordinary drivers in order to test the user experience. The company also wants to prove—in a statistical, actuarial sense—that the auto-drive function is safe: not perfect, not crash-proof, but safer than a competent human driver. “We have a saying here at that” says Levandowski. “In God we trust—all others must bring data.”
Currently, the data reveal that so-called release versions of Chauffeur will, on average, travel 36,000 miles before making a mistake severe enough to require driver intervention. A mistake doesn’t mean a crash—it just means that Chauffeur misinterprets what it sees. For example, it might mistake a parked truck for a small building or a mailbox for a child standing by the side of the road. It’s scary, but it’s not the same thing as an accident.
The software also performs hundreds of diagnostic checks a second. Glitches occur about every 300 miles. This spring, Chris Urmson, the director of self-driving-car project, told a government audience in Washington, D.C., that the vast majority of those are nothing to worry about. “We’ve set the bar incredibly low,” he says. For the errors worrisome enough to require human hands back on the wheel, crew of young testers have been trained in extreme driving techniques—including emergency braking, high-speed lane changes, and preventing and maneuvering through uncontrolled slides—just in case.