​Tesla Autopilot is beta technology. You should not trust it with your life.

You might feel like you can. You might feel like you're in good hands with the multi-sensor suite and the lightning fast reflexes of a computer and electronic drive system. You might be lulled into a sense of safety by the automatic braking and lane changing and adept handling of stop-and-go rush hour traffic that would make your blood boil in frustration.

But Autopilot is beta technology. You should not stop paying attention to the road.

How Autopilot works

Tesla Autopilot is best described as an advanced "super cruise" system. It is meant to take over some functions on the highway and in similar driving situations, but it will keep going until you tell it to stop.

Pulling twice on the stalk on the lower left side of the steering wheel column engages Autopilot. The system relies on a combination of forward-looking long-range radar, 12 ultrasonic series positioned around the car, a camera mounted at the top of the windshield, precision GPS, and high-resolution digital maps.

The radar is used to detect objects in front of the car with a range of around 150 meters; it discerns what kind of vehicle is in front of it based on the radar profile — sedans, tractor trailers, and motorcycles are all generally accurately rendered on the instrument cluster display. The radar also picks up a broader picture of the environment in front of the car, including overhead road signs, most of which are ignored as not relevant to the driving of the car. By analyzing this information over time the radar can determine the relative speed of other vehicles on the road. The radar unit processes this information and relays it to the Autopilot control computer to make decisions.

Ringing the car are 12 ultrasonic sensors. These sensors, essentially compact sonar units, have a range of 16 feet, giving the car awareness of objects in its immediate vicinity. It uses the sensors in automatic parking, reverse remote piloting (the Summon feature), and in Autopilot to determine if the adjacent lanes are clear. The sensors have no awareness of what is around them or precisely where that object is, just that there is something in their vicinity returning the sonar signal. If an object is picked up by two sensors it's assumed to be in the overlapping area of that pair.

The camera sitting at the top of the windshield is supplied by MobileEye. It watches the environment ahead of the car, picking up on road signs (the total system currently only acknowledges speed limit signs), lane markings, and obstructions in the roadway for the Automatic Emergency Braking system.

The car uses high-resolution maps of roadways to help look ahead on the road. The maps work in conjunction with the lane information from the camera, and will perform actions like slowing the car when coming up on a sharp turn in the road.

All of this combined makes for a car that can slow and maintain a safe distance behind other vehicles it approaches on the highway, engage emergency breaking, follow the curves of the lane markings, obey speed limits, and change lanes when triggered by the driver. It's cruise control, but better.

Tesla calls Autopilot as a "driver assist system". Read that again. Driver. Assist.

The NHTSA defines four levels of driving autonomy, where Level 0 is full driver control and Level 4 is a car that can safely drive from A to B with no input from the driver — or no driver at all. Tesla classifies Autopilot as a Level 2 autonomy: the car takes control of some functions, but the driver is ultimately in charge and responsible.

As such, the branding of Autopilot might seem like false advertising. You say the word "Autopilot" and people picture a car that can fully drive itself. That simply isn't the case. Airplane pilots know the true nature of the term autopilot — you pilot the plane on takeoff and landing and autopilot manages the mundane task of cruising through the skies. But it doesn't really matter how a small portion of the population knows "autopilot"; when you mention it to a layperson their first impression is that the car can fully drive itself from A to B, which a Tesla currently cannot.

Tesla officially calls Autopilot a "driver assist system". Read that again. Driver. Assist. Given that, perhaps the name "Autopilot" is something of a misnomer. But that doesn't change what it does.

Autopilot is beta technology. It has limited functionality and awareness.

Using Autopilot

On enabling Autopilot the driver is prompted with a warning that the system is a beta and they are informed of what it can do and what their responsibilities are (everything). In short:

  • The driver is ultimately responsible for actions the car takes.
  • Autopilot can maintain the current lane and spacing with slower cars, and can change lanes when prompted.
  • The driver must maintain their hands on the steering wheel.
  • If the car encounters a situation that Autopilot is not equipped to handle, it will hand control back over to the driver with little warning.
  • If the driver fails to heed the warnings of the car it will activate the hazard lights and slowly come to a complete stop.

Upon activating Autopilot while driving, the car again prompts the driver to keep their hand on the steering wheel and be prepared to take control at a moment's notice.

And then the car just ... drives. It can really lull you into a false sense of security — the car adeptly keeps itself in the current lane even as the road curves, maintains its speed or slows if it encounters an obstacle to that speed, and changes lanes with confidence when you tell it to do so. It even slows itself when it picks up on changed speed limit signs.

It's oddly relaxing. At first, you feel on edge as the car just cruises along in its current lane, and then the highway turns and it freaks you out as the steering wheel turns itself. But then the car just keeps on driving like it's no big deal. It cruises along, waiting for you to give it a command.

Autopilot promotes you to the role of vehicle supervisor. Your job is to intervene when Autopilot isn't doing what it is supposed to do.

Autopilot promotes you to the role of vehicle supervisor. Your job stops being control of the vehicle speed and direction — you're now observing and making sure the car is doing what it's supposed to do and intervening when you feel it is not.

It doesn't take long as a driver of a conventional vehicle for maintaining a lane and speed to become tasks you don't put much, if any, conscious thought into. It's just a base task to driving, something you do without thinking. But you are thinking. Your brain is actively processing a constant stream of incoming information, from the condition of the road to the vehicles around you to road signs and billboards and the color of the sky to the sound and vibrations of your car — all of that feeding into processes that you learned by muscle memory that allow you to make the minute adjustments needed to steer your car.

The confidence with which Autopilot drives a Tesla after a while stops being so weird. It doesn't take that long to get used to it, to be willing to cede control to the car on long stretches of highway or when stuck in rush hour traffic. It's mind blowing at first and then slowly becomes mundane.

Autopilot is beta technology. Don't let it lull you into a false sense of safety.

Autopilot's faults

Autopilot is not a perfect system. In fact, it's far from it. It is equipped to handle a multitude of driving conditions and a huge number of scenarios. But Autopilot is not smart, it is not an artificial intelligence. It applies the inputs it receives from the sensor suite to the scenarios that it has been programmed to deal with and, if it finds a solution that it can apply with a certain degree of confidence, it will. If it cannot find a solution, it hands control back to the driver.

Autopilot has an awareness of its immediate surroundings and what is in front of it, but that awareness is not complete and total. The ultrasonic sensors can tell if something is returning their signal in roughly 16 feet around the car, but they cannot tell what it is or precisely where it is, just that there is something in its range and how far away that something is. The radar has a range of several hundred feet, but it can only see what's in the line of sight from its positioning in the nose of the car — it cannot see through cars or over hills and it can only make assumptions about what vehicles, environmental objects, and obstacles are in front of it.

The car has practically no rear visibility when it comes to highway speeds. The quartet of sensors in the rear bumper can see sixteen feet to the rear, and rarely will you find a car in that space when you're on the highway. It's up to the driver to verify that lanes are clear of approaching traffic when triggering a lane change.

Tesla Autopilot simultaneously gives you more information about what it sees than any competing system, and yet not enough to instill 100% confidence.

Autopilot also has some occasionally frustrating tendencies, like coming to a near complete stop when a vehicle is turning off the road, instead of behaving like a normal driver and accelerating once it's clear the vehicle will be out of the way.

Tesla's remote over-the-air updates also have frequently changed the behavior of Autopilot without notice. Most of the time this is for the better, but when a driver has become accustomed to a specific behavior (e.g. the car hugging the left side of the road), then a change to that behavior without warning comes as a surprise.

The most frustrating and mystifying part of Autopilot is its level of communication. It shows on the instrument cluster display exactly what vehicles it has registered in its vicinity and which lane or leading it is tracking against. It's a level of data that no other adaptive cruise control or lane-keeping system provides, but it's also lacking in the communication of what the car is going to do. If you're coming up on a slower car in your lane, Autopilot gives you no indication of how far out it is from slowing down to match its speed — it just goes until it decides to slow down. It can be disconcerting; where you might have lifted off the accelerator to let your car coast until the other car matches speed, a Tesla under Autopilot doesn't tell you what it's thinking. This makes the observation aspect occasionally stressful — you're watching the system, poised to take control, when the car finally opts to brake.

There's also no communication for when the car is verging on handing over control to the driver. It will carry on with driving and not communicating that something is wrong until it finally reaches the limits of its programming and abruptly hands over control.

Autopilot is beta technology. It is not a flawless system.

The Human component

As mentioned earlier, Tesla Autopilot is not an artificial intelligence. It is not capable of making judgment calls — it has a decision tree or matrix that it works through for every action. It cannot make the call of whether or not to drive itself into a wall instead of plowing through a class of schoolchildren. That is on the driver.

Ultimately, the human driver sitting in the driver's seat of the car holds the ultimate authority and responsibility for what the car does. You may not have the same 360-degree awareness and electron-fast reflexes as a Tesla, but you have the gift of judgment. You can perceive what is on the road and make predictions of human behavior that Autopilot cannot today make.

More importantly, you can and should take control when Autopilot isn't reacting in a safe manner. We've already seen it a few times — with Teslas driving into the rear of an obscured stalled vehicle and most recently misinterpreting a tractor-trailer crossing the road and striking it at full speed. The latter incident resulted in the death of the driver, the first known Autopilot fatality and one that comes after more than 130 million miles of otherwise fatality-free Autopilot driving.

Here's what we know about the Autopilot accident that took the life of Tesla driver, Joshua Brown:

According to eyewitness accounts, his car was traveling east at about 85mph down U.S. 27A outside of Williston, Florida. U.S. 27A in this area is a four-lane limited-access divided highway with a speed limit of 65mph. The driver of a semi-truck heading west made a turn across the oncoming lanes at the intersection of NE 140th Ct.

Autopilot is not yet meant to replace the driver, nor is it meant to operate without supervision.

According to the truck driver's accounting of the incident, and Tesla's own data, Brown's Tesla did not slow on approaching the truck that had crossed the eastbound lanes. The truck was riding high and the nose of the Model S passed beneath the trailer before the upper windshield made contact at full speed. Neither the Automatic Emergency Braking system nor Brown as the driver activated the brakes prior to impact.

While we don't have information on it, it's safe to assume that Brown died on impact. The Tesla's roof was sheared off and the car continued under the trailer and down the road for a few hundred yards before veering off the highway and into a field where it finally came to a stop.

According to Tesla's data relayed from the accident, the Autopilot camera did not see the white trailer against the brightly-lit sky, while the Autopilot radar interpreted the high ride height of the trailer as an overhead road sign. These signals combined to tell the Autopilot system that it was safe to carry on through and did not activate the brakes.

This is the point where the driver, as the ultimate authority and responsible entity in the car, should have taken control and applied the brakes.

But he did not.

According to eyewitness reports, Brown was watching a movie on a portable DVD player while driving under Autopilot. The Florida Highway Patrol reports having recovered such a portable DVD player from the wreckage of the car. While we can only go by the witness accounting here, if Brown was indeed watching a movie while cruising on Autopilot we can only call that incredibly reckless.

It stands repeating: Autopilot is a driver assist system. Key words: driver and assist. It is not yet meant to replace the driver, nor is it meant to operate without supervision. It is a beta and it can and will make mistakes that the driver needs to be prepared to correct at a moment's notice — whether or not the car gives warning that it is ill-equipped to handle the situation it faces.

Autopilot is beta technology. It is dependent on human supervision.

Trusting technology

There's a lot to be said for trusting technology. We do it all the time — we trust the control systems of nuclear power plants to manage their radioactive fuel safely, we trust cruise control to maintain the speed that we specify and to not deviate wildly from it, we trust an oven to heat to a set temperature and maintain that temperature and for it to not ruin the roast.

But all of this still requires human supervision. There are humans at the nuclear power plant monitoring the fuel rod temperatures and ensuring that things don't spiral out of control into a meltdown situation. Human drivers sit in the driver's seat of their cars under cruise control, periodically checking the speedometer to ensure that the car is still at the same speed, all while controlling the direction of the car and ensuring that they safely avoid any obstacles. The human cook is in the kitchen or nearby, checking up on the roast to ensure that it's cooking as intended and the oven isn't on fire.

There are always going to be unforeseen situations that these systems just are not designed to handle, and it takes the intervention of a human to prevent catastrophe.

All of these systems, from the simplicity of the oven to the complicated nuclear fuel reactor, are programmed to handle an array of situations and are equipped with fail-safes to ensure safe operation of the device. There are always going to be unforeseen situations — a tsunami overwhelming a nuclear power plant, for example — that these systems just are not designed to handle, and it takes the intervention of a human to prevent catastrophe. Until we have a true artificial intelligence that we're confident can make decisions as good or better faster and more reliably than a human — which will require an expanded array of inputs and increased quality of those inputs — humans are ultimately in charge of these machines.

It's not Whirlpool's fault if your roast was ruined because you didn't pay attention to temperature or time. It's not Westinghouse's fault if humans failed to react in time to stop a nuclear meltdown before it happened. It's not Tesla's fault if a car under Autopilot drives itself into an accident if the driver wasn't paying attention to the road.

Autopilot is beta technology. You, the human, are in command.