Your Tesla just wrecked. Was it Autopilot? Or did you make a mistake like a normal human?

With more Autopilot-equipped Tesla vehicles on the road than ever, it might seem like an easy cop-out to blame the self-driving functions of the car for your wreck. After all, it'll draw media attention to the wreck while simultaneously deflecting blame from your actions as the driver. Except that this tactic is almost certain to backfire — Tesla will not hesitate to pull the car's automated logs and call out your shenanigans.

That's not to say that there aren't legitimate concerns about the potential for a car operating under Autopilot or a similar competing system to take an unexpected action leading to a wreck. We're still in the early days of self-driving car technology, as the fatal accident involving a Model S operating on Autopilot with an allegedly inattentive driver all too clearly demonstrated. Even still, the 360-degree awareness and instant reflexes of Autopilot make it a safer driver than you.

Tesla Autopilot is a beta system, one that requires attentiveness on the part of the driver. Tesla makes this very clear upon activating Autopilot, and again every time the system is engaged. The human driver is in charge, they are the ultimate authority and responsibility for the car's actions, even if Autopilot is turned on.

Here's the thing: aside from the deadly Model S wreck that's been pinned on the reckless actions of the driver, not one of the accidents that have been publicly blamed on Autopilot turned out to actually be the fault of Autopilot. Take the Model X owner who blamed Autopilot instead of a simple and common pedal mix-up that led to an impact with a building. Or the recent case of a rollover accident in which there's no indication that Autopilot was engaged. Or the time a Model S drove right into a car parked on the side of a highway (with only the traffic aware cruise control system engaged, not full auto-steering Autopilot).

It's easy to place blame on Autopilot. It's a new technology that apparently a good number of Tesla owners don't fully grasp in terms of capability or functionality. There's nothing wrong with not understanding how a system works — billions of people have no concept of how a computer or car actually works — so long as you understand how to operate the system — and understand its limitations.

There's nothing wrong with not understanding how a system works — so long as you know its limitations and how to use it.

The problem becomes when that shallow knowledge based is called upon to explain an unexpected occurrence. When you don't understand how a system work and what it is capable of, then the system becomes the default blame target. You can see it happen all the time with computers — somebody who doesn't understand how computers work fat-fingers a command in a window they never should have opened in the first place or clicks the wrong button, and they blame the computer for the missing file or strange settings or compromised security — anything but their unknowingly incorrect action. It happens all the time in systems technological and economic and political — not understanding how the system works is fine 99.9% of the time. But when something goes wrong, odds are it's because of a human's unknowing error.

New technology is too easy of a blame target. We're hardwired to believe that we took the correct action, even if we did the exact opposite of what we think we did. That's especially more the case after a violent accident where heat-of-the-moment memories are often foggy and misinterpreted (I know post-wreck confusion all-too-well). Much of our human experience and memory is inferred — our brains are great at recognizing patterns and filling in the blanks, and most of our lives fit into these sort of patterns.

Driving very much falls into a practically instinctual pattern, so when you're forced think about what happened after an accident, you're almost certainly going to recall doing things correctly because that's what you've always done without even thinking about it. You don't think about turning on the turn signal to change lanes, you think "change lanes" and your brain automatically goes through the process of checking the mirrors and looking over your shoulder to see if the lane is clear, hitting the turn signal, and then steering the car into the next lane. That automatic pattern implementation makes driving predictable and far less stressful and tiring than it would be if our conscious brains were fully engaged in every driving process as we all are when we first learn to drive. We learn what to tune out, what to pay attention to, what we can do without thinking about it, and what inputs should trigger our full and undivided attention.

So when something goes wrong and your car ends up upside down or in the side of a building, we have to figure out what went wrong. You've driven thousands of times before, so your brain fills in your memory with exactly what you'd expect: you performing mundane driving tasks correctly, even if that's exactly what you didn't do. It only takes one slip-up on the road for your drive to end in catastrophe and your feeble human brain is going to focus more on the immediate concern of correcting the car's wayward trajectory instead of what went wrong.

And odds are you're going to make things worse. Despite all of our training on how to handle everyday driving concerns, it's difficult to perform that same sort of internationalization through physical repetition training for safely guiding a car through a skid or how to act to correct a rollover. Your car is accelerating out of control as you press on the brake pedal — your first instinct will be to press harder on the brake pedal, not to question if your foot might actually be on the accelerator.

Much of our human experience inferred — our brains are great at recognizing patterns and filling in the blanks. We unconsciously assume that we acted according to that pattern.

And so, after filling in the blanks with what we believe to be the correct information, the logical information — I was pressing the brake pedal the entire time! — the only plausible explanation must be that the car is at fault. Lo and behold, there's a fancy self-driving system that you know kind of works ... it must be at fault here, right? Obviously.

And then that statement gets recorded in the police report and picked up by the local press and it very quickly gets amplified by the national press. Given Tesla's status as the automotive industry darling, it's no surprise to see sensationalist stories about how a seemingly fantastical system like Autopilot could actually be at fault. The 24-hour news cycle operates too quickly for fact-checking and technical consultation. And so we end up with overblown stories about Autopilot car wrecks and the dangers of fancy new technology. It's not inherently malicious, but there certainly is some incentive in the form of reader and viewer counts in getting stories about frightening technology out quickly.

Which then, inevitably, leads to Tesla feeling like they have to make a public statement. And inevitably, Tesla responds by pulling the individual vehicle's logs and verifying that Autopilot was in fact not activated. Or that the driver wasn't paying attention as they are repeatedly and frequently advised to do. Or that it was a freak accident beyond Autopilot's programming — and, again, the driver should have intervened to stop anyway.

Autopilot is far from infallible, and it will likely be many years before we can safely tune out and let the car do all of the driving. But when your Tesla wrecks, even if Autopilot is engaged, it's probably your fault anyway.