The car is 100% for sure fully capable of driving autonomously coast to coast, right now. Using that statement as a knock against Tesla is just silly.
Per Tesla's own literature, the driver must remain attentive and ready to retake control at a moment's notice. That's not "Full autonomy" according to the SAE, NHTSA, etc. It's only Level 2 autonomy.
Elon and Tesla have great marketing. Publicity and hype can literally generate billions for them. They're not going to miss a chance to show off and build their brand. If they felt confident that the car could do it, I have a very hard time believing that they wouldn't have done it already. Level 2 autonomy is intended to be used as a driving aid to reduce fatigue in simple situations such as long stretches of highway cruising. It's not intended to replace all driving, and it's not capable of handling very complex situations or in some cases, even relatively simple ones like poor road markings.
The problem is people focus on the accidents. So yea, Teslas on autopilot have crashed, but guess what - there would have been 10 times more crashes if they hadn't been on autopilot, and humans are just bad at wrapping their heads around that.
Elon calls it a problem of how many 9s people expect. Going from 99% accident free to 99.9% to 99.99% to 99.999% is a really big step each time.
The thing is, they don't have to be crashing at all if Tesla had better controls in place. Many of these steps could probably be done with simple OTA updates. All other tech like this has more restrictions in place on the user. GM's Super Cruise is also a Level 2 driving aid. They have cameras in the car that watch the driver and make sure their eyes remain forward and hands on the wheel. They have far fewer miles logged than Tesla, but they also have zero accidents while the system is in use. Waymo is Level 4 autonomy (no safety driver at all) on public roads, and they have a far, far better safety record than Autopilot.
The way that they're wrecking is also important too. Most of the Autopilot accidents that I'm aware of involve a failure to react to a pretty basic driving situation. The human behind the wheel is to blame, but shouldn't these autonomous safety systems be able to see a stationary object the size of a semi that's blocking the road? Autopilot struggles with things that most human drivers (even partially distracted ones) can navigate without wrecking. Wrecks like
https://www.youtube.com/watch?v=X3hrKnv0dPQ only make the general public, legislators, and regulators apprehensive to adopting this tech. It's better for everybody (users and manufacturers) to make sure this tech is developed and applied responsibly, even if that takes a couple of years longer. High profile, easily avoidable wrecks a couple of times per year only hurt the cause.
So since you are clearly a skeptic, what is your answer? Statistics clearly show autopilot is 10 times better right now, do you need it to be 100 times better before you believe in it? 1,000 times better? 1,000,000 times better???? Personally if I had to get in a taxi I'd rather have it be running on Tesla autopilot than a random human driver - and autopilot is getting better every day, human drivers if anything are just getting more and more distracted.
The thing is, Tesla is contributing to those distracted drivers by placing a huge screen front/center. They're contributing to the problem they're trying to solve by integrating the HVAC controls into that screen, so that if you want to change the temp or where the air is blowing, you have to take your eyes off the road. They're making the problem worse by allowing drivers to mess with functions on the screen (unrelated to operating the vehicle) while the vehicle is moving. Most of the Autopilot accidents that I'm aware of involve Autopilot tech being misused by the driver. Imagine how good Tesla's safety record could be if they put similar safeguards in place that others in the industry already do?
As for me, it's not about a metric of safety, it's about Autopilot routinely doing things that a human isn't likely to do. I want to see it navigating difficult situations equally as well or better than an average human driver (pretty low bar). When I see videos of it failing to see a stationary object (like a tipped over truck that's 9ft tall and 20 ft wide) in broad daylight, that's not going to cut it for me to trust it with my life or my families. I'd be more comfortable using it as it's intended use (augmented cruise control), but that breeds complacency too, which can lead to situations where the driver needs to take over, but isn't prepared to do so. There's a growing contingent of people in the car world that say we shouldn't be allowing public use of anything between level 1 autonomy and level 5 autonomy on public roads. There just aren't enough safeguards in place, and drivers are assuming the tech is more capable than it is at this time.