100% misleading at best
Go to their flashy Autopilot webpage with lots of slick graphics:
https://www.tesla.com/autopilotThe primary focus is a 2 minute video (sped up) that shows a Tesla driving through a city or town. The driver never touches the wheel or the pedals. The video is sped up footage of what's probably more like a 10 minute drive in real time. So several miles, in something like 10 minutes, and zero input from the driver.
But if you read this text-based page:
https://www.tesla.com/support/autopilotTesla clearly says "Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment."
"While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. "
So at the very least, they're sending mixed signals to consumers. Marketing says "look what we can do!" but the legal department says "Hey, don't do what we just showed you we can do in our own marketing." And of course a skeptic could say that they're sending those mixed signals knowing full well that consumers will push the limits, and that Tesla is fine with that because it gets them more data to aid their quest to be the first fully self driving tech on the market. It's just another case of big tech using consumers as beta testers so they don't have to fund the research themselves.
Tesla could easily do an OTA update that required more supervision from the drivers. They could do the same thing and limit the use of the tech to pre-mapped areas so that the car has to react to fewer unknowns in real time. They do neither. Wonder why?
Autopilot's biggest advantage is that it's limitless to it's users right now. It's biggest drawback is that it's limitless to it's users right now. GM's Super Cruise, and Ford's upcoming tech seem like much more responsible applications of this tech.
Look at the safety records of Autopilot vs Super Cruise or other driver assist tech and it's not close. It's much, much easier to misuse or abuse Autopilot than other similar tech and that leads to serious flaws being tested in public. There are multiple cases of Autopilot failures contributing to accidents that a reasonably observant human wouldn't have gotten into. This is just the most recent:
https://twitter.com/i/status/1267304975069261824