Author Topic: What does the Tesla autopilot death mean for the future of self-driving cars  (Read 29843 times)

BCBiker

  • Stubble
  • **
  • Posts: 187
  • Location: Colorado
    • Business Casual Biker - Health, Wealth, and Mental Stealth BTYB Bicycle Commuting
It has been interesting to see how people react to the death of a Tesla driver using autopilot.  I typed up my thoughts on my blog. I'm interested to see what Mustachians have to say about the event. I'm looking forward to where this thread goes.
http://wp.me/p5wdgE-4b

Lagom

  • Handlebar Stache
  • *****
  • Posts: 1258
  • Age: 40
  • Location: SF Bay Area
Probably won't change things much, tbh. Maybe that's wishful thinking on my part, but the math will overwhelmingly support self-driving tech when it comes to safety, so the only hurdle (albeit a big one), will be overcoming people's visceral dislike of not being in control. But we already trust airline pilots and uber drivers, so I would like to think it's not so much of a leap. Sooner or later, self-driving cars will be pervasive. I, for one, can't wait until that day.

SoccerLounge

  • Stubble
  • **
  • Posts: 240
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

trashmanz

  • Bristles
  • ***
  • Posts: 338
Forward progress won't be slowed.

ransom132

  • 5 O'Clock Shadow
  • *
  • Posts: 91
Wasn't the driver killed because the truck in front of him went on the wrong lane? I mean it was more the other person's fault than it was the auto pilot driver...unless I didn't read correctly.

seattlecyclone

  • Walrus Stache
  • *******
  • Posts: 7266
  • Age: 39
  • Location: Seattle, WA
    • My blog
Probably won't change things much, tbh. Maybe that's wishful thinking on my part, but the math will overwhelmingly support self-driving tech when it comes to safety, so the only hurdle (albeit a big one), will be overcoming people's visceral dislike of not being in control. But we already trust airline pilots and uber drivers, so I would like to think it's not so much of a leap. Sooner or later, self-driving cars will be pervasive. I, for one, can't wait until that day.

Agreed. Humans die in human-controlled vehicles all the time. Self-driving cars don't need to be perfect to be better than humans.

dandarc

  • Walrus Stache
  • *******
  • Posts: 5488
  • Age: 41
  • Pronouns: he/him/his
Wasn't the driver killed because the truck in front of him went on the wrong lane? I mean it was more the other person's fault than it was the auto pilot driver...unless I didn't read correctly.
Eh.  Read somewhere that the car ran up under the truck that cut across the road, then kept driving a while after the top of the car was cut-off.

The car basically couldn't "see" the trailer across the road due to not enough color contrast.  Driver likely wasn't paying attention either, which of course is part of the allure of auto-pilot.  Tesla also claims that if the impact was with a lower-part of the trailer, there may have been a better outcome.

So better auto-pilot, or a more attentive driver in either vehicle would have made a difference here.  Betting on autopilot improving more than human drivers.

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
Wasn't the driver killed because the truck in front of him went on the wrong lane? I mean it was more the other person's fault than it was the auto pilot driver...unless I didn't read correctly.

From my understanding it was turning left at an access road to get off of the highway.  The Tesla struck the right side of the trailer and drove under it so yes the truck pulled in front of the Tesla.  I would think that somebody paying attention would have slowed down even if the Tesla didn't detect it.  You almost always see skid marks in those type of events sometimes too late to stop but at least some of the speed is taken away.  The truck driver said the Tesla went under the truck so fast he didn't even see it.  The Tesla continued on down the road another 1/4 mile crashing through two fences and finally hitting a telephone pole breaking the pole in half.  The car ended up spinning counter clockwise sideways to it's final resting spot about 100 feet off the highway.

The truck driver says the driver was watching Harry Potter but he said he only heard it playing never saw it.  The sheriff mentioned there was a portable DVD player in the car.  I assume the Tesla driver's head was smashed instantly when hitting the truck.  Probably never saw it coming.

Rocket

  • 5 O'Clock Shadow
  • *
  • Posts: 99
  • Location: Los Angeles
What google found with their cars is that people quickly rely on the tech after about 5 minutes and then do really stupid stuff like unbuckling and crawling around the car looking for something.  The tesla is still a decade away from fully autonomous.

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
The tesla is still a decade away from fully autonomous.

Google and Apple both use the laser imaging and radar system to draw the model of the environment along with the other devices.  Tesla uses the crash avoidance systems that other auto braking systems use along with the computer vision system.  I read the trailer that the Tesla hit was white and it was hard to distinguish between the bright lit sky.  It seems odd though as the Tesla was heading east so the sun should have been behind the car at 4:40 pm.  I could see the computer vision system getting washed out if it was heading west into in the sun.  I think both systems have to agree before there is a reaction.

mrpercentage

  • Handlebar Stache
  • *****
  • Posts: 1235
  • Location: PHX, AZ
This wont slow it. Once you have a few self driving cars plow over some pedestrians you will have real backlash. Robots are cool killing people who chose to use them or people who get paid to work around them, but people will get all Terminator/Matrix when they start running us over like Maximum Overdrive

mrpercentage

  • Handlebar Stache
  • *****
  • Posts: 1235
  • Location: PHX, AZ
Just incase you dont know what Im talking about

https://youtu.be/DVigoLjor0Q?t=1m32s

By the way, I miss Emilo Estevez. He is one of my favorites. I make it a point to see anything he is in, directs, or produces. Some of his movies like "Wisdom" carry a serious message
« Last Edit: July 03, 2016, 02:48:21 PM by mrpercentage »

big_owl

  • Handlebar Stache
  • *****
  • Posts: 1051
I think accidents like these are being  underestimated and could quickly torpedo progress.  If not for technical reasons then emotional ones.  I also think the comparison to the airline industry is tenuous. 

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
I also think the comparison to the airline industry is tenuous.

I agree.  The Tesla driver thought he was in a Google car with probably 2-3 times the smarts of the Tesla.  I believe Tesla is foolish for calling it "Autopilot" mode.   I guess they feel the warning on the dash is enough but I don't think the general public understands the difference.  If the Tesla driver was watching a movie then he clearly didn't.  The Tesla is barely smarter than other auto braking or rear end avoidance crash systems.  It was never designed to behave as what I would call or trust as an autopilot.

I am still amazed the the thing didn't detect some kind of impact that would have applied the brakes after shearing off the roof.  The airbags didn't go off either so maybe there wasn't enough to detect but reports say the car looked like a sardine can with the roof peeled back.

Lagom

  • Handlebar Stache
  • *****
  • Posts: 1258
  • Age: 40
  • Location: SF Bay Area
I was comparing self-driving technology in its mature form to airlines. The Tesla is far from a fully autonomous vehicle.

dang1

  • Pencil Stache
  • ****
  • Posts: 515
I'm looking forward to autonomous vehicles. Apparently, today, a pretty good driver assist system in an affordably priced car is Honda Sense in a Civic.

dachs

  • Bristles
  • ***
  • Posts: 253
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

True, but pilots who watch Harry Potter during an autopilot approach will get fired.

SoccerLounge

  • Stubble
  • **
  • Posts: 240
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

True, but pilots who watch Harry Potter during an autopilot approach will get fired.

What about if they play video games during an approach? Just kidding, it's an Airbus ;)

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
What about if they play video games during an approach? Just kidding, it's an Airbus ;)

My buddy works on planes and they use a Fex Ex pilot to do any test flights if needed after completing work.  He said Fed Ex doesn't want them to mess with anything.  I guess they just pay them to back up the computer and stay alert of course.  Not sure if he really meant totally hands off or just a little hands off.

deborah

  • Senior Mustachian
  • ********
  • Posts: 16090
  • Age: 14
  • Location: Australia or another awesome area
At the very least, the accident should start Telsa looking at taking it off autopilot when the roof shears off!

They said that there is a death every x million miles, and one death after something like 2x million miles on autopilot showed how safe the cars were! This doesn't sound like a manufacturer who is interested in improving the safety of their vehicle.

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
They said that there is a death every x million miles, and one death after something like 2x million miles on autopilot showed how safe the cars were! This doesn't sound like a manufacturer who is interested in improving the safety of their vehicle.

Wonder how statistically correct that number is when they have another death tomorrow in autopilot?

BCBiker

  • Stubble
  • **
  • Posts: 187
  • Location: Colorado
    • Business Casual Biker - Health, Wealth, and Mental Stealth BTYB Bicycle Commuting
They said that there is a death every x million miles, and one death after something like 2x million miles on autopilot showed how safe the cars were! This doesn't sound like a manufacturer who is interested in improving the safety of their vehicle.
I don't understand your comment. They were illustrating that their system has so far performed around twice as well as standard human-driving. Flip that around and say there will be 2x as many deaths if humans are continued to be allowed to drive cars without computer assistance.  Humans don't seem like a bunch who care too much about their brothers and sisters. And since they have been driving for over 100 years, I don't think we can count on much improvement. :)

Wonder how statistically correct that number is when they have another death tomorrow in autopilot?

The answer is that the 1 death per 130 million miles is not statistically significant.  There could be another event tomorrow, which would increase the death rate to slightly lower than 1 death per 65 million miles.  Or there could not be another death in the next 1 billion miles driven decreasing the death rate to 1 death per 565 million miles.  It will take several billion miles driven before we will have a reasonable handle on the actual death rate related to autonomous driving.   And I argue that that death rate will likely steadily decrease until the death rate approaches a zero asymptote.  It might mean that the autonomous systems look nothing like Google's or Tesla's current systems or a combination of the two.  The key is that technology will progress as long as the public allows it.

True, but pilots who watch Harry Potter during an autopilot approach will get fired.
My wife's friend is a pilot and apparently the shenanigans that happen up there would freak most people out, especially when autopilot is engaged.
« Last Edit: July 03, 2016, 11:23:40 PM by BCBiker »

FIRE me

  • Handlebar Stache
  • *****
  • Posts: 1097
  • Location: Louisville, KY
  • So much technology, so little talent.
I think accidents like these are being  underestimated and could quickly torpedo progress.  If not for technical reasons then emotional ones.  I also think the comparison to the airline industry is tenuous.

I don't think the recent accident will slow the development and adoption of self driving vehicles. But if is does, I don't think it will be technical or emotional. It would be financial, due to lawsuits.

For example, if you own and drive a Tesla, you agreed to the TOS, and were informed that it is a beta feature. But if a pedestrian (for example) were mowed down by a Tesla on autopilot he would have agreed to no such thing. For that reason, I think the manufacturer's liability could become quite high.

deborah

  • Senior Mustachian
  • ********
  • Posts: 16090
  • Age: 14
  • Location: Australia or another awesome area
They said that there is a death every x million miles, and one death after something like 2x million miles on autopilot showed how safe the cars were! This doesn't sound like a manufacturer who is interested in improving the safety of their vehicle.
I don't understand your comment. They were illustrating that their system has so far performed around twice as well as standard human-driving. Flip that around and say there will be 2x as many deaths if humans are continued to be allowed to drive cars without computer assistance.  Humans don't seem like a bunch who care too much about their brothers and sisters. And since they have been driving for over 100 years, I don't think we can count on much improvement. :)
They were protecting themselves (we have had half the deaths you would expect), rather than suggesting that there were actually some safety issues (car continued to speed on even after an accident).

BCBiker

  • Stubble
  • **
  • Posts: 187
  • Location: Colorado
    • Business Casual Biker - Health, Wealth, and Mental Stealth BTYB Bicycle Commuting
They said that there is a death every x million miles, and one death after something like 2x million miles on autopilot showed how safe the cars were! This doesn't sound like a manufacturer who is interested in improving the safety of their vehicle.
I don't understand your comment. They were illustrating that their system has so far performed around twice as well as standard human-driving. Flip that around and say there will be 2x as many deaths if humans are continued to be allowed to drive cars without computer assistance.  Humans don't seem like a bunch who care too much about their brothers and sisters. And since they have been driving for over 100 years, I don't think we can count on much improvement. :)
They were protecting themselves (we have had half the deaths you would expect), rather than suggesting that there were actually some safety issues (car continued to speed on even after an accident).
I actually prefer their response to saying we don't know what we are doing.  When your product outperforms the competition you don't dwell on that one failing.  Leave that to the media and the competitors.  In their statement, they detail exactly the circumstance under which their system failed, including that they have multiple warnings for the driver to pay attention when autopilot is engaged and that the trailer was too high for the sensors to recognize it.  I guarantee Tesla will have either a software or hardware upgrade in the near future to address the issue.

Playing with Fire UK

  • Magnum Stache
  • ******
  • Posts: 3449
I think the message that needs to get out is that this wasn't a 'self-driving' car; it was a slightly fancier cruise control, that clearly required the a human driver to be paying attention, and the human failed to do that.

To me this is more akin to an accident that occurs in a regular car when someone is texting than a 'self-driving' failure. However I only think this because I read into auto-pilot and what it can and can't do.

Tesla's accident statistics are in no way helping the matter. Proper self driving cars should be orders of magnitude safer than human-controlled cars.

I think there is a danger of public perception (based on the people in my office speaking about it), that self driving cars aren't safe.

wienerdog

  • Pencil Stache
  • ****
  • Posts: 587
I guarantee Tesla will have either a software or hardware upgrade in the near future to address the issue.

I agree.  MobilEye has already said in 2017 they should have "crossing traffic avoidance" which will require hardware upgrades (some of the new ones might have this hardware in it).  If you look around you'll find many close calls of stopped traffic as the Tesla just can't detect it far enough out due to hardware limitations and lane changes where the Tesla changes way to soon where there is only a few feet between the forward car and the Tesla.  I read somewhere that the trailer was detected as an overhead sign and that is why the Tesla never attempted to brake so they know what happened.

Mobileye's video back from 2014 predicting 2016 but I think they have pushed it back into 2017.

https://www.youtube.com/watch?v=dhEgD6ZFlQE

GuitarStv

  • Senior Mustachian
  • ********
  • Posts: 23261
  • Age: 42
  • Location: Toronto, Ontario, Canada
Probably won't change things much, tbh. Maybe that's wishful thinking on my part, but the math will overwhelmingly support self-driving tech when it comes to safety, so the only hurdle (albeit a big one), will be overcoming people's visceral dislike of not being in control. But we already trust airline pilots and uber drivers, so I would like to think it's not so much of a leap. Sooner or later, self-driving cars will be pervasive. I, for one, can't wait until that day.

Humans die in human-controlled vehicles all the time. Self-driving cars don't need to be perfect to be better than humans.

This is the crux of the matter.

I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

Humans are very flawed critters.  Robo driving will be safer than human driving not too far in the future.  It's really hard to find grounds to oppose and argue against a safer option.

music lover

  • Pencil Stache
  • ****
  • Posts: 652
The fact that a test driver in a test vehicle was not paying attention is a pretty good indication of what will happen in the real world with real people.

Comparing miles per accident is meaningless unless you include night driving, winter driving, roads with no lines, construction zone driving, gravel road driving, and every other situation that appears in the real world.

music lover

  • Pencil Stache
  • ****
  • Posts: 652
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.

dandarc

  • Walrus Stache
  • *******
  • Posts: 5488
  • Age: 41
  • Pronouns: he/him/his
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.
Not sure if you're serious . . . People drive their cars into fatal accidents every single day.  Automation doesn't have to be perfect to be better than human.

music lover

  • Pencil Stache
  • ****
  • Posts: 652
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.
Not sure if you're serious . . . People drive their cars into fatal accidents every single day.  Automation doesn't have to be perfect to be better than human.

Are you serious? Perhaps you drank too much of the Elon Musk Kool-Aid??

Driving into a semi in broad daylight might put a computer at the level of the bottom 1% of drivers...but, it certainly isn't safer than the other 99% of drivers.

Guses

  • Pencil Stache
  • ****
  • Posts: 915
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.

But the thing is, the same outcome would have happened if the car was manually driven and the driver was not paying attention.

The good thing about this unfortunate accident is that after they fix the software/hardware to be able to detect this, there will never be another identical incident.

Technology is like natural selection on steroids. Each incident makes the system much safer.

The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

music lover

  • Pencil Stache
  • ****
  • Posts: 652
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

Some people welcome government control over every aspect of their lives. I'm not one of them.

GuitarStv

  • Senior Mustachian
  • ********
  • Posts: 23261
  • Age: 42
  • Location: Toronto, Ontario, Canada
Technology is like natural selection on steroids. Each incident makes the system much safer.

This depends on how they implement the system changes to detect and correct the problem.  (With some adaptive systems, implementing a fix can actually cause a lot of weird unseen errors in other places . . . which is why regression testing is so important.)  It's generally a true statement though.


The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

Some people welcome government control over every aspect of their lives. I'm not one of them.

If an automated system eventually becomes a thousand times safer than the average human driver, what exactly would your opposition to banning manual driving be?

As a libertarian I'd think you would welcome the legislation.  It follows Mill's harm principle . . . in that it only limits personal freedom to prevent harm to others.

Guses

  • Pencil Stache
  • ****
  • Posts: 915
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

Some people welcome government control over every aspect of their lives. I'm not one of them.

Oh, it's not about government control, it's the realization that the average human is not equipped (biologically or otherwise) to safely and efficiently drive machines at speeds over 20-30 MPH. Even at low speed, efficiency is not part of the game.

I mean, have you ever seen a traffic jam where drivers constantly change lanes to be "in the fast lane", speed up and then slam on the brakes because another "lane hopper" is also in a hurry? It's effin ridiculous.

The human element is obsolete and deprecated.

You want to drive manually, inefficiently and dangerously? Go right ahead on a private road! Public roads should be reserved for the safest, fastest and most efficient systems. Human drivers are not part of it.

I will make the bold statement that mass adoption of intelligent cars and "carsharing 2.0" (i.e., like uber but you share a car with other people) will mark the end of traffic jams and make public infrastructure much cheaper in the future. That is if we eliminate the weakest link (humans).

BCBiker

  • Stubble
  • **
  • Posts: 187
  • Location: Colorado
    • Business Casual Biker - Health, Wealth, and Mental Stealth BTYB Bicycle Commuting
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.
Not sure if you're serious . . . People drive their cars into fatal accidents every single day.  Automation doesn't have to be perfect to be better than human.

Are you serious? Perhaps you drank too much of the Elon Musk Kool-Aid??

Driving into a semi in broad daylight might put a computer at the level of the bottom 1% of drivers...but, it certainly isn't safer than the other 99% of drivers.

You logic seems flawed.  In the last 100+ years of human driving in the US, ther have been approximately 3.5 million auto related deaths.  Many, many thousands of them under circumstance where the driver was at fault at least as much as in the Tesla event (a truck pulls out in front and driver fails to recognize).  Just on my 28 mile per day commute by bicycle, I see multiple cars plowed into inanimate polls, fences, bus stops, etc. every week.  I know this is an anecdote but one Sunday morning while we were in a grocery store we heard a loud crash sound.  A truck was northbound and the driver was either drunk/high or asleep and he crossed oncoming traffic and plowed into a store at >60 miles per hour, killing at least one person in the building.  How can you praise the human driver when these events happen multiple times daily?  Yes, this was a major failure of the autonomous driving system but saying it is worse than a human driver is very naive (I'm not sure who is passing out this Kool-aide?). 

I agree with your sentiments about concern of all out ban of human driving, but when it become clear that thousands of people will be alive who would have died because humans want to drive there will be a practical ban on human driving. By that time, virtual reality will probably be so good that you can get your adrenaline fill in a safe setting and you will be willing to give up the "driving-experience" that is apparently cherished.  Also, you will still be able to bicycle to get your rush (and you should).
« Last Edit: July 04, 2016, 09:07:24 AM by BCBiker »

music lover

  • Pencil Stache
  • ****
  • Posts: 652
I understand the elements of human error. I also understand the elements of government creep and stifling legislation that is sure to accompany self-driving cars. Don't be blinded by visions of a self-driving utopia without giving serious thought to all the pitfalls that will accompany it. As the saying goes...be careful what you wish for.

seattlecyclone

  • Walrus Stache
  • *******
  • Posts: 7266
  • Age: 39
  • Location: Seattle, WA
    • My blog
I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.

Humans are different than computers. It's reasonable to assume that when they fail, they might fail in different ways. Of course a human probably wouldn't crash into the side of a semi in this way, but how many situations are there where a human would have crashed but the Tesla would have survived?

It all should come down to numbers: if the autopilot kills fewer people than human drivers, it should be embraced over manual driving even if the few cases where the autopilot fails look incredibly stupid to a human driver.

The upshot to all this is that Tesla now has the opportunity to see what went wrong here and teach all their cars how not to make this same mistake ever again. Good luck doing that with humans.

Guses

  • Pencil Stache
  • ****
  • Posts: 915
I understand the elements of human error. I also understand the elements of government creep and stifling legislation that is sure to accompany self-driving cars. Don't be blinded by visions of a self-driving utopia without giving serious thought to all the pitfalls that will accompany it. As the saying goes...be careful what you wish for.

Can you expand on what you think the pitfalls are?

FYI, I don't think additional regulation, in itself, is a pitfall.

big_owl

  • Handlebar Stache
  • *****
  • Posts: 1051
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

I personally don't enjoy driving my car all that much, but a lot of people do.  I do however love to drive motorcycles for enjoyment.  If automatic driving can reduce the number of people I have to worry about t-boning me or rear-ending me then I'm for its adoption, but not mandatory enforcement.  Of course it's a NIMBY situation - I would never voluntarily give up the ability to drive a motorcycle freely, and in your utopia I assume that would be outlawed as well.  But then so would riding bicycles on the street since they're a manual operation.  Lucky for me your dream won't materialize in either of our lifetimes. 

How is banning manual driving at all compatible with mustachianism? 

seattlecyclone

  • Walrus Stache
  • *******
  • Posts: 7266
  • Age: 39
  • Location: Seattle, WA
    • My blog
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

I personally don't enjoy driving my car all that much, but a lot of people do.  I do however love to drive motorcycles for enjoyment.  If automatic driving can reduce the number of people I have to worry about t-boning me or rear-ending me then I'm for its adoption, but not mandatory enforcement.  Of course it's a NIMBY situation - I would never voluntarily give up the ability to drive a motorcycle freely, and in your utopia I assume that would be outlawed as well.  But then so would riding bicycles on the street since they're a manual operation.  Lucky for me your dream won't materialize in either of our lifetimes. 

How is banning manual driving at all compatible with mustachianism? 

I don't think manual driving will be banned, since people do love their classic cars. Also many people won't be able to afford a new enough car to have an autopilot feature for many years after the feature becomes widespread. What I do expect to happen is that the insurance industry will run the numbers on humans vs. autopilots, set their liability rates accordingly, and the market will work it out. Most people will use autonomous vehicles because it's cheaper and easier, while car enthusiasts will still get to have their fun while paying a fair price for the privilege.

mrpercentage

  • Handlebar Stache
  • *****
  • Posts: 1235
  • Location: PHX, AZ
I think people are getting carried away with thinking all manual diving will be replaced. Even if all future cars are auto drive the older cars should be grandfathered and there will be times a machine just can't do the job "off road and perhaps ice road trucking ect.

Also drivers will be needed. Auto uber with no driver would be the nastiest soda, cum, and graffiti everywhere vehicle ever witnessed by man.

JLee

  • Walrus Stache
  • *******
  • Posts: 7529
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.

I personally don't enjoy driving my car all that much, but a lot of people do.  I do however love to drive motorcycles for enjoyment.  If automatic driving can reduce the number of people I have to worry about t-boning me or rear-ending me then I'm for its adoption, but not mandatory enforcement.  Of course it's a NIMBY situation - I would never voluntarily give up the ability to drive a motorcycle freely, and in your utopia I assume that would be outlawed as well.  But then so would riding bicycles on the street since they're a manual operation.  Lucky for me your dream won't materialize in either of our lifetimes. 

How is banning manual driving at all compatible with mustachianism? 

I don't think manual driving will be banned, since people do love their classic cars. Also many people won't be able to afford a new enough car to have an autopilot feature for many years after the feature becomes widespread. What I do expect to happen is that the insurance industry will run the numbers on humans vs. autopilots, set their liability rates accordingly, and the market will work it out. Most people will use autonomous vehicles because it's cheaper and easier, while car enthusiasts will still get to have their fun while paying a fair price for the privilege.

If you removed all the people who don't give a crap about driving and put them in autonomous vehicles, you may find that manually operated vehicles become much safer.

I work as an engineer designing automated train controls.  It's similar, but far less complex than automated driving cars.  The moment you remove a human from decision making and replace it with a well tested automation, a vehicle becomes astoundingly safer and less accident prone.

The car drove into the side of a semi in broad daylight, so to claim that removing human decisions is safer is wrong...at least at this point in the technology's development. That may change in the future, but it's not safer today...and as was just proven, it can be more dangerous.

What decided to drive the semi into the road, if not a human?
« Last Edit: July 04, 2016, 10:13:18 AM by JLee »

Guses

  • Pencil Stache
  • ****
  • Posts: 915

Also drivers will be needed. Auto uber with no driver would be the nastiest soda, cum, and graffiti everywhere vehicle ever witnessed by man.

In a world with self driving cars, don't you think that ubiquitous cameras + ID scans would deter the vandals?
The real cause of the accident is the trailer driver making a dangerous lane change. I can't wait for when driving manually is entirely banned.


Of course it's a NIMBY situation - I would never voluntarily give up the ability to drive a motorcycle freely, and in your utopia I assume that would be outlawed as well.  But then so would riding bicycles on the street since they're a manual operation.  Lucky for me your dream won't materialize in either of our lifetimes. 

How is banning manual driving at all compatible with mustachianism? 

There is already room in the infrastructure to have both bikes and cars and most places already have lanes for bikes anyways. IMO this is a non issue as the car needs to be programmed to avoid pedestrians, animals and such. That being said, I would be all for more stringent enforcement of biking and pedestrian rules. Law enforcement will likely have lots of free time anyways given how self driving cars are not good for making money off of fines...


Cpa Cat

  • Handlebar Stache
  • *****
  • Posts: 1692
When I was commuting, I saw all sorts of shenanigans in cars. I witnessed drivers doing the following: applying make up, eating with both hands occupied with food items, reading a novel, watching a portable dvd player on the dash, the entire range of cell phone interactions, turning full around to talk to someone in the back seat, leaning over to retrieve an item from the floor, changing clothes, getting a blow job, and a whole host of other activities that made me say things like, "WTF - Is that guy reading a newspaper while driving??"

I would prefer that all such people drive self-driving vehicles. The problem is that almost every driver is prone to multitasking when they have confidence in the autopilot feature of their vehicle. At the least, most drivers will fall asleep at some point on a longer trip. Why wouldn't they? People already fall asleep at the wheel now.  Having an autopilot feature will make drivers more confident to drive when they're tired, or drunk.

So if a company is going to release an auto pilot feature, then it should be a true auto pilot. It needs to compensate for the fact that human drivers are lazy and stupid, and they get lazier and stupider if their car drives itself. There aren't enough dash warning in the world to make human drivers actively engage their minds on driving when they aren't actually driving. Studies show that even cruise control causes a significant reduction in attentiveness - and you still have to steer on cruise control!

Was the Tesla driver in this accident paying attention? I don't know. I certainly don't put any faith in anything the semi truck driver says, given that he is being accused of driving dangerously and the Tesla driver's family is blaming him for the accident. For all we know, the Tesla driver had spent enough time in his auto pilot car not having to do anything that by the time he realized the Tesla wasn't going to brake, it was too late. If reaction time is slow on cruise control, then how slow does it get on Ultimate Do-Nothing Cruise Control?

If a company is going to put Ultimate Do-Nothing Cruise Control in a car, then it seems to me that there's a fallacy in design if there's any reliance whatsoever on human reaction time to stop accidents.

That said, I agree with others - the time can't come quickly enough for this technology to be perfected. Human drivers are morons. Maybe this accident would have been avoided if the semi-truck was driven by a robot, too.

Spork

  • Walrus Stache
  • *******
  • Posts: 5742
    • Spork In The Eye
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

Yes and no.  While it seems counter intuitive, aircraft autopiloting is much less complicated.  Following an ILS signal (or VOR or GPS or...) is much simpler than the car's requirement to see/follow roads.  Air traffic is generally separated by ATC.  Additional support from transponders that can communicate with other aircraft.  And... there is MUCH less traffic in the air than there is on the ground.

Guses

  • Pencil Stache
  • ****
  • Posts: 915
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

Yes and no.  While it seems counter intuitive, aircraft autopiloting is much less complicated.  Following an ILS signal (or VOR or GPS or...) is much simpler than the car's requirement to see/follow roads.  Air traffic is generally separated by ATC.  Additional support from transponders that can communicate with other aircraft.  And... there is MUCH less traffic in the air than there is on the ground.

BUUUT, a car is much more maneuverable, does not go as fast, is easier to stop  (versus stalling in a plane) and only goes in 2D (almost 1.5 D really). I agree that the number of cars issue needs to be addressed though...


Spork

  • Walrus Stache
  • *******
  • Posts: 5742
    • Spork In The Eye
Fully agree. And that airline pilot comparison is extra pertinent, were the public only to realize just how much of modern airline flight is automated - including, frequently, takeoffs and landings ;)

Yes and no.  While it seems counter intuitive, aircraft autopiloting is much less complicated.  Following an ILS signal (or VOR or GPS or...) is much simpler than the car's requirement to see/follow roads.  Air traffic is generally separated by ATC.  Additional support from transponders that can communicate with other aircraft.  And... there is MUCH less traffic in the air than there is on the ground.

BUUUT, a car is much more maneuverable, does not go as fast, is easier to stop  (versus stalling in a plane) and only goes in 2D (almost 1.5 D really). I agree that the number of cars issue needs to be addressed though...

It's not just cars.  It's potholes, construction cones, kids on bicycles, red lights, stop signs, ... and on and on. 

Stalls are easy to autopilot.  If Airspeed < threshold, then lower nose.  It's not about stopping.  It's about keeping air moving over the airfoil.

It SEEMS like planes would be more complicated.  But there just are way fewer variables.

Guses

  • Pencil Stache
  • ****
  • Posts: 915
Oh, I agree with you that planes are simpler to automate. I was just pointing out that there can be more flexibility in the programming given how maneuverable cars are.

For instance, even if a flock of birds could be detected 300 feet in front of a plane, the plane could not react in time to avoid it because of it's limitations and the speed at which it is travelling.