We were unable to load Disqus. If you are a moderator please see our troubleshooting guide.

Chuck • 8 years ago

The more interesting question, is when the self-driving car is forced to make an ethical decision to protect the driver vs protect pedestrian(s) in the road, what decision does it make? Should algorithms be written to preserve the maximum number of lives? or always prioritize the driver?

Alan • 8 years ago

Another interesting question: if all self-driving cars are designed to operate very conservatively, i.e. below speed limits and with large spacings between themselves and other cars, and this known behavior induces other drivers to frequently 'cut off' the self-driving cars, causing more disruptive / less safe driving on the highway - how will highway traffic regulators respond?

AMTbuff • 8 years ago

Large spacing means reduced total flow of cars. I thought self-driving cars were supposed to bunch close together, increasing capacity. There seems to be a transition challenge here.

marcioab • 8 years ago

That is my point in another post here. The overall speed limit should be reduced. Self-drive cars will respect and Human-drive car will be ticketed up to the point they will respect. In the end (low) SPEED and distance as you mention will be a key element.

DStuff • 8 years ago

I think that (if the manufacturer is going to be financially responsible for injuries) they will be optimized to put the manufacturer on the hook for as little as possible.

Jeff • 8 years ago

That's an interesting point, and not one I've heard discussed. Thanks for your comment.

keith12345 • 8 years ago

It should always be to preserve the maximum number of lives, with pedestrians getting higher priority in ambiguous situations.

DStuff • 8 years ago

"with pedestrians getting higher priority in ambiguous situations." -- And that will make the streets of New York a parking lot, as every pedestrian in the city need never fear stepping out into traffic ever again.

As to the priority, I will be doing the research, and buy (paying more if I have to) to get the one that will prioritize the well being of my family over strangers.

keith12345 • 8 years ago

So your solution is just to have the cars automatically run them down and kill them? Really? Wow. I also find your notion that your family's lives are somehow worth more than other people's lives, simply because you don't know them, to be rather odious. Sorry, but no, you are not better than everyone else, just because you happen to think so.

Tom Billings • 8 years ago

Not a question of DStuff thinking others are worth less. It's a question of who is he responsible for protecting. His family has the strongest call on his protection. I would not trust anyone with supervising a child who did not view that child in such a manner.

While we are all equal before the law, we are *not* equally responsible for all others.

keith12345 • 8 years ago

Which is exactly why this type of decision can't be left to individual drivers to do as they please. Naturally, people will look to maximize saving their own hide, everyone else be damned. They'll select/program their cars so that they will plow into 8 pedestrians if they have to, just as long as it saves their own life. Self-driving car protocols must come from the top down, and be uniformly applied across the board so that drivers don't endanger countless others in order to maximize their own personal safety. Saving the maximum number of lives in any situation should always be the mandated protocol.

DStuff • 8 years ago

And when the car buyer fully understands that the car he's buying will not put his/her children's safety first and foremost, they will walk away and buy the competition's car that will. If by force of law (and the guns that back it up) it is required that all cars discount the customer's safety, buyers won't buy (see the Volt and other mandated flops).

keith12345 • 8 years ago

So, you are by yourself, alone in your self-driving car. You encounter a situation where you (and only you) are probably going to die unless your car veers off the road and into a large crowd of pedestrians waiting on the sidewalk. Several of those pedestrians will die, and several more will be severely injured as a result. But you will live. Do you think it's OK for your car to be programmed to take that course of action, in order to put YOUR safety first and foremost?

DStuff • 8 years ago

Yep, I have the right of self preservation, and I expect my equipment to assist me in exercising that right. I also expect my estate to go after the manufacturer of a defective product if said product (self driving car in this case) fails to assist and instead inhibits those efforts.

But I also expect that a computer will not be able to do sufficiently complex ethical calculus to do anything more than freeze up in the moment of crisis, and Blue Screen of Death (literally) / reboot.

keith12345 • 8 years ago

It's a pretty universally accepted principle that one's individual rights extend only up to the line where they begin to infringe upon another's rights, and then they stop. You may have the right to self-preservation, but you certainly do not have the right to kill and maim innocent people to exercise it, because then you are quite clearly infringing on their rights. Frankly it's sickening and appalling that you are just fine with killing a whole bunch of innocent people just to save yourself. Society would not accept this behavior, and yes, there would eventually be legislation prohibiting such wanton, reckless and self-serving actions. If not from the get-go, then certainly soon after the first incidents where innocent people are killed by the actions of self-driving cars that are trying to save their drivers without any regard for the safety of anyone else.

Sharia M • 8 years ago

Actually, this a a very gray area in the law. Your intention in swerving the car is not to kill but to escape death. You are right that it appears reckless and self-serving to swerve the car, disregarding a substantial risk to others. However, to be reckless, the risk you create must also be unjustifiable. Is it justified to redirect harm aimed at you onto others? Under a "necessity" defense, generally, the harm you are averting (death to yourself) must be greater than the harm you create (death to many others). So it appears a necessity defense fails. However, we of course value our own lives more than others and this must be factored in.

I think the better question is: are these cars programmed to act in ways that reasonable humans would act anyway? Would a reasonable person swerve a car to avoid hitting a pole even if it meant potentially killing people in a crowd? Probably, yes. A car should act as ethically as a reasonable person. Of course in the heat of the moment humans make ethically flawed decisions and you could argue cars should aspire to eliminate self-preservational human bias in the name of greater good (utility). But to say someone is "selfish" in choosing a car that would act similarly to the reasonable person, seems going a bit far.

DStuff • 8 years ago

And the day that legislation goes into effect, I'll start making some serious cash opening a grey market shop that alters the programming. Probably won't even be that hard, it'll most likely be 4th rate code done in 3rd world code shops.

Human Nature will always override Utopian pipe dreams.

marcioab • 8 years ago

That is for sure. And if that means the streets will became a parking lot, so they will be. And at that point, we will recognize the city needs a new architecture.

DStuff • 8 years ago

Or the self driving cars will go un bought.

marcioab • 8 years ago

The new generation do not want to spend time driving anymore. They want to spend that time interacting with their smartphones and let the car get the destination ( leaded by Waze ). Get there 10, 20 or 30 minutes sooner or later, does not matter. The only problem is that this safety-first driver-less technology is 10 years delayed.

GManJamin • 8 years ago

There are a couple of things that article like this never address. So are these cars going to be fully autonomous or is a human required to be back up? It mentioned about not being able to drive in the snow but what does that exactly mean? Does the car just pull over and stop or does a human have to take over immediately? If the a human has to also be ready to take over when ever the computer faces an issue it can't solve (like ice, snow, hydroplaning, deer jumping out) it would really put a damper on demand. What if the human behind the wheel falls sleep? The same issue is why there are still human pilots in airplanes.

There is also the question of maintenance? Would this be fully covered for the life of the car at purchase? What happens 5-10 years in and sensors start failing? Does the manufacture still have liability in that situation? Would this potentially mean that people truly would not own the car but just rent it from the manufacture so the manufacture guarantees it is maintained to a certain standard?

Salva • 8 years ago

My only concern in the light of the VW innuendo is what would be the best way to ensure the algorithms are always doing what they are suppose to do vs. gaining questionable competitive advantage for the company? As engineer I always though doing the right thing was cheaper for the company on the long run as VW case is now demonstrating again, and yet a corporate culture was capable of doing the misapplication of technology systematically.

GlitchTrax • 7 years ago

Lovely sentiments, but isn't this a precious dream given that we do not have legal access to proprietary software without a long, wicked court fight nor do we have design controls over the "little black boxes" installed in cars that now determine fault? After all, it took over a decade, death, congressional hearings, court-ordered access to Toyota's engine control code, 7 engineers from The Barr Group pouring over millions of lines of code for 18 months before the software that triggers Sudden Unintended Acceleration was confirmed. All the while, Toyota's PR spin went into overdrive.

We now know that Toyota vehicles contained (contain still?) faulty software that causes a car to race out of control and yet the public really isn't up to speed on what exactly happened.

Before we get comfy cozy with the notion that companies will accept blame if their systems indicate that it was their code's fault, let's remember this: they design the very code that determines fault without any oversight or regulations.

Until safety critical software is regulated, we can not rest on our laurels and give ourselves and the public a false sense of security.

Schreuder Partners • 7 years ago

Public liability refers to accidents on public or private property and can occur in a wide range of circumstances and situations. The common theme in public liability accidents is that they result in a person being injured or killed due to another person’s negligence or failure to apply a reasonable standard of care. How will autonomous cars fit into this definition completely and overcome the challenges, I`m puzzled. We cover some of this stuff here too.

Stephen Bieda • 8 years ago

It is going to be interesting to see what happens to the auto insurance business over the next 10-20 years with increasing driverless functionality. Rates should go down and maybe auto insurance won't event need to exist at some point.

Morgan Folland • 8 years ago

Theft, damage from nature or weather, fire, etc. There will be some liability on owner/operator.

Pete_EE • 8 years ago

Suppose you've been drinking and ask your car to take you home. While you are between towns (or in a rough neighborhood) it starts to snow. What to do? Where is the legal responsibility?

Jon Steiner • 7 years ago

The would have to make the judgement that it wasn't mathematically certain it couldn't drive in the snow. Your car would be sued if it drove you, putting you in harms way. Maybe there could be an override feature as well, in emergencies.

costume • 8 years ago

Wonder what happens to driver skills if they atrophy while most of their time is spent as passenger. What if you live in LA and are driven everywhere for 11 months and then suddenly have to drive because it rains? And every other car on the road is now piloted by equally poor drivers?

marcioab • 8 years ago

Max speed limit in my town (Sao Paulo) is 50 Km/h. If a self-drive car stays at 40 Km/h, that is reasonable slow to avoid accidents and even in case they happen by some malfunction, it will not be that bad and will be fixed, even if that means the speed needs to go down to 30 Km/h.

Donald S Brant Jr • 8 years ago

Please have a friend hit you/run you down with their car at 30Km/h and let us know exactly how "not that bad" it is and how well you "will be fixed."

marcioab • 8 years ago

Sure there will be damage. It must be. By definition there is no 100% safety system. But at 30 Km/h it is exponentially smaller (or "not that bad") than at 90 Km/h. In the city I live, the speed limit was reduced from 90 Km/h to 50 Km/h and the number of accidents were reduced significantly (officially confirmed). But my point here is: (low) Speed will be the key factor.

PMiranda • 8 years ago

Makes sense. The carmaker was going to get sued anyway, so they might as well lean in and accept some liability so they can get the ball rolling. Over time, I expect it will pay off since an autonomous car should get into fewer accidents and carmakers already get sued in major accidents.

Biff Henerson • 8 years ago

Simple minds need a simple example. If an accident occurs with a self driving Volvo car and it is the car's fault, Volvo will accept all liability. This is a misleading way of saying that all Volvo car customers will have to pay for all damages related to said liability.
There's no free lunch folks. If Volvo pays, they simply pass the cost onto the Customer. It doesn't cost Volvo anything other than a tarnished brand which is generally short lived.
Perhaps they need to face a fine of 5% of the net worth of the company. Then they might spend a few extra hours reviewing their software and hardware.

djb72 • 8 years ago

That's a pretty dumb idea. When a human being causes a car crash we don't fine them 5% of their net worth so they learn to drive more responsibly.

DStuff • 8 years ago

I'm sure that some insurance company will be happy to take money to cover the "infinitesimal" chance that a program will get something wrong. That's a no brainier compared to covering most human drivers.

keith12345 • 8 years ago

So if that cost becomes too high, you can choose not to buy Volvo. Not really a life-altering event. Unlike being hit with liability in an accident which can ruin someone financially. For a lot of people having that security insurance may be well worth a modest increase in the cost of the car.