I would like to present mind-blowing article about self-driving cars' ethics:
Autonomous self-driving cars are already on our (ok, their ;-)) streets!
Just check this out:
With the convenience of autonomous cars come moral questions about algorithms that decide what to do in case of the accident.
"How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?"
What's your opinion about that?
Will you buy a self-driving car? (http://www.businessinsider.com/report-10-million-self-driving-cars-will-be-on-the-road-by-2020-2015-5-6)
-------------------------
There is also a scientific paper, that the "Technology Review" article refers to: http://arxiv.org/abs/1510.03346
Read that if you prefer more scientific approach rather than 'popular scientific' style. However, reading the article from "Technology Review" would be enough for the discussion.
-------------------------
How about "http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/" ?
ReplyDelete** Supposedly Hotz's work might power Tesla cars.
I put my 2 cents on `semi` self-driving cars. Latest premium cars already have promising steps towards a widely acceptable semi self-driving car. i.e. Mercedes Benz provides assisted break on all the models as a standard security feature. Such feature requires a pretty decent radar sensor in order to function at 200 Km/h. Besides radar sensor, I would expect a decent computing power and the software behind all the sensors.
There are many other similar features to mention though. However, I wouldn't expect any of these technology to reach fully autonomous self driving 'personal' car level.
The article is very unilateral. First of all I suppose autonomous cars won't be speeding. At 50 kmph chance of killing a pedestrian is smaller that at, say 80 which is average speed of a car driven by a human in urbanized area. Also hitting a wall in a car equipped in airbags and side curtains doesn't have to be fatal for the driver and passengers. Second, let's think what people do when a pedestrian run in front of their car. They instinctively turn the wheel, not even thinking about the wall or trees or curbs on the side of the road. Autonomous car will react faster so the speed at (possible) impact will be lower.
ReplyDeleteYes, I'll buy a self-driving car if I can afford one. I use public transport now only because I can relax and read a book instead of driving and staying focused on a road.
I have exactly the same opinion on the article, as it is very unilateral, not to say trully narrowed or even naive (would you believe that car in situation b or c would just stop on the wall without further slide movement towards pedestrians?).
ReplyDeleteThe question regarding unavoidable accident is not well articulated, what is the perspective? The driver's perspective? Or the 'greater' good's perspective?
I can't imagine anyone willing to buy a car with programmed behaviour to kill driver if number of casulties exceeds number of passengers in the car.
But.. in cost-saving times, I could easily imagine really bizarre scenario like: quick evaluation of casualties and damages vs insurance summons total amount? In such cases, algorithms will be ruthless..
Because of that, I'm not of fan of the whole idea hence i'm not looking forward to more self-driving vehicles. But I must admit I'm huge fan of Tesla model S:) and some of features like adaptive cruise-control can be very useful without stripping the joy from driving the car.
I read this article and watched youtube films with great interest. I love new technology and I like driving my car. When I am driving I feel free – like a medieval knight. I can drive slowly or fast (usually), I can drive here or there and this is a kind of freedom for me. But what about this self-driving cars? I feel confused. on one hand I see a lot of customers wanting to buy these cars: the elderly, the disabled, women. But on the other hand for a typical man this is not acceptable to use such a car. I would be frustrated if I couldn’t decide about my car. me and my colleagues can’t imagine that we get into the car and we can only press one button. I hate functions which work without my wish. I like watching SF movies and in these movies all vehicles drive or fly without human control but fortunately it is only an SF vision. For me this is not an acceptable idea for everybody. Another issue is about programming self-driving cars to drive safer and safer. This is typical that a machine must be programmed to work and this is very difficult to programme something without any mistakes. I think future self-driving cars should have a so called decision panel. For example if I am driving with my children I care about their security and it is the most important to me. If I am driving alone I care first about the children on the road etc. I think I will never buy a self-driving car but if it is mandatory I will buy a car with a decision panel only. I must decide about my children and other people’s safety from my individual point of view.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHi,
ReplyDeleteIt is very interest topic , but we must discuss about few things when we use self-driving cars because it is dangerous, for example
it is easy way to kill people (this example is in article)
A few question to discuss below:
1.Who will be responsible for killing a person?
2.Who will be responsible for accident ?
3.Which kind of road use self-driving cars ? Because in my opinion we can't use this in local road or road which have
Pedestrian crossings without lights.
And also we must speak about dangerous with software because software is not excellent and sometimes have a bugs.
I think we will use this kind of cars but it is long way to do this
I would like use this but when I have a ensure that this kind of cars are safe.
First of all, I can't agree with primary statement in this article - that you have to kill someone if the accident is unavoidable.
ReplyDeleteLet's consider ideal situation (hypothetical, not very realistic) where the only cars on the streets are self-driving cars. In that case, the number of situations when accident is unavoidable is very limited:
- technical damage in other vehicle;
- road surface damage that makes autonomous vehicle(s) spin out of control;
- unpredictable behavior of pedestrians or (motor-)cyclists.
There is little we could do about these situations; there is no and probably could be no procedure how to behave in these regards. We can address only a handful of them, not all of them. After all, they are all unforeseeable! We don't know how to behave: everyone behaves in a different way, lest we speak about autonomous vehicles.
For that reason, some legislators already says, that there have to be a human behind the wheel ready to take actions in an unpredictable circumstances.
Now, onto killing. Well, I am not that young anymore and it matters here. In the early 1990's Japanese car makers threatened to mass produce cars with external air bags by 1993. It didn't happened mostly because the onboard computers weren't powerful enough to detect the unavoidable collision. Now it is possible, for autonomous vehicle it is a basic requirement. So why not to put external thermodynamic (it doesn't have to be air...) bag into every self-driving car? That would be enough for most of accidents. Sure, when you hit a wall at 30 meters per second (or 108 km/h if you prefer) there is little we could do about it. But, I believe that autonomous vehicles would never allow for such an event. And for more than one reason:
- the only ways that AV's could go as fast as I mentioned above are motorways (or highways in US English), and this kind of speed limit is available only on isolated ones (no pedestrians!);
- the human reaction time is at best (for a very conscious driver) 0.8 second between observing a problem and hitting the brakes. The brakes themselves have also some response time (around 0.5 second). That means that if you're really good, you will start decelerating in 1.3 second and if you're speeding at only 20m/s (72 km/h) the delay results in driving (at least) 33 meters before even starting to decelerate. Computers are better than humans in that regards: first of all, they will keep up to speed limits (less than 14 m/s in a city); their reaction time will be basically the breaking system delay time (0.5s) which equates to driving only about 7m before decelerating. At the same time we tend not to press breaks strong enough, computers will be better than us in this regard. Meaning? No need to kill anyone.
I really like the analysis put forward by Paweł and I agree with most of his points. I believe that autonomous, self-driving cars will be much safer both for passengers and pedestrians, bystanders or other cars. The technology is moving forward rapidly and I'm pretty sure the security measures will be improved constantly.
ReplyDeleteI would love to buy a self-driving car. I don't own a car right now and commute using public transport. I think that self-driving car would much more convenient and and at least equally time saving.
Autonomous vehicles should be programmed to first obey all traffic laws, and then to save the life of the occupant of the vehicle. This is a simple mostly correct approach. Human discovered those laws in order to avoid any accident. Please keep in mind that vast majority of accidents are at least partially caused by driver error. So if we magically eliminated driver error tomorrow, we would cut out about 95% of traffic collisions. In my opinion autonomous cars will be less likely to cause an accident, so there's a safety motive for having one. Theoretically, they should also be better at avoiding the vehicles operated by drunk, inattentive or reckless humans. Those cars replace the distracted and error-prone human operator with technology that has an unlimited attention span, processes data faster than humanly possible and is not allowed to violate traffic laws. The only real question is whether the software and sensors are up to the demands of real-world driving.
ReplyDeleteThank you for this article. It is very interesting. I must agree with my predecessors that the dilemma depicted in the article is fake. If constructors were able to an effort allowing to conceive and build an autonomous car they are certainly able to figure out ways of minimizing a risk of a collision threatening peoples life.
ReplyDeleteI would not buy such a car because living in Warsaw it does not matter to me if I am stuck in traffic jams in a car (self-driving or not) or in public transport.
I read somewhere that in California they want to introduce some sort o licence, for people wanted to be a passenger of this kind of vehicle. I must say that self driving car is really great idea especially for disabled people and this should really improve the situation on roads.
ReplyDeleteI would like to buy this kind of car but I think that the price for it will be very high, same situation is with hybrid cars and Tesla models.
Hi. First of all I would like to introduce you my point of view. Company where I am employed is one of the leaders in autonomous cars solutions. Project have to ensure drivers and road users proper level of comfort and safety. Part of technology, which come from autonomous cars is implemented in vehicles which are on the roads. This solutions are for example: automated parking system, line assist or obstacle detection and avoidance system.
ReplyDeleteSystems moves to advanced stage of autonomous car development. This solution can not be implemented without advanced tests . Projects left controlled environment in laboratories and are present on roads (of course with professional drivers, and software engineers).
This car will respect traffic regulations, can not be drunk, and has thousands "skills" based on algorithms. Bugs in this kind of solutions are not acceptable.
This is very interesting and pretty old dilemma - should you sacrifice your life and save somebody else or maybe you've a right to protect yourself from danger and death to the end ? Quantitative aspect is interesting as well - maybe you car can protect your self but only when there is life for life equation ? I guess there would be never rewarding answer for this question
ReplyDeleteIt's hard to simple state that I would like to survive at any cost. Too many variables and so on. Despite the fact that your car can have a bug and actually kill you without any warning.
ReplyDeleteLook at California State latest statements about this:
http://www.theverge.com/2015/12/16/10325672/california-dmv-regulations-autonomous-car
In my opinion it's against first law of Robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
How to code something like this?
Moral dilemmas programmers such devices are large, but I think the program depends on the possibilities offered by the car. For example, if you have at our disposal a super brakes - we do not have a dilemma - if you encounter such a dilemma we can limit the speed of the car. the reactions of the car must be as close as possible to human behavior (of course these correct). And in everyday life rarely are heading dilemmas - bump into a tree or a column of pedestrians. The same car should protect passengers mainly.
ReplyDeleteI wondered once and would like to have this car. I always wanted to read the newspaper or a book in my car driving to work :)
I think it might be not that easy to sell a car which kills the driver as a first choice : ) of course issue is very complex, at first I thought it should be random but now I tend to think there should be somekind of decision making process concerning minimalising the loss of life. I dream about self driving cars accually alrhough I like driving I always imagine what I would do with all this spare time I spend in the car. I also think it would be much safer with automatic cars - yes I trust machines much more than people :)
ReplyDeleteI agree with the fact that the article is biased. The topic is very interesting. On the one hand, autonomous cars to comply with regulations, have not been traveled by intoxicated faster response time would be, there would be no so-called Sunday drivers, that this all sounds promising, but there are several problems. Firstly, such a car would have to contain a lot of sensors, which is when one of them stops working, or will generate errors, secondly, the software also can contain errors, thirdly, the car certainly will connect to the Internet, the question of security, whether hackers will be able to interfere with the operation of its system, these questions are very many. I very much like lead and even cars with manual gearboxes, certainly if I had not buy such a car.
ReplyDeleteI agree with the fact that the article is biased. The topic is very interesting. On the one hand, autonomous cars to comply with regulations, have not been traveled by intoxicated faster response time would be, there would be no so-called Sunday drivers, that this all sounds promising, but there are several problems. Firstly, such a car would have to contain a lot of sensors, which is when one of them stops working, or will generate errors, secondly, the software also can contain errors, thirdly, the car certainly will connect to the Internet, the question of security, whether hackers will be able to interfere with the operation of its system, these questions are very many. I very much like lead and even cars with manual gearboxes, certainly if I had not buy such a car.
ReplyDeleteThis topic from what I remember was already discussed on this blog, and from what I know, my opinion has changed on this issue :) I think it is only a matter of time that such cars are legal and driving on our roads. Personally I have nothing against legalizing autonomous cars once they prove safe.
ReplyDeleteHi it's really impressive to see how those cars function in a real environment. Previously I have only seen footage of such cars driven in special dedicated tracks but I guess they start to meet certain safety levels which allow them to use them on real streets. This topic raises a lot of ethic issues and some of them are highlighted in the article that you have shared. It really comes down to how we define greater good and best possible outcome so if something goes bad we should minimize casualties.
ReplyDelete"I started freaking out, before the car did!" That quote made my day (or night). ;)
"How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?"
ReplyDeleteThere is no good answer on your questions and I think that nobody will ever be able to answer on them. It all depends on the situation. The biggest problem is that we can't create an alghoritms of situations like this because we don't know what the answer should be. To answer on that we would have to decide who deserve to live and who diserve to die. Is anybody able to such thing ?
Nice topic :) I've heard stories from people who actually drove cars in "self-driving" mode and according to what they said it has to be amazing experience. It sounds useful when we have to travel 200+ kilometres being, let's say, tired. On the other hand I'm not sure if I could trust a car so much to let it drive without my full attention. I don't even use automatic parking feature (which works very well), I prefer to do it by myself. It means, that I probably wouldn't buy a car like this (yet).
ReplyDeleteIn my opinion a self-driving car should mimic a behavior of a real driver. It should think of his safety in the first place.
ReplyDeleteHowever in this case the owner should be responsible for the damages caused by the car. Today we are responsible for our kids and pets.
If we were to drive a car that makes his on decisions and interacts with the enviroment we need to accept the responsibility.
Another option is to limit the car mobility, like it is today. A car that drives slowly and has additional markings showing that it is
autonomus can minimalize the risk of potential accident in the traffic.
The future requiers to face many others ethical dihlemas. The science-fiction writers creates many scenarios based on the clash of
the technology and human nature. This shows that more progres requiers more responsibility.
I would buy myself a autonomus car because it would allow to do more interesting things instead of being focused on driving especially
during long cruises.
In my opinion a self-driving car should mimic a behavior of a real driver. It should think of his safety in the first place.
ReplyDeleteHowever in this case the owner should be responsible for the damages caused by the car. Today we are responsible for our kids and pets.
If we were to drive a car that makes his on decisions and interacts with the enviroment we need to accept the responsibility.
Another option is to limit the car mobility, like it is today. A car that drives slowly and has additional markings showing that it is
autonomus can minimalize the risk of potential accident in the traffic.
The future requiers to face many others ethical dihlemas. The science-fiction writers creates many scenarios based on the clash of
the technology and human nature. This shows that more progres requiers more responsibility.
I would buy myself a autonomus car because it would allow to do more interesting things instead of being focused on driving especially
during long cruises.
This is a very interesting problem and not a simple one at all. We have to make a whole lot of assumptions before we even try making a decision here. Do we assume all human lives are equally valuable, or maybe some people's lives are worth protecting more than others'? If so, then based on what?
ReplyDeleteAlso, who should be held responsible for the crash, should it ever happen? The vehicle's owner? Its passenger? Or maybe whoever programmed it?
And finally, would you ever buy an autonomous car programmed to sacrifice you under whatever circumstances? I wouldn't and neither would most people with a sense of self-preservation.
I agree with Marcin, that traffic law was created to decrease amount of car accidents, so obeying them by self-driving car should almost eliminate dangerous situations. I think that technology used in such vehicles will also protect the owner and the people outside the car, so dilemma described in an article will be a very rare situation.
ReplyDeleteI would like to have self-driving car, but it am not sure if driving it in Warsaw will be pleasant, at least now public communication (especially trams and subway) are much faster.
In my opinion self-driving car should not have to decide what to do in that situations. Unfortunately that's impossible, it is for sure that one day that will be inevitable. I don't want to decide what car should do. If I would be the driver, I would like to survive, that same if I would be the person in front of that car. The crucial question is who will take that decision during programing that car.
ReplyDeleteNow this kind of cars are too strange and weird for me. On the one hand it's more safely, but in the other hand the driver is less responsible for car action.
Interesting moral case ...if I had a car like that obviously, as an owner, I would say that I want it always to preserve my life and the life of my family driving in it. From the point of view of the society probably it would be better to preserve life of the pedestrians and sacrifice the life of the car owner, since it was due to his choice (the choice of a self-driving car) that the accident occurred in the first place. So here we get to a conflict of interest and maybe there are objective variables that can give a numeric value to each element of this moral situation and come up with a conclusion such as: "life of A is of higher value that life of B" and hence... crash. In case of a draw, well, maybe both A and B should be equally a victim to keep the situation fair? Non-the-less, putting things in such a way is somehow in contradiction to my morality....
ReplyDelete