Hi everyone! I would like to share with you an article about autonomous cars and moral decisions made by algorithms inside those cars.
I would like to ask you:
1. What do you think about autonomous cars? Are we ready to use them?
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
3. What do you think about morality of algorithms that will make decisions in future?
Article:
http://arxiv.org/pdf/1510.03346.pdf
Google autonomus car:
https://www.youtube.com/watch?v=CqSDWoAhvLU
M.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteI have to admit I'm not of fan of the whole idea hence I'm not looking forward to more self-driving vehicles. I must admit I'm huge fan of Tesla model S:) and some of features like adaptive cruise-control can be very useful without stripping the joy from driving the car.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
What do you mean by bad decision here?
The question is not well articulated as it missed the perspective. What is the perspective? The driver's perspective? Or the 'greater' good's perspective?
If it's not driver's perspective, then I can't imagine anyone willing to buy a car with programmed behaviour to kill driver if number of casulties exceeds number of passengers in the car.
3. What do you think about morality of algorithms that will make decisions in future?
There isn't such thing as morality of algorithms, as they will be perfectly ruthless executing designed logic.
Hi.
DeleteThx for your comment. I'm also a fan of Tesla but now I have e63 AMG with huge engine :) I love to drive my car.
Back to questions.
What I meant, that bad decisions is when algorithm need to kill driver or pedestrian and there is no choice. In that kind of situation there are none proper decisions or we can only assume. In that case who should take the responsibility?
About last question, I think, morality of algorithm is very difficult question. I see correlation with developer morality and the inventor of autonomous car. As you said, there is no such thing as morality of algorithms but they are made by humans which has different perception of morality.
Personally i'm afraid that we mentally we won't be ready, until digital generation grow up.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteWe are being lazier every day. Why not, look at Tesla they have beta version of their operating system with autonomous driving. However, we should simulate real environment in closed environment before. Prepare proper testing scenarios etc.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
Bad decision: killing driver or killing pedestrian. Who you want to blame? Developer, manufacturer, other drivers? It's hard to answer.
3. What do you think about morality of algorithms that will make decisions in future?
As i have already wrote, we are not prepared to talk about morality. If we eliminate human beings from this equation, we can start talking about it. With every single car driven by computer we can expect proper behaviour and so on. It's hard to predict, killing 4 people in a car or 4 people on the street. How to choose?
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteAutonomous cars are great but this technology is still in beta it's sad but it is. We are not ready to use them :( . First issue is that, that our law is not ready for this kind of technology ( I think that in california is or was similar situation) and this concerns responsibility for bad decisions that car made. As far as I remember in California they want to add article in law about how will be responsible for car mistakes and they want to provide a role of "car administrator" but I think that they didn't make it with this record. There is a lot of discussion regarding autonomous cars this short articles shows their state for end of 2015
http://www.theverge.com/2015/12/16/10325672/california-dmv-regulations-autonomous-car
In the beginning autonomous cars will be great for disabled people and also for elder people who should not drive a car. But in the end this kind of communication will be great improvement in our lives.
2. There should be some "red button" that stops car immediately. Responsible for car should take its owner. Yes I know that there will be some pros and cons for this statement and there will be a lot of discussion on this subject. This is quite similar situation as it was with space ships. The development process took 2 years and next 5 years software was under intensive testing and as we can remember there were some issues connected with bad decisions made by program.
And there should be two kind if mistakes one connected to process od software development and other one connected with mistakes of software but with little help of human :) - similar situation took place in California with Google Car.
3. In this point I totally agree with s14772 algorithms are ruthless. Yes algorithms are made by humans but it not the kind of algorithms that you have in business programs etc.. This algorithms are build to learn themselves to gather knowledge and use it. So after learning phase of algorithm developer can't do much.
I guess this subject is some kind of boomerang, but the must is a must.
ReplyDelete1. What do you think about autonomous cars? Are we ready to use them?
I think you're asking wrong question. The right question is whether autonomous vehicles are good enough. The answer is - I believe - not yet.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
Who takes the responsibility when application's algorithm causes failure in say nuclear power plant? Who is responsible when automatic train takes on wrong track leading to horrible accident?
On the other hand who is to blame when financial forecast software predicts wrong numbers and you over-invest?
I believe this situation is much closer to prior points, so I guess the vehicle and software manufacturer(s) should be held responsible.
3. What do you think about morality of algorithms that will make decisions in future?
Algorithms have no morality, these are dumb, abstract creatures. Unless you make them not, algorithms will repeatedly take on the same decisions. The question is, are these decisions right or wrong. But I am not the one to judge.
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDelete1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteIn my opinion autonomous cars is not ready to use this.
We will speak later when producer this cars use a lot of tests.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
2.It is very important question for this product but in my opinion,we can't make a robot or some things without bugs so I think who drive this car , this person is responsibility for wrong decision .
3. What do you think about morality of algorithms that will make decisions in future?
3. It is very good question but in my opinion algorithms will make decisions which it can ,because programmer who make alghoritm so in my opinion we don't have Morality of algorithms
1- What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteIn the 2020s, the streets of our cities could well be filled completely autonomous vehicles, able to move only from point A to point B. This new type of transportation involves many issues, especially at the reactions in an emergency. On paper, no car will be fully automated in the sense that the driver should always have the opportunity to "regain control" to make a decision in case of problems.
frankly speaking I'm not ready and I can't be.
2- Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
The reason it seems like an issue is because human drivers speed and get distracted so we're often faced with the "Wow, there's a pothole there and I need to deal with it right now because it's right in front of me" situations. But a driverless car won't get distracted, won't speed, and will likely see the problem well in advance of it being a problem. So the car wouldn't take the suspension-busting hit. It would slow down, wait for traffic to pass then avoid the obstacle in a safe fashion. Or it would decide there is no safe way to avoid the obstacle and pull over. Or it would handle it improperly, but probably at far lower rates than humans, and the programmers would then fix the problem and cars would handle it better in the future
3- What do you think about morality of algorithms that will make decisions in future?
Morality of algorithms. I never heard about this
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteIn my opinion a self-driving car should mimic a behavior of a real driver. Are we ready to use them? I think that in Poland, not yet :) The most optimum situation is when there is a majority of the self-driving cars because they would be able to communicate with each other.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
The cars should 'think' of their safety in the first place.
However in this case the owner should be responsible for the damages caused by the car. Today we are responsible for our kids and pets.
If we were to drive a car that makes his on decisions and interacts with the enviroment we need to accept the responsibility.
Another option is to limit the car mobility, like it is today. A car that drives slowly and has additional markings showing that it is
autonomus can minimalize the risk of potential accident in the traffic.
The future requiers to face many others ethical dihlemas. The science-fiction writers creates many scenarios based on the clash of
the technology and human nature. This shows that more progres requiers more responsibility.
3. What do you think about morality of algorithms that will make decisions in future?
I think that there is no real morality of the algorithms. Programmers create the morality of the behavior of the cars and if the buyer chooses among different available actions the car take he also takes responsibility.
I read an article about google car that had two minor accidents. If I good remember one of them was caused by another driver. I think we are ready for this kind of cars but better question is do we need them? I love driving and it is pleasure for me. Additionally most of this cars is based on perfect state of infrastructure. I know that the algorithm might be better driver than person who is not professional driver.
ReplyDeleteWho is responsible for bad decisions of the algorithms? Let’s take a look on our president accident. Is there a chance for the algorithm? In this kind of situation (tires, debris on the roadway, speed) is it producer of algorithm fault? Sometimes good skills, focus are important but you need a luck too.
Morality. Should we focus (as programmer) on save life of passengers or other traffic participants? I do not know and I do not want to know.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteI like the idea of autonomous cars. They will be a great solution for people how cannot drive or are a bit afraid of being a driver. Personally I like to drive a car, but from time to time I prefer to just sit, relax, read a book and not focus on a road.
I am not sure if we are ready for autonomous cars. It seems that they are still the technology of a future (but maybe not so far future).
I also think that we should somehow separate autonomous vehicles from the "regular" ones, because the behaviour of the algorithms can be very different then people way of driving and it may affect on the safety of autonomous traffic.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
The bad decision is probably the action of a car that leads to an accident. It is really difficult to blame someone for that. The owner of a car has probably not so big influence on a car action, the developer of a software probably has done everything according to the requirements, the company which produced the car might probably not predicted this specific situation. That's a really thought question you asked.
3. What do you think about morality of algorithms that will make decisions in future?
I agree with many of previous opinions, that algorithms have not self-awareness, so they also have no morality.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDelete2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
3. What do you think about morality of algorithms that will make decisions in future?
1. Yes and No. We almoust there in terms of technology but we are not ready to use it. We need to make a very clear situation, either we all use autonomus cars or nobody will use them. Problem with the technology is that it reacts badly for unlogical situations that are caused by human drivers. That is why if we want to have save roads with autonomus cars, all of them need to be autonomus. Until we allow people to drive we can't allow robots to drive
2. I don't see any reason why it should be different than in the rest of the IT sector. Always the creator (team/company) is taking full responsibility for the errors done by the programm, same situation is here. If we allow only autonomus cars to drive than we will remove the unlogical and unsafe drivers, this way any error will be 100% fault of the programmers
3. There is no good answer on this question. You will probably receive as many question as many people you will ask that is why we should forget about asking such questions
The only reason I commute by train instead of a car is that I can relax and read a book instead of watching the road and staying focused. I'm quite ready for autonomous car, just give me one. I'd really like to spend less time commuting and to read a book at the same time. I think that robot driver can predict better and react faster that human driver. It will also be more of defensive driver kind, so there will be less road rage, less tailgating and so on. Let's face the facts: according to polish police's report on road accidents, main reasons are not giving priority to other vehicles, speeding and incorrect overtaking and tailgating (source: http://statystyka.policja.pl/st/ruch-drogowy/76562,Wypadki-drogowe-raporty-roczne.html).
ReplyDeleteAutonomous cars can solve all of these problems. Of course I don'y think that accidents won't happen, but I'm sure it's number will decrease with increasing number of self driving cars on the streets.
I cannot answer the responsibility or morality question, but consider that law abiding autonomous car isn't very likely to hit a pedestrian unless a pedestrian runs in front of a car, so who's to blame, manufacturer, software developer or a pedestrian?
Hey, thanks for posting. I am pretty sure we have already talked about this topic probably during last semester, it sounds very similar. Regarding your questions:
ReplyDelete1. I think we are far from ready to use autonomous cars, the related technologies have not matured at an acceptable level to trust them with our lives. However we are doing fine progress towards getting there at some point in the not so distant future.
2. Its a difficult question to ask, it is logical to make responsible the people that created the algorithms however then we could say we could make responsible nature (or god) who created these people. ;) In the future when AI becomes self-concious and capable of being judged (by whom, that's another good question) we could blame the AI.
3. Ideally advanced AI will make decisions unbiased from any form of influences and will make the most objective decisions to achieve best results. But you see we can't even decide what are these best results. :) I guess the AI will have to figure this out.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteI love the idea. I think that our grandchildren won't believe us that we drove cars on streets by ourselves. I like driving but facts are facts - car is the most dangerous transportation solution. Traffic-related death rate statistics are terrifying (https://en.wikipedia.org/wiki/List_of_countries_by_traffic-related_death_rate). However, I think that we are not ready yet and it's not because of immature technology but rather because of the law and.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
An example of a bad decision would be not to stop at a red light, although I can imagine that event this could be a good decision if it was to avoid an accident. But in terms of self driving cars most people are afraid that algorithms will make bad decisions but I'm pretty sure that statistically algorithms will make a way better decisions that humans. The only problem is that some of decisions will be controversial.
3. What do you think about morality of algorithms that will make decisions in future?
Algorithms (at least for now) are not conscious of their decisions so from my point of view we shouldn't discuss a morality of algorithms but a morality of their creators.
Hi M.
ReplyDeleteIn the begining I would like to say that I love motorisaion, taht I should say thank you for your choice.
1. What do you think about autonomous cars? Are we ready to use them?
That is difficult and that is depend of situaction. I had previously heard of autonomous or self-driving vehicles. I have a positive opinion of the technology and I have high expectations about that technology. I woundering about security issues related to self-driving vehicles – I think that king of technology not performing as well as actual drivers. I would like to have that technology in my car but I wouldn’t like to pay extra for that technology. Despite the fact that a potential safety-risk exists during the transition if it becomes necessary to hand control back to the human driver – taht is necessary requuirement for me.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
Oh, that is simple (for me) – I think that whole responsible must be on the producer and maker side. Every bad situaction must be recompense by they.
3. What do you think about morality of algorithms that will make decisions in future?
4.
Hmmm… that is depend of people, poeple who will be build the structure of programms. But… upon reflections and life experience that algorithms are accepted by managers, bosses and presidens – that is really team of decision people.
1. Engineers have started working on autonomous cars a long time ago. This Cars was improving through the years. Army has put a lot of work in projects of this type. Autonomous cars are working better and better now, However, I like drive a car. If the technology will be reliable (most reliable), it will be more popular than now. If I win in a lotto I would bay autonomous cars.
ReplyDelete2. Many factors accompanies accident usually. If you want to say that the decision was wrong, you would have to showed better decision in this moment. You will have to prove your theory. It is very difficult usually.
3. When we have problems with evaluating human morality, we will have problems with evaluated morality algorithms. Haven’t we?
This comment has been removed by the author.
ReplyDelete1)First I must say that this idea is very right and noble, but for God sake do not allow IT guys to design the look of the cars.
ReplyDelete2) I think the fault of the wrong decissions taken by the algorithm resides in the hands of the maker of such algorithm - I can't say what are the bad decisions here, because this topic is very new and not well tested. So far there was no bad decissions taken in real life.
3) In my opinion there is no answer for that question
I really like this solution, undoubtedly many people will benefit from it, mainly disable people. But it may also lead to such a situation where this type of cars will transport most of the goods and will replace many thousands of working drivers. In my opinion this technology still requires additional work, for example on the security algorithms, but also on the legal side of its usage. The ban on using the manually driven cars may have a better success rate in the future then, when matched with decreasing cost of travel and rate of accidents.
ReplyDeleteI follow topic for a long time. It is very interesting, observing the actions of the car google, I think we are ready for the autonomous car. But only when they are protected from attack from outside.
ReplyDeleteAlgorithms have to "make decisions" best for the situation. Tests take several years. If it is proved that the algorithm was bad or dangerous, then the company should be held responsible.
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteI think that we are not ready to use fully autonomous cars due to the fact that we are not ready to pass ethical decisions to machines. However we are ready utilise that technology in some circumstances, for example during the parking activities.
(see more at " https://www.technologyreview.com/s/539731/how-to-help-self-driving-cars-make-ethical-decisions/")
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
That question should me slightly change: Is anyone ready to take the responsibility for bad decisions of the algorithms? Answer to such question will sound "No". In consequences for the long period of time we will have drivers in each car.
3. What do you think about morality of algorithms that will make decisions in future?
I am not an expert but I think that word "morality" is strictly related to word " humanism". If we would like to be humans we have to keep moral decisions to ourselves not to algorithms.
At the beginning I would rather separate autonomous cars from regular cars or introduce the former ones step by step into the traffic. Driving a car is a great pleasure, of course we are not accustomed to automatic driving yet but I would be willing to try innovative way of using this kind of a vehicle. In fact I cannot wait to do it. We should always try to do new things and make progress in every sphere of our lives.
ReplyDeleteI think that only the producers are to be blamed for any mistakes.
The question of morality must stay unsolved-even among humans. And such issue as far as algorithms are concerned is just too far -fetched.
I've heard stories from people who actually drove cars in "self-driving" mode and according to what they said it has to be amazing experience. It sounds useful when we have to travel 200+ kilometres being, let's say, tired. On the other hand I'm not sure if I could trust a car so much to let it drive without my full attention. I don't even use automatic parking feature (which works very well), I prefer to do it by myself. It means, that I probably wouldn't buy a car like this (yet).
ReplyDelete1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteFirst tests and tests and tests.
Then maybe we will be ready to use that technology.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
Car manufacturer should take the responsibility for bad decisions of the algorithms.
3. What do you think about morality of algorithms that will make decisions in future?
There is no morality of algorithms...
1. What do you think about autonomous cars? Are we ready to use them?
ReplyDeleteWe are ready to use autonomous cars. People slow reacting in difficult situation, computer don't have that problem. It makes decisions fast and that play crucial role in lifesaving. We should use them and not afraid of them. At the beginning people should have opportunity to drive car manually, but in passage of time that will be not needed just like for example the elevators.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
The car designer should take the responsibility if something bad will happened. The algorithm should calculate the best decision in every situation.0
3. What do you think about morality of algorithms that will make decisions in future?
Algorithms don't have morality, and we can talk about ethics and responsibility, but the main goal is to eliminate that situation and designer are focusing on that more that morality.
ReplyDelete1. What do you think about autonomous cars? Are we ready to use them?
I recon many people would love the idea of not having to drive every day when commuting to work or focus on the road when they go on holiday. So yes, why not? As for safety it has been written in the article that “the wide adoption of self-driving, Autonomous Vehicles (AVs) promises to dramatically reduce the number of traffic accidents.” We can also constantly improve safety by introducing new safety features.
2. Who should take the responsibility for bad decisions of the algorithms? What are for you those bad decisions - your personal opinion?
Personally I can hardly imagine a trial where a guilty for an algorithm bad decision is sought, unless accidents happen frequently in certain circumstances and nothing is being repaired.
3. What do you think about morality of algorithms that will make decisions in future?
There is always a team of people working on an algorithm. If their intention is to build something good and test it properly then there is no question of immorality.