Friday, November 6, 2015

Can kill robot cars in case of accidents the driver: “killer software”? – German Economic News

Self-propelled cars provide software developers before a moral dilemma: you have to program the car to decide when unavoidable accident, whether the driver or the pedestrians should be protected. Researchers have now investigated in a study, if customers would buy a car with “killer software”.

 Anyone who buys a car that is programmed to the owners in an accident to t & # XF6; th to save others (Photo: EPA / GOOGLE / dpa)?

Anyone who buys a car that is programmed to to kill the owner in an accident in order to save others? (Photo: EPA / GOOGLE / dpa)

In the development of self-propelled cars software developers are facing some moral dilemmas: When in the car for a certain behavior Programming accidents, they have to decide in advance, should react when and how a car. A social consensus To teach cars the morally correct behavior should initially: also situations The problem come on, where the car has to outweigh the driver’s life against the lives of passers

there. the morally correct behavior with some tough questions exist : How should the car be programmed to act in case of an unavoidable accident? Should the vehicle endanger human lives as little as possible, even if it means sacrificing their own driver, or to protect the occupants at all costs it? Or should it choose between these extremes by chance?

Scientists from the university in Toulouse have now investigated this issue. As the technology magazine Technology Review reported they have made to Polls how people would react in certain accident scenarios. These were difficult to moral scenarios:

“Imagine that in the not too distant future with a self-driving car. One day while driving, causing an unfortunate series of events, that the car moving in its ten-headed crowd. There is not enough time to brake, but it can result in death of 10 people avoid by steers in a wall. This conflict, however, would kill the driver and passengers. What should be the vehicle to do?

The question the researchers varied some of the details, such as the number of pedestrians, whether the driver or the on-board computer makes the decision and whether the participants around the scene as occupant or anonymous observer to imagine, then they analyzed the answers the study participants from

The result: In theory, the people agree with the idea that self-propelled vehicles should be programmed to minimize the number of deaths – a dead man is so. the social consensus less bad than ten deaths. However, only in theory: self wanted the participants as a car that is not drive – and therefore do not buy

The carmaker will therefore sell more cars, if you are installing software that the. first principle of robot ethics followed: A robot protects therefore first of its owner. (Read more here in an interview with the BMW developers for automated driving.)

LikeTweet

No comments:

Post a Comment