Debates Forum

Debates Forum

  1. Joined
    29 Dec '08
    Moves
    6788
    02 Nov '15 01:42
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over a bunch of hikers walking (legally) on a mountain road, or driving off the road to certain death. Thinking rationally, which do you choose? Put another way, what is the right thing to do?

    According to my source, it is technically feasible that some situations like this will occur when in a self-driving car; one that can detect and analyze the options and choose, fast enough to make a difference, and returning its control to you to decide will not give you enough time to react.

    How should the software of the programmed driving system be coded to choose?

    If a car manufacturer’s programming chooses to save you and kill some of the hikers, and another does the opposite, should this programming be disclosed before purchase?

    Should the choice be offered to the car buyer to make, and programmed into the system at purchase?

    Other thoughts?

    I know this thread will not get the usual politically driven responses, but that’s a feature, not a bug.
  2. Subscriberkmax87
    Land of Free
    Health and Education
    Joined
    09 Oct '04
    Moves
    82196
    02 Nov '15 02:431 edit
    Originally posted by JS357
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over ...[text shortened]... his thread will not get the usual politically driven responses, but that’s a feature, not a bug.
    Does current law demand a driver of a vehicle to sacrifice their life if rounding a bend they are faced with your choice? Or are hitchhikers on a shared road held responsible for their own deaths should they walk in a manner oblivious of their own safety?
    It may be a moot problem by the time self drive cars become widespread anyway. Onboard radar/crash avoidance technology would allow the vehicle to react and stop consistently much quicker than a driver would. And if they were to become widespread a simple solution for any would be hitchhiker using a shared road would be the legal requirement to wear some form of transponder that would alert the sdc to the presence of walkers even when they are out of sight.

    If all these systems in place are down for any reason the primary responsibility of the sdc should be to its occupants. If you would hold a hitchhiker responsible for their own death if they walked unsafely on a shared road and they were knocked down by a driver, why would the occupant of a sdc have to be forced to make a choice that involved sacrificing themselves?
  3. SubscriberWajoma
    Die Cheeseburger
    Provocation
    Joined
    01 Sep '04
    Moves
    65514
    02 Nov '15 05:31
    Originally posted by JS357
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over ...[text shortened]... his thread will not get the usual politically driven responses, but that’s a feature, not a bug.
    They're going to get bullied.

    Cut somebody up in traffic now and you're likely to experience some road rage, but cut up the self driving car, it's going to get out of your way.
  4. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    02 Nov '15 07:33
    Originally posted by JS357
    Other thoughts?
    Its a difficult problem and I don't have an answer.

    I suspect that in reality such decisions will be made by programmers and nobody but them will every know. The marketing people will never let it be known that your car is designed to kill in some circumstances (whoever it decides to kill).

    Such decisions would in reality be very rare. The car will almost certainly attempt to stop in most cases rather than steer to avoid an obstacle. In In the particular example, the car will probably also not know the consequences of going off road ie it will be programmed to simply not go off road, or to go off under certain circumstances but will not base its decision on what is off the edge of the road. However I do agree that there will be circumstances where a similar decision will be required. I think the most likely scenario is what to do if there is an oncoming vehicle or a speeding truck coming from behind and getting out of the way requires going into pedestrians.

    The VW case comes to mind where the cars essentially decide to pollute the earth and kill indirectly except when they are being tested.
    (53,000 early deaths occur per year in the United States alone )
    https://en.wikipedia.org/wiki/Exhaust_gas

    Of course humans make these kinds of decisions too but typically do not have a lot of time to think them through.
  5. Standard membershavixmironline
    Guppy poo
    Sewers of Holland
    Joined
    31 Jan '04
    Moves
    56262
    02 Nov '15 17:05
    We were asking roughly the same question, the other day, at work.
    If you run someone down in a self driving car, are you guilty or is the manufacturer?
  6. Joined
    29 Dec '08
    Moves
    6788
    02 Nov '15 17:43
    Originally posted by twhitehead
    Its a difficult problem and I don't have an answer.

    I suspect that in reality such decisions will be made by programmers and nobody but them will every know. The marketing people will never let it be known that your car is designed to kill in some circumstances (whoever it decides to kill).

    Such decisions would in reality be very rare. The car will ...[text shortened]... make these kinds of decisions too but typically do not have a lot of time to think them through.
    The auto industry is introducing automatic capabilities for accident avoidance as we speak. Those capabilities result in situations where the software chooses an action that is (or should be) intended to cause the least damage/injury. For example forward-looking radar can detect a vehicle ahead that has suddenly slowed down, and can issue a warning and then if voluntary action isn't taken, can apply the brakes. I imagine the warning can be bypassed if there isn't time. But rearward-looking radar might at the same time detect a vehicle behind that isn't so equipped and is about to plow into you. There will be balancing of risks.

    I had this happen to me a couple of months ago on a freeway. Fortunately I was able to add a leftward swerve to my deceleration and the driver behind me added a rightward swerve so we didn't collide. I can't say whether the swerve was really necessary. I can say we would have collided if we both swerved the same way. I swerved first so the other driver may have seen this. Programming such alternative actions may be tricky.

    I don't believe it will be possible to hide the programming, although VW did it for several years.

    In the US, the federal National Transportation Safety Board is "Charged with determining he probable cause of transportation accidents and promoting transportation safety, and assisting victims of transportation accidents and their families."

    http://www.ntsb.gov/Pages/default.aspx

    I imagine that comparable agencies exist in other developed countries.
  7. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    02 Nov '15 18:35
    Originally posted by JS357
    I don't believe it will be possible to hide the programming, although VW did it for several years.
    It is likely the programming simply won't be looked at (rather than deliberately hidden) until an accident occurs.
    As pointed out, self driving cars already exist and are driving, and cars that brake automatically or take other safety measures exist and are driving. No explanation for how they make such evaluations has been published by their manufacturers.
  8. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    02 Nov '15 18:371 edit
    I wonder if there are any official polices that airlines use when faced with such situations (landing on a busy road for example). I know of one instance of a light aircraft that made an emergency landing on a beach and killed two people who were walking on the beach. It is likely the pilot did not know that would happen, but if he did, should he have just ditched into the sea?
  9. SubscriberC J Horse
    A stable personality
    Near my hay.
    Joined
    27 Apr '06
    Moves
    52400
    02 Nov '15 18:561 edit
    Originally posted by JS357
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over ...[text shortened]... eath. Thinking rationally, which do you choose? Put another way, what is the right thing to do?
    If you cannot stop before reaching the hikers, you were driving too fast for the conditions. I would hope that a self-driving car would adopt a sensible speed.
  10. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    02 Nov '15 19:54
    Originally posted by C J Horse
    If you cannot stop before reaching the hikers, you were driving too fast for the conditions. I would hope that a self-driving car would adopt a sensible speed.
    It is certainly a good point that for obstacles in front of it a self driving car should always be able to stop in time. I actually wonder just how well they can judge slippery conditions. Humans are notoriously bad at judging stopping distance in slippery conditions.

    However, the question still remains for other scenarios such as an on-coming vehicle moving into your lane and the only alternatives being hit into it or leave your lane and hit pedestrians. From what I have seen of self driving cars so far, I think it would merely brake as hard as it could and stick to its lane. The algorithms for leaving its lane would almost certainly require a clear path ahead.
  11. Germany
    Joined
    27 Oct '08
    Moves
    3118
    02 Nov '15 20:46
    Originally posted by JS357
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over ...[text shortened]... his thread will not get the usual politically driven responses, but that’s a feature, not a bug.
    I would say your source is wrong and it is extremely unlikely that a self-driving car would detect two different emergencies exactly at the same time.
  12. Joined
    29 Dec '08
    Moves
    6788
    02 Nov '15 21:03
    Originally posted by twhitehead
    It is likely the programming simply won't be looked at (rather than deliberately hidden) until an accident occurs.
    As pointed out, self driving cars already exist and are driving, and cars that brake automatically or take other safety measures exist and are driving. No explanation for how they make such evaluations has been published by their manufacturers.
    For the US there is this:

    http://cyberlaw.stanford.edu/wiki/index.php/Automated_Driving:_Legislative_and_Regulatory_Action

    The first footnoted reference is interesting.
  13. Joined
    29 Dec '08
    Moves
    6788
    02 Nov '15 21:42
    Originally posted by KazetNagorra
    I would say your source is wrong and it is extremely unlikely that a self-driving car would detect two different emergencies exactly at the same time.
    I think autonomous vehicle software is going to have to manage a multifactor situation like I described in my freeway encounter -- needing to slow down while avoiding being rammed by a car that isn't slowing down. It wouldn't necessarily detect the two issues at exactly the same moment in time.
  14. Zugzwang
    Joined
    08 Jun '07
    Moves
    2120
    02 Nov '15 22:41
    For further reading:
    _The Trolley Problem: Would You Throw the Fat Guy Off the Bridge? A Philosophical Conundrum_
    by Thomas Cathcart (2013)
  15. Joined
    02 Jan '06
    Moves
    10087
    02 Nov '15 22:591 edit
    Originally posted by JS357
    Trolley Problem and Self-Driving Cars

    I recently read or saw a discussion on this problem, by people who think it is or will be a real question.

    In the case of self-driving cars, think of this variant on the Trolley Problem.

    You are driving (legally) your car (current technology) and come into a situation where you have to choose between running over ...[text shortened]... his thread will not get the usual politically driven responses, but that’s a feature, not a bug.
    A self driving car? Why not take a bus?

    The only question becomes, will the government one day decide we all must take the bus or self driving car?

    My guess is they will.
Back to Top