let's say, two people if the car were to kept going straight. And then, but if the driver pulls the lever and moves it to another track, it would only hit one person. So is it morally right or wrong to pull the lever? Because if the driver doesn't pull anything, it's not his or her responsibility, but two people will die. If you pull the lever, you save a life, but then you're deliberately killing that person who would otherwise not have to die. And that person may be crossing the tracks thinking is perfectly safe because the trolley car wasn't supposed to go that way. That's the ethical moral problem. And I think the autonomous vehicle problem is akin to that because computers AI will be making decisions on left turn, right turn fast, slow that will have lives that ride on that decision. So I think the programming per se is not by rules. So you don't really say if there are two younger people, one older people, then do this and do that. It's no human will ever have to have to program (27/38)
You are viewing a single comment's thread from: