More Equal than Others
Tesla Motors, a US-based electric car company, introduced autopilot driving to its electric cars. These cars are driving millions of miles every day on highways across the world, collecting information, and sending it back to a huge central database. This, in turn, will make autonomous driving for all a thing of the not-too-distant future.
When designing such technology, engineers face moral questions that rarely arise in real life. Suppose the car is fast approaching a tunnel entrance when its braking mechanism fails. Suppose a child has tripped and is lying across the entrance, blocking the tunnel. Should the car be designed to move forward and kill the innocent child or swerve into the tunnel wall and kill the innocent driver? Should the engineers program the car to choose based on age or number of victims? Suppose the car could plow forward into multiple victims or swerve to the side to kill one previously unendangered victim: Should one person die to save many?