Self Driving Cars

A dilemma about how to program a self-driving car showed me how interdisciplinary some problems can be.

Today was my first STEM Skills and STEM in Society class. The class started with a TED-Ed video that raised an interesting question: If you were programming a self-driving car and a box fell out of a truck, would you program the car to let the box hit, swerve into an SUV, or hit a motorcycle. Here is the link to the video. Here's a picture of the situation:



This is a hard dilemma to answer. If I were the programmer, I would research each situation and find or calculate the average number of deaths for each situation. Then I would choose the one with the least number of average deaths. This method allows the choice to be made purely on empirical data and not affected by human biases. For example, if the program had a bad experience with a motorcyclist, he would likely choose the motorcycle. Our biases affect us whether we are conscious of it or not.

This problem is interesting to think about. It brings up many philosophical questions like

  • How do we value life?
  • Who is to blame? 
  • What principle or theory should be applied to help solve it?
and many more. The many unknowns also make this problem difficult to answer. This exercise taught me that there are many places where different fields intersect. However, critical thinking and reasoning skills can be used in these places to find an answer.




Comments

  1. I like the way you considered the possible bias of the human programmer! It foreshadows the conversations we've had on objectivity and values in science and shows us how these same issues arise in tech and applied contexts.

    ReplyDelete

Post a Comment

Popular posts from this blog

My Gov School Experience

June 13: Are Animals People?

June 17: Data Visualization and AI Overview