Page 160 - AI Ver 3.0 class 10_Flipbook
P. 160
Task 21 st Century #Technology Literacy
Skills
Case Study: Self-driving cars
Scenario 1:
Imagine it's the year 2030. Self-driving cars, once just a
futuristic concept, are now a common sight on the roads.
You own one yourself, using it for your daily commute
and enjoying the ease it brings to your life.
Of course, with all the advanced features it offers, the car
is quite expensive. Now, picture this: one day, your father
is on his way to the office in his self-driving car. He is
seated in the back while the car drives itself. Suddenly,
a small boy runs into the road. The situation unfolds so
quickly that the car has only two options:
Continue straight and severely injure the boy.
Take a sharp right turn to avoid the boy, crashing into a metal pole, damaging the car, and injuring your father.
This scenario highlights a critical ethical dilemma that developers face when designing a self-driving car's decision-
making algorithm. Ultimately, the morality of the developer influences the machine's choices --what they believe is
the "right" decision will be prioritised in the car's programming.
If you were in the developer's position, how would you approach this dilemma?
Scenario 2:
Let's now assume that the car has hit the boy who stepped in front of it. Considering this as an accident, who should
be held responsible, and why?
1. The owner of the car.
2. The manufacturing company.
3. The developer who designed the car’s algorithm.
4. The boy who ran into the road and was severely injured.
158 Touchpad Artificial Intelligence (Ver. 3.0)-X

