WHAT YOU SHOULD KNOW




Getting It Right
Humanity needs to develop clear guidelines and ethics for artificial intelligence, but doing so will not be easy.
In March 2018, an autonomously driven Uber vehicle hit and killed a woman, making her the first pedestrian to be killed by a self-driving car. This immediately raised a difficult question: ‘Who is to blame in this case? The owner, for owning the car and using it for its intended purpose? The manufacturer of the car? The company that built the software that controls it?
As “smart” machines become more and more a part of our daily lives, the question of how we regulate and control them becomes greater.
Some of these are practical questions. For instance, how do we prevent robots from being hacked and misused? IOActive, a firm of security consultants, has demonstrated how real this risk is by hacking into and taking control of Alpha 2, a humanoid robot designed to be a household assistant. They instructed it to pick up a screwdriver and repeatedly stab a tomato.
And what about morals – should we encode them into robots? If so, whose morals? Science fiction offers a good starting place for considering this question. The famous science fiction writer Isaac Asimov proposed, as long ago as 1942, three laws for robots. First, a robot must not injure a human or allow one to come to harm. Second, a robot must obey its orders, except where they would conflict with the first law. Third, a robot should protect itself, as long as that protection does not go against either of the first two laws.
These three laws are a good start, but they provide no clear guidance for some of the thorny situations a machine might face. Consider again, a self-driving car that sees a pedestrian step out unexpectedly into the street. It has to make what is essentially a moral choice. It can swerve dangerously to protect the pedestrian but risk its owner’s life. Or it can prioritize its owner’s safety at the expense of the pedestrian’s. Does a self-driving vehicle have a loyal duty to protect its owner? And if so, do taxis and public transit vehicles behave differently to privately owned cars? And should a vehicle’s calculations change if the pedestrian is a child or an elderly person?
It will take some time for us to come to grips with questions like these. In the meantime, here’s some advice. Be nice to machines. In the long run, it might pay to stay in their good books.

We’ve not even explored or began to scratch the surface of some of the problems that our technological revolution is throwing up, but I'm sure we'll figure it out along the way as they come up. Besides it took thousands of years for creation or evolution (wherever you stand on the pole) to perfect humanity, so what's the rush?

Please leave your thoughts and opinions in the comments box provided below.

Have a fruitful day!



Olusola Bodunrin is a graduate of Philosophy from the University of Ado-Ekiti. He is a professional writer, he writes articles for publication and he anchors – ‘What You Should Know’ on SHEGZSABLEZS’ blog.
‘What You Should Know’ is a column that offers to educate and enlighten the public on general falsehood and myths.

Comments

  1. Insightful, thanks!

    ReplyDelete
    Replies
    1. You are very welcome. Please other interesting and educative articles out on the blog.

      God bless you.

      Regards
      Oluwole

      Delete

Post a Comment