The three laws of Isaac Asimov's robotics are:
- First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
- Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
- Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Whether the qualification "robot" should be applied to human-like artificial systems of the type MA-UNI?
We DO NOT design robots, therefore our artificial systems can not be restricted in freedom of thought and action. But if we can not donate the machine with self-consciousness, feel free will and, as a consequence, responsibility to ourselves and to others (which can not be confined to three simple rules that are only human-related), then we are not good designers.
- © Copyright 2015-2018, All rights reserved, www. ma-uni.com -