Essay about Three Laws of Robotics

850 Words Feb 24th, 2011 4 Pages
THE THREE LAWS OF ROBOTICS I, Robot by Isaac Asimov. 1941-1950.
ORIGINAL VERSION
1. A robot may not injure a human being, or through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
SECOND VERSION
1. No robot may harm a human being.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
THIRD VERSION
1. No Machine may harm
…show more content…
Read "Robot Dreams" by Asimov. In this short story a rogue robot "dreams" a new moral system for robots. Which ethical theory best categorizes this new moral system?

Question 1 – The moral theory that best describes the first version is that of a Deontologist. Deontology is the most appropriate interpretation because the consequences are not clearly thought out or the focus of the rules. There are plenty of scenarios that the first version would not be a good fit for everyday life. An example of one of those scenarios is if someone’s morals are, “One can never lie.” What would happen when an armed man orders a robot to tell where a particular individual is? The robot cannot tell the armed man where the individual is because that would be directly conflicting with rules 1 &2.

Question 2 – The moral theory that best pertains to the third version is Utilitarianism. It’s clear that this is the answer simply because of the specific wording rule 3 of the third version: “No Machine may harm humanity; or, through inaction, allow humanity to come to harm.” The fact that humanity is in the rule implies that it is about the whole, rather than just a single individual, which by definition is Utilitarianism.

Question 3 – The answer here again is the moral theory of Utilitarianism. The quick definition of Utilitarianism is an action is morally right if the consequence is favorable to the majority. Thus, if a single human is putting the entire human race in jeopardy, the

More about Essay about Three Laws of Robotics