No robot rights have been codified, as of yet. But as advances are made in the field of artificial intelligence, many ethical dilemmas begin to arise regarding treatment of robots. Although many people might be turned off by humanlike robots, it is possible that robots will one day become an integral part of our society. According to the BBC, a British government-commissioned conference in 2006 explored issues such as the need for healthcare and housing for robots, which would be subsidized by the government. The issue of robots in the military was also discussed at the conference. South Korea, meanwhile, aims to have a robot in every home by the year 2020, and the country even spoke of creating a Robot Ethics Charter. So robot rights may be on the way.
The whole issue of rights for robots was anticipated by science-fiction writer Isaac Asimov in the early 1940s when the writer created his "Three Laws of Robotics." The first law says robots can't injure human beings or through inaction let them come to harm. The second law says robots have to obey humans, unless what they're told conflicts with the first law. The third law says that robots must protect their own existence, assuming that nothing they do to protect themselves would create a conflict with the first or second law [source: Auburn University].
It's interesting to note that the word "robot" saw its first use in Czech playwright Karel Capek's R.U.R. (Rossum's Universal Robots) in 1921. The word came from the Czech word for serf or forced labor. The play doesn't take the most optimistic view of a predominantly robotic world where robots do many things for man but bring on an equal amount of social turmoil that comes from a populace with nothing to do [source: University of Texas at Austin].
How does the Nao robot help autistic children?
Answered by John Oliver
Can a robot conduct an orchestra?
Answered by Discovery Channel
Are humans smart enough to build thinking machines?
Answered by Rodney Brooks