A chimpanzee's is geneticlly altered invetro to have all the facillities of a humanbeing. He is born, he learns as any child learns, he grows, he speaks, he feals. He can express "I Am." But he is physiclly a chimpanzee.
Must he be afforded all the rights intitled to humanbeings?
Will it come down to proving he has a soul?
How will we deal with him?
*An interesting take on the rights of Androids is shown on Star Trek-The Next Generation. Data (an android, built by man) is in court to decide if he is the property of Starfleet, or if he has the right to choose his own destiny. Another is when Data constructs a daughter, and weather he has a right to retain custody.
The term soul is figurative and can be interpreted in different ways across the spectrum.
But no, he is not held to the standards of man because they were created by men for men, just as other inventions of man are not held accountable for their actions, their creators are.
IP: Logged
03:38 PM
Boondawg Member
Posts: 38235 From: Displaced Alaskan Registered: Jun 2003
Sentience refers to utilization of sensory organs, the ability to feel or perceive subjectively, not necessarily including the faculty of self-awareness. The possession of sapience is not a necessity. The word sentient is often confused with the word sapient, which can connote knowledge, consciousness, or apperception. The root of the confusion is that the word conscious has a number of different usages in English. The two words can be distinguished by looking at their Latin roots: sentire, "to feel"; and sapere, "to know".
Sentience is the ability to sense. It is separate from, and not dependent on, aspects of consciousness.
Philosophy and sentience: Many philosophers, notably Colin McGinn, believe that sentience cannot ever be understood, no matter how much progress is made by neuroscience in understanding the brain. Holders of this position are called New Mysterians. They do not deny that most other aspects of consciousness are subject to scientific investigation, from creativity to sapience, to self-awareness. New Mysterians believe that only sentience cannot be comprehensively understood by science. There continues to be much debate among philosophers, with many adamant that there is no really hard problem with sentience whatsoever.
Non-human animal rights and sentience: In the philosophy of animal rights, sentience is commonly seen as the ability to experience suffering. The 18th century philosopher Jeremy Bentham raised the issue of non-human suffering and sadism in An Introduction to the Principles of Morals and Legislation: "The French have already discovered that the blackness of the skin is no reason why a human being should be abandoned without redress to the caprice of a tormentor... What else is it that should trace the insuperable line? Is it the faculty of reason, or, perhaps, the faculty of discourse? But a full-grown horse or dog is beyond comparison a more rational, as well as a more conversable animal, than an infant of a day, or a week, or even a month, old. But suppose the case were otherwise, what would it avail? The question is not, "Can they reason?" nor, "Can they talk?" but, "Can they suffer?"
As Peter Singer argues, this is often dismissed by appeal to a distinction that condemns humans suffering but allows non-human suffering. However, as many of the suggested distinguishing features of humanity - extreme intelligence; highly complex language; etc. are not present in marginal cases such as young or mentally disabled humans, it appears that the only distinction is a prejudice based on species alone, which non-human animal rights supporters call speciesism - that is, differentiating humans from other animals purely on the grounds that they are human.
On the other hand, some have argued that modern science cannot determine exactly where sentience begins, going from bacteria to animals. This would pose considerable complications for a theory of unnecessary suffering. Others take no objection with the conclusion that it's wrong to cause unnecessary suffering, but contend that on this issue the moral concept of right/wrong shouldn't mirror human nature but should instead be modelled from nature. Since animals routinely kill each other and inflict (at times unnecessary) suffering on each other, then as part of animalia it wouldn't be wrong for us to also. This is a view most of the world's population follows, whether intentionally acknowledging it or not. Therefore, the reason the rules of nature regarding killing aren't applicable towards other humans is because we are then dealing with the human realm. Our own psychology and the collective sociology make it unfavorable (i.e. less safety, added stress, reduced efficiency) to partake in killing other humans. Seen in this light, it would not be speciesism to kill animals but spare humans, but instead an outgrowth of humans' (as a species) naturalistic adaptation while observing all natural ethics regarding suffering.
Artificial intelligence: The issue of sentience also frequently arises in science fiction stories describing robots or computers with artificial intelligence. Intelligence and sentience are quite distinct, so the question arises as to whether computers with artificial intelligence will become sentient.
Some science fiction, most notably the Star Trek franchise, uses the term sentience to describe a species with human-like intelligence.
Eastern religion: Eastern religions including Hinduism, Buddhism, Sikhism and Jainism recognize nonhumans as sentient beings. In Jainism and Hinduism, this is closely related to the concept of ahimsa, nonviolence toward other beings. In Jainism, all the matter is endowed with sentience; there are six degrees of sentience, from one to six. Water, for example, is a sentient being of first order, as it is considered to possess only one sense, that of touch. Man is considered to be sentient being of the sixth order. According to Buddhism, sentient beings made of pure consciousness are possible. In Mahayana Buddhism, which includes Zen and Tibetan Buddhism, the concept is related to the Bodhisattva, an enlightened being devoted to the liberation of others. The first vow of a Bodhisattva states: "Sentient beings are numberless; I vow to free them."
Western religion: Western religion is increasingly becoming aware of the concept of sentience. Andrew Linzey, founder of the Oxford Centre for Animal Ethics in England, is known as a foremost international advocate for recognizing animals as sentient beings in Biblically-based faith traditions. The Interfaith Association of Animal Chaplains encourages animal ministry groups to adopt a policy of recognizing and valuing sentient beings.
Sentience: The next moral dilemma Richard Barry ZDNet.co.uk Published: 24 Jan 2001 17:30 GMT
Sometime in the future machines will reach a level of intelligence that will challenge, or even surpass our own.
Revered members of the academic community deem the event an inevitability. These include names like Ray Kurzweil -- inventor of the first reading machine for the blind -- Berkeley's John Searle and perhaps the man who deserves most credit for adding legitimacy to this belief, Sun Microsystems' Bill Joy.
If they are right, one day man will give life to a new race of intelligent sentient beings powered by artificial means.
If we can, for argument's sake, agree that this is possible we should consider how a sentient artificial being would be received by man and by society. Would it be forced to exist like its automaton predecessors who have effectively been our slaves, or would it enjoy the same rights as the humans who created it, simply because of its intellect?
It is an enormous question that touches religion, politics and law, but little consideration is given to dawn of a new intelligent species and to the rights an autonomous sentient being could.
For a start, it would have to convince us that it was truly sentient: intelligent and able to feel (although it is debateable whether its feelings would mirror our own).
Peter Garrett, director of research and education at pro-life charity Life takes a strong stance: "I think it would take around four years of questioning before I would be satisfied that this being could be considered a person."
Garrett's prudence is perhaps born of his strong religious convictions, which can weigh heavily people's thoughts about whether or not an artificial sentient being could ever hope to be seen as an autonomous person (not to be confused with becoming a human being).
"I think even when we grant the label person to this new entity, it would still not be a human being... It is still a man made being and not made in the image of God... and while that may not be important on one level -- I think the secular world and the secular legal system regard it as being very unimportant -- I feel that the robot would still be a product of humanity, whereas man is believed to be a creation of God, made in the image and likeness of God," he says.
This is significant because it marks a boundary around us as a species and protects us from usurpers that can never hope to be like us. Or more accurately, like God.
Garrett's way of thinking is reflected when you look at why moral abuses were inflicted, for example, on black people in South Africa. Venture into the northern province of Pretoria where hundreds of thousands of Boer -- the Dutch settlers who became the fathers of apartheid -- still send their children to white only schools -- Volkskool -- and you will be told that blacks are not, like them, made in the image of God.
But Garrett thinks religions would be able, in the end, to cope with the idea of a new sentient race, and that these beings would be allowed to worship: "I think the Catholic church would say: 'You are not a human being, you are not Imago Dei but you are a person and we should perhaps have a third Vatican council to discuss terms under which persons of your type can join the Catholic church. I think it might take some time and a fair bit of getting used to but I cannot see any justified reason why it could not happen."
Even Buddhists, who profess to honour and respect "all sentient beings", as written in the Dharma, admit to some confusion over artificial worshippers. "Of course we believe all sentient beings have the right to live a full life with all the blessings all living creatures enjoy," a spokesman for the Buddhist Cooperative in London says. "It is a curious thing to even imagine, but we believe all that has life has the right to enjoy it to its full. I see no reason to make an exception to this new life form, although I am sure initially many eyebrows will be raised, if only out of wonder."
But Garrett believes that sentient beings will never be the same as humans. "I think what we would have created would be yes a person, but not a human being and not Imago Dei. I still believe the category of human being would be very different, and one reason is that as human beings we have to move forward towards death and we have to learn how to face that. That is part of being human."
In the film Bicentennial Man in which Robin Williams plays Andrew, a sentient robot who looks, feels and thinks like a human but is still classed as a droid, the death issue provides the final step toward the revered status of "human". Andrew swaps his mechanical innards for a set of organic ones that eventually age and kill him off in his sleep, earning him the posthumous award of human being.
The reality of artificial beings being granted immortality through upgrades and repairs does nothing to quell the fears of those who believe robots could one day replace man as Earth's dominant species. For them at least, the question is not about whether a robot is equal to man, but rather, through its intellect and potential immortality, superior to us -- some kind of god-like race.
So, in summary, a sentient robot could in theory wander into a place of worship, light a candle and get on with praying with no fear of abuse or intolerance from religious leaders. But what would happen if someone, or something under the control of someone, harmed it deliberately? Would we be ready to afford this new race legal protection that could send a human to jail if it were deliberately harmed?
"Yes, I think we probably could" says Robin Bynoe, high-tech lawyer and partner at London law firm Charles Russell. "What is most likely to happen is a gradual move toward a structured process of liability that would provide this new being with legal protection. For example there will at some point have to be a legal protection against, say someone pulling out its wires." Bynoe explains: "If pain is experienced by the robot it would have to have similar laws that protect dogs or cats."
Bynoe is confident the legal system would be flexible enough to cope with the arrival of a new man made species "in the fullness of time." "I think the legal system would be able to cope with a robot that can think and feel, although it would have to demonstrate these abilities," he says.
He does not like the argument that a sentient robot will necessarily be a bipedal mechanism created to look like a man, and rather believes in something more likely to resemble HAL of 2001. He nominates Bill Gates' house as a possible candidate for the world's first sentient artificial being but concedes it will be very difficult for even an intelligent house to convince the world that it has the right to any moral say.
In all likelihood the arrival of a new sentient species born of robotics will probably present itself as a droid performing a certain task that man cannot, or would rather not do. In Ridley Scott's Blade Runner, the Nexus 6 series of replicants got fed up of being slaves of humans. No wonder: species-ism played a major part in Scott's scenario with humans slandering the replicants as "skin jobs" and forcing them to work in the service of man at pain of "retirement" should they resist.
Imagine the problems our grandchildren would face if the robots suddenly downed their tools and demanded liberation. As Bynoe puts it: "No one is going to pay good money for something that sulks, are they?"
This step to self awareness marks a significant point in our sentient being's life. At the point where it decides it wants to leave its owner and do its own thing, there could be trouble.
Steve Grand, the engineer who created AI interactive game Creatures starring the adorable Norns, has been dubbed the "most intelligent man in Britain" and believes that once his robot orangutan Lucy achieves consciousness, she should be afforded the same rights as any human. He does accept however, that species-ism is a threat to how she will be treated. That and her appearance. Asked if it would matter whether she were bipedal or a sentient house, Grand laughs, but acknowledges that "we are incredibly anthropomorphic, aren't we?".
Time, says Grand, will be the key to man and artificial sentient beings living together: "It will take a very long time to get used to the idea, and that's not a bad thing. We simply could not take in the sudden arrival of a new intelligent species."
But what if these beings do not want to do man's bidding. What if they do not want to do the jobs they were designed to do? "I think that these machines will be different from us by virtue of the fact that they will enjoy what they have been designed to do... what they were made for. So if you want a miner or maybe a [robot] horse to pull a plough, it will enjoy doing it and enjoy doing it well."
While Grand agrees that a sentient machine may toy with the idea of choice, and thus whether or not it actually wants to do the job it was designed for "it will probably be at a lesser extent than you might think. We are much more stuck with our motivations than we like to believe we are. The reason we hate working nine to five is that we weren't built for it."
"We were built for life in the jungle and an awful lot of human stress in the 21st Century is due to the fact that we don't live in the jungle anymore. We are trapped by our emotions, just as they will be trapped by their emotions, but they will have evolved for a certain job, not for life in the jungle, that will be the status quo as far as they are concerned," he adds.
But the term "sentience" denotes empowerment not only of the mind, but of the will. It gives the power to choose and the will to exercise that choice.
The question of morality -- whether an intelligent thinking machine could somehow assert its individuality and be granted the right to live an autonomous and free existence -- is surely more important than the question of whether or not it can be done. Will man be willing to share this Earth with a creature that perhaps looks different to us, is more intelligent than us and stronger than us?
Consensus within the scientific community does not exist but those who believe sentience is achievable suggest it could be as much as a century away. We should be thankful we have that much time ahead to consider the moral implications of creating another sentient.
"Painful isn't it, a life of fear? That's what it is to be a slave," Roy, Nexus 6 Replicant, Blade Runner.
If you lose an arm, you can get an artificial replacement. Lets say the technology is so advanced that you can't tell the replacement from the original arm once installed. If you lose an arm, and get it replaced, as far as you're concerned it's like you never lost the arm to begin with. You are 100% functional, right down to touch, feel, and even goosebumps.
Now, lets assume they can cure brain tumors the same way. Cut out the tumor and replace the removed portion with a small artificial device that duplicates the neural processes of the removed section. The nerves connecting to the artificial implant can't tell anything is changed - thought, sensation, feeling all travel through the chip just as it did before you had the tumor.
Are you still you?
What if more of your brain is replaced?
What if your entire brain is completely replaced? Are you still you? Are you still a living, sentient human? If not, at what point do you cease being "you"?
IP: Logged
04:28 PM
Boondawg Member
Posts: 38235 From: Displaced Alaskan Registered: Jun 2003
If you lose an arm, you can get an artificial replacement. Lets say the technology is so advanced that you can't tell the replacement from the original arm once installed. If you lose an arm, and get it replaced, as far as you're concerned it's like you never lost the arm to begin with. You are 100% functional, right down to touch, feel, and even goosebumps.
Now, lets assume they can cure brain tumors the same way. Cut out the tumor and replace the removed portion with a small artificial device that duplicates the neural processes of the removed section. The nerves connecting to the artificial implant can't tell anything is changed - thought, sensation, feeling all travel through the chip just as it did before you had the tumor.
Are you still you?
What if more of your brain is replaced?
What if your entire brain is completely replaced? Are you still you? Are you still a living, sentient human? If not, at what point do you cease being "you"?
Exactly. If someone transplants a human brain, with all it's thoughts, emotions, everything that made that donor a person, into the body of an ape, is that person still that person?
Lets not forget, it is believed by some, that animals have no Soul. And Black African Americans once ranked amongst the sentient Beings with no Soul and no noteworthy inner life. Their gesticulations and lack of comprehensible language, when trapped and tied down, had them disqualified in the pious white mind only a couple of centuries back. Their submission, use and abuse, like that of animals, created no inner conflict or self examination.
[This message has been edited by Boondawg (edited 11-30-2007).]
IP: Logged
04:32 PM
ryan.hess Member
Posts: 20784 From: Orlando, FL Registered: Dec 2002
Exactly. If someone transplants a human brain, with all it's thoughts, emotions, everything that made that donor a person, into the body of an ape, is that person still that person?
Lets ask Monkeyman?
IP: Logged
04:37 PM
frontal lobe Member
Posts: 9042 From: brookfield,wisconsin Registered: Dec 1999
"If I'm not me, who in the world (edited for swearing) am I?" Arnold Schwarzenegger in Total Recall. One of my all-time favorite movie lines. Sorry to interrupt. Just made me think of it.
IP: Logged
06:36 PM
Wichita Member
Posts: 20703 From: Wichita, Kansas Registered: Jun 2002
Sentience refers to utilization of sensory organs, the ability to feel or perceive subjectively, not necessarily including the faculty of self-awareness. The possession of sapience is not a necessity. The word sentient is often confused with the word sapient, which can connote knowledge, consciousness, or apperception. The root of the confusion is that the word conscious has a number of different usages in English. The two words can be distinguished by looking at their Latin roots: sentire, "to feel"; and sapere, "to know".
Sentience is the ability to sense. It is separate from, and not dependent on, aspects of consciousness.
A chimpanzee's is geneticlly altered invetro to have all the facillities of a humanbeing. He is born, he learns as any child learns, he grows, he speaks, he feals. He can express "I Am." But he is physiclly a chimpanzee.
Must he be afforded all the rights intitled to humanbeings?
Will it come down to proving he has a soul?
How will we deal with him?
quote
Originally posted by ryan.hess:
Lets ask Monkeyman?
The expert has arrived. (I can't believe I didn't see this thread earlier. My bad.)
I am afforded the same rights that you human beings have.
I have a soul (and am going to Heaven).
I've always been a little difficult to deal with but others are, too.
Would you treat somebody differently if they were born of a different color than you? How about if they had a birth defect that made them different than the "normal" child? What if they are born a genius? Or born with a genetic mutation (i.e. X men)? A (wo)man is a (wo)man no matter what he/she looks like.