|
|
|
|
|
April 12th, 2008, 12:13 AM
|
|
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
That's a formula of actions and reactions, nothing more. I can achieve the same sort of result with a bunch of gears in a box. (Well, I could if I understood mechanical things a *Lot* better)
Are you going to argue that a bunch of gears in a box are aware?
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|
April 12th, 2008, 10:42 AM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
I wouldn't in that case, because the gears in the box wouldn't be able to perceive. I would, however, argue that a computer could be considered at least partially aware( not self-aware though ), if connected to sensory devices. Say it was connected to a camera and when it detected anything in that camera it would set off an alarm. It perceives, and it reacts.
Humans aren't different, the way I see it, just that we don't just react to what we perceive but also to our thoughts. We're far more complex, currently, but the principle is the same.
|
April 12th, 2008, 11:08 AM
|
|
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
Right, right...Because a more complex gear-box is capable of different things than a less-complex gear-box.
Sorry for the sarcasm, but this is a redux of an arguement I had before that went nowhere.
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|
April 12th, 2008, 01:50 PM
|
|
General
|
|
Join Date: Apr 2001
Location: Cincinnati, Ohio, USA
Posts: 4,547
Thanks: 1
Thanked 7 Times in 5 Posts
|
|
Re: OT - Sentience
Well yes, a more complex gear box *is* capable of different things than a less complex gear box. If you have only two gears, for instance, you certainly can't build a calculator out of them - but Charles Babbage was able to make a calculator out of hundreds of gears!
(Well, OK, I suppose you *could* build a calculator out of two gears... but the "calculator" would only be capable of very simple calculations like multiplying numbers by the ratio of the numbers of teeth on the gears... )
And while the human brain is not made of gears, the components (neurons) that it IS made out of are not much more functionally complex, and can easily be emulated. I know that neurons have internal components such as a nucleus and cytoplasm, but in my opinion at least those are just "implementation details" - a neuron would be a neuron whether it's made of cytoplasm or crystals or whatever. I think it would be arrogant and shortsighted to assume that creatures worthy of moral consideration could only be composed of organic components. Even if God made man in his own image, could God not have created creatures that were not in his own image, but were still self-aware? And with intelligence and free will, whether they come from God or not, could not man create creatures of his own that are self-aware? We already do, of course; it's called reproduction; clearly *something* is being transmitted from parents to children that gives them the capability of self-awareness; it might be genes, or it might be the essence of God, or it might be the Force, or it might be farandolae (those fictitious mitochondrial parasites posited by Madeleine L'Engle in her novels that supposedly grand humans special powers), or maybe a combination of factors, or maybe something we don't even know about yet, but whatever it is, I'm confident in humanity's ability to duplicate it artificially at some point in the future.
Maybe it's just a philosophical difference between us - I tend to believe in things that I can perceive, and consider abstractions to be "less real", so if I perceive a computer to be acting in a self-aware manner, I'd consider it to be self-aware. Since I can't seem to perceive God (apparently something is lacking in me because everyone else can!), I don't consider God to be as "real" as human beings or computers or chairs or planets or whatever. Maybe that's what draws me toward the Buddhist/Jewish philosophy of focusing on improving the here and now, and drives me away from the Christian/Muslim philosophy of treating the here and now as irrelevant compared to what comes after... the strange thing is, both philosophies seem to practice the opposite of what they preach - Buddhists seem to not be much into worldly activism, while Christians go nuts about it! Perhaps things will change though as cultures blend... I've read reports of Tibetan monks being involved in protests, and yet there are a lot more interfaith dialogues than there used to be...
__________________
The Ed draws near! What dost thou deaux?
|
April 13th, 2008, 12:20 AM
|
|
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
Mechanical input goes in, mechanical output goes out. Also, light reflects and/or refracts as it hits it.
Basically, you're saying that, because something is complex, it must be capable of being aware.
I'm saying, making something simple, complex, does not give it anything the simple thing does not have, unless you change its nature.
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|
April 13th, 2008, 11:05 AM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
How's that so different from the brain where electrical input goes in and electrical output goes out?
Anyway, personally I'd guess the brain was also simple in the beginning; so simple that it wasn't even aware. It had to be simple, things like that don't just suddenly appear out of nowhere. As it evolved through millions of years, it slowly grew in complexity, and also in awareness.
We humans have the most complex brains( that I know, anyway ), and we could also be said to be, by far probably, the most aware, and self-aware, creatures on earth.
|
April 14th, 2008, 10:48 AM
|
|
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
Basically, I have never seen nor heard of a machine that outputs anything that wasn't either in the input or in the machine.
Even an electrical generator. Before you say 'mechanical energy to electricity', let me point out, 'magnets' and 'electrons'.
Basically, you're presuposing a machine that does something no other machine I have ever heard of does. Yes, it's possible. But the odds are against it.
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|
April 14th, 2008, 02:54 PM
|
|
National Security Advisor
|
|
Join Date: Dec 1999
Posts: 8,806
Thanks: 54
Thanked 33 Times in 31 Posts
|
|
Re: OT - Sentience
Interesting original thought extension, Ed. I would say that "sentience" is an (imperfect) term from (imperfect) Cognitive Science, but if you boil it down to logical calculations and/or behavior, then I would say it's true machines have sentience, as do animals. Basically, my matrix looks like this:
Code:
Logic Language Identity Experience/Consciousness
Human yes/much yes/advanced yes - noisy yes
Animal yes/some yes/basic yes - simple yes
Robot yes/lots yes/logical maybe no/artificial
The lower-right cell is the clincher for my moral decisions. Go ahead and power down your computer. But please be kind to animals.
|
April 14th, 2008, 03:00 PM
|
|
General
|
|
Join Date: Apr 2001
Location: Cincinnati, Ohio, USA
Posts: 4,547
Thanks: 1
Thanked 7 Times in 5 Posts
|
|
Re: OT - Sentience
So the only reason you consider robots to not be worthy of moral consideration is that they have only "artificial" experience? What makes you think that that experience is not as real as that which comes from naturally occurring entities? And could it not be argued that humans also have artificial experience as well? Babies don't just drop from the sky - they require other humans to create them!
I guess I just don't buy into the whole "nature is good, humans are bad" thing quite completely enough to go for that argument
__________________
The Ed draws near! What dost thou deaux?
|
April 14th, 2008, 04:29 PM
|
|
National Security Advisor
|
|
Join Date: Dec 1999
Posts: 8,806
Thanks: 54
Thanked 33 Times in 31 Posts
|
|
Re: OT - Sentience
Because I have programmed A.I.'s, and my view of how they work shows me zero evidence that they require or would have anything like my experience of life in the moment - my consciousness as opposed to my memory and data representations.
On the other hand, I also have it that morality is a choice, unless you subscribe to a doctrine such as religions provide.
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is On
|
|
|
|
|