|
|
|
 |

April 11th, 2008, 07:24 PM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
But that's just the way computers are now, and even now all those things could be programmed into one; you could connect it to a temperature sensor and tell it that when it goes below 0 degrees celcius it should take measures to warm itself up; it would basically be like it could feel cold. At this stage computers are limited by their own programming, but in the future this might not always be so.
Like I said, our brain is little more than an advanced biological computer that is connected to some sensors, alot of small factories that keeps our machinery going, and a few tools that allows us to interact with our surroundings. It collects data from those sensors and stores it, and thus becomes *us*, the sum of our own memories and experiences.
From what I recall, they've found evidence indicating life on this planet may have been evolving for as long as 3.5 billion years. Computers have been under development for 80'ish years?
Somehow I think that should development on computers continue for another 3 billion years the result would be staggering.
|

April 11th, 2008, 09:18 PM
|
 |
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
If all you think you are is the sum of your memories, then who's the 'you' that's remembering those memories?
Memories don't explain awareness.
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|

April 11th, 2008, 11:02 PM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
Does there really need to be a 'you', though? Suppose the brain's just 'designed' to automatically shuffle through memories, or rather experiences, all the time. Trying to not think about anything is impossible, right?
What's awareness but the brain getting continuous input from your enviroment via your senses, comparing it with previous experiences, and then deciding on the course of action that is likely to bring the best outcome possible?
|

April 12th, 2008, 12:13 AM
|
 |
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
That's a formula of actions and reactions, nothing more. I can achieve the same sort of result with a bunch of gears in a box. (Well, I could if I understood mechanical things a *Lot* better)
Are you going to argue that a bunch of gears in a box are aware?
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|

April 12th, 2008, 10:42 AM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
I wouldn't in that case, because the gears in the box wouldn't be able to perceive. I would, however, argue that a computer could be considered at least partially aware( not self-aware though ), if connected to sensory devices. Say it was connected to a camera and when it detected anything in that camera it would set off an alarm. It perceives, and it reacts.
Humans aren't different, the way I see it, just that we don't just react to what we perceive but also to our thoughts. We're far more complex, currently, but the principle is the same.
|

April 12th, 2008, 11:08 AM
|
 |
Shrapnel Fanatic
|
|
Join Date: Mar 2003
Location: CHEESE!
Posts: 10,009
Thanks: 0
Thanked 7 Times in 1 Post
|
|
Re: OT - Sentience
Right, right...Because a more complex gear-box is capable of different things than a less-complex gear-box.
Sorry for the sarcasm, but this is a redux of an arguement I had before that went nowhere.
__________________
If I only could remember half the things I'd forgot, that would be a lot of stuff, I think - I don't know; I forgot!
A* E* Se! Gd! $-- C-^- Ai** M-- S? Ss---- RA Pw? Fq Bb++@ Tcp? L++++
Some of my webcomics. I've got 400+ webcomics at Last count, some dead.
Sig updated to remove non-working links.
|

April 12th, 2008, 01:50 PM
|
 |
General
|
|
Join Date: Apr 2001
Location: Cincinnati, Ohio, USA
Posts: 4,547
Thanks: 1
Thanked 7 Times in 5 Posts
|
|
Re: OT - Sentience
Well yes, a more complex gear box *is* capable of different things than a less complex gear box. If you have only two gears, for instance, you certainly can't build a calculator out of them - but Charles Babbage was able to make a calculator out of hundreds of gears!
(Well, OK, I suppose you *could* build a calculator out of two gears... but the "calculator" would only be capable of very simple calculations like multiplying numbers by the ratio of the numbers of teeth on the gears...  )
And while the human brain is not made of gears, the components (neurons) that it IS made out of are not much more functionally complex, and can easily be emulated. I know that neurons have internal components such as a nucleus and cytoplasm, but in my opinion at least those are just "implementation details" - a neuron would be a neuron whether it's made of cytoplasm or crystals or whatever. I think it would be arrogant and shortsighted to assume that creatures worthy of moral consideration could only be composed of organic components. Even if God made man in his own image, could God not have created creatures that were not in his own image, but were still self-aware? And with intelligence and free will, whether they come from God or not, could not man create creatures of his own that are self-aware? We already do, of course; it's called reproduction; clearly *something* is being transmitted from parents to children that gives them the capability of self-awareness; it might be genes, or it might be the essence of God, or it might be the Force, or it might be farandolae (those fictitious mitochondrial parasites posited by Madeleine L'Engle in her novels that supposedly grand humans special powers), or maybe a combination of factors, or maybe something we don't even know about yet, but whatever it is, I'm confident in humanity's ability to duplicate it artificially at some point in the future.
Maybe it's just a philosophical difference between us - I tend to believe in things that I can perceive, and consider abstractions to be "less real", so if I perceive a computer to be acting in a self-aware manner, I'd consider it to be self-aware. Since I can't seem to perceive God (apparently something is lacking in me because everyone else can!), I don't consider God to be as "real" as human beings or computers or chairs or planets or whatever. Maybe that's what draws me toward the Buddhist/Jewish philosophy of focusing on improving the here and now, and drives me away from the Christian/Muslim philosophy of treating the here and now as irrelevant compared to what comes after... the strange thing is, both philosophies seem to practice the opposite of what they preach - Buddhists seem to not be much into worldly activism, while Christians go nuts about it! Perhaps things will change though as cultures blend... I've read reports of Tibetan monks being involved in protests, and yet there are a lot more interfaith dialogues than there used to be...
__________________
The Ed draws near! What dost thou deaux?
|
Thread Tools |
|
Display Modes |
Hybrid Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is On
|
|
|
|
|