|
|
|
 |

April 16th, 2008, 03:32 PM
|
 |
Captain
|
|
Join Date: Apr 2003
Location: Burnaby
Posts: 995
Thanks: 0
Thanked 3 Times in 2 Posts
|
|
Re: OT - Sentience
Self-determination in the context I used it refers to the actual ability, rather than the right that you are referring to. If you enslave a human, you may deprive him of his right to self-determination, but he will still have the ability to determine what he wants to do with his existence. The fact that a slavemaster may restrict his ability to do so is irrelevant in this case.
As it stands, the odds of us creating a machine with these 3 traits isn't very good, given that we've been trying to figure out the source of our consciousness for a good few thousand years at least, and have really made very little progress, I don't see how we could go about imbuing machines with something we don't understand.
And machines developing a human level of awareness on their own is something limited to bad sci-fi. Machines do not evolve on their own. Yes, I know, we make better machines and call it evolution but it's not true evolution. There's no survival of the fittest, no mutation, no genetic drift (or the mechanical equivalent thereof), there is in fact, nothing that constitutes evolution going on.
Outside the realm of sci-fi, the odds of machines with human-level awareness ever existing is very, very small. Why? Because at the end of the day, a machine is nothing more than a tool. There is absolutely no point in creating a tool that has a sense of self, it's own thoughts & feelings, and the ability to decide for itself what kind of tool it wants to be. It would, obviously, be counterproductive to imbue our tools with such attributes, since not only would they serve no purpose, they would actually pose a hindrance to the usability of the tool.
Such a device might make for an interesting novelty, but it is unlikely they would become widespread, because they serve no practical purpose. And in the interest of remaining relatively civil, I'm not even going to address the concept of basing our ethical and moral beliefs on the possibility of encountering a theoretical form of life based on a substance that by all accounts is poorly suited to form life outside of science fiction.
__________________
Suction feet are not to be trifled with!
|

April 16th, 2008, 04:14 PM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
That's not entirely true. They would be of great use in the creation of video games, simulation programs, social experiments, or even as virtual buddies. Any number of things, really. Not to mention it would be a grand achivement, period.
Obviously you're not going to give your TV the option to actually disobey you and change channels at will, or some such.
|

April 16th, 2008, 05:27 PM
|
 |
First Lieutenant
|
|
Join Date: Jul 2002
Location: Brasil
Posts: 604
Thanks: 0
Thanked 6 Times in 6 Posts
|
|
Re: OT - Sentience
We are too far away from computers with enough capacity for true intelligence/sentience... it´s a question of scale... our computers are not complex enough to simulate real sentience yet...
I read once that the most powerfull supercomputer of today have the equivalent processing power of a single ant, if so much...
__________________
Currently Playing:
Megamek (latest dev version with home-made random campaign generator), Dominions 3 (with CBM) and Sins of a Solar Empire (heavily modded)
|

April 16th, 2008, 06:16 PM
|
 |
Second Lieutenant
|
|
Join Date: Mar 2005
Location: Seattle, WA
Posts: 417
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
If we were to teach a computer to use English language the way children do, it might produce surprising and even scary results.
Steven Pinker recently wrote The Stuff of Thought, language as a window into human nature. You can see a video of his introduction to the book's topic in his TED talk on the subject.
|

April 17th, 2008, 09:17 AM
|
 |
General
|
|
Join Date: Feb 2001
Location: Pittsburgh, PA, USA
Posts: 3,070
Thanks: 13
Thanked 9 Times in 8 Posts
|
|
Re: OT - Sentience
Quote:
AngleWyrm said: If we were to teach a computer to use English language the way children do, it might produce surprising and even scary results.
|
A computer would have to already be sentient to be capable of learning language the way that children do.
__________________
Cap'n Q
"Good morning, Pooh Bear," said Eeyore gloomily. "If it is a good morning," he said. "Which I doubt," said he.
|

April 17th, 2008, 02:13 PM
|
 |
Second Lieutenant
|
|
Join Date: Mar 2005
Location: Seattle, WA
Posts: 417
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
The New Mysterianism perspective is that there is something unexplainable about the inner workings of a person. Possibly extendable to 'there are things that are unknowable'.
This seems to me a convoluted way to accept lack of understanding, an ornate way of saying 'I'm special'.
What about the flu virus? It uses people for food, unimpeded by mankinds attempts to stop it every year. It breeds with impunity for a season, and then rests until the next. Surely it is the king of the universe, content that it too is special.
|

April 17th, 2008, 02:27 PM
|
First Lieutenant
|
|
Join Date: Jan 2005
Posts: 689
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
It just has to do with humans' inability to accept that we're not more than we appear. We want to be special. Religion is much to blame, with the focus on soul, spirit and other mystical subjects.
|

April 17th, 2008, 03:03 PM
|
 |
General
|
|
Join Date: Apr 2001
Location: Cincinnati, Ohio, USA
Posts: 4,547
Thanks: 1
Thanked 7 Times in 5 Posts
|
|
Re: OT - Sentience
Quote:
capnq said:
Quote:
AngleWyrm said: If we were to teach a computer to use English language the way children do, it might produce surprising and even scary results.
|
A computer would have to already be sentient to be capable of learning language the way that children do.
|
Aren't computers already capable of learning language the way children do? Granted none of them have been very successful using that method... 
__________________
The Ed draws near! What dost thou deaux?
|

April 17th, 2008, 08:10 PM
|
Private
|
|
Join Date: Dec 2006
Posts: 16
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Re: OT - Sentience
What I meant by my post was that self-awarness my not exactly be an existing quality. After all, if we program a computer to look like it is self-aware and to "think" it is self-aware, saying it isn't only prove that we could be "not self-aware" at all. Complicated to explain on text, but I understand myself 
we cannot prove anything is or isn't self-aware. After all, animals could be a lot smarter, in a sense, than us, and we butcher them. (daulphins will end up leaving the planet singing...)
I like raapys' point. We are nothing more than chemical reactions in a natural computer, which animals have too, and at a lower (or higher, who knows) degree, plants too. The fact that ours is a bit bigger doesn't mean we're the only sentient beings, in terms stated above. What would make us different is the level we are at, like communication etc. After all, a brain, like a computer, is, in a way, a machine. A gearbox is too. so what matters is only the complexity of that machine. No soul or whatever bull**** religion bring to us. (sorry for religious types, no offence intended)
|

April 19th, 2008, 10:55 AM
|
 |
General
|
|
Join Date: Feb 2001
Location: Pittsburgh, PA, USA
Posts: 3,070
Thanks: 13
Thanked 9 Times in 8 Posts
|
|
Re: OT - Sentience
Quote:
Ed Kolis said:
Quote:
capnq said: A computer would have to already be sentient to be capable of learning language the way that children do.
|
Aren't computers already capable of learning language the way children do? Granted none of them have been very successful using that method...
|
I was going to say, "no, there aren't any computers that learn language the way children do," but when I went looking for confirmation of my opinion, I found this article from New Scientist. The main difference is that children learn language by hearing it used, but the computer software described in the NS article learns from text input.
__________________
Cap'n Q
"Good morning, Pooh Bear," said Eeyore gloomily. "If it is a good morning," he said. "Which I doubt," said he.
|
Thread Tools |
|
Display Modes |
Hybrid Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is On
|
|
|
|
|