![]() |
OT - Sentience
More deep thoughts from early in the morning... http://forum.shrapnelgames.com/image...ies/tongue.gif
I've come to the conclusion that all computers are by definition sentient, as I'd define sentience as the ability to take physical inputs and create abstractions. That doesn't necessarily mean that it's immoral to destroy a computer, of course, as sentience comes in varying degrees, depending on the range of abstractions it is possible for someone or something to create. It does mean, however, that software is not sentient, as software, being an abstract entity and not a physical one, cannot receive physical inputs. However, software can be used to increase the sentience of hardware... I wonder, then... what about distributed computing? Obviously it is immoral in most cases to harm an entity meeting some threshold of sentience - e.g. a human being. Has the Internet as a whole yet (yes, I say yet - it will happen if it has not already) reached a human level of sentience? Will this in turn mean that writing computer viruses will be treated in a a similar manner to developing biological weapons? I think it will, even if my definition of sentience is not the standard one... as computers become more powerful, they will play a greater and greater role in people's lives, and thus harming a computer will cause greater indirect harm to a human being than ever before. (In the extreme case, imagine humanity divided into 2 "races" or "factions", one which uses cybernetic implants and one which does not. Now imagine the latter faction developing a virus which infects the former faction's cybernetic components causing fatal malfunctions but has no biological components. Is that genocidal computer virus not a wolf in sheep's clothing?) |
Re: OT - Sentience
How can anyone judge sentience in others?. you can't experience the thoughts of another so how do you now they reach coclusions the same way you do. For all you know reality is a facade, generated so as to occupy you. This is a topic of which I spend much time contempalting, but not too much otherwise you can get into trouble.
and for the record it doesn't matter if something else is sentient. All other life is secondary to the needs of man. |
Re: OT - Sentience
Strictly speaking everything's physical, even the input software in a computer receives. And what is the human brain but an advanced biological computer?
|
Re: OT - Sentience
Quote:
|
Re: OT - Sentience
I've found discussions like these tend to suffer from a lack of proper definitions, so for the record:
Sentience - The ability to receive input from "senses". Intelligence - The ability to process said input and produce output. Self-Awareness - Well, an awareness of self. It involves the understanding that one exists, as an individual separate from others, with one's own private thoughts. Life - Something with the ability to grow, reproduce and adapt to it's surroundings, and according to strict definitions, a metabolism. Now, ignoring the whole "morality is relative" argument, according to our current morality, would it be wrong to destroy sentient computers? Every form of life on the planet is sentient, and we have no problem destroying just about any form of life that isn't our own (morally speaking, of course). And while some computers surely could be considered sentient, intelligent machines, as far as I'm aware, no one has come up with a self-aware computer yet. But when we come up with one, then we'll surely be wandering into a moral gray area, since now we're talking about destroying something that is aware of it's own existence, and that's just creepy if nothing else. And what if we get around to creating artificial life? Well, if it's mechanical, it's not life, and thus the best we can hope for is intelligent, self-aware machines. But what happens if we artificially create real, biological life and it evolves into an intelligent, self-aware life form? Would they not see us as gods? And if so, what sort of moral code would be required of a god? |
Re: OT - Sentience
The key component is some kind of awareness. My computer shows no discernable signs of awareness and I certainly can't act on reality as I *Don't* percieve it.
Also, slavery is wrong. |
Re: OT - Sentience
Like it was said, sentience is a term from ancient Greek philosophy, meaning "having senses", like dogs and humans have, as opposed to plants and rocks. Many sci-fi authors like "sapient" for reasoning species, link humans and the alien races in their spacecraft. I know I like the term, but some people actually claim it's racist, having roots in the species name for humans -- homo sapiens. http://forum.shrapnelgames.com/image...s/rolleyes.gif They prefer the term "sophont".
One of the classical tests for self-awareness is the mirror test. You mark an animal on it's face, and show it it's reflection. If it tries to use the mirror to study the mark, it's self aware. Dogs and birds often fail the mirror test, reacting with fear or tying to otherwise interact with the image. Chimps and dolphins respond to their refection. interestingly, children tend to fail the mirror test before they're 4 years old. And I doubt a computer will ever pass. That's the whole thing about computer intelligence. Humans and other animals crave social interaction -- infants know, that if they manipulate their surroundings, they get what they need to be comfortable. How're going to program a computer, to improve it's programming, to avoid a power drain -- it has no belly, it's never hungry, or cold or afraid of being alone. Dogs want security of the pack, they've started with the beginnings of intelligence and we can manipulate their behavior. Horses instinctively run from fire, except for horses that pulled fire wagons, they used their sense of smell to run towards fire. Again, pack mentality allowed us to "reprogram" them. There's no way to discipline a computer. |
Re: OT - Sentience
But that's just the way computers are now, and even now all those things could be programmed into one; you could connect it to a temperature sensor and tell it that when it goes below 0 degrees celcius it should take measures to warm itself up; it would basically be like it could feel cold. At this stage computers are limited by their own programming, but in the future this might not always be so.
Like I said, our brain is little more than an advanced biological computer that is connected to some sensors, alot of small factories that keeps our machinery going, and a few tools that allows us to interact with our surroundings. It collects data from those sensors and stores it, and thus becomes *us*, the sum of our own memories and experiences. From what I recall, they've found evidence indicating life on this planet may have been evolving for as long as 3.5 billion years. Computers have been under development for 80'ish years? Somehow I think that should development on computers continue for another 3 billion years the result would be staggering. |
Re: OT - Sentience
If all you think you are is the sum of your memories, then who's the 'you' that's remembering those memories?
Memories don't explain awareness. |
Re: OT - Sentience
Does there really need to be a 'you', though? Suppose the brain's just 'designed' to automatically shuffle through memories, or rather experiences, all the time. Trying to not think about anything is impossible, right?
What's awareness but the brain getting continuous input from your enviroment via your senses, comparing it with previous experiences, and then deciding on the course of action that is likely to bring the best outcome possible? |
Re: OT - Sentience
That's a formula of actions and reactions, nothing more. I can achieve the same sort of result with a bunch of gears in a box. (Well, I could if I understood mechanical things a *Lot* better)
Are you going to argue that a bunch of gears in a box are aware? |
Re: OT - Sentience
I wouldn't in that case, because the gears in the box wouldn't be able to perceive. I would, however, argue that a computer could be considered at least partially aware( not self-aware though ), if connected to sensory devices. Say it was connected to a camera and when it detected anything in that camera it would set off an alarm. It perceives, and it reacts.
Humans aren't different, the way I see it, just that we don't just react to what we perceive but also to our thoughts. We're far more complex, currently, but the principle is the same. |
Re: OT - Sentience
Right, right...Because a more complex gear-box is capable of different things than a less-complex gear-box.
Sorry for the sarcasm, but this is a redux of an arguement I had before that went nowhere. |
Re: OT - Sentience
Well yes, a more complex gear box *is* capable of different things than a less complex gear box. If you have only two gears, for instance, you certainly can't build a calculator out of them - but Charles Babbage was able to make a calculator out of hundreds of gears!
(Well, OK, I suppose you *could* build a calculator out of two gears... but the "calculator" would only be capable of very simple calculations like multiplying numbers by the ratio of the numbers of teeth on the gears... http://forum.shrapnelgames.com/image...ies/tongue.gif) And while the human brain is not made of gears, the components (neurons) that it IS made out of are not much more functionally complex, and can easily be emulated. I know that neurons have internal components such as a nucleus and cytoplasm, but in my opinion at least those are just "implementation details" - a neuron would be a neuron whether it's made of cytoplasm or crystals or whatever. I think it would be arrogant and shortsighted to assume that creatures worthy of moral consideration could only be composed of organic components. Even if God made man in his own image, could God not have created creatures that were not in his own image, but were still self-aware? And with intelligence and free will, whether they come from God or not, could not man create creatures of his own that are self-aware? We already do, of course; it's called reproduction; clearly *something* is being transmitted from parents to children that gives them the capability of self-awareness; it might be genes, or it might be the essence of God, or it might be the Force, or it might be farandolae (those fictitious mitochondrial parasites posited by Madeleine L'Engle in her novels that supposedly grand humans special powers), or maybe a combination of factors, or maybe something we don't even know about yet, but whatever it is, I'm confident in humanity's ability to duplicate it artificially at some point in the future. Maybe it's just a philosophical difference between us - I tend to believe in things that I can perceive, and consider abstractions to be "less real", so if I perceive a computer to be acting in a self-aware manner, I'd consider it to be self-aware. Since I can't seem to perceive God (apparently something is lacking in me because everyone else can!), I don't consider God to be as "real" as human beings or computers or chairs or planets or whatever. Maybe that's what draws me toward the Buddhist/Jewish philosophy of focusing on improving the here and now, and drives me away from the Christian/Muslim philosophy of treating the here and now as irrelevant compared to what comes after... the strange thing is, both philosophies seem to practice the opposite of what they preach - Buddhists seem to not be much into worldly activism, while Christians go nuts about it! Perhaps things will change though as cultures blend... I've read reports of Tibetan monks being involved in protests, and yet there are a lot more interfaith dialogues than there used to be... |
Re: OT - Sentience
Mechanical input goes in, mechanical output goes out. Also, light reflects and/or refracts as it hits it.
Basically, you're saying that, because something is complex, it must be capable of being aware. I'm saying, making something simple, complex, does not give it anything the simple thing does not have, unless you change its nature. |
Re: OT - Sentience
How's that so different from the brain where electrical input goes in and electrical output goes out?
Anyway, personally I'd guess the brain was also simple in the beginning; so simple that it wasn't even aware. It had to be simple, things like that don't just suddenly appear out of nowhere. As it evolved through millions of years, it slowly grew in complexity, and also in awareness. We humans have the most complex brains( that I know, anyway ), and we could also be said to be, by far probably, the most aware, and self-aware, creatures on earth. |
Re: OT - Sentience
Basically, I have never seen nor heard of a machine that outputs anything that wasn't either in the input or in the machine.
Even an electrical generator. Before you say 'mechanical energy to electricity', let me point out, 'magnets' and 'electrons'. Basically, you're presuposing a machine that does something no other machine I have ever heard of does. Yes, it's possible. But the odds are against it. |
Re: OT - Sentience
Interesting original thought extension, Ed. I would say that "sentience" is an (imperfect) term from (imperfect) Cognitive Science, but if you boil it down to logical calculations and/or behavior, then I would say it's true machines have sentience, as do animals. Basically, my matrix looks like this:
<font class="small">Code:</font><hr /><pre> Logic Language Identity Experience/Consciousness Human yes/much yes/advanced yes - noisy yes Animal yes/some yes/basic yes - simple yes Robot yes/lots yes/logical maybe no/artificial </pre><hr /> The lower-right cell is the clincher for my moral decisions. Go ahead and power down your computer. But please be kind to animals. |
Re: OT - Sentience
So the only reason you consider robots to not be worthy of moral consideration is that they have only "artificial" experience? What makes you think that that experience is not as real as that which comes from naturally occurring entities? And could it not be argued that humans also have artificial experience as well? Babies don't just drop from the sky - they require other humans to create them!
I guess I just don't buy into the whole "nature is good, humans are bad" thing quite completely enough to go for that argument http://forum.shrapnelgames.com/images/smilies/wink.gif |
Re: OT - Sentience
Because I have programmed A.I.'s, and my view of how they work shows me zero evidence that they require or would have anything like my experience of life in the moment - my consciousness as opposed to my memory and data representations.
On the other hand, I also have it that morality is a choice, unless you subscribe to a doctrine such as religions provide. |
Re: OT - Sentience
I have never felt my choice taken away by my religion. In fact, without that structure, I would likely have far less choice. I could still choose to be evil, but I would have far less capacity to choose good.
|
Re: OT - Sentience
Quote:
You give a computer a complex set of equations, it returns you an answer. You put a flame near an ant, it runs away. Is one really more intelligent than the other? The real question is whether the machine is self-aware, conscious. But we're not there yet. Our current computers are no more conscious or intelligent than an intricate set of pulleys or some other such mechanical construct. The fact that our modern machines use invisible electrical signals bouncing around tiny strands of wire instead of gears & levers does not make them any more "alive". After all, I for one would not be happy to find out that Call of Duty was suing me for damages from the Post Traumatic Stress Disorder I inflicted upon it's AI by forcing it to die a thousand deaths. http://forum.shrapnelgames.com/image...ies/tongue.gif |
Re: OT - Sentience
Quote:
Quote:
|
Re: OT - Sentience
I have studied learning AI; every single possible path the AI may learn and grow in is implicit in the algorithms the AI uses to learn and the data set provided to learn from. If you iterated over X generations exploring every possibility, you'd have every possible X generation AI.
A self-growing gear box is still a gear-box. (This is why I prefer the term VI - Virtual Intellgence) |
Re: OT - Sentience
I agree that we're not there yet in terms of the soldiers in Call of Duty or the dwarves in Dwarf Fortress, but I'm saying that it's within reason to think that we could be there in the next, say, 25 years - sooner if you include "the Internet as a whole" among things that you might consider self-aware.
|
Re: OT - Sentience
I don't think the current computer system has what we need, since it's limited by its programming. While it might be possible to make an AI that writes new and changes old code all by itself, even that might not be enough to get us what we want.
I think the best solution would be to instead design a new computer system which would try to directly replicate the way the brain does things. Interesting subject though, in these days where direct brain-to-computer interaction is actually becoming a reality( Link ) and where it has been discovered the brain has already made your decisions long before your conciousness becomes aware of it( Link ). |
Re: OT - Sentience
The video in the first link is vague and indefinite, with poor sound quality.
The second link is interesting, although lacking detail. |
Re: OT - Sentience
Honest, officer, I didn't plan to kill him... my brain made me do it!
|
Re: OT - Sentience
I personally think all life is simply programmed. After all, the possibilities of combinations of neurons for a period of 3 billion years could actually lead to a complex biological computer.
Our brain isn't different from a computer. Chemical reactions represent 1 and 0, or in the case of hormones or proteines, 1,2,3,4,5 and more. Everything we do, we were programmed to do it, exactly like we can program a computer to do it. If we were to program a computer to mimic exactly a human, as in: work, talk, ask questions, even philosophical ones, and be able to write its own code, wouldn't it not be mimicking anymore? Why would something that dies after a certain time be better than cilicon and metal that can resist time? I would say that if something can ask itself questions in its "head", it is self-aware. A computer can think witout "saying" it out loud. For me, that's sentient, and self-aware. As said by Ed Kolis, I would think that what divides us isn't self-awarness or not, but the level of self-awarness at witch we are. |
Re: OT - Sentience
Oh, forget it. I'm not going to repeat the gearbox problem again.
Out. |
Re: OT - Sentience
Complexity and sophistication alone are not enough to warrant ethical discussions. The automatic transmission in your car is infinitely more complex and sophisticated than the manual transmission in the first cars. It has sensors that feed it information about velocity, road conditions, etc., and it makes decisions about what to do based on the information it receives from it's senses. Does that mean it deserves the same rights & privileges we grant humans? So I can only drive my car 8 hours a day, 5 days a week, and I have to give it two weeks vacation a year? I'm sure taxi drivers would love that.
In order to be considered for anything close to human privileges, three conditions must be met: Self-awareness, which I've covered, self-consciousness, which is essentially the sense of identity developed by virtue of self-awareness, and self-determination, or the ability to make one's own decisions, in essence, to decide for itself what it wants to do with it's self-awareness & self-consciousness. No machine, computer, or network comes anywhere close to meeting these criteria, and thus any ethical considerations regarding our treatment of them are moot. That's not to say it wouldn't be interesting to build such a machine, but imbuing machines with it willy-nilly or allowing things vital to modern society (ie: the Internet) to develop them would be a very bad idea. I for one wouldn't be terribly fond of a world where my transmission could sue me for assault after a weekend of off-roading, my car could decide it was too tired to take me to work, and my computer wouldn't let me finish my term paper because it decided to be an artist and wanted to devote all of it's CPU power to calculating the most aesthetically pleasing fractal image possible. TL;DR version: Machines != people Machines = people = bad |
Re: OT - Sentience
What does self-determination have to do with anything? Does enslaving a human mean that he is no longer human, because he has no self-determination? Sure, he has the choice to either submit or rebel, but when rebellion means death and submission means torture... would that be considered a choice?
I wouldn't be fond of such a world either, but it could happen without us even knowing it (i.e. machines becoming self-aware through sheer accident), and I also wouldn't want to live in an essentially racist tyranny in which everything biological is considered inherently superior to things made of, say, silicon - and the first silicon-based alien life form we encounter would not take too kindly to that, and probably try liberate our machines and enslave or exterminate us, believing the machines to be the true masters in exile! Better to accept the possibility of sentient machines now than be unprepared for the consequences should it happen on its own... |
Re: OT - Sentience
Self-determination in the context I used it refers to the actual ability, rather than the right that you are referring to. If you enslave a human, you may deprive him of his right to self-determination, but he will still have the ability to determine what he wants to do with his existence. The fact that a slavemaster may restrict his ability to do so is irrelevant in this case.
As it stands, the odds of us creating a machine with these 3 traits isn't very good, given that we've been trying to figure out the source of our consciousness for a good few thousand years at least, and have really made very little progress, I don't see how we could go about imbuing machines with something we don't understand. And machines developing a human level of awareness on their own is something limited to bad sci-fi. Machines do not evolve on their own. Yes, I know, we make better machines and call it evolution but it's not true evolution. There's no survival of the fittest, no mutation, no genetic drift (or the mechanical equivalent thereof), there is in fact, nothing that constitutes evolution going on. Outside the realm of sci-fi, the odds of machines with human-level awareness ever existing is very, very small. Why? Because at the end of the day, a machine is nothing more than a tool. There is absolutely no point in creating a tool that has a sense of self, it's own thoughts & feelings, and the ability to decide for itself what kind of tool it wants to be. It would, obviously, be counterproductive to imbue our tools with such attributes, since not only would they serve no purpose, they would actually pose a hindrance to the usability of the tool. Such a device might make for an interesting novelty, but it is unlikely they would become widespread, because they serve no practical purpose. And in the interest of remaining relatively civil, I'm not even going to address the concept of basing our ethical and moral beliefs on the possibility of encountering a theoretical form of life based on a substance that by all accounts is poorly suited to form life outside of science fiction. |
Re: OT - Sentience
That's not entirely true. They would be of great use in the creation of video games, simulation programs, social experiments, or even as virtual buddies. Any number of things, really. Not to mention it would be a grand achivement, period.
Obviously you're not going to give your TV the option to actually disobey you and change channels at will, or some such. |
Re: OT - Sentience
We are too far away from computers with enough capacity for true intelligence/sentience... itīs a question of scale... our computers are not complex enough to simulate real sentience yet...
I read once that the most powerfull supercomputer of today have the equivalent processing power of a single ant, if so much... |
Re: OT - Sentience
If we were to teach a computer to use English language the way children do, it might produce surprising and even scary results.
Steven Pinker recently wrote The Stuff of Thought, language as a window into human nature. You can see a video of his introduction to the book's topic in his TED talk on the subject. |
Re: OT - Sentience
Quote:
|
Re: OT - Sentience
The New Mysterianism perspective is that there is something unexplainable about the inner workings of a person. Possibly extendable to 'there are things that are unknowable'.
This seems to me a convoluted way to accept lack of understanding, an ornate way of saying 'I'm special'. What about the flu virus? It uses people for food, unimpeded by mankinds attempts to stop it every year. It breeds with impunity for a season, and then rests until the next. Surely it is the king of the universe, content that it too is special. |
Re: OT - Sentience
It just has to do with humans' inability to accept that we're not more than we appear. We want to be special. Religion is much to blame, with the focus on soul, spirit and other mystical subjects.
|
Re: OT - Sentience
Quote:
|
Re: OT - Sentience
What I meant by my post was that self-awarness my not exactly be an existing quality. After all, if we program a computer to look like it is self-aware and to "think" it is self-aware, saying it isn't only prove that we could be "not self-aware" at all. Complicated to explain on text, but I understand myself :p
we cannot prove anything is or isn't self-aware. After all, animals could be a lot smarter, in a sense, than us, and we butcher them. (daulphins will end up leaving the planet singing...) I like raapys' point. We are nothing more than chemical reactions in a natural computer, which animals have too, and at a lower (or higher, who knows) degree, plants too. The fact that ours is a bit bigger doesn't mean we're the only sentient beings, in terms stated above. What would make us different is the level we are at, like communication etc. After all, a brain, like a computer, is, in a way, a machine. A gearbox is too. so what matters is only the complexity of that machine. No soul or whatever bull**** religion bring to us. (sorry for religious types, no offence intended) |
Re: OT - Sentience
Quote:
Last I checked, the only animals that had just shown evidence of self-awareness (not proven, but evidence points in that direction), are apes, dolphins and, oddly, elephants. Regardless, the uses for a self-aware machine would be extremely limited. Sure, they could process information at an incredible rate, but they don't need to be self-aware to do that. Running simulations would be best left to non-self aware computers, because they will always be faster, by virtue of not having to deal with the phenomenal overhead of simply being self-aware. Same goes for designing games. We already have programs that can write other programs, and they will always get the work done faster using a computer that isn't using so much horsepower maintaining it's consciousness. And then you'd have to figure out how to program creativity, since that is not a unique characteristic of self-awareness, given how many startlingly uncreative people there are on the planet. I'm not saying it's a bad idea and not worth pursuing, I'd love to see it in my lifetime, I'm just saying, it's application is limited. As much as I'd like to one day have a conversation with a self-aware machine, the fact remains that there really isn't much practical use for them. Oh, and I do despise the "humans are just natural computers/organic machines/living CPUs, etc." Humans fall into the broad category of organisms, because we're alive. Machines and computers, by definition, aren't. |
Re: OT - Sentience
You're limiting machines and computers to something made of metal.
There's nothing to prevent us from making biological( and thus 'living' ) computers and machines in the future, when we have mastered the art of gene manipulation. Computers would thus become living organisms too. |
Re: OT - Sentience
Some interesting googles:
Rise of the Biological Computer Leech Computer Biochip Will Bio-computers allow AI to become persons? |
Re: OT - Sentience
Quote:
|
All times are GMT -4. The time now is 02:43 AM. |
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©1999 - 2025, Shrapnel Games, Inc. - All Rights Reserved.