![]() |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
Why do we structurally store memories? Because the energy Version is too damn fragile. One good bump and it's gone. That is why those who have had head trauma, electric shock, etc. have memory loss (usually of the preceeding 20 min). Also, realize that energy is not transmitted within our brains through electrons. It is done through chemical ions (calcium, potassium, etc.). That is one of the reasons that computers are so fast. They operate through much faster electrons. Ironicly, this may be a strike against them ever becoming conscious. Cousciousness is very much a temporal event. In fact it only emerges with specific timing (20 msec) and complexity. Computers just might be too fast to gain cousciousness. Thinking of the brain as some kind of crude anchor to which the energy of ou mind is tied is somewhat baseless. The structure of our brain is our mind. |
Re: [OT] Plato\'s Pub and Philosophical Society
Hmm, can you explain this a bit more? I didn't quite understand how faster speed can prevent consciousness... If the computer is fast, wouldn't it just have thoughts that happen faster?
|
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
|
Re: [OT] Plato\'s Pub and Philosophical Society
Suicide Junkie:
It was something that was discussed in my neuro-science class Last semester. To get much more specific I would have to dig through my notes to find the relevant data (perhaps I'll get around to doing that later, if I have time). Kamog: One way to put it is that consciousness is something that has to have the time to consider itself. If a complex task is performed too quickly it doesn't have enough time to displat the emergent property of consciousness. It may be that computers are finishing their tasks before they have a chance to be more than their task. A lot of our consciousness is a result of the lingering effects (often self-perpetuated) of a stimulus rather than specifically due to any given stimulus. Sort of like with the example I gave of how we encode memory. A stimulus sets up a feedback loop that stimulates the growth of memory. This is further reinforced by dreaming to consolidate the memory. It is from this entire process (and many others) that we derive consciousness. A computer can store data and be done with it, no further actvity needed. This isn't to say that an actual AI is impossible though. Merely that it would be qualitatively different from ours, if possible at all. Interesting fact: the brain works in binary. Don't believe me? Ok. Each neuron transmits info through pulses down its axon. At any point on the axon there two possibilities, that it is "spiking" (passing an ion charge) or that it is not. The refractory period of the axon (the minimum time a point takes to "reset" after a burst) is 1 msec. Therefore in 5 msec there are 2^5 possible combinations of 1 and 0. In one sec there are 2^1000. And that is only for a single axon. Multiply that by the number of neurons in the brain and you get and idea of its actual computational power. The latest generation of computer processors is getting up there. In fact, the tendency of pentiums and higher to randomly(?) take and odd action , or otherwise show the odd unexplainable bug, may be a precursor to something like an AI awakening. I have often wondered at how (if) humans and AI's would be able to understand each other. It seems as if we would reach consciousness from opposite ends of the spectrum. Our brains (organic animals) developed as a capacity for action and then eventually evolved memory. AI's would have started as pure memory and then developed the capacity for action. What differences would there be between the products of such different origins? Would we be able to reconcile such differences? Sometimes such questions keep me up at night. |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
Quote:
This will eventually make them easier for us to accept, but the first few years of human/ AI relations will be very difficult. People will fear AIs as a threat, (I can see the "Frankenstein" headlines in parts of the Brtitish press now http://forum.shrapnelgames.com/images/icons/icon9.gif ) and their initial alien-ness will mean people either refuse to accept their intelligence and treat them as dumb machines (effectively consigning them to slavery) or block further development in AI tech, or both. I would like to see human rights organisations pre-empt AI technology by defining NOW what constitutes an artificial intelligence for the purposes of assigning it certain rights and protections. Unfortunately I don't think this is likely to happen, and AIs will be used as cheap labour, no doubt programmed to obey (like in asimov's second law of robotics: "A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.") Once we are able to put AIs in human-like bodies, they will be able to gather their experiences in much the same way that we do, they will be much easier for us to relate to and (some) humans will be able to accept their status and sympathise with them. Then the struggle for AI rights will begin, with economic interests trying to keep them in chains. However I doubt this will manifest itself in the kind of terminator2-style apocolypse postulated by the likes of blatant self-publicist Kevin Warwick, because AIs will be fundamentally safe: Although Aasimov's positronic brain and three laws are really pure technobabble, I'm sure human fears will make sure some kind of coded inhibition against anti-social behaviour will is implemented. Which brings us back around to free will... [ March 24, 2003, 11:35: Message edited by: dogscoff ] |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
I'm not sure how you'd go about coding it to think about what it's doing... some set of parallel processors inspecting the incoming codes, perhaps an evolutionary programming system where it takes the majority decision of the currently top-ranked algorithms. (Ranked via various needs sensors, and perhaps a pair of "good bot"/"bad bot" social buttons on the front) |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
This is interestingly mirrored in the case of feral children (those raised by animals). Ferals tend to show a similar deficiency in self-awareness and lack of fourth and fifth order dendrite growth, even after extensive cultural assimilation and education. And guess what? Ferals tend to die in their forties and fifties as well. Interesting is it not? |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
I'm not sure how you'd go about coding it to think about what it's doing... some set of parallel processors inspecting the incoming codes, perhaps an evolutionary programming system where it takes the majority decision of the currently top-ranked algorithms. (Ranked via various needs sensors, and perhaps a pair of "good bot"/"bad bot" social buttons on the front)</font><hr /></blockquote><font size="2" face="Verdana, Helvetica, sans-serif">I think it has a lot to do with the way in which we each encode information (humans and computers). Humans don't just store information, we store our interpretation of information. That filtering process is part of what gives us cousciousness. Computers can just store data whole cloth, no need for interpretation. I don't think that finding a way for computers to mimic our encoding process is the answrer to creating an AI. For AI's an entirely different process would have to be discovered, one taking into account such fundamental differences. As for the 20 msec time frame, we have processes within us that happen both faster and slower, but it is only those that occur at @20 msec that produce/are a part/define consciousness. If computers can achieve consciousness it will most likely be in a very different timeframe. Perhaps one in which we will be unable to recognize their awakening. |
Re: [OT] Plato\'s Pub and Philosophical Society
Quote:
|
Re: [OT] Plato\'s Pub and Philosophical Society
Has anyone ever read the hyperion series by Dan Simmons? It has an interesting account of the development of AI's, especially in the Last two books (Endymion and Rise of Endymion). It sees them basically as viruses that gained sentience through parasitic consumption of their bretheren. This had some interesting implications on their group psychology and in their interaction with humans.
|
All times are GMT -4. The time now is 08:42 AM. |
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©1999 - 2025, Shrapnel Games, Inc. - All Rights Reserved.