Guest Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (denoir @ 21 May 2003,17:00)</td></tr><tr><td id="QUOTE"></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IsthatyouJohnWayne @ 21 May 2003,16:53)</td></tr><tr><td id="QUOTE">Lastly Quantum computers making possible all kinds of astronomic wackiness thanks to 'qubits' that exist simultaneously as a one and a zero. But i personally think a full on quantum computer will be unfeasible for a very very long time ( over a century or centuries+ ).<span id='postcolor'> I'd say 20 years at most. You already have functioning examples that can perform simple logical and math operations. What is lacking today is the production technology that can mass produce the devices with sufficient accuracy. Also still a problem is the stability and sterility of the operating environment. I'm confident however that it will be solved <span id='postcolor'> What, you mean it might require WindowsQuantum to run properly? Then it's stuffed...don't even bother Share this post Link to post Share on other sites
bn880 5 Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IsthatyouJohnWayne @ 21 May 2003,11:10)</td></tr><tr><td id="QUOTE">And will there still be people starving in Africa even as we create this artificial life in a laboratory? <span id='postcolor'> If you follow past examples (intelligence ) yes it's quite possible. Especially if there is a use for intelligent AI in warfare (which there is), you could have disasterous wars and starvation while some sophisticated Artificial Intelligence is created. Sucks... Share this post Link to post Share on other sites
CopyCon 1 Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (denoir @ 21 May 2003,17:03)</td></tr><tr><td id="QUOTE">I think that what people most often think by "intelligence" is self awareness.<span id='postcolor'> And here comes the problem that makes a whole society of philosophers scratch their brains out in frustration. Do you have to be self conscious to be intelligent, or do you become self conscious at a certain level of intelligens. For me acting intelligent is the same as beeing intelligent, since you'd never notice any difference. Share this post Link to post Share on other sites
Tex -USMC- 0 Posted May 21, 2003 I know many humans who are self-aware, but would not fit into my definition of 'intelligent' Share this post Link to post Share on other sites
Tovarish 0 Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Tex [uSMC] @ 21 May 2003,18:46)</td></tr><tr><td id="QUOTE">I know many humans who are self-aware, but would not fit into my definition of 'intelligent' Â <span id='postcolor'> Which brings me to my favourite definition of a computer - Accelerator of Human Stupidity. Share this post Link to post Share on other sites
Guest Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Copy Con @ 21 May 2003,18:39)</td></tr><tr><td id="QUOTE">For me acting intelligent is the same as beeing intelligent, since you'd never notice any difference.<span id='postcolor'> Yes that is of course one way of looking at it That's also that basic premisis for the Turing test: You lock up a computer and a human being in two separate boxes and give them the same means of communication. If you can't differentiate between them then the computer is said to be intelligent. I think that the idea is very flawed since you only get to see one limited aspect of the whole 'intelligence' concept. Until today no computer has ever passed the Turing test, but they are getting closer and closer. Share this post Link to post Share on other sites
CopyCon 1 Posted May 21, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (denoir @ 21 May 2003,18:54)</td></tr><tr><td id="QUOTE">Yes that is of course one way of looking at it That's also that basic premisis for the Turing test: You lock up a computer and a human being in two separate boxes and give them the same means of communication. If you can't differentiate between them then the computer is said to be intelligent. I think that the idea is very flawed since you only get to see one limited aspect of the whole 'intelligence' concept. Until today no computer has ever passed the Turing test, but they are getting closer and closer.<span id='postcolor'> I dont really like the test either, anyway the: http://www.alicebot.org/ won "Most intelligent bot" some time ago and it would not pass even the most simple tests. Share this post Link to post Share on other sites
Albert Schweitzer 10 Posted May 21, 2003 Of course. This is how you test the intelligence of monkeys. You put a miror in front of them and see wether they recognise themselves. Not many animals can do so. Gorrillas and Killerwhales are two I know. It is believed that self-recognition is the main precondition for creativity. Share this post Link to post Share on other sites
IsthatyouJohnWayne 0 Posted May 21, 2003 Albert Schweizer-"This is how you test the intelligence of monkeys. You put a miror in front of them and see wether they recognise themselves. Not many animals can do so. Gorrillas and Killerwhales are two I know. It is believed that self-recognition is the main precondition for creativity." But for self-recognition to truly take place, cognition is first necessary. This suggests elements of reasoning, perception, awareness and judgement. It may be possible to program a system to 'recognise' itself in some sense with very little cognition. You could equip it with visual sensors ,a name and a set of dimensions for its physical appearance. Then after some fine tuning and clever programming it could use visual object identification techniques to 'recognise' itself in the mirror. You could even program it to wiggle and say 'hey that looks like me, Joey!' each time it sees itself in the mirror (and programme it to differentiate between itself and other similarly shaped objects) but would that indicate a true awareness of its own existence? No. In many ways thats just another limited test not unlike the Turing test. Share this post Link to post Share on other sites
Guest Posted May 22, 2003 The Turing test uses language as the criteria, which means computers not only have to be able to understand what is being said, but what they themselves are saying (the second part being harder in some ways I think). I think language is a rather dodgy criteria, since most (all?) animals and some plants have means of communicating with each other, but not all of these would pass a self-awareness test (which has it's own problems anyway). I'm not sure if I'd really want my PC to have self-awareness anyway. Share this post Link to post Share on other sites
Nagual 0 Posted May 22, 2003 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IsthatyouJohnWayne @ 22 May 2003,07:42)</td></tr><tr><td id="QUOTE">But for self-recognition to truly take place, cognition is first necessary. This suggests elements of reasoning, perception, awareness and judgement.<span id='postcolor'> Take out reasoning and judgement and you're getting warmer. Reasoning especially is not cognition, simply a tool to help make sense of things. There are much better filters to percieve through than that, though reason is useful when it comes to order and stability. It can be / is  be part of an overall cognitive system, but not the cognitive system itself. Eg, insects, who demostrate "Group intelligence", are living beings with a cognitive system too, but i get the feeling that reasoning is not high on their perceptual adgenda. Also, if anyone seriously wants to ponder about intelligence, consider that intelligence, awareness ect, has an actual connection to that good ol thing called "life". There is no perception, awareness, intelligence, without life itself, whatever that may be. Regarding testing, what about human testing. There is test called EQ, testing emotional intelligence, believed by some serious researchers to be much more accurate than IQ tests. Funnily enough, most modern adult males score at the level of children in single figure age groups. Share this post Link to post Share on other sites
IsthatyouJohnWayne 0 Posted May 22, 2003 Nagual-"Take out reasoning and judgement and you're getting warmer. Reasoning especially is not cognition, simply a tool to help make sense of things. There are much better filters to percieve through than that, though reason is useful when it comes to order and stability." I do not claim reasoning or the other ablilities mentioned are essential at all times to cognition hence 'suggests elements' . I must disagree though in that to me reasoning seems to be quite an important part of cognition at least when discussing higher intelligence (as we have established AI in this discussion may pertain to). I think reasoning in some sense is essential to forming an accurate or complex understanding of things, and knowledge without understanding tends to undermine intelligence. So reasoning abilities of some kind are important to advancing artificial intelligence. Anyway perhaps we should move on in this discussion from advanced or sentient artificial intelligence to more practical or immediately feasible areas of AI research. Narrow task specific AI might not entail so much philosophical quacking. interesting new smilies Share this post Link to post Share on other sites
DarkLight 0 Posted May 22, 2003 I think that machines will be able to play emotions and they might be able to react in an emotional way to certain things but they'll never feel emotions the same way as us humans. If you ask me, they'll show emotions but it'll be just some programmed stuff, nothing like we have... Share this post Link to post Share on other sites
reedkiller 0 Posted May 22, 2003 If we assume that religion or ones spirit does not exist for now than we can know for sure that Artificially created human intelligence and consciousness is possible because all you would need to do is "grow" a test tube brain (obviously not possible yet!" or create a extremely good copy of the brain (biological or mechanical (a computer)). Therefore there is no question as to whether it can be done in theory. We can also assume that an artificial human brain could be improved (more easily if it is non-biological). The only question is whether there will ever be the technology and motivation to create such things. As to the technology we are now making more advances in science per year than ever before. In less than 100 years we have gone from the Wright brothers to Mr. Armstrong. And we are clearly a very, very long way from knowing all we can. so I believe that it is only a matter of time before the technology exists. Share this post Link to post Share on other sites
reedkiller 0 Posted May 22, 2003 If we ever do make a perfect copy of the human brain in the form of a machine than it would obviously feel emotions and have some idear of what it is (assuming it had some senses such as sight, hearing etc.) So would it be given human rights? @DarkLight -- dont think you, or the human race are anything special. You are little more important to the running of the universe than a rock of the same chemical compersition as your self. The same rules of physics apply to a human that apply to anything else. A emotion is no more than a relitivly simple set of chemical and electrical changes that in theory could be emulated in somthing that does not live. Share this post Link to post Share on other sites
bn880 5 Posted May 22, 2003 Quote[/b] ]@DarkLight -- dont think you, or the human race are anything special. You are little more important to the running of the universe than a rock of the same chemical compersition as your self. The same rules of physics apply to a human that apply to anything else. A emotion is no more than a relitivly simple set of chemical and electrical changes that in theory could be emulated in somthing that does not live. I pretty much see it the same way. Although you have to admit we are a little special. Share this post Link to post Share on other sites
Guest Posted May 22, 2003 Ah ha..this goes into interesting territory The Anthropic Principal Simply put: We see the universe the way it is because we exist. Not so simply: 1. The Weak Anthropic Principal In a universe that is large or infinite in space and/or time, the conditions necessary for the development of intelligent life will be met only in certain regions that are limited in space and time. The intelligent beings in these regions should therefore not be suprised if they observe that their locality in the universe satisfies the conditions that are necessary for their existence. 2. The Strong Anthropic Principal There are either many different universes or many different regions of a single universe, each with its own initial configuration and, perhaps, with its own of laws of science. In most of these universes the conditions would not be right for the development of complicated organisms; only in the few universes that are like ours would intelligent beings develop and ask the question: "Why is the universe the way we see it?" The answer is then simple: If it had been different we would not be here! (From "A Brief History of Time" by Stephen Hawking). The second one I find a lot of fun. One way of looking at it is to say "The universe was created so that intelligent could exist in order to ask the question why was the universe created". So now if you apply that to AI....... Share this post Link to post Share on other sites
IsthatyouJohnWayne 0 Posted May 22, 2003 Actually Leone the end of your post raises a good question. Why are we intent (as many people are) on creating artificial sentient life that knows it exists? Perhaps just so it can ask 'why do i exist?' What use actually is that to the human race? Why are we so intent on building a superior replacement for our own species? Perhaps we should focus our efforts more on just augmenting existing sentient lifeforms (ie us) Reedkiller- I suppose growing a test tube brain (and having it live in a machine exoskeleton?) would qualify as creating artificial intelligence, but you must admit it is still more 'natural' (less artificial) than creating a brain of non biological circuitry. Though perhaps it would be the intelligent (easier) thing to do . Still, genetically altering humans and adding metal bits is something of a taboo subject(or objective for scientists anyway). All this talk makes me want to play Deus Ex Share this post Link to post Share on other sites
CopyCon 1 Posted May 22, 2003 What use actually is that to the human race? Why are we so intent on building a superior replacement for our own species? Maybe that's the last possible evolution for mankind. To Denoir: Does extrapolating just apply on curves, like in your example. Or is it the same logic used in solving IQ tests with forms and shapes, or... the way RAID 5 calculates and rebuilds the missing bits? Share this post Link to post Share on other sites
reedkiller 0 Posted May 22, 2003 @Leone -- what you said is quite amuseing and clever. Share this post Link to post Share on other sites
DarkLight 0 Posted May 22, 2003 @DarkLight  --  dont think you, or the human race are anything special. You are little more important to the running of the universe than a rock of the same chemical compersition as your self. The same rules of physics apply to a human that apply to anything else. A emotion is no more than a relitivly simple set of chemical and electrical changes that in theory could be emulated in somthing that does not live. I don't think the human race is special at all, the only thing we're good at is bad stuff, basically, the worst thing walking on our earth are humans. I know human beings are nothing really special, that's not what i wanted to say... What i was trying to say is will the computer experience those 'emotions' the same way as us or will he just react on it as his little program tells him to. I think there is a difference between doing what you are programmed for and really feeling emotions... I find it very hard to believe that a machine can have real feelings, not just programmed feelings but that the computer can really experience them exactly as us humans and some animals. It just sounds so... weird... After all, it's still a machine... Share this post Link to post Share on other sites
CopyCon 1 Posted May 22, 2003 I find it very hard to believe that a machine can have real feelings, not just programmed feelings but that the computer can really experience them exactly as us humans and some animals. It just sounds so... weird... After all, it's still a machine... Do animals have REAL feelings? How do you define emotions? ****WARNING**** Below you'll find content that you might find offensive if your a friend of pets! If I put sensors on my computer, that makes it beep if you try to take it apart or smash it, does it feel pain? If I kick a cat and it meows, does it feel pain? You cannot ever measure or find out wheather something is able to have emotions, feeling pain or not. The only thing that is measuerable is the sensor input and the signals. Share this post Link to post Share on other sites
reedkiller 0 Posted May 22, 2003 @DarkLight --- what is the diffrance between your self and a particually complex machine? Apart form what it is made of etc. What I am saying is there is nothing magical about you feeling pain like you do. it is only a result of PHYSICAL changes etc. (unless you are blindly religious) and physical changes can be simulated. Many people chose not to belive that a machine can in theory created that does have feelings because it makes them feel less special and more vanrable. Share this post Link to post Share on other sites
reedkiller 0 Posted May 22, 2003 @Copy Con -- You make a very valid point however I think that we can tell something's about whether a given thing feels pain. It is assumed (by at least one group of scientists) that certain animals do not feel pain. For example frogs witch seam to just have knee jerk reactions to heat because they are quite happy to sit (and die) in water that slowly heats up to 200*c but will jump away if they are suddenly put in a hot oven (don't try this at home assholes). We can also study how the brain works to give us clues as to whether animals feel pain. ***(im a bit over my head here so don't trust all I say)*** Share this post Link to post Share on other sites
Devil 0 Posted May 22, 2003 I have found a really advanced chatbot that knowns A LOT. Talk to it. http://aimovie.warnerbros.com/html/flash.html Share this post Link to post Share on other sites