Jump to content
Sign in to follow this  
-radkeff-

Is artificial intelligence possible ?

Recommended Posts

Quote[/b] ]It just sounds so... weird... After all, it's still a machine...

so are we rock.gif

Quote[/b] ]For example frogs witch seam to just have knee jerk reactions to heat because they are quite happy to sit (and die) in water that slowly heats up to 200*c but will jump away if they are suddenly put in a hot oven

maybe when the water reaches a certain point the can`t get out and dont realise till its too late?

Quote[/b] ](don't try this at home assholes).

oopsies wow_o.giftounge_o.gif

Share this post


Link to post
Share on other sites

i can't be assed to read the entire thread, but have you guys ever heard of "robots"?

I've seen so many robot programs where they would give the robot the ability to do certain things, but have a blank memory and they had to learn to figure things out themselves... I personally call this "THINKING" and "LEARNING"

another way of putting it, AI

Share this post


Link to post
Share on other sites

If you ask me, AI is simply a point at which a computer has the potential to extend it's own programming without any boundaries. Just like we can learn anything if we want to and if we work hard enough.

Of course there are computers nowadays that can 'learn' but in my mind that isn't real AI because they can only learn so much and then that's it.

When we invent a computer that has the ability to learn anything, anytime without restriction - that would be AI.

biggrin_o.gif

Share this post


Link to post
Share on other sites

AI is the ability for a computer to make a decision, completely unassisted. Such as picking a random number.

In fact, if you could develop a computer that can create random numbers unassisted, you are probably set for your lifetime in terms of money.

Share this post


Link to post
Share on other sites
Guest
I find it very hard to believe that a machine can have real feelings, not just programmed feelings but that the computer can really experience them exactly as us humans and some animals.  

It just sounds so... weird...  After all, it's still a machine...

Do animals have REAL feelings?

How do you define emotions?

****WARNING****

Below you'll find content that you might find offensive if your a friend of pets!  tounge_o.gif

If I put sensors on my computer, that makes it beep if you try to take it apart or smash it, does it feel pain?

If I kick a cat and it meows, does it feel pain?

You cannot ever measure or find out wheather something is able to have emotions, feeling pain or not. The only thing that is measuerable is the sensor input and the signals.

Pain is not an emotion. Pain is caused by certain events (heat, cold, sharp contact etc) triggering automatic responses in the human (and animal) nervous system. It has nothing to do with emotions at all....things like love/hate etc. They are responses based on learning/conditioning and more intangable things (ie: why do I love this person and not someone else?).

Pain would possible now to replicate in a machine (grow nerve cells, attach to some sort of computer).....emotions....I don't think so.

As for animals having emotions....how do you now humans have emotions? Because they can communicate it with language, but also because you understand how to read non-verbal cues, such as facial expressions/body language. Also because humans (generally) know how to empathise...you can see someone having something done to them, and know/imagine how that would feel, and then make a judgement about that persons emotions.

It is totally possible to read animals non-verbal cues, and to empathise with them to a certain degree. Gorillas have been taught language before, and could communicate emotions to a human.

But the day my PC says "I'm feeling a bit unhappy, so I'm not going to run Flashpoint" is the day it's gone to far tounge_o.gif

Share this post


Link to post
Share on other sites

It is most likely that AI is not possible becouse our brain is probably a caotic mathematical sistem. That means that a small change in one variable results as a great change in the result. To calculate a caotic mathematical system you would need a super computer wich is able to calculate numbers infinite in length. As you all know our computers have difficlutys moving a few soldires around Nogova in OPF, so the AI will stay like the brave AI soldiers in OPF - predictable and stupid. wow_o.gif

Share this post


Link to post
Share on other sites

Of course its possible. Its just a matter of time. Before it was done, flight, space travel and nuclear bombs were considered impossible by most. But then someone went up and did it.

Anything is possible. Its just about time.

Share this post


Link to post
Share on other sites
Of course its possible. Its just a matter of time. Before it was done, flight, space travel and nuclear bombs were considered impossible by most. But then someone went up and did it.

Anything is possible. Its just about time.

Don't be so optimistic. One thing is to make an atom bomb the other is to make the thing that made an atom bomb.

By the "kvantum" theory (Einstein, Hawkins, ...) evrything in this universe has a certain probability to happen but it is far more less likely that it will happen in the lifetime of our universe.

So - anything is possible but it will probably never happen!

rock.gifbiggrin_o.gif

Share this post


Link to post
Share on other sites

I wasn't quite sure if I should make a new post for this, but I think it's related enough to be posted as a reply to this topic, so here it goes:

Imagine, you'd get your hands on a super-PC with a game featuring AI characters that are actual self-aware, conscious, sentient (to the limits of their digital environment) beings. Do you think you would feel any moral objections against hurting/killing any of the AI characters?

Personally I would, and when we get to such a point in AI development, some pretty big choices will have to be made as to the rights of such artificial sentients. They might not be "living" beings as such, but if we create them to our own level of intelligence (or higher) while basically controlling/limiting their living environment (the computer), I think it is our duty to treat them with atleast a proper level of respect. Humans performing horrible psychological experiments on their digitally created counterparts would be rather questionable IMHO ...

Share this post


Link to post
Share on other sites
Do you think you would feel any moral objections against hurting/killing any of the AI characters?

If you would backup their "conscience", and after killing them would restore that backup, would that make it any different? The same "character" would be "alive" again, and would have no idea of what has happened...

Share this post


Link to post
Share on other sites

"Don't be so optimistic. One thing is to make an atom bomb the other is to make the thing that made an atom bomb."

All I am saying is that most people have a tendency to say "It cant be done" until someone actually goes and does it. I'd rather assume anything CAN be done than limit myself and say it CAN'T be done.

Share this post


Link to post
Share on other sites

Thats like saying it is morraly right to cause emense pain to someone as long as you make shure there is no permanent dammage and that they dont remember it

Share this post


Link to post
Share on other sites
"Don't be so optimistic. One thing is to make an atom bomb the other is to make the thing that made an atom bomb."

All I am saying is that most people have a tendency to say "It cant be done" until someone actually goes and does it. I'd rather assume anything CAN be done than limit myself and say it CAN'T be done.

I dont want to be offensive but I noticed that most people think that there are no limits to the human brain. I'm sorry to let you know that they are.

For example:

Evrybody thinks that weather is predictable. It is not! To predict it you would need a super computer (like the one I mentioned in one of my previous posts). One American scientist who worked on a weather study for the US Army (all the the scientists in the USA that worked appart from military have probably died from hunger) said that a wind from a butterfly wing in California could cause a tornado in Florida.

His weather model was consisted of 3 linear equations, it is very unlikely that our brain (mathematical) model is made from less than 3. sad_o.gifwink_o.gif

Share this post


Link to post
Share on other sites

"I dont want to be offensive but I noticed that most people think that there are no limits to the human brain. I'm sorry to let you know that they are."

I know our brains are limited in their capacity. But that doesn't tell us how far we will go in the future. There is just no way to know for sure now, what we can or cannot do in the future.

Share this post


Link to post
Share on other sites

Sith-

Quote[/b] ]"Imagine, you'd get your hands on a super-PC with a game featuring AI characters that are actual self-aware, conscious, sentient (to the limits of their digital environment) beings. Do you think you would feel any moral objections against hurting/killing any of the AI characters?"

Its a good question. For some reason i hadnt considered sentient AI being used in a conflict type game (not in the context of this discussion anyway).

For a start it would probably be possible to use incredibly advanced AI that just hadnt quite gone through the last step to make it fully conscious. I imagine many legitimate game developers might be happy to use such AI for games involving combat (near perfect simulation of consciousness without the real thing).

As Kegetys has said it might be possible to back up or preserve the 'brain' of the AI. Also it would probably be possible to 'fake' pain and death in the AI so that the AI would become something like an actor.

Having said all that, due to the nastiness inherent in human nature sad_o.gif im sure that some sentient AI would indeed find their way into a game or situation in which death in the game world would mean real death for the AI (and pain inflicted would be 'real' pain etc) .I would definately have a problem with playing or having anything to do with a game like that.

Perhaps some kind of underground games movement would emerge dealing in games in which you could do various nasty things to sentient lifeforms (im hoping it would have to be an underground movement and not a sanctioned and legal area of games development). In fact this is making me want to write a sci-fi novel tounge_o.gif

Would the AI have to learn their way to self awareness or would they be programmed self aware? Somehow killing AI that would have memories of growing up and a history seems more depraved to me.

Basically i think if we succeeded in creating sentient self aware beings artificially then we would have succeeded in creating life (artificially) and we would have a duty to accord them the rights that intelligent living beings deserve. I also think some people would feel endowed with a godlike power over the AI and would waste no time in violating their rights.

Creating such AI would really open a whole Pandoras box.

Perhaps we are all just such AI in an ultra convincing gameworld and God is a user.... rock.giftounge_o.gif

Perhaps i am the only AI and all the rest of you are part of the game simulation ?

Ahhhhhhhh hh ! crazy_o.gif

Share this post


Link to post
Share on other sites
Guest
I wasn't quite sure if I should make a new post for this, but I think it's related enough to be posted as a reply to this topic, so here it goes:

Imagine, you'd get your hands on a super-PC with a game featuring AI characters that are actual self-aware, conscious, sentient (to the limits of their digital environment) beings. Do you think you would feel any moral objections against hurting/killing any of the AI characters?

Personally I would, and when we get to such a point in AI development, some pretty big choices will have to be made as to the rights of such artificial sentients. They might not be "living" beings as such, but if we create them to our own level of intelligence (or higher) while basically controlling/limiting their living environment (the computer), I think it is our duty to treat them with atleast a proper level of respect. Humans performing horrible psychological experiments on their digitally created counterparts would be rather questionable IMHO ...

I'd go with Douglas Adams on this one. Remember the creature in "The Restaurant At The End Of The Universe", that wants to be eaten? It even suggests nice portions of itself to eat.

Perhaps the type of AI you are talking about would be "happy" to be killed, so long as it made you work for it smile_o.gif

But I think any programmer that made an AI that had this sort of intelligence and could feel pain, then put them into an environment would have to be a very sick monkey indeed. So that's what you are working on is it? wink_o.giftounge_o.gif

Share this post


Link to post
Share on other sites
Imagine, you'd get your hands on a super-PC with a game featuring AI characters that are actual self-aware, conscious, sentient (to the limits of their digital environment) beings. Do you think you would feel any moral objections against hurting/killing any of the AI characters?

I guess that game would make some serious threat to multiplayer smile_o.gif

Nah..i'd think i would blast away...even if they could be conscious, i guess humans would make a diffenrence of someone standing next to you and someone living in a box (computer) tounge_o.gif

I think you would even rate animals higher because if that ai was in a game then you could create and delete that ai like we are creating and deleting ofp's ai...

Share this post


Link to post
Share on other sites
I find it very hard to believe that a machine can have real feelings, not just programmed feelings but that the computer can really experience them exactly as us humans and some animals.  

It just sounds so... weird...  After all, it's still a machine...

Do animals have REAL feelings?

How do you define emotions?

'Real' feelings? I dunno, and as far as i know, there's no good way to test it...

Quote[/b] ]@DarkLight --- what is the diffrance between your self and a particually complex machine? Apart form what it is made of etc.

What I am saying is there is nothing magical about you feeling pain like you do. it is only a result of PHYSICAL changes etc. (unless you are blindly religious) and physical changes can be simulated.

Many people chose not to belive that a machine can in theory created that does have feelings because it makes them feel less special and more vanrable.

As i said before i do not consider humans as something special, when comparing to other animals i see nothing but us being stupid animals that were lucky to have a good evolution and some pretty good brains (compared to others).

I know that the feelings are nothing special but i still find it hard to accept that a machine could experience the feelings in the same way as an animal (such as us humans)...

Share this post


Link to post
Share on other sites
Imagine, you'd get your hands on a super-PC with a game featuring AI characters that are actual self-aware, conscious, sentient (to the limits of their digital environment) beings. Do you think you would feel any moral objections against hurting/killing any of the AI characters?

Yes I would have moral objections about that. Even though they would lack a physical presence, it would be a lot like killing a fully paralized person on life support with proper brain function. rock.gif I was thinking about this lately also...

EDIT: The only problem with killing natural life (humans etc) as opposed to an identical artificial simulation (if possible) however is that life has propagated through millenia. Is it possible that some "processes" in the brains of mammals get transferred from mothers to offspring as "initialization" routines. tounge_o.gif Wonder how that would be done... anyway, killing a "being" in a machine is destroying your creation and the being, killing a natural lifeform is terminating a process that has been going on for a very long time, and it kind of upsets the species to say the least.

I sound like I'm going crazy there.

Share this post


Link to post
Share on other sites
I also think some people would feel endowed with a godlike power over the AI and would waste no time in violating their rights.

That's exactly the thing that bothers me. If we can control something and use it to increase our sense of power (even if it's merely virtual power), we will. It's human nature. And somehow I'm pretty sure that such things will get back to us in a very nasty way ...

crazy_o.gif

Nah..i'd think i would blast away...even if they could be conscious, i guess humans would make a diffenrence of someone standing next to you and someone living in a box (computer) tounge_o.gif

I think you would even rate animals higher because if that ai was in a game then you could create and delete that ai like we are creating and deleting ofp's ai...

So even if these virtual beings are mentally identical to normal humans, but simply lack a physical body structure, it's ok for us to delete the very existence we put them up with in the first place? Remember, we're not just talking about some pile of code here (ok in theory we are tounge_o.gif),  but something that actually feels joy, fear, panick, pain. It has memories and it will beg for it's life when you put a gun against its head. And this time that is not the result of some scripted event, but because the AI character is truly terrified, like you would be in a similar situation.

Is it possible that some "processes" in the brains of mammals get transferred from mothers to offspring as "initialization" routines.

If I recall correctly, all basic human and animal instincts are part of our genetic code. Every creature that is born has a certain amount of "genetic knowledge" to be able to survive to some extend the period after its birth (breathing, feeding, reflexes, etc). For information about genetically passing on knowledge gathered after birth, I have to refer you to Ellen Ripley tounge_o.gif

Share this post


Link to post
Share on other sites

Sith-"I'm pretty sure that such things will get back to us in a very nasty way ..."

Well if its any consolation probably none of us will live to see it smile_o.gif .

Oh wait thats a bad thing isnt it sad_o.gif

tounge_o.gif

Share this post


Link to post
Share on other sites
Nah..i'd think i would blast away...even if they could be conscious, i guess humans would make a diffenrence of someone standing next to you and someone living in a box (computer) tounge_o.gif

I think you would even rate animals higher because if that ai was in a game then you could create and delete that ai like we are creating and deleting ofp's ai...

So even if these virtual beings are mentally identical to normal humans, but simply lack a physical body structure, it's ok for us to delete the very existence we put them up with in the first place? Remember, we're not just talking about some pile of code here (ok in theory we are tounge_o.gif),  but something that actually feels joy, fear, panick, pain. It has memories and it will beg for it's life when you put a gun against its head. And this time that is not the result of some scripted event, but because the AI character is truly terrified, like you would be in a similar situation.

Like the guy said u can create and delete them at will.......... big fish eat little fish= bye bye cyber brains.

At the end of the day the robots prime puprose would be to serve us we al;ready have enough problem with ppl let alone introducing another group to deal with unless the entire human race was dying of some sort of superbug.

Share this post


Link to post
Share on other sites

And what about the introduction in the main AI program some laws, like the 3 Azimov laws. I suppose this genre of machines are created by engineers not by a crazy scientist. Test operations and security free from bugs... crazy_o.gif

Share this post


Link to post
Share on other sites
Guest
It has memories and it will beg for it's life when you put a gun against its head. And this time that is not the result of some scripted event, but because the AI character is truly terrified, like you would be in a similar situation.

A not so extreme, but similar thing comes to mind. At my university a department called SANS (Studies of Artificial Neural Systems) made a complete simulation of the sea lamprey's (a very primitive fish with only about 100,000 neurons total) nervous system. They made the simulation exact to the molecular level.

The result? It behaved exactly like the real thing. They tested all forms of extreme conditions on it - giving it adrenaline shocks, boiling it etc.. And the simulated fish reacted the same way as the real thing did.

So was that cruelty to animals? rock.gif I'd say no in our world but yes in the simulated one...

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×