Jump to content
Sign in to follow this  
Nic

Robots rule!

Recommended Posts

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (der bastler @ Feb. 26 2003,01:17)</td></tr><tr><td id="QUOTE">Deadly serious these days?

-=> wink.gif <=-<span id='postcolor'>

I don't get you.

Share this post


Link to post
Share on other sites

I don't understand.

What's with the monster?

Although I understand the picture of the Amish people.

And that's not an insult. Personally I respect them Amish people. They have principles against technology and stand by them.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IceFire @ Feb. 26 2003,01:18)</td></tr><tr><td id="QUOTE"></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (der bastler @ Feb. 26 2003,01:17)</td></tr><tr><td id="QUOTE">Deadly serious these days?

-=> wink.gif <=-<span id='postcolor'>

I don't get you.<span id='postcolor'>

wink.gif = wink smilie, although i don't like using it because it doesn't really look like it should. wink.gif

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IceFire @ Feb. 26 2003,01:27)</td></tr><tr><td id="QUOTE">I don't understand.

What's with the monster?

Although I understand the picture of the Amish people.

And that's not an insult.   Personally I respect them Amish people.  They have principles against technology and stand by them.<span id='postcolor'>

don`t worrry i was making a funny hence he 2 little smileys at the bottom of smile.gif but i do wonder if the amaish ppl would mind a robot if it was entirely mechanical.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (peanuckle_00 @ Feb. 26 2003,03:02)</td></tr><tr><td id="QUOTE">Without technology the black would be slaves, blacks like me.<span id='postcolor'>

I dont think technology had much to do with slavery.... confused.gif

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IceFire @ Feb. 26 2003,00:59)</td></tr><tr><td id="QUOTE">The love between a man and woman is only possible between humans.  And is a perfect reason why our humanity should not be recreated or imitated by mankind.<span id='postcolor'>

What you're saying is that we should not make anything as our own image. You are afraid that it would somehow reduce the value of "humanity" if mere machines could be just like we are.

I have seen "humanity" reduced to code of four different letters, neatly stored on a hard-disk of a computer. That was called the human genome project and it is now more or less complete (accessible at http://www.ncbi.nlm.nih.gov/mapview....f&query). The essence of humanity is the information coded in the biological storage medium of DNA.

Like it or not, humans and computers are both essentially just machines running a program.

Share this post


Link to post
Share on other sites

wow.gif2--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (peanuckle_00 @ Feb. 26 2003,03wow.gif2)</td></tr><tr><td id="QUOTE">Without technology the black would be slaves, blacks like me.<span id='postcolor'>

It was an advance in technology that turned slavery into a big business in the first place. The cotton gin allowed the process of seperating cotton fiber from cotton seed to be accomplished much more efficiently. So, instead of plantation owners having to own a large number of slaves for cotton harvesting AND a large number of slaves for cotton seperating (which was a prohibitively expensive and inneficient means of farming), they could concentrate the vast majority of their slave labor on harvesting, and the cotton gin allowed the seperators to keep up with the harvesting rate, no matter how much cotton was picked.

Of course, Eli Whitney, the guy who invented the cotton gin, also was a pioneer in interchangeable parts and industrial production of firearms in the Northern States. so yes, technology did help win the Civil War, and, by proxy, free the slaves.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Oligo @ Feb. 26 2003,09:31)</td></tr><tr><td id="QUOTE">Like it or not, humans and computers are both essentially just machines running a program.<span id='postcolor'>

At an objective level, this is accurate. However, to allow the base concept of life to be assigned a value no greater than a computer program opens up a Pandora's Box of abuses that would otherwise be prevented by maintaining, for lack of a better term, the sanctity of life. To maintain the viewpoint that life is not special and therefore not valuable, you slip into a form of nihilism that at first seems reasonable and very fashionably post-modern, but is ultimately an incredibly dangerous outlook. Why is it dangerous? Well, it's pretty simple: what if everybody thought the same way you do? I mean, everyone. Suddenly, there's no reason to be decent to your fellow human beings. Why should we? They're just carbon-based robots running on a genetic program enforced by chemical balances; fuck 'em. And that's just at the very basic day to day level. Think about the consequences at the national or world level. Hell, it's bad enough as it is, when leaders still try to maintain a veneer of righteousness.

So, whether your statement is correct or not, it is not a thought to be nurtured. I know it sounds a bit fascist to deny the truth, but in this one case, it is the smart thing to do. By believing that life is special and something that shouldn't be messed with, we then have the moral option to hold ourselves to a higher standard.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Tex [uSMC] @ Feb. 26 2003,10:19)</td></tr><tr><td id="QUOTE">At an objective level, this is accurate. However, to allow the base concept of life to be assigned a value no greater than a computer program opens up a Pandora's Box of abuses that would otherwise be prevented by maintaining, for lack of a better term, the sanctity of life. To maintain the viewpoint that life is not special and therefore not valuable, you slip into a form of nihilism that at first seems reasonable and very fashionably post-modern, but is ultimately an incredibly dangerous outlook. Why is it dangerous? Well, it's pretty simple: what if everybody thought the same way you do? I mean, everyone. Suddenly, there's no reason to be decent to your fellow human beings. Why should we? They're just carbon-based robots running on a genetic program enforced by chemical balances; fuck 'em. And that's just at the very basic day to day level. Think about the consequences at the national or world level. Hell, it's bad enough as it is, when leaders still try to maintain a veneer of righteousness.

So, whether your statement is correct or not, it is not a thought to be nurtured. I know it sounds a bit fascist to deny the truth, but in this one case, it is the smart thing to do. By believing that life is special and something that shouldn't be messed with, we then have the moral option to hold ourselves to a higher standard.<span id='postcolor'>

Come on, do you really think that decent behaviour should rise from some kind of inherent "value" and "speciality" of human life? If you look at things in their base level, all life indeed is just carbon-based robots running their genetic program. But we are still free to believe that it is wrong to kill and harm other life. We do not need a magic sanctity of life to justify our behaviour, all we need is the skill of empathy: "I'm a carbon-robot but I nevertheless don't want to die or endure pain, therefore those similar carbon-robots next to me probably don't want to die or be in pain either, thus I should not harm them."

If we move our moral basis from sanctity of life to empathy, we'll never again need to adjust it again. I'd be happy to think that if we ever create true artificial intelligence, we would be able to treat it as an equal, not as a slave, a mere silicon-robot.

What comes to "messing" with life: It's not messing, it's improving. Our cultural evolution is lifting us above natural evolution. We are soon free to improve upon what nature has accidentally given us. What's so wrong with that?

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">Come on, do you really think that decent behaviour should rise from some kind of inherent "value" and "speciality" of human life? <span id='postcolor'>

Can you think of a better reason that we should treat other humans well? For example, I'm a bit of an existentialist, and think that a human's life is only as valuable or meaningful A) as far as their society inherently values human life, and B) the value that a human adds to his/her own life through activities or deeds. Without our society to lay down a fundamental baseline value of life, then people would solely be judged by their accomplishments, and those who, for whatever reason, fall behind, would be worthless. Now, if the society's baseline for value of life was that it was a standard natural phenomena that deserves no special recognition, how could that society justify laws against, for example, murder? I mean, it's not like life is special or anything, so why should taking life carry a special punishment?

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">But we are still free to believe that it is wrong to kill and harm other life. We do not need a magic sanctity of life to justify our behaviour, all we need is the skill of empathy: "I'm a carbon-robot but I nevertheless don't want to die or endure pain, therefore those similar carbon-robots next to me probably don't want to die or be in pain either, thus I should not harm them."

<span id='postcolor'>

If you think about it, a large number of the human-caused problems in the world are caused by a human making the decision to value life less than their own goals or ideology. So, taking this into account, one can see how devaluing life itself cannot possibly improve the situation. Instead, to value life as something that is special and worthy of preservation, you can at least offer a check against the behavior that leads to many of the problems in the world. It won't entirely prevent the behavior, but it will create a means by which people who operate largely on values and ideas will have to consider a value that they have learned since infancy before taking any action.

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">If we move our moral basis from sanctity of life to empathy, we'll never again need to adjust it again. I'd be happy to think that if we ever create true artificial intelligence, we would be able to treat it as an equal, not as a slave, a mere silicon-robot.

<span id='postcolor'>

When you consider it, empathy also stems from a type of sanctity of life. You see, empathy relies on your value and preference of your own life. You consider yourself to be special, and would prefer that you were not harmed. So, empathy is basically the projection of your own sanctity of life onto other people.

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">What comes to "messing" with life: It's not messing, it's improving. Our cultural evolution is lifting us above natural evolution. We are soon free to improve upon what nature has accidentally given us. What's so wrong with that?<span id='postcolor'>

For one, there's nothing wrong with improving, and for two, we are not yet at a point where we can adequately decide what is an improvement for the better and what would be a mistake for the worse. As for our cultural evolution, I can't help but laugh a little. Our cultural evolution, in the past century, has included: murdering those who practice a different religion; taking large amounts of mind-altering substances; and of course, watching Jerry Springer.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Tex [uSMC] @ Feb. 26 2003,10:19)</td></tr><tr><td id="QUOTE"></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Oligo @ Feb. 26 2003,09:31)</td></tr><tr><td id="QUOTE">Like it or not, humans and computers are both essentially just machines running a program.<span id='postcolor'>

At an objective level, this is accurate. However, to allow the base concept of life to be assigned a value no greater than a computer program opens up a Pandora's Box of abuses that would otherwise be prevented by maintaining, for lack of a better term, the sanctity of life. To maintain the viewpoint that life is not special and therefore not valuable, you slip into a form of nihilism that at first seems reasonable and very fashionably post-modern, but is ultimately an incredibly dangerous outlook. Why is it dangerous? Well, it's pretty simple: what if everybody thought the same way you do? I mean, everyone. Suddenly, there's no reason to be decent to your fellow human beings. Why should we? They're just carbon-based robots running on a genetic program enforced by chemical balances; fuck 'em. And that's just at the very basic day to day level. Think about the consequences at the national or world level. Hell, it's bad enough as it is, when leaders still try to maintain a veneer of righteousness.

So, whether your statement is correct or not, it is not a thought to be nurtured. I know it sounds a bit fascist to deny the truth, but in this one case, it is the smart thing to do. By believing that life is special and something that shouldn't be messed with, we then have the moral option to hold ourselves to a higher standard.<span id='postcolor'>

That same argument could also be used in a discussion on cloning and yes tex there is a reason still to be decent to other folk even if they did regarrd one another as simply machines,the reason is cause and effect,u hit someone they hit u back,a bit like it is today when u scrape the surface.

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">If we move our moral basis from sanctity of life to empathy, we'll never again need to adjust it again. I'd be happy to think that if we ever create true artificial intelligence, we would be able to treat it as an equal, not as a slave, a mere silicon-robot.<span id='postcolor'>

robots should be our slaves within context.They should ease the load of us and be our tools and give us greater freedoms the last thing we`d need is another minority to complicate matters.

The phrase "when in rome do as the romans do",comes to mind with regards to that.

Share this post


Link to post
Share on other sites
Guest

I wonder what will destroy our civilization first, uncontrollable AI, irresponsible genetic alterations or is we'll just do it the old fashioned way and nuke ourselves to death. smile.gif

Share this post


Link to post
Share on other sites

ah but would the AI, if it did wipe us out, do a better job of running the planet,if it did have the intellectual capability for it, than we did or would it wipe itself out eventually in the same way we did smile.gif

My bets on atmoic war hey who knows maybe iraq will kickstart it wow.gif

Share this post


Link to post
Share on other sites

I think the likelyhood of AI physically taking control of society against a majority of human wishes in some kind of 'hostile takeover' is low to very low.

On the contrary we will simply hand it over to 'them'. By the time it is possible to run a society with AI ,said AI will be so thoroughly ingrained in that society that the vast majority will not regard it as particularly sinister. It will happen gradually that humans will become less and less important in the decision making process. Assuming humanity survives that long.

If AI was running the planet then the survival over the long term of intelligent beings (whether human or AI) would i believe be more certain .

It could hardly be less certain now. We're not even taking significant measures to detect and enable the destruction of meteorites or other extra-terrestrial threats

or develop such technologies as would enable such things.

People including those in government always tend to end up looking to the short term.

But such threats however seemingly unlikely in the short term

could wipe out all life and a catastrophe WILL happen eventually.

AI determined to continue life as a whole would view a protection against this as a priority and organise accordingly.

Share this post


Link to post
Share on other sites

wow.gif0--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Renagade @ Feb. 26 2003,21wow.gif0)</td></tr><tr><td id="QUOTE">robots should be our slaves within context.They should ease the load of us and be our tools and give us greater freedoms the last thing we`d need is another minority to complicate matters.

The phrase "when in rome do as the romans do",comes to mind with regards to that.<span id='postcolor'>

Have you ever watched Bicentennial Man? nice movie.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (IsthatyouJohnWayne @ Feb. 26 2003,21:38)</td></tr><tr><td id="QUOTE">On the contrary we will simply hand it over to 'them'. By the time it is possible to run a society with AI ,said AI will be so thoroughly ingrained in that society that the vast majority will not regard it as particularly sinister. It will happen gradually that humans will become less and less important in the decision making process. Assuming humanity survives that long.<span id='postcolor'>

have u ever played the game " Omikron: the nomad soul"? while being a good game in itself it also touches this subject.

The world that u play in has one big supercomputer controlling things that the government does now and more,unfortunately a demon manages to take control of it and screws things up and a few corrupt officials make matters worse.

Now if such a system was implemented it could suffer from the same problems ,obviously not a demon,but u get the idea.

No,i haven`t seen bicentinel man,why?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×