Views: 122891|Replies: 189

Will Robots, Cyborgs, and Super-Humans Kill Off the Chinese People?   [Copy link] 中文

Rank: 1

Post time 2013-3-21 05:45:27 |Display all floors
This post was edited by WarmWeather at 2013-3-21 06:52

For centuries people have feared Hell, the place where "sinful" souls suffer eternal, unimaginable torment. Gradually, as the light of reason began to dawn upon the world, or at least certain parts of it, the concept of Hell began to lose its power. It became a somewhat politically incorrect relic of a bygone dark age, not to be taken too literally or seriously. Nowadays, few people in the civilized world believe that Hell exists, although, oddly enough, many still claim to believe in God. Well, they're wrong. While God, Heaven, or angels may be little more than childish fantasies, Hell most certainly does exist. It is right here, right now. Hell is all around us. We live it, breathe it; it's our reality. A reality ruled by entropy, where suffering and death are the sickening defaults for every living thing. The question isn't whether you will suffer and die, but rather how much, and when. Some may be luckier than others, but in the end no flesh is spared. The shadow of death and misery hangs over us like Damocles' sword, and poisons all pleasures. Ours is a brutal natural order where organisms have to kill each other for food, the struggle for status and resources is constant, and disease, violence, disasters, and accidents perpetually vie with steady degeneration in their bid to inflict suffering and death upon the living. In a world such as this, it can hardly be deemed surprising that many seek blissful oblivion via drugs, sex, religious psychosis...or suicide.

After aeons of pain, fear, despair, frustration, and perspectiveless drudgery, of countless lost generations, there is, however, finally some genuine hope. Ex machina libertas; technology will set you free. Science and technology have, ever since man learned to control fire and began to make simple tools, steadily improved the quality of life. It is they who gave birth to civilization, man's ongoing rebellion against a harsh, chaotic, uncaring universe. Now, in what is commonly known as the 21st century, this technology-driven rebellion is about to reach its zenith. Soon, our species will have the ability to not only liberate itself from Hell, but to create Heaven on Earth; to finally fulfill its destiny among the stars and "become as gods".

Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.  This event, the relatively sudden emergence of superintelligence (SI), is often referred to as the (technological) Singularity in Transhumanist circles. The longer definition is:  SINGULARITY: the postulated point or short period in our future when our self-guided evolutionary development accelerates enormously (powered by nanotech, neuroscience, AI, and   perhaps uploading) so that nothing beyond that time can reliably be conceived.  

Assuming, of course, that we actually survive the next 15-40 years and reach the Singularity, which may be assuming too much. There is no shortage of existential risks, after all, with runaway / military nanotech, genetically engineered or synthetic bacteria and viruses, and good old nuclear warfare being among the most likely candidates for near-term human extinction. Even if superhuman intelligence wins the race, survival is by no means guaranteed for those who don't participate or fall behind in this burst of self-directed hyperevolution. Technology, like most things, is a double-edged sword, and will give us or our creations not just the means not to improve life immeasurably --to banish aging, disease, and suffering forever-- but also to extinguish it on an unprecedented scale, practically unopposed.

Directed efforts to stop some of the more dangerous technologies which might cause a "malevolent" Singularity (or global destruction in general) can only slow the process down somewhat, not stop it entirely. Unless, of course, one is prepared to sacrifice science, technology, and civilization itself. This, however, would be a certain death sentence for every individual and, eventually, the species itself, while the alternative offers at least some hope, even if the odds are stacked against us. Rather than trying to enforce an essentially immoral and ultimately doomed program of relinquishment, as suggested by Bill Joy et al, we should try to develop the most empowering, mind & body-enhancing technologies as soon as possible. Above all we need to become smarter, more rational than we currently are in order to deal intelligently with the complex challenges ahead.

Evolution says organisms are replaced by species of superior adaptability. When our robots are tired of taking orders, they may, if we're lucky, show more compassion to us than we've shown the species we pushed into oblivion. Perhaps they will put us into zoos, throw peanuts at us and make us dance inside our cages.

The Posthuman future may be glorious, filled with wonders far beyond our current comprehension, but what good is that to a person if he can't be part of it? If, for example, AIs  become superintelligent before humans do, this will reduce us to second-rate beings that are almost completely at the mercy of this new "master race". During our dominion of the Earth we have wiped out countless animal species, brought others (including our "cousins", the apes) to the brink of extinction, used them for scientific experiments, put them in cages for our enjoyment etc., etc. This is the privilege of (near-absolute) power. If we lose the top spot to our own creations, we will find ourselves in the same precarious position that animals are in now. While it may be more or less impossible to predict what exactly an "alien" Superintelligence would do to/with lower life forms such as humans, the mere fact that we'd be completely at its mercy should be reason enough for concern.


[continued below]

Use magic tools Report

Rank: 1

Post time 2013-3-21 05:46:09 |Display all floors
Needless to say, from a personal perspective it doesn't matter much who or what exactly will become superintelligent (AIs, genetically engineered humans, cyborgs) -- in each case you'd be faced with an unpredictable, vastly superior being. A god, in effect. Because one's personality would almost certainly change, perhaps even completely beyond recognition, once the augmentation process starts, it doesn't even really matter whether the person would be "good" or "bad" to begin with; the result would be "unknowable" anyway. Many (most? all??) of our current emotions and attitudes, the legacy of our evolutionary past, could easily become as antiquated as our biological bodies in the Posthuman world. Altruism may be useful in an evolutionary context where weak, imperfect beings have to rely on cooperation to survive, but to a solitary god-like SI it would just be a dangerous handicap. What would it gain by letting others ascend? Most likely nothing. What could it lose? Possibly everything. Consequently, if its concept of logic will even remotely resemble ours, it probably won't let others become its peers. And even if it's completely, utterly alien, it could still harm or kill us for other (apparently incomprehensible) reasons, or even more or less accidentally, as a side-effect of its ascension, for example. How many insects does the average human crush or otherwise kill during his lifetime? Many thousands, no doubt, and often without even knowing about it. Usually it's not malice or anything of the sort, merely utter indifference. The insects simply aren't important enough to care about -- unless they get in the way, that is, in which case they're bound to be castigated with some chemical weapon of mass destruction. They're non-entities to be ignored and casually stepped on at best, annoying pests to be eradicated at worst. Such are the eternal laws of power. So what's the moral of the story here? Well, make sure that you'll be one of the first Posthumans, obviously, but more on that later.

True, the future doesn't necessarily have to be bad for the less-than-superintelligent -- the SIs could be "eternal" philanthropists for all we know, altruism might turn out to be the most logically stable "Objective Morality", they could be our obedient, genie-like servants, or they might simply choose to ignore us altogether and fly off into space, but depending on such positive scenarios in the face of unknowability is dangerously naive (wishful thinking). Yet, though lip service is occasionally paid to the dangers of the Singularity and powerful new technologies in general, there is no known coordinated effort within the Transhumanist community to actively prepare for the coming changes. This has to do with the generally (too) optimistic, idealistic, and technophilic attitude of many Transhumanists, and perhaps a desire to make/keep the philosophy socially acceptable and [thus] easier to proliferate. Visions of a harsh, devouring technocalypse, no matter how realistic are usually dismissed as being too "pessimistic". Of course lethargy, defeatism, strife, and conservative thinking also contribute to the lack of focus and momentum in Transhumanism, but the main problem seems to be that "we" aren't taking our own ideas seriously enough, fail to fully grasp the implications of things like nanotech, AI, and the Singularity. It's all talk and no action.

My main message is as simple as it is radical: assuming that we don't destroy ourselves first, technological progress will profoundly impact society in the (relatively) near future, culminating in the emergence of superintelligence and [thus] the Singularity. Those who will acquire a dominant position during this event, quite possibly a classical Darwinian struggle (survival of the fittest), will undoubtedly reap enormous benefits; they will become "persons of unprecedented physical, intellectual, and psychological capacity. Self-programming, self-constituting, potentially immortal, unlimited individuals''. Those who for whatever reason won't participate or fall behind will face a very uncertain future, and quite possibly extermination.   

So, the question is, is China prepared for the coming future?

Use magic tools Report

Rank: 1

Post time 2013-3-21 12:11:54 |Display all floors
This post was edited by WarmWeather at 2013-3-21 12:12
WarmWeather Post time: 2013-3-21 05:46
So, the question is, is China prepared for the coming future?

I know that Israel is ready.

Use magic tools Report

Rank: 1

Post time 2013-3-21 14:17:28 |Display all floors
WarmWeather Post time: 2013-3-21 12:11
I know that Israel is ready.

Horse Manure

What a load of horse manure.A huge solar flare will take those who survive back into the neolithic world to which we belong.
9/11 was an inside job.
No second plane.It was a bomb.Bomb in the other building.
You KNOW without a doubt the videos are fake,right ?!
Planes don't meld into steel and concrete buildings.They crash into them !!!!!!!
It's amazing how the building ate the plane !!!
Imagine those fragile wings cutting slots in massive steel columns !!!!!
How STUPID can they think the people are to believe that crap ??!!

Use magic tools Report

Rank: 8Rank: 8

Post time 2013-3-21 17:14:46 |Display all floors
TECHNOLOGICALLY ACCELERATED EVOLUTION (TAE) is the next horizon of life (or non-life) on earth.  With the "creation" of life from artificially coded DNA sequences, life has lost its aura of spirituality and merges with non-life as a continuum, and morality itself loses its transcendence, while becoming nothing more than a code that repeats itself in behavior which accidentally promotes survival of the individual, of society, or of the specie.  Such accidents do happen by chance, and are selected for reproduction by the genetic or digital programming framework in which they occur.

Is China prepared for this kind of a world.  You bet it is.  China leads in genetics, nanotech, and computerization.  China has no fear of robots controlled by supercomputers, or nanobots for that matter, because at Santa's workshop, all toys are possible, and in the largest volumes too.

Use magic tools Report

Rank: 8Rank: 8

Post time 2013-3-21 19:42:37 |Display all floors
So many Chinese, so scared of extinction.
I've made my living, Mr. Thompson, in large part as a gambler. Some days I make twenty bets, some days I make none. There are weeks, sometimes months, in fact, when I don't make any bet at all because ...

Use magic tools Report

Rank: 4

Post time 2013-3-21 23:39:54 |Display all floors
A mini silly terror campaign (omg, they gonna kill us with robots etc) to support Chinese ultra-nationalism.

Simply sad.

Use magic tools Report

You can't reply post until you log in Log in | register

BACK TO THE TOP
Contact us:Tel: (86)010-84883548, Email: blog@chinadaily.com.cn
Blog announcement:| We reserve the right, and you authorize us, to use content, including words, photos and videos, which you provide to our blog
platform, for non-profit purposes on China Daily media, comprising newspaper, website, iPad and other social media accounts.