X on 19/6/2001 at 19:23
I feel that the least moral ending is the Dark Age. Why should one man decide to plunge the cogs of a human machine into a spiral from which it may not recover? A chance to rebuild is not necessary, as we exist in humanity we shall repeat the same technological achievements, and thus the mistakes.
The idea of a malevolent dictator is an intersting idea, little more. People would either worship or rebel. More than likely both. And this gives the possibilty of an unwanted return to branded capitalism.
Thus the Illuminati. I feel that the idea the society could be influenced in the right directions of socialism and equality without it realising it gives the greatest freedom and also the chance for true humanatarian advance.
Deep Shadow on 20/6/2001 at 04:58
Posted by X:
"Thus the Illuminati. I feel that the idea the society could be influenced in the right directions of socialism and equality without it realising it gives the greatest freedom and also the chance for true humanatarian advance."
I totally agree.
Wearewe on 20/6/2001 at 05:16
Nonono the zero tech period, would last only a few months is ever get to be, after all is just a matter of re-routing the computing lines, remember the principle of the internet, the net cannot be closed or shutdown totally, there is always an alternative way around, so the idea of a total darkness is the wrong one.
<IMG SRC="idea.gif" border="0">
santaClaws on 20/6/2001 at 15:16
i know there've been loads and loads of intelligent replies to this topic and i don't even think anyone will read my post eny more. but anyway:
i also think the helios ending is the best one, concerning morality. furthermore, i think the almost religious debate about this topic is great. i wouldn't have thought this would be possible. one could print the whole thing, go to the university and get a doctor of philosophy with it.
but what i really wanted to say is:
if the whole game was just about the morality of the endings, nobody should play the game anyway. first of all, everyone killed people during the game. is this moral?
2. i'm sure it wasn't easy for ion storm (namely warren spector) to think of three different endings in the first place (except for some really strange things). you can't blame them for not coming up whith a grater number of 'morally correct' endings.
3. i'm absolutely sure that every single player who finished deus ex played all three endings, just to see the graphics, the sequence, to see how the story goes on. if you really had cared about the morality of the endings, you wouldn't even have played those you considered bad. it's nothing but a computer game, after all.
final. as a matter of discussion, this morality question is f**king great. but you can still interprete the outcome any way you want.
santaClaws AKA triCKster
------------
if you can't
convince 'em
con?fuse 'em
------------
kostoffj on 20/6/2001 at 16:40
Great posts on this thread!
I have to cast my vote for the Helios ending being the most moral. My take on it is a bit different because I am looking at the problem differently than others who have so far replied to this question. The difference is this: I think the matter at hand is not a question of which ending ensures the system of governance that best meets peoples' needs, but which ending ensures the survival of the human species.
Because that's what's really at stake. By the time of Deus Ex, and really, our own immediate future, the human race will face the possibility of extinction at its own hands. You could argue that we've faced this since the Cold War, but the threat of nuclear war was a threat to the survival of civilization, not the entire species. The advent of nanotechnology and advanced genetic engineering will change this. In the old days, to build a bomb it took a concerted national effort, access to rare fissile materials, facilities to process that material into weapons-grade material, world class scientific and engineering talent to make a functional design, and the strength to resist efforts by established powers to stop you building a bomb. All this costs billions - and that's just to build a bomb. To put that bomb on target, reliably, you have to get into rocketry - more talent, more billions, etc. In other words, it's hard to make and deploy nuclear weaponry, which means that few powers will have them, which further means that it is possible to institute control regimes that manage the threat to civilization. This has worked, although with some close calls, for 50 years.
With the technologies becoming available to us, it will be possible for educated and motivated persons to design and build superweapons right in their own homes, be they some kind of virus or nanite, and then release them into the biosphere, where, thanks to the power of replication, the virus/nanites/whatever can spread and annihilate people or even life in general. This doesn't even have to be deliberate, it could happen accidently (the "gray goo" scenario). It's pointless to argue about when exactly we'll have this capability - it will come to pass, eventually.
How could the human race protect itself against this kind of threat? Out of a population of billions, a human government could never hope to control access to materials and technology to prevent this, even if it went for maximum repression (and such repression would probably increase the population of smart kooks who would seriously considering making a killer nanoplague). Blind faith - "somehow we'll get through this, just like we did the Cold War" - won't work; the combination of sick but smart psycho and highly developed nanotech and genengineering will come to pass someday.
Human nature, as it is now, is fatally flawed; the genocidal psychopath will always be with us, since the "dark" side of human personality is an inseperable part of total human nature. It's not a problem, in the sense of the survival of the species as a whole, so long as individuals can't by themselves do anything that could destroy the whole species. But it is inevitable that we will reach this stage of technological development. Baring a fundamental change in human nature, the only thing that can save us is an external intervention - Deus Ex Machina, God from a machine, just like in the old Greek plays, who comes in and saves everyone from themselves at the last minute. We know that the God from our religious texts won't be doing this; He certainly didn't get involved at Auschwitz and Hiroshima and all the other awful examples of genocide. So our only option is to build our savior. As the AI Morpheus said to JC, "You will have your God, and you will make Him with your own hands." Since I think the survival of our species could justly be counted the highest moral value of all, I think that clearly the Helios ending is the most moral. The Tong ending somewhat addresses the threat of technology, but his extreme solution just postpones the day of reckoning - someday the society and technology will be restored and we will face the question again. The Voltaire quote at the end of the Helios ending said it all, "If there was no God, it would be necessary to invent Him."
Forgive the disjointed nature of this post; for an eloquent and frightening discussion of our self-threat to our survival thanks to empowering technologies, read Bill Joy's article in the April 1999 issue of Wired magazine titled "Why the Future Doesn't Need Us." For those who don't know, Joy is the Chief Scientist and co-founder of Sun Microsystems, so rest assured that this isn't Art Bell stuff ;) .
[ June 20, 2001: Message edited by: Felonious Punk ]
nimbus on 21/6/2001 at 01:45
Interesting, Felonious. Though if you really believed that we are stuck with a "fatally flawed" element of our nature, then the most moral thing would be to eradicate ourselves and pave the way for a less, umm, darkly motivated species.
What I had wanted to say was that I think I and some others approached this question from a different angle: "Which ending is the most moral?" We need to define exactly what is being asked. I took it to mean: If you were JC in the game, with no knowledge of the actual scripted conclusions of the three coices, what option would you be most likely to pick, morally?
But I think most people are debating on the actual outcomes, of which script offers humanity a better future and the least unrest.
That, of course, would change my viewpoint, and I would jump along and say that the Helios ending, as presented, gives the best, and most moral outcome of the game. If that were the question, I don't see how there would be much debate. Helios is as close to perfectly moral as any human could attempt to be. AND, he would rule behind the scenes, not some figurehead government to be resented and overthrown, which offers humanity a better sense of true freedom with their actions.
BUT, if you look from the viewpoint of the main character himself, without the foresight of being able to watch exactly how each ending might pan out, I would say that putting your blind faith into a Helios ending would be very questionable on the moral score. Ever see The Matrix? How do you think it might have started?
Anyway, great discussion, people. Very entertaining :D
Wearewe on 21/6/2001 at 05:37
Peeps peeps please !!!
You can only actually shutdown the electronic spionage facility but you cannot cut away the Internet, there is no way to shut it down. Provided that there is no more central control, but the network is still there, the internet is not a star point topography system, with big center that routes everything, but a point to point topography and that make it's virtually unstoppable. Yes the centralized control is down, but not the local ones, those are easy to re-program. Hardly a couple of months.
So the third ending, ends with the master control, but does not plunge the humanity into the darkness (you wish).
Go and learn something about the internet, then tell me if i am right or not.
<font color="white">
[ June 21, 2001: Message edited by: Kenzo Uji ]
Meroveus on 21/6/2001 at 06:35
There's a valid argument/reasoning in each ending, with the Illuminati one probably being the less moral.
For example in the Dark Age ending, Tong is absolutely right in believing that technology always ends up controlling us. Technology is external, as society becomes dependant on it, it becomes even more difficult for individuals to be in control of their own lives. Technology gradually forces people to become "incompetent," in the sense that they have to rely on machines to do everything for them. When the machines fail, things come to a halt, chaos is created and people are helpless (Y2K was a good example of this potential threat). Can you imagine the scale of the potential danger in the context of a society totally regulated with nanotechnology? It's simply the control freak's dream, and is the last frontier in terms of removing any freedom left. No wonder that people like the Sun Microsystems guy totally freak out about this. The applications could be totally destructive. Just a simple little mistake and the world is covered in "gray goo." Very reassuring. The big problem with the Dark Age ending is that people will be reduced to trying to survive in a post apocalyptic world. How likely is it to continue for a while? Plus it lacks morality in the sense that it's too easy to simply just do nothing and let the world crumble without trying to reform it. With proof of the Illuminati conspiracy shown to the whole world, some things might actually be learned in the process.
The Helios ending is simply another variant of technology controlling humanity. An AI that was originally created by the Illuminati at that. Even if JC is merged with it, there's no clue as to what the AI will do in the long run. It might even be a trick of the Illuminati to facilitate their control by removing them from the spotlight. There's really no way to trust an AI. Not too moral.
The Illuminati ending is definately the less moral but perhaps the less dangerous. Like the Roman empire (which is actually pretty close to the corporate-military rule of the Illuminati) it will very probably collapse again in the near future anyway and put us back in this Martial Law scenario when the population goes berserk again and the Illuminati uses this to create another order where they are still on top.
So in conclusion I find none of them to be really moral, although the Dark Age one is the most appropriate. One aspect in Deus Ex is overlooked though - the psionics one. You see in the MJ12 labs that they knew this down to a science (see the grays). If the knowledge concerning this was spread around, and there was a way for each individual to develop those abilities. Knowledge is power. Each and everyone would be free and individually mentally empowered, so that they could build a new just civilization. This would be equivalent to a "golden age" coming after the "end times," that is often talked about in spiritual texts.
kostoffj on 21/6/2001 at 16:33
Kenzo -
In the game they tell you how the new Internet is vulnerable to destruction if Echelon IV/Helios is destroyed. You are told that some of Daedalus' code exists on every communications device on the planet, so by destroying Helios (evolved Daedalus, of course), all these devices would become incapable of communication. Presumably, the Daedalus code was written into the Deus Ex-current version or equivalent of IP as an integral component, and without Daedalus around to authenticate communications, no one can talk. The futuristic URLs all began with Daedalus, so this is likely. In any case, the Internet of Deus Ex is a lot more centralized than the Internet of today. This is certainly possible if one giant corporation (Micro^H^H^H Page Industries) is charged with writing all the software for the world's network connectivity.
Of course, new peer to peer protocols could be (re-)written, but the problem is how are you going to collaborate on such a project if you can't communicate with anyone else? Plus, the network infrastructure equipment of that time is probably made by Page Industries too, so all the routers and such are probably designed only to work with Page Industries software protocols.
Remember, too, that the world already is teetering on the brink of chaos, with Gray Death raging, civil wars and terrorism widespread; bringing down the global communication system wouldn't be this one event that destroys civilization, it would be more like the straw that broke the camel's back. That would be the final shove and the world plunges into the abyss. If everything goes to hell, then affairs may be too dire for people to work on quick remedies to the communications problems.
Wearewe on 22/6/2001 at 01:02
Damn right the code is there, but also the protocol, separate them, and voila, after all it's was added, or is an integral part, can be disabled, you don't have to rewrite the protocols.
Later the full removal will be done.
Besides as i was seeing the whole thing was more like a networked computing system, all of the devices were part of one big giant computer.