Playing god

Some of you may have played "games" like Second Life. If not, you probably know what it is. I'd like you to imagine that it's a few hundred years from now and that I'm a computer game designer working on an advanced version of this game.

Computers have changed a lot. Storage is basically unlimited and processing power is beyond anything you could comprehend. I finally have enough resources at my disposal that I can actually create sentience within the "game". That is, I can populate it with creatures who can pass a Turing test, who apparently have "free will". These pockets of sentience evolved from some of my original designs, and are pretty darn complicated, but of course I can still run any of it through a debugger if I want to. I can also change the physics of their little Universe...

In other words, I can "play god".

Would I? Would you? I think I can answer that I definitely would not, though certainly there would be temptation. But if I really thought that this computer generated world was very real to its software denizens, and really thought of them as sentient beings, how could I? I might want to, I might even sometimes try to convince myself that interfering in some specific place might be more moral than remaining hands off, but I would hope that I would resist that temptation because it's all downhill from there.

The atheists in the group have likely already guessed what comes next. Some of those little programs (that's what they are, right?) have become religious. They have imagined a "Creator" and are busily begging it for various actions and non-actions, any of which I could easily provide - if I wanted to "play god".

How would you feel about those beseechers of favors great and small? I'd feel sorry for them. Sorry that they don't understand that it was all only a game. Sorry that they weren't smart enough to see how the game works yet - and I'd wonder if they ever could be smart enough: can a computer program realize that it is just a bit of code?

I wouldn't play god. If these little programs were as bent on self destruction as we seem to be, I don't think I could even watch. I couldn't turn the game off - that would be immoral too. I'd just let it run by itself and make a promise to myself to never create anything like that again. I'd also warn everyone I knew that they shouldn't dabble with such games either.

How about you?

Got something to add? Send me email.


Increase ad revenue 50-250% with Ezoic

More Articles by

Find me on Google+

© Anthony Lawrence

Sun Oct 11 17:54:11 2009: 7179   TonyLawrence

Theists in the group may insist that I can't really create sentience - only the appearance of such.

Fine. You go ahead and explain that to folks within the game.

Sun Oct 11 19:00:29 2009: 7180   TonyLawrence

A specific question for the theists: assuming you believed that the little programs really were sentient, which ones would you have more respect for - the theists or non-theists?

The "theist code" is correct, at least with regard to their creation. They are wrong about my attitude toward them, though, and are engaging in pointless activities because of that. The "atheist code" has it all wrong, but that ignorance is harmless.

I'd also like to ask - again assuming that you can accept the sentience or at least accept that *I* believe them to be sentient - do you think that I am acting correctly by ignoring them entirely?

Sun Oct 11 19:11:05 2009: 7181   TonyLawrence

Many interesting questions flow from this. Assuming again that you at least accept that I belive these creatures are sentient, suppose that I do interfere to protect them from activities that will harm other important programs that are running. I ask them not to do certain things, but they do them anyway.

What's my reaction now? I could reprogram so that they CAN'T do those things - or I could "punish" them. I think I'd just have to accept that I screwed up my original design... and suffer the consequences. What would you do?

Sun Oct 11 19:33:42 2009: 7182   TonyLawrence

One more:

Early on, before you understand that the computer players are "real", you do "play god" now and then. You stop when you realize what's going on, but later on your young daughter finds your prototype and, being a clever young girl, also plays at "god" a little. You find out, and of course remove her from the game. Unfortunately, her actions have increased religious fervor. What now? Reset the game perhaps? Let it run on? Interfere again?

Sun Oct 11 19:35:54 2009: 7183   BrettLegree


Well, "life" is interesting enough for me that I don't know if I'd bother "playing" in another one.

Especially if I could be "god".

I have enough problems without making problems for other people!

Sun Oct 11 19:57:21 2009: 7184   TonyLawrence

Well, remember that here you were the designer. As you are an engineer, here's a moral question for you:

You created this, but now you realize that the inhabitants of your computer world are sentient. What now? Hide it? Destroy it? Give it to your employer - not your problem? Try to reason with an uncaring boss who wants to release it just so people CAN "play god"?

Sun Oct 11 20:08:35 2009: 7185   BrettLegree

Good point.

Perhaps "god" was just following orders from his boss. Perhaps he knew what he was creating was somewhat flawed, but he had grown tired of objecting, and just decided that he would "work to rule". He'd go home at the end of the day and not worry about it.

I know that I do things at work that I most definitely would not choose to do on my own.

(Which makes me wonder why I create such crap. Oh, I of course I know why - because I get paid to do it...)

Sun Oct 11 20:12:32 2009: 7186   TonyLawrence

So your answer would be "Not my problem" ?

Sun Oct 11 20:16:19 2009: 7187   TonyLawrence

By the way, we aren't talking about your creation being "flawed" - unless you think being sentient is a flaw.

I'm asking what you do - you created this, you realize the computer players are sentient.

You are treating this as though it were an parable of some real god - it's not that. This is a game, or was supposed to be. What do you do now?

Sun Oct 11 20:17:00 2009: 7188   BrettLegree

Hopefully my answer would have been "I quit" before I created it in the first place!

Interesting to think along these lines.

I don't think that any of us can guarantee that anything we do won't have negative consequences on someone in the future.

For instance - I used to supervise a facility that makes medical isotopes for cancer therapy - a life saving product, right?

But. The byproduct is highly radioactive and chemically toxic, and will require an incredible amount of work to store and eventually dispose safely. Knowing what I know, I can't guarantee that the byproduct won't cause a cancer and kill another innocent someday...


Sun Oct 11 20:28:10 2009: 7189   BrettLegree

Certainly sentience isn't a flaw.

Hmm. Should I turn it off, or turn my back on it, or whatever?

Well. It is still a simulation, right? And I've created sentient "life" inside of it.

I suppose one thing to think about is that I've already helped to create four sentient beings (my children).

And to some extent, I can change the physics of their little Universe as well (within reason - I am "Dad", of course!)

I think I need another beer!

Mon Oct 12 00:51:37 2009: 7192   BigDumbDinosaur

sentient: responsive to or conscious of sense impressions

Taken with a dose of artistic license, one, I suppose, could ascribe sentience to just about anything that reacts to a stimulus. Stretching it a bit farther, a computer running some sort of advanced AI software could be accused of being sentient. Consider Star Trek's Mr. Data...

Now, if one considers some of our politicians... <Grin>

As for playing God, first you have to believe in one. Otherwise, what you are doing as you "play God" is illusory and probably not dissimilar to what Stalin was doing back in the day.

Mon Oct 12 01:33:53 2009: 7194   TonyLawrence

Well, that's a whole different subject, but I do believe that what we call "consciousness'" is nothing but feedback - a thermometer is "conscious" - in a very primitive way.

But no, you don't have to be a theist to play this thought experiment. For atheists like us, the question isn't really about gods, it's about morals.

Mon Oct 12 10:03:20 2009: 7197   SteveP

Not sure whether I should be posting a comment since I almost certainly won't have time to read any responses or subsequent comments.

But perhaps I'm just too vain to avoid it!

I'm interested in knowing what kind of a god you (or I) might be.

Would we care about our creations?
Would we want a relationship with them?
Would we love them?

I guess the answers might depend on how much we think each one is *just* a complex set of ultimately predictable code or a something that is genuinely capable of independent thought, action, emotion, etc.

And to think, I started following this blog for perl tips.


Mon Oct 12 11:47:03 2009: 7198   TonyLawrence

If you are asking those questions, it seems like you are leaning toward interfering in the game?

Tue Oct 13 16:48:27 2009: 7223   Cyr

Interesting thought experiment.
It seems that this kind of program would reveal ourselves to ourselves in an uncomfortable clear manner.

Do you have the ability to take backups of the system?
If so, are you legally/morally/ethically obliged to do so?
If you do not take backups, or if you run the program on an unpatched system, could you be tried for criminal negligence?

Would any moral qualms about messing with your virtual lifeforms REALLY matter if you have backups?
That opens up a whole big can of worms there, having omnipotence over commodity life without any consequences.
If you raise your life to a perfect, harmonious utopia, would it matter if you then arbitrarily inflict them with plagues, war, SCO/Microsoft, and a constantly changing gravitational constant for your own amusement if you can just restore to the utopia?

Why would you have to "accept that I screwed up my original design... and suffer the consequences"?
Since you can create new life by running a program, why not reinstall the program with tweaked settings and try again?
On what basis would you base your objections?

Tue Oct 13 17:00:17 2009: 7224   TonyLawrence

Indeed, that is a good question: Why would I have to "accept that I screwed up my original design... and suffer the consequences"?

I just can't stand the idea of messing with someone's life. Why? I don't fully know. I'm an absolute hard-core atheist and honestly was wondering here if theists or atheists would be more likely to "play god". I can see it going either way, but it's interesting that reactions to this are so varied.

Tue Oct 13 17:31:53 2009: 7225   TonyLawrence

There are also questions about the AI inhabitants.

How would I feel if, as such a player, I was suddenly informed about the reality of my existence?

How would I feel if the designer provided an "afterlife" in which the truth was then revealed?

I can tell you that I'd be angry. And of course more angry because I'd have no power, existing entirely at the whim of that game designer.

I'd be even more angry if the designer claimed to be a god-thing - because I don't believe in such a thing, it would have to be an impostor and I'd be spitting-blood angry!

Theists, of course, would be happy to continue the deception, which would be even more infuriating for hard-core atheists!

This all might make a decent sci-fi story if I had the energy to write it :-)

Sat Oct 17 19:49:14 2009: 7284   SteveP

> If you are asking those questions, it seems like you are leaning toward interfering in the game?

Haven't actually decided yet but if I didn't care at all then there doesn't seem to be much of a question. I could reboot the game whenever I liked. And as for morality, who is setting the standard by which I, as god, am being judged? The programmed beings? How do they decide what is right and wrong?

If I *did* care then the blind watchmaker method - wind up the world and let it run untended - seems unsatisfactory somehow.

I'll try to find time to read the story. No promises though!

Fri Apr 9 09:03:58 2010: 8388   anonymous


If you could run an entity with free will through a debugger, then you would have the programmatic keys to sentience, and understanding of how it works. Therefore you could hack/modify this sentience and allow it to not be hurt or even offended by the futility of it all. Theoretically this is even possible within ourselves, just as it has always been.

Even way off into this future, there will still be nothing new under the sun.

Kerio Samepage

Have you tried Searching this site?

Support Rates

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us