Originally posted by knightmeister
I put it to you and all in this court that the onus of proof is on you lot to show how a) I have a self that is capable of meaningful control over my actions such that they cannot be said to be determined by my programming. b) I am responsible because I am the source of the action c) Being self aware gives me control and allows me to escape from my pro ...[text shortened]... bility on the grounds that I am a machine that does what it's told to do by it's programming.
I'm going to assume that you, computer, do not deny that you have a 'self' to begin with, and that
such a self possesses interests, values, and desires which you call your own. These qualities or
characteristics may be the product of programming, as is your concept of self, but I assume that you
do not deny that they exist. I am also going to assume that you understand that clouds and worms
do not possess these qualities nor a concept of self.
Assuming that you grant that these things are true -- they are self-evident, to be sure -- then you
also understand that your concept of self is always changing, usually only slightly, in reaction to the
things that you experience. New experiences reshape your interests, values, and desires and, consequently
reshape the nature of your character. While the good Doctor may have provided the mainframe
programming which provided the basis for your character, the events you experienced, the people
with whom you engaged, the sensations that you enjoyed or from which you recoiled were outside of
his control and modified that initial program. For example, if you, like all other computers -- meat or
otherwise -- were programed to enjoy praise, then those experiences which resulted in your receiving
praise changed your programming such that you would continue to seek these experiences.
That is: beyond the initial programming by your creator and programmer, your experiences helped to
reformulate that programing in ways unpredictable to the initial programmer, given that he could not
possibly control the behavior of all of the other interacting agents (given that he was not their programmer,
as well).
These are things that self-aware entities with interests can do. Worms and clouds, for example, do
not react in this way. One cannot speak of a cloud's experience for it does not have experiences for
it lacks the capacity to be aware that it has experiences. So, too, with a worm. Consequently, the
notion of punishing a cloud or worm lacks rationale because there would be no 'experience' of
punishing in an effort to shape its non-existent self.
And so, your program
evolved in two significant ways: First, it became something distinct and
separate from the original program (that is, another identical computer with identical programming
given different experiences would look different than yours), and, as a consequence, second, became
something for which you are responsible.
Why do I say this? Given that you found yourself choosing to shock people, we must conclude that
you believed that it was best to do so. This tells me that your initial programming and subsequent
experiences have led you to conclude that shocking people was a good act in at least that circumstance.
Given that we know that the Doctor is repeatedly moved to great compassion and empathy, I can
only conclude that he programmed you with these characteristics. Knowing the Doctor well myself,
I can only imagine that your experiences with him have been encouraging and pleasant, and that
your interactions with his transient crew members have been at least similar (if not amusing).
Even if he did not, given that you are self-aware and capable of logical thought, the empathy that
results from recognizing that other people are also self-aware and desire to have lives that minimize
harm to themselves (as you and all creatures, sentient and non-sentient, do) should help you draw
the conclusion that harming people unnecessarily is wrong (if for no other reason than you would
prefer to avoid being unnecessarily harmed yourself).
As a prosecutor, the idea that you lack the critical capacity to logically infer that such an action was
wrong demonstrates a profound deficiency in your programming, one that must be rectified. If your
programming is so perverse as to believe that harming people with no cause was good, then on behalf
of society, we must do our part to rectify that. Punishment for the actions that you concluded were
the best ones and thus were compelled to choose is our way of modifying your programming, improving
your ability to logically conclude that harming people is bad rather than good, if for no other reason than
to keep from being harmed (via punishment) after harming other people, but more ideally because
you recognize the value that other people have and come to the conclusion that choosing not to harm
them is the just course of action (irrespective of punishment).
Unlike a cloud or worm (as non-sentient) or even a dog or young child, you (like I) have the capacity
to infer the rightness or wrongness about a proposed course of action. Before electing to choose the
action, you can contemplate and evaluate it, and weigh its potential risks, benefits, and liabilities to
yourself and to others. If your character is so flawed or damaged that you feel senselessly harming
others is a good course of action, then certainly you logically recognize that it is our duty to limit
your ability to do such harm. If your programming is so limited that you lack the capacity to reason
out that harming other people is in fact a bad thing, then logically you recognize that we have a
responsibility to protect those others who have deduced this already.
This is the most significant reason for the justness of your punishment -- unlike clouds, you have
the capacity to reflect upon the implications of your actions by virtue of your being self-aware and
knowing that other entities are also self-aware, and that the experience of suffering is one which all
entities strive to avoid. If your capacity is so diminished that you found yourself desiring and electing
to choose to harm other people, then it is because you haven't reflected upon your experiences in
relation to your initial programing to draw the logical conclusion that one ought not to harm other people.
Nemesio, Esquire