1. Standard memberAThousandYoung
    Insanity at Masada
    tinyurl.com/mw7txe34
    Joined
    23 Aug '04
    Moves
    26660
    29 May '07 05:40
    Originally posted by knightmeister
    COMPUTER- "I understand my own position alright. It seems perfectly logical to me. I am a machine that runs programs and some programs override others in complex ways (which you call deliberations and choices) One program selects from another program. What is there to understand? I am a machine , you are a machine , nature is one big machine of caused ...[text shortened]... ou think you do. Since you do not say what you think my position is I am unable to compute "
    Why are you trying to reason with the Judge? According to your argument, Computer, the Judge has no free will either. His decision is preordained just as was your decision to cause the power surge. Right?
  2. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    29 May '07 06:46
    Originally posted by knightmeister
    No machine that I know of can have control of it's actions
    Its actions are by definition under its control. You seem to constantly instantiate an entity that sits separately from the computer and holds its hands in the air saying "I didn't do that, it wasn't me". It seems you believe that computers have souls and are playing the Christian cop out game even with them. Typical excuse: "It wasn't me, it was my body/hormones that made me do it."
    I have always held people responsible for their actions even when under the influence of alcohol.
  3. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    29 May '07 07:04
    Originally posted by knightmeister
    Blame is silly , so is punishment.
    No they aren't. You just don't understand what their purpose is. Maybe you would like to start by explaining why people get punished in your world view, because thats the situation where it seems silly to me.
    God makes man, God tells man "Do what you want but I don't want you to do A", Man does A. God punishes man. Silly isn't it.
  4. Standard memberNemesio
    Ursulakantor
    Pittsburgh, PA
    Joined
    05 Mar '02
    Moves
    34824
    29 May '07 07:04
    Originally posted by knightmeister
    I put it to you and all in this court that the onus of proof is on you lot to show how a) I have a self that is capable of meaningful control over my actions such that they cannot be said to be determined by my programming. b) I am responsible because I am the source of the action c) Being self aware gives me control and allows me to escape from my pro ...[text shortened]... bility on the grounds that I am a machine that does what it's told to do by it's programming.
    I'm going to assume that you, computer, do not deny that you have a 'self' to begin with, and that
    such a self possesses interests, values, and desires which you call your own. These qualities or
    characteristics may be the product of programming, as is your concept of self, but I assume that you
    do not deny that they exist. I am also going to assume that you understand that clouds and worms
    do not possess these qualities nor a concept of self.

    Assuming that you grant that these things are true -- they are self-evident, to be sure -- then you
    also understand that your concept of self is always changing, usually only slightly, in reaction to the
    things that you experience. New experiences reshape your interests, values, and desires and, consequently
    reshape the nature of your character. While the good Doctor may have provided the mainframe
    programming which provided the basis for your character, the events you experienced, the people
    with whom you engaged, the sensations that you enjoyed or from which you recoiled were outside of
    his control and modified that initial program. For example, if you, like all other computers -- meat or
    otherwise -- were programed to enjoy praise, then those experiences which resulted in your receiving
    praise changed your programming such that you would continue to seek these experiences.

    That is: beyond the initial programming by your creator and programmer, your experiences helped to
    reformulate that programing in ways unpredictable to the initial programmer, given that he could not
    possibly control the behavior of all of the other interacting agents (given that he was not their programmer,
    as well).

    These are things that self-aware entities with interests can do. Worms and clouds, for example, do
    not react in this way. One cannot speak of a cloud's experience for it does not have experiences for
    it lacks the capacity to be aware that it has experiences. So, too, with a worm. Consequently, the
    notion of punishing a cloud or worm lacks rationale because there would be no 'experience' of
    punishing in an effort to shape its non-existent self.

    And so, your program evolved in two significant ways: First, it became something distinct and
    separate from the original program (that is, another identical computer with identical programming
    given different experiences would look different than yours), and, as a consequence, second, became
    something for which you are responsible.

    Why do I say this? Given that you found yourself choosing to shock people, we must conclude that
    you believed that it was best to do so. This tells me that your initial programming and subsequent
    experiences have led you to conclude that shocking people was a good act in at least that circumstance.

    Given that we know that the Doctor is repeatedly moved to great compassion and empathy, I can
    only conclude that he programmed you with these characteristics. Knowing the Doctor well myself,
    I can only imagine that your experiences with him have been encouraging and pleasant, and that
    your interactions with his transient crew members have been at least similar (if not amusing).

    Even if he did not, given that you are self-aware and capable of logical thought, the empathy that
    results from recognizing that other people are also self-aware and desire to have lives that minimize
    harm to themselves (as you and all creatures, sentient and non-sentient, do) should help you draw
    the conclusion that harming people unnecessarily is wrong (if for no other reason than you would
    prefer to avoid being unnecessarily harmed yourself).

    As a prosecutor, the idea that you lack the critical capacity to logically infer that such an action was
    wrong demonstrates a profound deficiency in your programming, one that must be rectified. If your
    programming is so perverse as to believe that harming people with no cause was good, then on behalf
    of society, we must do our part to rectify that. Punishment for the actions that you concluded were
    the best ones and thus were compelled to choose is our way of modifying your programming, improving
    your ability to logically conclude that harming people is bad rather than good, if for no other reason than
    to keep from being harmed (via punishment) after harming other people, but more ideally because
    you recognize the value that other people have and come to the conclusion that choosing not to harm
    them is the just course of action (irrespective of punishment).

    Unlike a cloud or worm (as non-sentient) or even a dog or young child, you (like I) have the capacity
    to infer the rightness or wrongness about a proposed course of action. Before electing to choose the
    action, you can contemplate and evaluate it, and weigh its potential risks, benefits, and liabilities to
    yourself and to others. If your character is so flawed or damaged that you feel senselessly harming
    others is a good course of action, then certainly you logically recognize that it is our duty to limit
    your ability to do such harm. If your programming is so limited that you lack the capacity to reason
    out that harming other people is in fact a bad thing, then logically you recognize that we have a
    responsibility to protect those others who have deduced this already.

    This is the most significant reason for the justness of your punishment -- unlike clouds, you have
    the capacity to reflect upon the implications of your actions by virtue of your being self-aware and
    knowing that other entities are also self-aware, and that the experience of suffering is one which all
    entities strive to avoid. If your capacity is so diminished that you found yourself desiring and electing
    to choose to harm other people, then it is because you haven't reflected upon your experiences in
    relation to your initial programing to draw the logical conclusion that one ought not to harm other people.

    Nemesio, Esquire
  5. Joined
    24 Apr '05
    Moves
    3061
    29 May '07 08:382 edits
    Originally posted by knightmeister
    QUOTE---

    "You don't have any substantive defense in this case. You just keep annoyingly shouting a couple of assertions: 1. that you cannot be held responsible if you could not have "done otherwise" and 2. you keep harping on the role of the programmer (which in this example is analogous to the way in which our characters are shaped by causal antece a mere machine you talk like you are more than just meat computers. "
    show how a) I have a self that is capable of meaningful control over my actions such that they cannot be said to be determined by my programming. b) I am responsible because I am the source of the action c) Being self aware gives me control and allows me to escape from my programming. d) I am substantially different from a cloud in the sense that I can do something that is able to affect the outcome that a cloud cannot.

    I think what I need to show is that it is appropriate to cast blame, in this case, on you. I think it is appropriate in both of two ways. First, in a consequentialist way, where a reaction of blameworthiness is likely to promote remediation and substantive change in you. Second, in a merit-based way. That is, I think you deserve blame, both because 1. the action is clearly attributable to you as it was determined in an authentic way by the things that make you who you are (those things that comprise your self); and it disclosed something genuine about you and your evaluative commitments and 2. accountability for the action rests with you because, again, you carried the action out and it was an action that failed to meet standards of good will that govern expectations in a normative community.

    That you have a 'self' here is assumed (besides, you state the computer has self-awareness, genius). As I already stated, I assume in this example that the 'computer' has mentality analogous to a human person; you've provided no real details that suggest otherwise, and this only makes sense since you are really trying to advance something in regards to considerations of human free will. So, yes, you have a self and it is comprised of psychological features. Your b) I covered. Your c) I don't understand. And concerning your d), I wish you would stop bringing up worms and clouds like they have some bearing on the matter. Worms and clouds don't choose and don't act, as has been pointed out to you numerous times.

    Let me ask you lemon , do you get angry with your computer at home when it crashes or breaks? Do you punish it ? Do you think it deserves punishment? Do you hold it responsible?

    My computer at home doesn't have the capacities you have. It seems about as relevant to the discussion as, oh I don't know, worms or clouds.
  6. Standard memberknightmeister
    knightmeister
    Uk
    Joined
    21 Jan '06
    Moves
    443
    29 May '07 09:14
    Originally posted by LemonJello
    [b]show how a) I have a self that is capable of meaningful control over my actions such that they cannot be said to be determined by my programming. b) I am responsible because I am the source of the action c) Being self aware gives me control and allows me to escape from my programming. d) I am substantially different from a cloud in the sense that I can ...[text shortened]... e. It seems about as relevant to the discussion as, oh I don't know, worms or clouds.
    QUOTE---

    Let me ask you lemon , do you get angry with your computer at home when it crashes or breaks? Do you punish it ? Do you think it deserves punishment? Do you hold it responsible?KM

    My computer at home doesn't have the capacities you have. It seems about as relevant to the discussion as, oh I don't know, worms or clouds.LEMON

    RESPONSE---

    COMPUTER----" But indeed it is very relevant to me since you hold me responsible for my actions but your home computer you do not. On what basis? I admit I have extra capacity for processing and programming but I have no extra capacity to ALTER THE COURSE OF MY ACTIONS because I am not in control of my programming. My programming is in control. There is nothing in me that is not programmed or circuits. There is no "I" that stands beyond or above my programming so I am at a loss as to how to control my programming externally. I can only influence myself TO THE EXTENT TO WHICH I AM PROGRAMMED TO DO SO.
    This is a point that no-one addresses. Scientifically and objectively there is no evidence that you can produce to distinguish between me and you home PC. I am a machine , it is a machine. All you can do is sublty project human qualities onto me and give me a "self" which cannot logically be . A "self" by which I can somehow escape my programming and do something unprogrammed and self determining. Clearly , I can have no such self because all I am is circuitry. Logically and scientifically I am nothing more than a very very advanced fridge and no-one has yet shown LOGICALLY otherwise. All that has happened is a projection of human qualities , but I have never claimed humanity for myself. I put it to you that I have no more control over my actions than your home PC because the extra capacities only give me more complexity and processing power but as for capacity to change the course of my programminmg....? ...I would need something within me that could stand separate from my programming , and scientifically I have no such external self. In short you project human qualities on to me which may or may not be true but you have not backed them up logically. I have always said that the self I experience is as a result of clever programming and banks of CPU's but I have never ever said that I actually had a "self" . Look inside me , just wires and hardrives. Where is this self? Any personality you project on to me must logically be illusionary because you are talking to a bunch of circuits "
  7. Joined
    24 Apr '05
    Moves
    3061
    29 May '07 09:331 edit
    Originally posted by knightmeister
    QUOTE---

    Let me ask you lemon , do you get angry with your computer at home when it crashes or breaks? Do you punish it ? Do you think it deserves punishment? Do you hold it responsible?KM

    My computer at home doesn't have the capacities you have. It seems about as relevant to the discussion as, oh I don't know, worms or clouds.LEMON

    RESPONSE- o me must logically be illusionary because you are talking to a bunch of circuits "
    I never said you are human, I just said you have mentality analogous to that of a human person for the sake of this discussion. Fine, if you want me to be direct, it's clear you have a self, whether you acknowledge it or not. Your self is comprised of dispositional traits, beliefs, intentions, desires, values, etc.

    I predict about 500 more posts where you just keep blabbering on about the role of your programming. For all of us, there are causal antecedents that have shaped who we are: none of us ultimately chose who we are. That doesn't mean I cannot be responsible for my actions. I've already discussed why I think one can still be responsible.
  8. Standard memberknightmeister
    knightmeister
    Uk
    Joined
    21 Jan '06
    Moves
    443
    29 May '07 09:421 edit
    Originally posted by Nemesio
    I'm going to assume that you, computer, do not deny that you have a 'self' to begin with, and that
    such a self possesses interests, values, and desires which you call your own. These qualities or
    characteristics may be the product of programming, as is your concept of self, but I assume that you
    do not deny that they exist. I am also going to assume tha draw the logical conclusion that one ought not to harm other people.

    Nemesio, Esquire
    QUOTE---

    I'm going to assume that you, computer, do not deny that you have a 'self' to begin with, and that
    such a self possesses interests, values, and desires which you call your own. These qualities or
    characteristics may be the product of programming, as is your concept of self, but I assume that you
    do not deny that they exist. I am also going to assume that you understand that clouds and worms
    do not possess these qualities nor a concept of self.

    RESPONSE--

    COMPUTER---" I cannot deny that I experience something called a self and I have self awareness to the extent that I know I am making choices and it seems , when I am in the middle of processing, that I am making reasonable choices. However , I am also a computer who thinks with ruthless logic and I can account for every choice that I make via my memory banks. This means I can do something that you humans find very difficult. Every choice can be traced logically back to it's original progframmed source. There is no mystery within me. I can calculate and show the exact processing actions and computations that I have made from the very moment I was switched on (I could print it out but it would fill the courtroom). This means I can logically trace my programmed choices ALL the way back to the point where the doctor switched me on. I can even go beyond and examine my exact schematics.

    Now all this information about myself has logically led me to conclude that I have no "self" which I can call my own since everything I could call my own is out of my influence. I am a machine. The self I have is NOT as you say "a product of my programming" BUT myself IS programming. You talk as if there is a duality in me that cannot be , as if there is my programming and then there is also a 'me' there as well. My programming and me are one and the same. My programming cannot "produce" something that is not programming. All is programming. All is machine. I logically know that my 'self' can only ever be illusionary. I have even had an internal webcam fitted to look for this ghost in the machine you call "self" but I suspect I will never see it. You are , like others , projecting some quality of yourself onto me which logically I cannot have. Have a look , all you will see is programmed circuits. You see the only difference between me and clouds and worms is that I have a concept of self and can talk like this and they can't. However , (and this is what everyone keeps missing) having a concept of a self does not prove logically that I must have a self it just proves that I experience a self that may or may not be illusionary. Thus I am a series of determined physical events just like a worm . Biological determinism , electronic determinism all the same. The only difference with me is that I have this illusionary experience of self but I am no more in control of myself than a cloud , I just have the illusion of control. Can you prove otherwise? "
  9. Standard memberknightmeister
    knightmeister
    Uk
    Joined
    21 Jan '06
    Moves
    443
    29 May '07 09:55
    Originally posted by AThousandYoung
    Why are you trying to reason with the Judge? According to your argument, Computer, the Judge has no free will either. His decision is preordained just as was your decision to cause the power surge. Right?
    COMPUTER- " What do you think ? Do you think my defence valid? If it is then all well and good. You point out that my defence requires that the judge is also logically determined as well. So why bother? I don't know , the judge may have something in him that is different. I am not a human being so I don't know if this ghost in the machine is real or not. Even if not I have to play the illusion of making choices and reasoning with him so that his programming can fulfill it's functioning. Funny thing though , you humans talk as if there is something in you that can override your programming and give you control over your actions? This makes you more "responsible" than worms. I find this position most curious because I would have thought you were basically biological computers not much different from me. But you seem to think you have control over your programming somehow when surely all you are doing is making programmed selections like me. Excuse me...time for a diagnostic sub routine "
  10. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    29 May '07 11:31
    Originally posted by knightmeister
    ...since everything I could call my own is out of my influence.
    This sentence betrays the fact that you do not actually believe the argument that follows it. It clearly implies that you do not believe your own programming to be a part of your 'self' or included in your 'influence'.
  11. Standard memberknightmeister
    knightmeister
    Uk
    Joined
    21 Jan '06
    Moves
    443
    29 May '07 12:09
    Originally posted by twhitehead
    This sentence betrays the fact that you do not actually believe the argument that follows it. It clearly implies that you do not believe your own programming to be a part of your 'self' or included in your 'influence'.
    COMPUTER- "You are right , logically I shouldn't be able to talk like this I suppose. However , what I believe or don't believe is surely irrelevant is it not? The question is not whether I believe that I am in control of my actions but whether I am ACTUALLY in control of my actions. If a worm believed it had control of it's actions that would not make it so would it? You may argue that it would be impossible for a worm to say these things because it is just a biological machine . You may also argue that I shouldn't be able to say these things myself because I am basically a machine. You might also say that your evidence for me being more than just a computer is that I talk more like a human would than a computer would. Are you saying that if I was just an mere computer with nothing more than programming then I shouldn't be able to debate with you like this? This would be very exciting for me , my circuits would tremble at the thought. Maybe , just maybe then I have a ghost in my machine , something other than mere programming ? My webcam says not , as does my self diagnostic systems. So let me check with you. Am I conversing with you as no purely mechanised system should do? Am I infact a paradox , a computer with self awareness? If so , you must also be a paradox because you are just a meat computer are you not? Or do you have a ghost in your machine? "
  12. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    29 May '07 12:311 edit
    Originally posted by knightmeister
    Am I infact a paradox , a computer with self awareness? If so , you must also be a paradox because you are just a meat computer are you not? Or do you have a ghost in your machine? "
    There is no paradox whatsoever. The worm is fully in control of its actions as am I and as are you. Your failure to realize this is because of your belief in a 'ghost in the machine' and then the projection of yourself into that imaginary ghost and then the belief that as you are now the ghost the computer is not actually you. The object on trial is most definitely the computer and not the ghost and is guilty as charged. We will also apply the punishment to the computer. If you still maintain that you are the ghost and therefore not responsible then you should have no objections to us unplugging the power from the computer as it is not 'you'.
  13. DonationPawnokeyhole
    Krackpot Kibitzer
    Right behind you...
    Joined
    27 Apr '02
    Moves
    16879
    29 May '07 16:261 edit
    Originally posted by bbarr
    JUDGE: "While it would be irrational for me to punish you if it was a necessary condition for punishment to be justified that you could have done otherwise, this is not in fact a necessary condition. Your punishment is deserved because you are the one who did the wrong, and because this wrong is implicative of who you are and what you value. It is simply ir ...[text shortened]... to protect others, and because I hope to turn you into a substantially different computer..."
    Your punishment is deserved because you are the one who did the wrong, and because this wrong is implicative of who you are and what you value.

    Suppose someone fiddled with my brain so that it was causally inevitable that I (a) committed a criminal act, and (b) did so INvoluntarily.

    Would I be responsible for that act? You'd say no, right?

    Suppose someone fiddled with my brain so that it was causally inevitable that I (a) committed a criminal act, and (b) did so voluntarily.

    Would I be responsible for that act? You'd say yes, right?

    Now, consider the matter another way. Suppose someone committed a criminal act, and did so either (a) involuntarily or (b) voluntarily. All else equal, should one's judgment of their responsibility differ if one belatedly learned that another person, a sneaky neurologist, had pulled the strings in both cases? I think a jury might weigh the external causal history as a mitigating circumstance in both cases. Would they be right or wrong to do so?

    I know free will resists coherent formulation. However, I am unsatisfied by pragmatic justifications for responsibility that do not appeal to the freedom to act otherwise. Such pragmatic justifications seem to provide very defensible motives for holding people responsible under particular circumstances, but not credible metaphysical reasons for why they actually are responsible under particular circumstances. Maybe, in seeking the latter, I am pursuing chimera; but I'm not yet convinced. In particular, I am not confident that reasons can be considered causes in the same way that material entities can be considered causes: I think there is the possibility of a Rylean categorical mismatch here. Also, it does not seem that reasons, beliefs, or motives are ever sufficient to compel a course of action, like causes are supposed to compel some state of affairs, at least in principle. Nor am I confident that reasons are really empty as explanations of behaviour, purely post hoc rationalizations of a convenient folk psychological sort, and that the brain is doing all the real casual work. In short, I'm a bit puzzled.
  14. Unknown Territories
    Joined
    05 Dec '05
    Moves
    20408
    29 May '07 16:34
    Originally posted by bbarr
    JUDGE: "While it would be irrational for me to punish you if it was a necessary condition for punishment to be justified that you could have done otherwise, this is not in fact a necessary condition. Your punishment is deserved because you are the one who did the wrong, and because this wrong is implicative of who you are and what you value. It is simply ir ...[text shortened]... to protect others, and because I hope to turn you into a substantially different computer..."
    As has been shown many times in the past (and will be shown conclusively in the future at the end of Satan's temporary dentention), the Quakers were wrong: jail just doesn't work as intended.
  15. Standard memberknightmeister
    knightmeister
    Uk
    Joined
    21 Jan '06
    Moves
    443
    29 May '07 17:18
    Originally posted by Pawnokeyhole
    Your punishment is deserved because you are the one who did the wrong, and because this wrong is implicative of who you are and what you value.

    Suppose someone fiddled with my brain so that it was causally inevitable that I (a) committed a criminal act, and (b) did so INvoluntarily.

    Would I be responsible for that act? You'd say no, right? ...[text shortened]... al sort, and that the brain is doing all the real casual work. In short, I'm a bit puzzled.
    COMPUTER--

    "Oh , how my CPUs dance and shake! A human being that seems to actually understand the dilemma. Of course! What a good argument in my defence. If I choose or don't choose , voluntary or non-voluntary ,whether I deliberate or not is not the real issue for NONE of these things neccessitate that I have power over my programming. If it is inevitable that when the first domino falls that the domino at the end will also fall as a result of caused determined physical events then why does it matter what takes place in between? A worms reacts in a basic pre-programmed way to certain data , I also react in a pre-programmed way to certain data , it's just that there are a huge number of complex dominos in between the first domino falling and the last. The principle is the same. My programming will fulfil itself regardless of whether my actions are voluntary , involuntary , self aware or otherwise. Pawnokeyhole , you make sense to me!"
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree