24 May '07 19:20>
Originally posted by twhiteheadQUOTE---------------
I am more than happy with the word exact. My reason for using 'general' had to do with the random component.
[b]In any case , what you describe as free will is not the free will we experience. The free will we experience is one that gives us the feeling that we do have two or more very real possible actions that we could do and that it is us that needs ...[text shortened]... s what I call free will. Your repeated insistence that I have 'feelings' otherwise is false.
What I have described allows me, as a meat computer to have two real possible actions presented externally and for the meat computer (me) to make a choice. The fact that the meat computer may be programmed to always make the same choice given the set of initial conditions does not take away the choice. That is what I experience and that is what I call free will. Your repeated insistence that I have 'feelings' otherwise is false. -----whitey
RESPONSE----------------------
So you accept that you are programmed to make the same choice given identical circumstances even though externally there are other options available. Surely then the programming takes away one of the choices.
If you as a meat computer are programmed to make choice A then although choices B-Z may be theoretically available externally they are not really available internally or as an actual real possibility.
In theory a computer running microsoft word might start throwing up an options screen from "rome total war" from within that microsoft programming , but we know this isn't going to happen so there is no point putting it forward as a real option. We know the computer will do what it's programmed to do.
You make a choice as a meat computer , but your logic tells you that you can only make one choice (the choice you are programmed to make). But you insist on talking about having two "real" possible actions? In order for the other choice to be real and possible you would have to have different programming ,which you don't have .This makes the other choice impossible.
This is the "if I was a different person , then I would choose differently" version of free will. But it's not free will because choice is programmed and inevitable in this version. There is little point in saying that "if I was a different person then I would choose differently" because....erhem ...you are not a different person. The fact that you are a specific meat computer with a specific programming means other "real possible actions" can be excluded as possible. What they are , are "fantasy theoretical actions" based on another reality altogether. Your internal programming has eliminated these options as being possible
Now notice what we experience. We experience a sense of regret and guilt or relief , agonising etc over our choices. We really do believe that the other choice could have been made by us. It feels (sometimes) as if we reach a fork in the road and we really can go either way. You logically can't have that other choice with programming and you cannot hold a programmed meat computer morally accountable for it's actions if only one programmed choice is possible. All systems make choices of a kind , even trees and worms , but only systems that have more than one REAL option can be held accountable.