1. Standard memberKellyJay
    Walk your Faith
    USA
    Joined
    24 May '04
    Moves
    157807
    31 Jul '06 14:31
    Originally posted by Pawnokeyhole
    Can't you distinguish an assertion from a question?

    Can you specify the properties of neurons that guarantee consciousness or those of silicon circuits that forbid it?#

    If not, why are so certain of your contrary position?
    Can you talk about all that is required for conscieousness, you know?
    We can talk about the current flow through a processor, all that goes
    into it, but is just current through a processor, is there a magical
    ciruit design that turns that current into a real thought that makes
    the computer become aware? If not the only thing man can do is
    simply create a faster electrical abacus with the ability to program
    it.
    Kelly
  2. Standard memberKellyJay
    Walk your Faith
    USA
    Joined
    24 May '04
    Moves
    157807
    31 Jul '06 14:31
    Originally posted by Mixo
    What was this event?
    The fall of man.
    Kelly
  3. The Tao Temple
    Joined
    08 Mar '06
    Moves
    33857
    31 Jul '06 18:48
    Originally posted by KellyJay
    The fall of man.
    Kelly
    Is this the expulsion from Eden? If so, do you think any of the Old Testament accounts are just allegorical rather than actual?
  4. Standard memberHalitose
    I stink, ergo I am
    On the rebound
    Joined
    14 Jul '05
    Moves
    4464
    31 Jul '06 19:313 edits
    Originally posted by Pawnokeyhole
    *sighs*

    Note: The below differs slightly from the version.

    Okay, suppose I make beings--robots if you like, something else if you don't--with roughly equally balanced natural tendencies to be nasty or nice. Some people might think this approximates humans.

    I leave these being to interact with humans in significant ways.

    But they also have s other reasons that eliminate that liability. But, all else equal, He would be liable.
    This is the point I voiced from the beginning as to the disingenuity -- I hope it is merely subjective bias -- of the robot analogy:

    How can a creator be liable for the harm his creation does to itself and of its own choice?! Allow me to use an analogy of my own:

    I set up a "virtual environment" (VE) with a series of artificial intelligence (AI) nodes having distinct identities and personalities. The intention is for me to interact with the virtual environment to "channel" the nodes to allow for wholesome interaction in the creation of new "super-code". This obviously allows for the possible risk of creating a cyber-virus, or perhaps a virtual masterpiece of unthinkable magnitude. To further complicate the issue, the AI nodes firewall my access into their VE, turn on each other and wreck their environment.

    Would I still be liable for damage done to my own "virtual" creation? To whom would I be liable?

    Edit: Try to understand my point of a closed, created system. Your argument of "unleashing them on the public" has no valid application to reality, since by definition, there is no "uncreated" (i.e. non-robot beings) within the created universe.
  5. Standard memberfrogstomp
    Bruno's Ghost
    In a hot place
    Joined
    11 Sep '04
    Moves
    7707
    31 Jul '06 19:49
    Originally posted by Halitose
    This is the point I voiced from the beginning as to the disingenuity -- I hope it is merely subjective bias -- of the robot analogy:

    How can a creator be liable for the harm his creation does to itself and of its own choice?! Allow me to use an analogy of my own:

    I set up a "virtual environment" (VE) with a series of artificial intellige ...[text shortened]... nition, there is no "uncreated" (i.e. non-robot beings) within the created universe.
    Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
  6. Standard memberblakbuzzrd
    Buzzardus Maximus
    Joined
    03 Oct '05
    Moves
    23729
    31 Jul '06 19:51
    Originally posted by frogstomp
    Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
    Or disingenuous as to your real purpose.
  7. Standard memberHalitose
    I stink, ergo I am
    On the rebound
    Joined
    14 Jul '05
    Moves
    4464
    31 Jul '06 19:51
    Originally posted by Pawnokeyhole
    *sighs*

    Note: The below differs slightly from the version.

    Okay, suppose I make beings--robots if you like, something else if you don't--with roughly equally balanced natural tendencies to be nasty or nice. Some people might think this approximates humans.

    I leave these being to interact with humans in significant ways.

    But they also have s ...[text shortened]... other reasons that eliminate that liability. But, all else equal, He would be liable.
    That's the sort of liability God faces for creating beings with free will, with conflicting instincts towards good and evil. There may be other reasons that eliminate that liability. But, all else equal, He would be liable.

    Your ceteris paribus distinction raises another point:

    What if the intention was not to have purely "good" or "nice" robots, but "willingly good" or "willingly nice" robots? Let's take love -- would the act (or emotion) be the same if it didn't arise by choice? You can progamme your cellular phone to say: "I love you", but I'm sure you'd agree that a sms message from a lover with the same holds so much more meaning -- even pathos.

    I would contend that God willed to rather create beings capable of true love (with the added risk of hate, i.e. evil) rather than automated-response-robotrons.
  8. Standard memberHalitose
    I stink, ergo I am
    On the rebound
    Joined
    14 Jul '05
    Moves
    4464
    31 Jul '06 19:561 edit
    Originally posted by frogstomp
    Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
    Choice entails risk. Recklessness is the subjective vista of risk.
  9. Standard memberfrogstomp
    Bruno's Ghost
    In a hot place
    Joined
    11 Sep '04
    Moves
    7707
    01 Aug '06 15:24
    Originally posted by Halitose
    Choice entails risk. Recklessness is the subjective vista of risk.
    But you arent taking any risks, because you don't give a hoot how your world works out.
    BTW, would you subject your only begotten son to the whims of your AI's machinations?
  10. Joined
    31 May '06
    Moves
    1795
    01 Aug '06 16:08
    Originally posted by KellyJay
    Another statement of faith, not a fact.
    Kelly
    symanticaly he was actually asking a question, not making any statement of fact.
  11. Joined
    31 May '06
    Moves
    1795
    01 Aug '06 16:211 edit
    Originally posted by KellyJay
    Can you talk about all that is required for conscieousness, you know?
    We can talk about the current flow through a processor, all that goes
    into it, but is just current through a processor, is there a magical
    ciruit design that turns that current into a real thought that makes
    the computer become aware? If not the only thing man can do is
    simply create a faster electrical abacus with the ability to program
    it.
    Kelly
    Neurons 'talk' to each other through electrical impulses as well. What part of a 'neuron' net (brain) do you think is not possible to replicate by other means? Do you suggest that neurons are somehow exempt from the laws of physics? If so on what grounds? If you don't think that neurons are exempt from the laws of physics then there will be some system you can construct that will exactly replicate the important (for the purposes of thought in this case) processes that go on inside and between neurons, and therefore an entire working human brain can be constructed. Are you saying that this would not be sentient (assuming humans are)? And if you allow this to be a sentient being then why would it not be possible to build a nonhuman sentience, or do you not allow for any creature other than Homo sapiens to be sentient? If you think that sentience can't be explained through physical means then where is your evidence for any non-physical entity that somehow imbues sentience?
  12. Joined
    31 May '06
    Moves
    1795
    01 Aug '06 16:27
    Originally posted by frogstomp
    Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
    But what if the only way to acheive the required goal is to alow risk of not acheiving it (atleast this time around)?
  13. Standard memberfrogstomp
    Bruno's Ghost
    In a hot place
    Joined
    11 Sep '04
    Moves
    7707
    01 Aug '06 16:34
    Originally posted by googlefudge
    But what if the only way to acheive the required goal is to alow risk of not acheiving it (atleast this time around)?
    Still, to allow your AI's to act like viruses so they can erase other AI's is only counterproductive , unless you don't care about the final result.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree