05 Jul 12
Originally posted by Thequ1ckWe could be, but the fact is good old carbon has way more chemical tricks up its sleeve than anything else in the periodic table.
Shouldn't we put away our ego's and recognize that we could be tools to enable other life?
Silicon runs a distant second to carbon. Which is not to say it can't be done, just that we would have to be a lot trickier with the chemistry.
A lot of room to play in, like this new work on 'artificial atoms' for instance.
05 Jul 12
Originally posted by Thequ1ckI am sure we we eventually make a computer intelligence to rival our own. I am less convinced that it will every be self reproducing (and therefore worthy of the label 'life'.
Shouldn't we put away our ego's and recognize that we could be tools to enable other life?
I just don't think there would be much motivation for it. Of course, if one does become self reproducing and has the inbuilt urge to keep reproducing then they will start evolving and reproduction will become the norm.
But I suspect that intelligent beings capable of repairing themselves is far more likely and that they will have no need nor desire for reproduction. What they might do is create their own 'slaves', an army of robots to do their bidding, but not necessarily equal in intelligence.
As for ego, we need to first 'put away our ego' and recognize that even now we are not the pinnacle of evolution and other life forms that enabled our presence are in no way diminished by our existence.
05 Jul 12
Originally posted by twhiteheadSorry, I thought you meant silicon based biological life forms. You are talking about computers ala Ray Kurzweil's singularity:
I am sure we we eventually make a computer intelligence to rival our own. I am less convinced that it will every be self reproducing (and therefore worthy of the label 'life'.
I just don't think there would be much motivation for it. Of course, if one does become self reproducing and has the inbuilt urge to keep reproducing then they will start evolving ...[text shortened]... and other life forms that enabled our presence are in no way diminished by our existence.
http://en.wikipedia.org/wiki/Technological_singularity
As computers get more and more complex and faster, now they are thinking in terms of Exaflop computers, 1000 quadrillion instructions per second, having way passed petaflop, I think up to around 14 Petaflop or 0.014 Exaflop. That number will invariably go up and up in the next decade or so and will eventually surpass human intelligence, at least according to Kurzweil.
Every thing human so far has been exceeded by computers, like chess, as we know, and Go probably not too far behind. So it seems not such a long shot that our intelligence will be exceeded in the near future (within 40 years some say).
At first, even if they do score higher on IQ tests and so forth, they will be limited to hulking cabinets by the dozens with millions of individual CPU boards and such and would not be likely to be any kind of threat since we can just pull the power plug if it got to uppity🙂
Later, though, when individual CPU's are the size of viruses, it would be a different matter. I would see them at first being like a brain graft, part of a human.
Later than that though, it could come about they could house their brains in bodies of any sort, cripples, who knows, they could rewire someones body to work again at that point in the technology race.
That alone could give rise to a new class of humans, hybrids with the minds of genius in all fields.
They could if they wanted to, give us unlimited energy or transport systems, and such but I think it more likely they would just build their own spacecraft and leave us in the lurch, founding a new civilization somewhere else in the solar system or even going to the stars. They could at that point probably control hibernation to such an extent as to allow sub light speed trips of a thousand years or more so they would not have to put up with interference from pesky inferior humans.
05 Jul 12
Originally posted by sonhouseThe will to live or compete etc is a result of evolution. Unless computers evolve, or we deliberately code for it, these desires will not exist. It will be interesting to see how an advanced intelligence without human emotions actually behaves. Maybe it will be necessary to include a will to live just to keep it from self destructing.
They could if they wanted to, give us unlimited energy or transport systems, and such but I think it more likely they would just build their own spacecraft and leave us in the lurch, founding a new civilization somewhere else in the solar system or even going to the stars. They could at that point probably control hibernation to such an extent as to allow s ...[text shortened]... and years or more so they would not have to put up with interference from pesky inferior humans.
Originally posted by twhiteheadThe motivations of the rich for immortality should be enough to allow for this to happen.
The will to live or compete etc is a result of evolution. Unless computers evolve, or we deliberately code for it, these desires will not exist. It will be interesting to see how an advanced intelligence without human emotions actually behaves. Maybe it will be necessary to include a will to live just to keep it from self destructing.
I guess you're right in that it is a matter of entropy. We feel the need to destroy and rebuild as part of humanity.
Without emotions and the cycle of reorder. Do we then cease to be human?
We go on about our human needs like captain Kirk beating his chest but
if we had a spell inside a computer, without emotions and needs would we then long for our fears and insecurities or would we shy from them?
05 Jul 12
Originally posted by sonhouseActually Silicon has a good number of properties Carbon doesn't not least of which is abundance.
We could be, but the fact is good old carbon has way more chemical tricks up its sleeve than anything else in the periodic table.
Silicon runs a distant second to carbon. Which is not to say it can't be done, just that we would have to be a lot trickier with the chemistry.
A lot of room to play in, like this new work on 'artificial atoms' for instance.
Originally posted by sonhouseGonna have to get back to you on that. That's a shed load of good info and I need time to digest it.
Sorry, I thought you meant silicon based biological life forms. You are talking about computers ala Ray Kurzweil's singularity:
http://en.wikipedia.org/wiki/Technological_singularity
As computers get more and more complex and faster, now they are thinking in terms of Exaflop computers, 1000 quadrillion instructions per second, having way passed petafl d years or more so they would not have to put up with interference from pesky inferior humans.
06 Jul 12
Originally posted by twhiteheadHow would this hypothetical artificial sentience make decisions without emotions?
The will to live or compete etc is a result of evolution. Unless computers evolve, or we deliberately code for it, these desires will not exist. It will be interesting to see how an advanced intelligence without human emotions actually behaves. Maybe it will be necessary to include a will to live just to keep it from self destructing.
Without emotions you don't 'care' about anything.
Sure emotions can get in the way of making good decisions once you have decided what
you care about and what your goals are.
But I don't see how you can possibly come up with any goals in the first place without
emotions.
Now you could have goals imposed on you from an outside agent/creator but I think to
truly qualify as a sentient being you need to be able to set your own goals.
Not just mindlessly follow others instructions.
Which requires emotions.
Originally posted by googlefudge
How would this hypothetical artificial sentience make decisions without emotions?
Without emotions you don't 'care' about anything.
Sure emotions can get in the way of making good decisions once you have decided what
you care about and what your goals are.
But I don't see how you can possibly come up with any goals in the first place without ...[text shortened]... your own goals.
Not just mindlessly follow others instructions.
Which requires emotions.
Now you could have goals imposed on you from an outside agent/creator but I think to
truly qualify as a sentient being you need to be able to set your own goals.
Not just mindlessly follow others instructions.
yes, else, from the perspective of goals, how would you differ from just a stupid robot blindly obeying orders?
Originally posted by humyAntónio Damásio has written a good book called 'the feeling of what happens' it's a very interesting read and provides a language structure to help decipher emotions.Now you could have goals imposed on you from an outside agent/creator but I think to
truly qualify as a sentient being you need to be able to set your own goals.
Not just mindlessly follow others instructions.
yes, else, from the perspective of goals, how would you differ from just a stupid robot blindly obeying orders?
Originally posted by sonhouseThanks for this link sonhouse there are so many good reads in it.
Sorry, I thought you meant silicon based biological life forms. You are talking about computers ala Ray Kurzweil's singularity:
http://en.wikipedia.org/wiki/Technological_singularity
As computers get more and more complex and faster, now they are thinking in terms of Exaflop computers, 1000 quadrillion instructions per second, having way passed petafl d years or more so they would not have to put up with interference from pesky inferior humans.
My question remains though. Should a superintelligence singularity occur, whether now or in another hundred years (although it is expected around 2030) is it not possible that this intelligence could manipulate electron based machinery of the past and present?
If so are we not then ghosts in the machine?
How do you know for example that what you are reading is what I have typed.
It exists as a possibility that has traversed through a quantum router.
As humans, our macro-molecular structure has us deeply seated on highway 66. No getting away from that. But computers and routers that base decisions on a single electron energy. Well..that's different. Why shouldn't the first time travelers be computers?
Originally posted by Thequ1ckI guess you could make a case that time travel has happened from a computer's POV because we are leading up to a higher class of computer but other than that there doesn't seem to be anything happening in our world that would benefit such computers.
Thanks for this link sonhouse there are so many good reads in it.
My question remains though. Should a superintelligence singularity occur, whether now or in another hundred years (although it is expected around 2030) is it not possible that this intelligence could manipulate electron based machinery of the past and present?
If so are we not then gho ...[text shortened]... electron energy. Well..that's different. Why shouldn't the first time travelers be computers?
Besides there is also a theory that there is a universal roadblock to time travel, whether true or not remains to be seen.
Here is Hawking's take on time travel:
http://www.foxnews.com/scitech/2010/05/03/time-travel-possible-says-stephen-hawking/