1. Joined
    11 Nov '05
    Moves
    43938
    10 May '15 20:11
    Originally posted by Shallow Blue
    Oh, come on. That's a mere linguistic quibble. It's not a computer as we understand it now. If we allow any meaning of the word "computer" and "digital", Blaise Pascal may have something to say in the matter. Or Babbage, for that matter. But those are not computers "within the meaning of the act"; and neither is this one.

    Nevertheless, ENIAC wasn't t ...[text shortened]... f knowledge by never having heard of Konrad Zuse, but there you have it.

    Kvikkalkuel, anyone?
    "Oh, come on. That's a mere linguistic quibble."

    And this is what I've been saying all the time!

    If we want the ENIAC to be the first computer, then we just construct a definition that makes ENIAC the first computer. And this is what the quibble is all about.

    The only thing I say is that ENIAC perhaps wasn't the first computer and it's all about definitions.

    There are alternatives, and some of them is brought up in this thread. I am not the only one saying that there are other candidates to be the first computer.

    But those who bring up the Neuman property as a prerequisite to be called a computer - according to that definition in particular, of course ENIAC is the first computer in that respect. But there are other definitions too, we shouldn't forget them.
  2. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    10 May '15 22:49
    Originally posted by FabianFnas
    "Oh, come on. That's a mere linguistic quibble."

    And this is what I've been saying all the time!

    If we want the ENIAC to be the first computer, then we just construct a definition that makes ENIAC the first computer. And this is what the quibble is all about.

    The only thing I say is that ENIAC perhaps wasn't the first computer and it's all about ...[text shortened]... e first computer in that respect. But there are other definitions too, we shouldn't forget them.
    What do you mean by "the Neuman property"?
  3. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    10 May '15 23:00
    Originally posted by Shallow Blue
    Oh, come on. That's a mere linguistic quibble. It's not a computer as we understand it now. If we allow any meaning of the word "computer" and "digital", Blaise Pascal may have something to say in the matter. Or Babbage, for that matter. But those are not computers "within the meaning of the act"; and neither is this one.

    Nevertheless, ENIAC wasn't t ...[text shortened]... f knowledge by never having heard of Konrad Zuse, but there you have it.

    Kvikkalkuel, anyone?
    I think the difficulty with the Z3 and Z4 is that they are electro-mechanical rather than fully electronic, although that's not an objection from my point of view. Z4 was definitely Turing complete. Z3 sort of was, there was no conditional branch so to simulate it one has to calculate all possible branches in a serial fashion and then select the relevant answer, which may have exceeded the practical resources of the machine.
  4. Joined
    11 Nov '05
    Moves
    43938
    11 May '15 04:23
    Originally posted by DeepThought
    What do you mean by "the Neuman property"?
    Sorry, I meant 'turing complete'.
  5. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    11 May '15 22:57
    Originally posted by FabianFnas
    Sorry, I meant 'turing complete'.
    I thought you probably did. Although it's possible to construct a definition of computer that will make any given device the first one, I think the more important question is "What do we mean when we say computer?", once that's been answered the first machine that fulfills that definition is the first computer. For me it's that it is equivalent to a Turing machine - up to finiteness of storage, Turing machines are theoretical and have an infinite tape and therefore infinite memory. Others may regard that it is completely electronic as important. I think we're looking at Zuse Z3 or Z4 as the first, as both predate ENIAC. Babbage's machine was never built, although his son produced and demonstrated a simplified version and if that was Turing complete (it depends on what he left out) then it is the first.
  6. Joined
    11 Nov '05
    Moves
    43938
    12 May '15 05:26
    Originally posted by DeepThought
    I thought you probably did. Although it's possible to construct a definition of computer that will make any given device the first one, I think the more important question is "What do we mean when we say computer?", once that's been answered the first machine that fulfills that definition is the first computer. For me it's that it is equivalent to a Tu ...[text shortened]... d version and if that was Turing complete (it depends on what he left out) then it is the first.
    I agree. First the definition, then the conclusion.

    Many people use the way around. They go from the conclusion "I want ENIAC to be the first computer in the world, how would I forumulate a definition of a computer so tht the ENIAC according to this definition will be the first computer? Well, let's use 'turing complete' property to accomplish this".

    The same goes for the controversy if Pluto is a planet or not. If we use certain definition, then Pluto in fact is a planet. Do we use another definition, then we exclude Puto from being a planet.
    If I were to decide definitions I would say that Jupiter is *not* a planet but a star that wasn't big enough to ignite. Not many agree to this.

    So it is all about definitions.
  7. Joined
    31 Aug '06
    Moves
    40565
    12 May '15 08:58
    Anything that works on binary logic is a computer, and then there are analog computers who are more like measuring devices and who don't use binary logic. Simple, really. 😏

    Feel free to ignore me.
  8. Joined
    11 Nov '05
    Moves
    43938
    12 May '15 10:02
    Originally posted by C Hess
    Anything that works on binary logic is a computer, and then there are analog computers who are more like measuring devices and who don't use binary logic. Simple, really. 😏

    Feel free to ignore me.
    Thank you for this definition.

    Do you think the human brain is a computer?
  9. Joined
    31 Aug '06
    Moves
    40565
    12 May '15 11:401 edit
    Originally posted by FabianFnas
    Thank you for this definition.

    Do you think the human brain is a computer?
    It can be likened to a computer I suppose, but the truth is, I know way too little about computers to judge what is and isn't a computer.

    From my superficial understanding of computers I think it's not entirely flawed to liken the brain with a computer. It is connected to peripheral equipment, both for input and output, it can store results in memory (albeit somewhat flawed memory) and it sort of runs programs in every evaluation of input it performs.

    😕
  10. Joined
    11 Nov '05
    Moves
    43938
    12 May '15 13:00
    Originally posted by C Hess
    From my superficial understanding of computers I think it's not entirely flawed to liken the brain with a computer. It is connected to peripheral equipment, both for input and output, it can store results in memory (albeit somewhat flawed memory) and it sort of runs programs in every evaluation of input it performs.

    😕
    If, and I mean if, we define a biological brain as a computer, then, and only if the if part is fulfilled, ENIAC is certainly *not* the first computer in the world.

    But then we have to define too - how many nervous cells does it need to be defined as a computer?

    (This is not a very serious posting. It is only intended to be some food for thoughts...)
  11. Joined
    31 Aug '06
    Moves
    40565
    12 May '15 13:05
    Originally posted by FabianFnas
    (This is not a very serious posting. It is only intended to be some food for thoughts...)
    As were my posts.

    The brain is a very different beast from the processor I suspect. How many logic gates before it can be considered a processor? Also, is a processor enough to call it a computer, or do you need at least some kind of peripheral?
  12. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    13 May '15 15:33
    Originally posted by FabianFnas
    If, and I mean if, we define a biological brain as a computer, then, and only if the if part is fulfilled, ENIAC is certainly *not* the first computer in the world.

    But then we have to define too - how many nervous cells does it need to be defined as a computer?

    (This is not a very serious posting. It is only intended to be some food for thoughts...)
    If anything that processes information counts then the first strand of RNA should probably be considered a computer. So by that definition computers are at least 3.8 billion years old.
  13. Joined
    18 Jan '07
    Moves
    12438
    14 May '15 10:16
    Originally posted by FabianFnas
    If we want the ENIAC to be the first computer, then we just construct a definition that makes ENIAC the first computer. And this is what the quibble is all about.

    The only thing I say is that ENIAC perhaps wasn't the first computer and it's all about definitions.

    There are alternatives, and some of them is brought up in this thread. I am not the onl ...[text shortened]... e first computer in that respect. But there are other definitions too, we shouldn't forget them.
    Well, yes, partly, but it's generally considered a good idea to stick to the definitions used by the people who, you know, make a living off these things?

    And we define a computer as, in the proper sense, a programmable, Turing-complete Von Neumann machine using digital technology. More general types of computer - analog ones, for example - need to be specified as such. The Analytical Engine was a proto-computer, not a computer.

    Sure, you, the outsider, the non-professional, can deny this. You can also deny that 1 isn't prime, but see how far you get with the average mathematician. Or you can claim that an ogee arch should be counted as a lancet arch - just don't go complain to your architect when your house looks silly. Or the windows fall out.

    By the definitions as used by professionals ENIAC was the first true computer in what was then the Western world, and was beaten as the first true computer overall by one of Konrad Zuse's machine (one can quibble over whether it was the Z3 or the Z4, but not that he beat ENIAC). And there may have been an even earlier one in Stalinist Russia which is still classified, who knows? But the subject of this thread? No. It's a fore-runner of true computers, and an interesting one, but it doesn't beat ENIAC. Zuse did.
  14. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    14 May '15 15:06
    Originally posted by Shallow Blue
    Well, yes, partly, but it's generally considered a good idea to stick to the definitions used by the people who, you know, make a living off these things?

    And we define a computer as, in the proper sense, a programmable, Turing-complete Von Neumann machine using digital technology. More general types of computer - analog ones, for example - ne ...[text shortened]... 's a fore-runner of true computers, and an interesting one, but it doesn't beat ENIAC. Zuse did.
    No it doesn't have to be a von Neumann machine. A von Neumann machine is a specific architecture that all modern machines are based on, but not the only way to do it. Turing completeness really is the only criterion. For example, the von Neumann machine specifies registers, but there is no reason a computer couldn't be built without them. The registers are to store the internal state of the machine, but it could be stored in other ways, although it's probably more efficient than anything else easily imagined.

    The problem with Babbage's analytical engine is not that it was a "proto-computer" - it was certainly not an analogue machine as you seemed to imply - but that it was not built. Although, as I mentioned above, his son produced a simplified version and if that was Turing complete then it should have priority.

    Just a note on the von Neumann architecture. Modern machines almost invariably have what is normally called a modified Harvard architecture. The (unmodified) Harvard architecture has strict separation between program and data storage, which makes security easier. The modified architecture only enforces this at the level of the L1 cache.
  15. Joined
    11 Nov '05
    Moves
    43938
    14 May '15 20:15
    Originally posted by Shallow Blue
    Well, yes, partly, but it's generally considered a good idea to stick to the definitions used by the people who, you know, make a living off these things?
    So if we use one definition instead of another - does this mean that those people who make a living out of it earn less cash? In short - does it matter much?

    I say the definition is just an opinion. The writer of the article has one opinion, you have an another opinion. And you say it matter - how?

    If you want ENIAC to be the first computer, then you should stick with yours. I stick with the opinion that it just doesn't matter. And see - the world didn't fall apart.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree