If the infinite set of fractions is countable why is the set of decimals between 0 and 1 not countable? Surely any decimal can be written as a fraction and therefore the decimals are a subset of the fractions.

Originally posted by wolfgang59 If the infinite set of fractions is countable why is the set of decimals between 0 and 1 not countable? Surely any decimal can be written as a fraction and therefore the decimals are a subset of the fractions.

Can anyone explain it simply?

If by 'decimals' you are including irrational numbers then no, they cannot all be written as fractions. That is after all the difference between an irrational number and a rational number. ('rational' means 'can be written as a ratio' ).

One way to prove non-rationals exist is to prove that the square root of two cannot be rational:
http://en.wikipedia.org/wiki/Square_root_of_2

Once you understand that, read through this:
http://math.stackexchange.com/questions/666526/proving-that-an-irrational-number-exists-near-every-rational-number

Originally posted by twhitehead If by 'decimals' you are including irrational numbers then no, they cannot all be written as fractions. That is after all the difference between an irrational number and a rational number. ('rational' means 'can be written as a ratio' ).

One way to prove non-rationals exist is to prove that the square root of two cannot be rational:
http://en.wikipedi ...[text shortened]... xchange.com/questions/666526/proving-that-an-irrational-number-exists-near-every-rational-number

Can really a number, which is represented by a written decimally, ever be an irrational number? Pi is a irational that never can be exactly written out with decimals.

So of course he cannot include irrational numbers that are decimals...

Originally posted by FabianFnas Pi is a irational that never can be exactly written out with decimals.

An irrational number requires an infinite number of non-repeating decimals, but apart from that, it can be written out on an infinite piece of paper.
All rational numbers have repeating decimals, and it is purely a matter of convenience that causes us to not bother with writing out the repeating zero decimal that exists on most of our every day numbers.

There was a discussion in another thread about how 1.0000..... is the same number as 0.99999.....

Originally posted by twhitehead An irrational number requires an infinite number of non-repeating decimals, but apart from that, it can be written out on an infinite piece of paper.
All rational numbers have repeating decimals, and it is purely a matter of convenience that causes us to not bother with writing out the repeating zero decimal that exists on most of our every day numbers.
...[text shortened]... here was a discussion in another thread about how 1.0000..... is the same number as 0.99999.....

"An irrational number ... can be written out on an infinite piece of paper."

No, you cannot, that large piece of paper doesn't exist. Only rational numbers can be written on a piece of paper, if (of course) you accept the '...' notation, or change the base of ten to something else.

Originally posted by FabianFnas No, you cannot, that large piece of paper doesn't exist. Only rational numbers can be written on a piece of paper, if (of course) you accept the '...' notation, or change the base of ten to something else.

Well by your definition, a decimal number that can be written out is therefore just another way of saying 'rational number' and they are countable. The OP however is says the decimals it is talking about are not countable.

wolfgang59 is either wrong that they are not countable, or wrong that they are rational.

Originally posted by wolfgang59 If the infinite set of fractions is countable why is the set of decimals between 0 and 1 not countable? Surely any decimal can be written as a fraction and therefore the decimals are a subset of the fractions.

Can anyone explain it simply?

Do you count irrational numbers to be decimals or not.

If not, then the decimals are countable,

if yes, then the decimals aren't countable.

PS: Does anyone know the official definition of decimal?

Originally posted by wolfgang59 If the infinite set of fractions is countable why is the set of decimals between 0 and 1 not countable? Surely any decimal can be written as a fraction and therefore the decimals are a subset of the fractions.

Can anyone explain it simply?

The set of rationals is countable because on can produce a procedure to assign an integer to each one of them. Suppose the set of reals between 0 and 1 were countable. Then we could write them as an infinitely long list:

(1) 0.a11a12a13a14···.
(2) 0.a21a22a23a24···
(3) 0.a31a32a33a34···
etc.
where anm is the mth digit of the nth number - this attempts to define such a map.

Consider a number 0.b1b2b3b4··· where we choose b1 = 1 unless a11 = 1 when b1 = 9, b2 = 1 unless a22 = 1 when b2 = 9 and in general bn = 1 unless ann = 1 when bn = 9. Then 0·b1b2b3b4··· differs from the nth number in our list at the nth decimal place. If we add it in at the start and shift them all along by one and do the process again we will just find the same thing happening again, so the list can never be exhaustive of the real numbers. Since we cannot produce such a list there is no mapping between the reals and the integers which is both one-to-one (one integer maps to one real, aka injective) and onto (for each integer there is a real, surjective) so the set of real numbers is not countable.

Originally posted by wolfgang59 Just t be clear:
decimals = rational numbers that can be written as a finite string of numerals after the decimal point.

I saw a proof on youtube that they are uncountable. Is that wrong?

decimals = rational numbers that can be written as a finite string of numerals after the decimal point.

Does that mean that 1/3 isn't a decimal?
Anyway these so called decimals are countable. Can you please provide a link for the youtube 'proof' that decimals aren't countable?

Originally posted by wolfgang59 Just t be clear:
decimals = rational numbers that can be written as a finite string of numerals after the decimal point.

I saw a proof on youtube that they are uncountable. Is that wrong?

"... written as a finite string of numerals ..."

Of course it must be a finite string of numerals, you cannot ever write out an infinite string of numerals. There are not enough paper in the universe to hold the numerals, and furthermore, the universe will go into ash (or something) before you are finished with it.

Avoid 'finite' in your definition. It's ambiguous. And obvious.

Originally posted by wolfgang59 Just t be clear:
decimals = rational numbers that can be written as a finite string of numerals after the decimal point.

I saw a proof on youtube that they are uncountable. Is that wrong?

The proof I reproduced above is the standard one due to Cantor. If that is the one from the YouTube video then it is correct, but other proofs are possible. I'd regard a decimal number as a number written in base 10. Any decimal which can be written down with a finite number of digits is automatically rational. 0.123 for example is 123÷1000. I would not regard the restriction that it has a finite number of digits as necessary to its categorization as a decimal, that is just saying that you are using the standard positional notation for numbers and using base 10. Talking about a decimal number with an infinite number of digits is perfectly sound. Writing out such a number is a supertask but that in itself isn't any great objection. You could just have a Turing machine calculate pi and have it write the first digit after ½ a second, the next digit ¼ of a second later and so on. Assuming continuous paper (as opposed to the normal atomistic stuff) then it could make each successive digit more and more narrow so that it could be done on a finite sized piece of paper.

The Wikipedia page might present the proof better than I did:
http://en.wikipedia.org/wiki/Cantor's_diagonal_argument

Originally posted by DeepThought The set of rationals is countable because on can produce a procedure to assign an integer to each one of them. Suppose the set of reals between 0 and 1 were countable. Then we could write them as an infinitely long list:

(1) 0.a11a12a13a14···.
(2) 0.a21a22a23a24···
(3) 0.a31a32a33a34···
etc.
where anm is the mth digit of the nth number - this at ...[text shortened]... onto (for each integer there is a real, surjective) so the set of real numbers is not countable.

Yes this is similar to the proof that confused me.

Originally posted by wolfgang59 Yes I'm familiar with Cantor's argument, elegant and simple.

The "proof" of decimals being uncountable was actually a
proof of the irrationals being uncountable.

Playing blitz, demolishing a bottle of red and watching Maths
youtube vids is obviously beyond my multi-tasking ability!

Sorry for confusion everyone! ðŸ˜³

Don't worry about it, this forum died a bit of a death recently so it's good to have something to talk about. Just one thing though. It is the transcendentals which are uncountable. All numbers which are solutions to polynomial equations such as:

3x² - 2x + 1 = 0

or

ax^N + b x^(N - 1) + ··· + gx + h = 0

up to some finite, but arbitrary, N are countable. This is because you can produce a counting procedure for the coefficients (which are integers) and they have no more than N roots per polynomial. Things like pi are transcendental because they are not solutions to polynomial equations with a finite number of terms. These are the numbers which are uncountable.