Originally posted by David113
Prove:
If a, b, c are integers such that |a| < 10^32, |b| < 10^32, |c| < 10^32, and a, b, c are not all zero, then
|a*sqrt(2) + b*sqrt(3) + c*sqrt(5)| > 10^(-100)
Quite an interesting problem! I've had a few ideas so far, but none of them any good . 😕 Any hints? Here's what I have so far.
The form of the problem reminds me of the triangle inequality, which when coupled with some sort of squeezing operation was my first guess on how to solve it. However, the inequality signs are pointing the wrong way, so I don't think the triangle inequality is going to help much. 😕
My next idea was to look at the problem as a Diophantine equation, and see if any Diophantine solutions came pre-packaged for this problem. I didn't find any, but I did stumble across the concept of Diophantine approximations (approximating irrational numbers using rational ones) and methods used to guage just how "good" the approximations can be. I figure that if I can find some expression for approximating 0, and then proving that the approximation is only good enough to estimate the solution with an error greater than 10^(-100) given the bounds of the problem then I will have a fully formed solution. The trick is figuring out how to do all that. 😕
My latest idea was to use calculus to determine the stationary points on a real valued function matching the given one, evaluate the function at those stationary points, and see if I could exhaust all the values at the lattice points surrounding these values to determine by brute force that the given inequality must hold. However, my thoughts are a bit muddled on this one at present because the gradient is constant (with different constant values in each "orthant" - had to look up that word! 🙂) which doesn't help to narrow down the search one bit. 😕
Of course, this problem may end up being solved by a much simpler method by manipulating the inequality appropriately. If so, I'm all ears. Any help is appreciated! 🙂