Originally posted by FreakyKBH
The OP is the question which is put to anyone who has an idea on the topic.
Given the distance of the source, at what angle will the light hit any object, namely, the earth?
It is a spherical wavefront and you can calculate the difference between points that light hits at the exact same time, supposing the sun was putting out pulses a trillionth of a second apart and you could know exactly what pulse you are tracking then you can see the wavefront hits Earth at the closest distance and then hits the edge about 24,000 microseconds later.
But the angle you can do pretty simply, Earth's orbit around the sun is about 550 million miles in circumference, close enough for government work. Earth is about 8000 miles wide so across Earth is a change in angle of about 1 part in roughly 68,000. (meaning you can put about 68000 Earths side by side in the circle of Earths orbit) 1 arc second is cutting a circle into 1,296,000 parts (360 degrees * 60 minutes * 60 seconds of arc) so divide that by 68000 and you get roughly a difference of about 18 arc seconds from one side of Earth to the other when you point a very accurate caliper at the sun measuring the angles. So from dead center of Earth, (the closest point on Earth to the sun, a line drawn from there to the closest point on the sun, (center line of Earth to the center line of Sol) the angle would be +/- about 9 arc seconds. This is a back of the envelope thing, I didn't look up exact numbers but these numbers are fairly close.
But why is this of interest?
My guess is you will tie this to the flat Earth somehow, perhaps saying see, the sun is say 3000 miles above Earth and subtends an angle of 18 arc seconds so it OBVIOUSLY 0.3 mile across. Something like that? Saying, see, you can't tell the difference if it is 93 million miles away V 3000 miles. Is that your point?