18 Apr '15 10:36>5 edits
This seems to me like a pretty simple maths problem and yet, somehow, this one has so far got me frustratingly completely stumped:
Suppose we observed how often a completely random event happened in an arbitrary very 'large' number of seconds, say, a trillion seconds.
So, using conventional probability terminology, here we are using that observed trillion seconds as the sample space and one second as our unit measure.
There is no pattern to when the events occurs so, in any give one second period, there is a non-zero probability of no event occurring and there is a non-zero probability of the event occurring just once and there is a non-zero probability of the event occurring twice and so on for infinitum although lets say the event can only occur a non-infinite number of times in each second but with no definable finite upper limit.
Suppose, on average, we observe the event occurring with some average observed frequency F of v events per second s. So we know the approximate value of F and:
F = v/s.
Suppose F > 0 but F may or may not be greater than 1.
Now suppose we completely randomly pick just one of those one second periods.
Even if F >> 1, there must still be a small but non-zero probability that there will be no event in that second.
So what is the correct algebraic equation (or a reasonable numerical approach if no such algebraic equation exists ) for at least a good approximation of the probability of no event occurring in our randomly chosen second?
Suppose we observed how often a completely random event happened in an arbitrary very 'large' number of seconds, say, a trillion seconds.
So, using conventional probability terminology, here we are using that observed trillion seconds as the sample space and one second as our unit measure.
There is no pattern to when the events occurs so, in any give one second period, there is a non-zero probability of no event occurring and there is a non-zero probability of the event occurring just once and there is a non-zero probability of the event occurring twice and so on for infinitum although lets say the event can only occur a non-infinite number of times in each second but with no definable finite upper limit.
Suppose, on average, we observe the event occurring with some average observed frequency F of v events per second s. So we know the approximate value of F and:
F = v/s.
Suppose F > 0 but F may or may not be greater than 1.
Now suppose we completely randomly pick just one of those one second periods.
Even if F >> 1, there must still be a small but non-zero probability that there will be no event in that second.
So what is the correct algebraic equation (or a reasonable numerical approach if no such algebraic equation exists ) for at least a good approximation of the probability of no event occurring in our randomly chosen second?