1. Joined
    06 Mar '12
    Moves
    642
    21 Oct '15 10:3611 edits
    I have invented a new kind of probability density function but not sure what kind of probability it represents in this case so I wonder if someone here would form an option on this.

    let the notation we use here be:

    density(x) means probability density of random continuous variable x

    cumulative(x) means probability density of random variable x

    The general equation for my new kind of probability density function is:

    density( cumulative(h) = p ) = 1 where 0≤p≤1
    and h is an observed value, and, in this case, necessarily the first and only observed value, of continuous random variable x in the sample space.

    The equations reads as "the probability density of the cumulative of observed h equaling p is 1"

    Note that this general equation is quite generic because it purposely doesn't specify what kind of continuous probability distribution cumulative(h) is actually of; it could be any kind of continuous probability distribution you like although, obviously, to actually use the equation, you must specify what kind of distribution.

    But, for starters, I am not sure if this " density( cumulative(h) = p ) = 1 " can be defined as a prior probability density or a posterior probability density because, although it necessarily can only be defined after observing evidence h, I cannot see how it could come from the equation for posterior probability (which I take as being (H|E) = (E|H)(H) / (E) where E = evidence and H = hypothesis ) because that posterior probability equation seems to assume that there exists a prior probability ( probability density in this case ) and yet, in this case, I can tell you that no probability or probability density exists prior to observing h!
    Instead, this " density( cumulative(h) = p ) = 1 " is meant to be taken as a first principle and thus doesn't require being derived from anything else thus doesn't require being derived from the equation for posterior probability.
    So, can a probability still be defined as a "posterior probability" even if it cannot be derived from the equation for posterior probability! If not, does that mean that " density( cumulative(h) = p ) = 1 " is neither a prior nor a posterior probability! If so, what kind of probability would you call it if neither of those two kinds?

    I also not sure if, in this case, whether I have defined a likelihood function or a probability that isn't a likelihood function. Which sort is it and why?

    Any insight or opinion would be appreciated.
  2. Joined
    06 Mar '12
    Moves
    642
    21 Oct '15 12:15
    misprint:

    "cumulative(x) means probability density of random variable x "

    should be

    "cumulative(x) means cumulative of random variable x "
  3. Joined
    06 Mar '12
    Moves
    642
    21 Oct '15 14:318 edits
    Originally posted by humy

    I also not sure if, in this case, whether I have defined a likelihood function or a probability that isn't a likelihood function. Which sort is it and why?
    Just had another think about this and, unless I am missing something, this is a lot simpler than I thought and this equation is not a likelihood function as I hope I correctly currently understand likelihood function.
    My current line of reasoning is:
    this isn't for density(h) but rather density( cumulative(h) = p ).
    To be a likelihood function, it must be (E|H) and not (H|E).
    Here hypothesis H is:

    H = "cumulative(h) = p"

    And evidence E is simply:

    E = h

    But that would mean, because it is density( cumulative( h ) = p ) and not density(h) , it is the probability density of H given E, NOT the probability density of E given H.
    Therefore, this is (H|E) and not (E|H) and thus this is not a likelihood function. Simple (I think. Hope I got that right and not just made a complete idiot of myself ).

    But is it correct to call it a "posterior probability" even though it cannot possibly be derived from the equation for posterior probability because there exists no prior probabilities?
  4. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    21 Oct '15 16:35
    Hi Humy, I don't get what you're doing, or at least what is different about what you are doing. Probably a concrete example will help. Can you apply this to a single radioactive atom? In other words try deriving the rule that the probability the atom has not decayed drops exponentially with time.
  5. Joined
    06 Mar '12
    Moves
    642
    21 Oct '15 19:2414 edits
    Originally posted by DeepThought
    Can you apply this to a single radioactive atom?
    No. h has got to be a specific observed value of a continuous random variable (such as, say, 2.234 ) , not merely a single event/non-event.

    Say you have a load of live electric wires, each with a different voltage and it is this voltage that is the continuous random variable x we are considering. Lets say you are given info from a trustworthy source that the range of voltages in the wires is of, say, a continuous uniform distribution from 0 volt as its lower limit to some maximum upper limit of voltage, but you are not told what that upper limit is, only that it exists.

    You then randomly pick one of these live wires and make a measurement with a volt meter. Say the reading you get just happens to be exactly 1.5 volts. Here in this context that will mean:

    h = 1.5

    hence the equation:

    density( cumulative(h) = p ) = 1 where 0≤p≤1

    becomes;

    density( cumulative(1.5) = p ) = 1 where 0≤p≤1

    and now what that equation says about the cumulative of h, i.e. cumulative(1.5), is that, from your limited information, you should assume the probability of cumulative(1.5) being between, say, 0.1 and 0.2 of the actual distribution of x (so your unknown upper limit of the uniform distribution is somewhere between 1.5/0.1 = 15 volts, corresponding to that 0.1, and 1.5/0.2 = 7.5 volts, corresponding to that 0.2 ), is the same as cumulative(1.5) being between, say, 0.2 and 0.3 of the actual distribution of x. (and that probability in each case being 0.1 ). And the probability of cumulative(1.5) being less that half ( <0.5 ) of the actual distribution is the same as cumulative(1.5) being more than half ( >0.5 ), of the actual distribution (and that probability in each case being 0.5 so the probability of 1.5 being more than the median of the actual distribution is the same as it being less than the median of the actual distribution ). In other words, for any given magnitude of range of what proportion cumulative(1.5) is of the actual distribution, the probability is the same regardless of i.e independent of the values that range starts and ends with hence that is why my equation says the probability density of "cumulative(h) = p" is "=1".

    I hope I haven't made any part of that confusing; I confess I struggle somewhat to explain that concisely.

    P.S. still not sure but I now think it probably is correct to call it "posterior probability" even though it cannot be derived from the standard equation for posterior probability. If that is right, I think I ought to invent a special name for this new kind of posterior probability. It is "new" because, 1, it cannot come form the standard equation for posterior probability, 2, unusually for legitimate probability, there exists no prior probabilities for the evidence thus; 3, it doesn't come from updating priors.
  6. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    22 Oct '15 02:27
    Originally posted by humy
    No. h has got to be a specific observed value of a continuous random variable (such as, say, 2.234 ) , not merely a single event/non-event.

    Say you have a load of live electric wires, each with a different voltage and it is this voltage that is the continuous random variable x we are considering. Lets say you are given info from a trustworthy source that th ...[text shortened]... re exists no prior probabilities for the evidence thus; 3, it doesn't come from updating priors.
    What do you mean by cumulative? Are you trying to assign probabilities to outcomes based on data? I'd assumed you meant in some integration over a probability sense.
  7. Joined
    06 Mar '12
    Moves
    642
    22 Oct '15 07:417 edits
    Originally posted by DeepThought
    What do you mean by cumulative?
    just exactly the standard conventional meaning from cumulative i.e. as in:

    https://en.wikipedia.org/wiki/Cumulative_distribution_function

    I don't attach any special non-standard meaning to the word 'cumulative', if that is what you are thinking?

    from that link, "...The cumulative distribution function of a real-valued random variable X is the function given by

    Fx(x) = P(X≤x)

    ..."
    (the "x" in "Fx" is in a form of a subscript which I cannot edit in here correctly )

    and what I mean by "cumulative(x)" is exactly the same meaning as that "Fx(x)" definition above; I just used my own different and more wordy notation.


    Are you trying to assign probabilities to outcomes based on data?

    yes, exactly that.
    Incidentally, in this case, it necessarily has to be just one piece of data from just the first observation of an instance of x (which I call h here ) and, as soon as another observation of x is made from the same sample space, by probability density equation (i.e. "density( cumulative(h) = p ) = 1 where 0≤p≤1 " ) no longer applies but rather a different and much more complex equation that I have deduced applies, although you don't need to know any of that to understand my probability density equation.
  8. Joined
    06 Mar '12
    Moves
    642
    22 Oct '15 08:443 edits
    I think my OP is pretty much redundant now as I now think I have now pretty much worked out the answers. It isn't a likelihood function but is a posterior probability but it is a new kind of posterior probability fundamentally different from any conventional posterior probability as it doesn't obey the same rules of probability and I will come up with special name for it in due course.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree