05 Mar '16 19:28>10 edits
This one has really got me totally stumped:
what is the integral of:
∫[θ = 0, ∞] ( h (θ^m) ) / ( e^( θ(h + s) ) ) dθ
m ∈ ℕ,
m>0,
h, s, m, θ ∈ ℝ,
s, m, θ ≥ 0,
h>0
( so all positive i.e. none are allowed to be negative )
and why?
I tried Wolfram-Alpha this but it wouldn't give me the answer directly and what I seemed to indirectly infer from Wolfram-Alpha (by inputting a series of reduced versions of the above integral into it ) is that it implies it is:
h θ^( (m-1)! ) / ( ( h + s )^m )
(bear in mind that m∈ℕ and m>0 else cannot have that factorial )
BUT it seems to me that this MUST be false because I just get complete nonsense when I test that with a computer program.
what is the integral of:
∫[θ = 0, ∞] ( h (θ^m) ) / ( e^( θ(h + s) ) ) dθ
m ∈ ℕ,
m>0,
h, s, m, θ ∈ ℝ,
s, m, θ ≥ 0,
h>0
( so all positive i.e. none are allowed to be negative )
and why?
I tried Wolfram-Alpha this but it wouldn't give me the answer directly and what I seemed to indirectly infer from Wolfram-Alpha (by inputting a series of reduced versions of the above integral into it ) is that it implies it is:
h θ^( (m-1)! ) / ( ( h + s )^m )
(bear in mind that m∈ℕ and m>0 else cannot have that factorial )
BUT it seems to me that this MUST be false because I just get complete nonsense when I test that with a computer program.