Ok this one requires only a bit more mathematics then the last two. It is a classic example, introducing a concept called "asymmetric information".
There are 2 players, P1 and P2.
P1 owns a business. The business is worth V to P1.
P2 is considering buying the business. P2 values the business at 1.5V (Maybe P2 is a shrewder entrepenuer and thus can get more out of the business or P1 is subject to a 33% tax on earnings from which P2 is exempt).
Here's where asymmetric information comes in. Only P1 knows V. P2 only knows that V is uniformly distributed on the interval [0,1].
So P1 knows V.
P2 knows V~U(0,1).
Now P2 makes P1 a take-it-or-leave-it offer for the business. If the offer is greater than or equal to V, P1 will accept. If the offer is less than V, P1 will refuse.
What offer should P2 make to P1?
Originally posted by telerioncould you please explain your notation?
Ok this one requires only a bit more mathematics then the last two. It is a classic example, introducing a concept called "asymmetric information".
There are 2 players, P1 and P2.
P1 owns a business. The business is worth V to P1.
P2 is considering buying the business. P2 values the business at 1.5V (Maybe P2 is a shrewder entrepenuer and thu ...[text shortened]... will accept. If the offer is less than V, P1 will refuse.
What offer should P2 make to P1?
what does V~U(0,1) mean. keep in mind that i read your entire post and still couldn't figure it out.
Originally posted by The PlumberIf you have no idea what it means you should just leave this thread. It won't get any easier 😀
Some of us don't understand what "distributed as uniform on the interval [0,1]" means.
Anyway, uniformly distributed means that all numbers in the interval have equal chance of being hit, or whatever the word is.
Hmm, how about this:
Suppose P1 offers to sell at price x. Then P2 knows that x is less than or equal to V, as P1 wouldn't sell at a loss. Therefore V must lie somewhere between 0 and x. Since this is a one-off transaction, P2 has no reason to believe that P1 is not trying to oversell the business, so from P2's p-o-v V~U[0,x] (or U[0,1] if x > 1). The business therefore has a value to P2 of 3x/4 or 3/4, whichever is less, so he declines the offer.
=> No transaction occurs at any price.
You thinking along the correct lines. This is where you go astray just a bit.
Originally posted by Acolyte
Suppose P1 offers to sell at price x.
But notice from the OP that P2 is the player making the offer to P1. Amend your reasoning with this in mind, and I'm sure you will have it in a snap. Good work by the way.
Originally posted by telerion😳 Oops, misread the question.
You thinking along the correct lines. This is where you go astray just a bit.
Originally posted by Acolyte
[b]Suppose P1 offers to sell at price x.
But notice from the OP that P2 is the player making the offer to P1. Amend your reasoning with this in mind, and I'm sure you will have it in a snap. Good work by the way. [/b]
Suppose P2 offers to buy at x. Then P1 will only sell if V < x, so on average if P1 does sell, P2 will get something worth 3/4 or 3x/4 to him, whichever is lower, for a net profit of -x/4 or worse. If P1 doesn't sell then of course P2 breaks even, so overall for x > 0 P2 has an expected loss. Therefore P2 should offer nothing for the business.
Bingo!
I should say just a bit about the puzzle. Since P2 values the business more than P1, trade would occur under complete information (ceteris peribus and assuming P2 can afford V). Since it is a take-it-or-leave-it offer, the bid would equal V. If they could negotiate, then the accepted bid would lie somewhere between V and 1.5*V. In this case, the uncertainty built into the model prevents trade.