*Originally posted by Ramned*

**isnt this a matter of V / T - to find acceleration upon contact? So you do not need mass, unless it wants to find force later.
**

-30.5 / .0050 = a

then, f = .145(a)

a = -6100.

F = ma

F = .145(-6100)

F = -884.5 N

Pretty easy. Is this intro physics class or something?

Meanwhile, for those looking for some hard-time physics try to solve mine ...[text shortened]... l fall vertically by the time it reached home plate, 60.5 feet away. (*Hint: Convert to meters)?

Wouldn't that depend on whether it was pitched by Cy Young V Shaq O'neal?

6 feet up V 7 feet up seems to me would make a difference and no vertical height is given. What if it is pitched by a midget? So 4 feet up V 7 feet.

Also you can only start out horizontal, the ball would immediately start a parabolic curve downwards, so at best, if the ball was thrown in outer space with no force downwards, it would take 0.409 seconds to reach home plate.

So the answer depends on the initial pitching height. S=(A(T^2))/2.

Keeping it in feet, 60 MPH is 88 Feet/sec so 101 MPH is (101/60)*88

so that is 1.6833*88 or (148.13 F/S)/60.5 feet = 0.409 seconds and

so S=(32*.167)/2, it would drop 2 2/3 feet in that time, doesn't matter if it is dropped straight away or pitched, it still drops at 32 F/S^2 but it sure matters what height it was pitched from if it is to land somewhere near the catchers mitt. Suppose it was pitched by an 8 foot dude and he let go at the top of his swing and it was therefore 12 feet in the air when pitched, it would be just under 10 feet in the air as it whizzed by home plate, making it near impossible to catch. If it was pitched by a 6 footer and he let loose at shoulder height then it would be about 5 feet up and would end up at about half that, a perfect height to be caught by the catcher. Why is it a hint to do the arithmetic with metric?