A baseball thrown by a pitcher, from the mound, and caught by the catcher, at ho
ID: 1427063 • Letter: A
Question
A baseball thrown by a pitcher, from the mound, and caught by the catcher, at home plate, 60 feet and 6 inches away. Assume the baseball is thrown at a speed of 101 miles per hour and that it moves in a straight line path without changing speed (we make these assumptions to keep the problem simple).
(a) If in the process of the throw, before the ball has left the pitcher’s hand, the ball moves over a distance of 1.50 m, what is the acceleration of the ball? Assume that the ball is uniformly accelerated. (b) How long will it take the ball to arrive at the catcher’s position?
Explanation / Answer
here,
1ft = 0.3048 m
1inch = 0.0254m
1 mph = 0.44704 m/s
net distance to catcher, s = 4ft.6in = 1.2192 + 0.1524 = 1.3716 m
velocity of ball, v = 101 mph = 45.151 m/s
Part A:
For distance, d = 1.50 m
from third equation of motion,
a = v^2/2d
a = 45.151^2/(2*1.5)
a = 679.538 m/s^2
Part B:
as the acceleration was constant, so a = 679.538 m/s^2
Third equation of motion,
s = ut + 0.5at^2
1.3716 = 45.151*t+0.5*679.538*t^2
Upon solving above quadratic equation for time period o reach catcher, t
t = 0.025 s
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.