A person standing at the top of a hemispherical rock of radius R = 11.00 m kicks a ball (initially at rest on the top of the rock) to give it horizontal velocity vi
1)What must be its minimum initial speed if the ball is never to hit the rock after it is kicked?
2) With this initial speed, how far from the base of the rock does the ball hit the ground?
Im a bit confused
What im doing is just finding the time of flight t= sqrt(2*11/9.8) which is roughly 1.5s then dividing the horizontal radius 11m by this time to give me the "minimum initial speed for the ball to never hit the rock".
but them the second Q says find how far the ball will travel from the base which using my answer is 0.
So clearly im doing something wrong. Any thoughts?
1)What must be its minimum initial speed if the ball is never to hit the rock after it is kicked?
2) With this initial speed, how far from the base of the rock does the ball hit the ground?
Im a bit confused
What im doing is just finding the time of flight t= sqrt(2*11/9.8) which is roughly 1.5s then dividing the horizontal radius 11m by this time to give me the "minimum initial speed for the ball to never hit the rock".
but them the second Q says find how far the ball will travel from the base which using my answer is 0.
So clearly im doing something wrong. Any thoughts?