PLEASE HELP!! Brainliest to first answer
An archer shoots an arrow into the sky where the motion of the arrow can be modeled by the equation() = −5^2 + 20 + 4. Where t is time in seconds and f(t) is height in feet. Find how long it will take the arrow to hit the ground using an algebraic method. Round to the nearest tenth if necessary. Show all of your reasoning used with this algebraic method.