SEARCH HOME
Math CentralQuandaries & Queries

search

Question from Mike:

if a 100 mile trip averages 50 miles per hour, how much distance does one need to travel at 90 miles per hour to decrease travel time by 10%

Hi Mike.

Overall travel time is 2 hours in the original trip (50 miles per hour for 100 miles). So 10% less time is 90% of 2 hours, which is 0.90 x 2 = 1.8 hours.

For the revised trip, you are going 90 mph for a ways, then 50 mph the rest of the way. Let's say you go x miles at 90 mph. Then you are left with 100 - x miles remaining to cover at 50mph.

Time equals distance divided by speed, so the time it takes for the 90 mph portion is x/90 hours and the time it takes for the 50 mph portion is (100-x)/50 hours. Therefore the total time is the sum of these, and that's what we want to make equal to the target total time of 1.8 hours:

1.8 = x/90 + (100 - x)/50 hours.

Can you solve for x from here and finish the question? I would start by multiplying bath sides by 450.

Cheers,
Stephen La Rocque and Harley.

About Math Central
 

 


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences.
Quandaries & Queries page Home page University of Regina PIMS