SEARCH HOME
Math CentralQuandaries & Queries

search

Question from Adori, a student:

An airline runs a commuter between two cities that are 720 miles apart. If the average speed of the planes is increased by 40 miles per hour, the travel time is decreased An airline runs a commuter between two cities that are 720 miles apart. If the average speed of the planes is increased by 40 miles per hour, the travel time is decreased by 12 minutes. What airspeed is required to obtain this decrease in travel time?

Hi Adori,

The key here is that average rate or speed is distance divided by time. (The units tell you this, miles per hour is miles over hours.) For your problem suppose the airplane travels at a speed of s miles per hour and takes t hours to fly the 720 miles between the two cities. Since speed is distance divided by time this gives

s = 720/t miles per hour.

If the speed increases by 40 miles per hour to (s + 40) miles per hour and the distance remains at 720 miles then the time decreases by 12 minutes which is 12/60 = 1/5 hours to (t - 1/5) hours then the relationship speed is distance divided by time gives

s + 40 = 720/(t - 1/5) miles per hour.

This gives you two equations in s and t which you can solve.

I hope this helps,
Penny

About Math Central
 

 


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences.
Quandaries & Queries page Home page University of Regina PIMS