|
||||||||||||
|
||||||||||||
| ||||||||||||
Hi Nicole. Let's first try to guess. If it was 100 miles, then it took two hours to drive there at 50 mph, right? You get the time by dividing the distance by the speed. 100/50 = 2. On the trip home, it would take 100/55 = 1.82 hours, which isn't 30 minutes (0.5 hours) difference. Guessing helped us understand what we need to do to find an answer. Now that we know how to calculate things, we can create an equation. Let d = the distance from home to the beach. Then d/50 is the time it takes to get to the beach at 50 mph. And d/55 is the time it takes to get home at 55 mph. The difference between these times is 0.5 hours. "Difference" means subtraction. Therefore, (d/50) - (d/55) = 0.5 Can you solve the problem from here? Hope this helps, | ||||||||||||
|
||||||||||||
Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences. |