my name is zac and I need help on a 9th grade algebra question I've been stuck on for two days. its on a math project thats due tomorow.

A COMPUTOR IS ADVERTISED AS HAVING A PROCESSING SPEED OF 11 MILLION INSTRUCTIONS PER SECOND. ON THE AVERAGE, HOW LONG DOSE IT TAKE TO PROCESS ONE INSTRUCTION AT SUCH A SPEED?

Hi Zac,

Maybe it would be easier for you to see how to find the answer if we solved the same problem using smaller numbers.

Suppose the computer could only process 2 instructions per second. Do you see that it takes the computer, on average, 1/2 of a second to perform each (one) of the instructions?

2 instructions x 1/2 second each = 1 second

Now suppose the computer could process 3 instructions per second. Now it takes the computer, on average, 1/3 of a second to perform each (one) of the instructions.

3 instructions x 1/3 second each = 1 second

Now suppose the computer could process 100 instructions per second.

100 instructions x 1/100 second each = 1 second

Now suppose the computer could process 11 million instructions per second ... (use the pattern you observed in the previous examples).

Hope this helps.

Leeanne
Go to Math Central