SEARCH HOME
Math CentralQuandaries & Queries

search

Question from ritika, a student:

we say that one divided by zero gives us infinity, then why zero multiplied by infinity does not gives us one?????????????

Ritika,

Often when we have an arithmetic operation that we cannot do, we can make it possible by inventing a new type of number for the purpose. So we invent negative numbers to let us "subtract a larger number from a smaller number," fractions to let us do most of the divisions that don't work in the natural numbers, etc. Usually we do this based on the principle that the new numbers should work almost exactly like the old ones.

This actually helps us; for instance, it tells us that the number that gives an answer to 2-4 must be the same one that gives an answer to 3-5 because the old rules of arithmetic say this follows from 2+5 = 3+4. Even at this level we have to give up a few things: for instance, neither the integers nor the fractions let us count through them in increasing order.

But it turns out that to create a new number to be the answer to 1/0 we would have to give up a lot! In fact, so much that in many contexts we say we can't divide by zero; it is easier to allow zero to be a number without a reciprocal than to allow "infinity" with many restrictions.

We can allow things (other than 0) to be multiplied by infinity or added to infinity (giving infinity.) But if we allow subtraction of two infinities, multiplying infinity by zero, dividing infinity by itself, or cancelling infinities, and apply the usual rules, we get contradictions: for instance, to adapt an argument of Bertrand Russell:

#({me, Harry Potter}) = 2, where "#(A)" means "number in the set A"
1 + infinity = 2 + infinity
=> 1=2
=> #({me, Harry Potter}) = 1,
and I am Harry Potter.

Cancelling multiplications by zero or infinity gives the same sort of problem.

Many computers do allow as many operations with infinity as they can, to allow "graceful failure" if bad data are given; but the rules for handling them are complicated.

To sum up: You have to stop extending arithmetic somewhere; where you do it is to some extent optional; but the simplest place is usually banning division by zero and the use of infinity as a number.

Good Hunting!
RD

About Math Central
 

 


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences.
Quandaries & Queries page Home page University of Regina PIMS