



 
Ritika, Often when we have an arithmetic operation that we cannot do, we can make it possible by inventing a new type of number for the purpose. So we invent negative numbers to let us "subtract a larger number from a smaller number," fractions to let us do most of the divisions that don't work in the natural numbers, etc. Usually we do this based on the principle that the new numbers should work almost exactly like the old ones. This actually helps us; for instance, it tells us that the number that gives an answer to 24 must be the same one that gives an answer to 35 because the old rules of arithmetic say this follows from 2+5 = 3+4. Even at this level we have to give up a few things: for instance, neither the integers nor the fractions let us count through them in increasing order. But it turns out that to create a new number to be the answer to 1/0 we would have to give up a lot! In fact, so much that in many contexts we say we can't divide by zero; it is easier to allow zero to be a number without a reciprocal than to allow "infinity" with many restrictions. We can allow things (other than 0) to be multiplied by infinity or added to infinity (giving infinity.) But if we allow subtraction of two infinities, multiplying infinity by zero, dividing infinity by itself, or cancelling infinities, and apply the usual rules, we get contradictions: for instance, to adapt an argument of Bertrand Russell:
Cancelling multiplications by zero or infinity gives the same sort of problem. Many computers do allow as many operations with infinity as they can, to allow "graceful failure" if bad data are given; but the rules for handling them are complicated. To sum up: You have to stop extending arithmetic somewhere; where you do it is to some extent optional; but the simplest place is usually banning division by zero and the use of infinity as a number. Good Hunting!  


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences. 