SEARCH HOME
Math CentralQuandaries & Queries

search

Question from Douglas:

I realize raising 0^a = 0 if a>0 and undefined if a<=0.

If have read that 0^bi is undefined for all b.

What I don't understand is why 0^(a+bi) = 0 if a and b are not equal to zero.

Is this purely by definition or is there a logical reason why this is the case?
(I have taken Complex Analysis, so have a fairly good understanding of complex numbers.)

Thank you so much for taking the time!

Hi Douglas,

I assume that $a$ and $b$ are real and then

\[0^{a + bi} = 0^a \times 0^{bi}.\]

If $a \neq 0$ then $0^a = 0$ and hence

\[0^{a + bi} = 0^a \times 0^{bi} = 0 \times 0^{bi} = 0 \]

If $a=0$ then $a+bi = bi$ and hence

\[0^{a + bi} = 0^{bi} \mbox { which is undefined.}\]

Hence if $a$ and $b$ are real then

\[0^{a + bi} = \left\{ \begin{array}{ll} 0 & \mbox{if $a \neq 0$} \\ \mbox{ undefined} & \mbox{ if $a = 0$} \end{array} \right.\]

Penny

About Math Central
 

 


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences.
Quandaries & Queries page Home page University of Regina PIMS