Let j be a number such that j^2 = 1 (technically it's not true to say that j = sqrt(1)).
Let (a + bj)^2 = j, then
a^2 - b^2 + 2abj = j, as you correctly point out. Since j = 0 + 1j,
a^2 - b^2 = 0 (hence a^2 = b^2 and (a = b or a = -b))
while
2abj = 1j
This last implies 2ab=1, or ab=1/2. Since a is plus or minus b, ab = plus or minus a^2, and either a^2 = 1/2 or a^2 = -1/2 (as your teacher deduced). Since a and b are both reals, the second case is not possible; a = sqrt(1/2), and b = a = sqrt(1/2). This gives one sqrt of j as sqrt(1/2) + j*sqrt(1/2), which is your teacher's first root. Now, since (-a)^2 = a^2 for all a (both real and complex), -(sqrt(1/2) + j*sqrt(1/2)) = -sqrt(1/2) - j*sqrt(1/2) is also a root; the second that your teacher gave you.
Let's look at your roots.
If x = (sqrt(1/2) - j*sqrt(1/2)), then
x^2 = (sqrt(1/2)^2) - 2*sqrt(1/2)*sqrt(1/2)*j + (sqrt(1/2)*j)^2
= 1/2 - 2*1/2*j + 1/2*j^2
= 1/2 - j - 1/2
- -j
Similar calculations on your second root give the same result; i.e., it's also sqrt(-j) rather than sqrt(j).
Your mistake, I believe, is in writing (and thinking of) j as sqrt(-1). There is a small, subtle, but important difference between this phrasing and the definition of j which says "j^2 = -1" (namely, this second definition does not assume that j is positive; the first definition implicitly does, since the square roots of nonnegative reals are always both positive and negative). In fact, j is neither positive nor negative; it's measured on a different scale. |