Thursday, October 1, 2009

The Square Root of 2 is Irrational

You remember that fun piece of trivia from some math class in high school? Some teacher, in their infinite wisdom, laid down the law: The square root of 2 (√2) is an irrational number, which means that it cannot be represented by a repeating sequence of digits. The first few are 1.4142135623730950488016887242097...

But I'm in college now. I'm supposed to be challenging preconceptions about religion, culture, and basic properties of numbers, right? Of course, but fortunately, religion and culture aren't nearly as interesting, so we're going to be limiting ourselves to the third subject.

So if √2 really is an irrational number, I should be able to prove it. Before I prove it, we're going to need something to prove it with:
  • A rational number is a number that can be represented by the division of two integers. This means that I can represent 4 as 4/1, 2.5 as 5/2, and 3.333... as 10/3. An irrational number is a number which cannot be represented by these two integers. Some familiar ones are π and e. φ has also been coming up in discussion a lot this week.
  • The Well Ordering Property of Integers: Given any positive set of integers, I claim that of them, there is one that is less than all the others. That's right, when given the numbers {4, 9, 123, 19244000}, you can always look at them and say, "Ah ha! That four is less than all the other numbers!" This is important, I promise.
So now that we have something to work with, let us start making the magic. A good way to prove something in mathematics is to instead prove that what it isn't, it is not. Show that it can't possibly be anything else, so the only option left must be the answer.

Lucky for us, there is only two options here: √2 is either going to be rational, or irrational. We're claiming it's irrational, so let's show that it's not rational:

ASSUME √2 IS RATIONAL. This is mathematics, so we can play pretend. This means that we can express it as the division of two integers: √2 = a/b. Now if we multiply both sides by b, we get the following:
b√2 = a.
We know that a is an integer, since we called it an integer to begin with, so that means that b√2 must also be an integer. Now since a and b exist, it's not hard to imagine there is a positive set of integers that would satisfy the requirement √2 = c/d (2a and 2b come to mind).

Let's call the set of all the possible positive integers b√2 the set S. This means S is a really big group of positive integers, all of which are products of another integer and √2. This set must have a smallest member, as per the well ordering property. Let's call that smallest member s = t√2.

Since s is the smallest member of this set S, that means that if we can show that there is yet a SMALLER member of S than s, we've got a problem, since smaller than the smallest doesn't make much sense.

s = t√2
s√2 - s = s√2 - t√2
s√2 - s = (s -t) √2

Now we know s and t are integers, by definition. If we multiply them by another √2, we have s√2 = 2t, which means that s√2 is also an integer. Since s and s√2 are both integers, that means the left side of the equation is an integer, which means both sides are integers (we can make this argument from either side).

We also know that both sides are positive. The left side can be factored out to s (√2 - 1), and √2 > 1, since 12 is 1, and 22 is 4.

Now we've got a problem. We've got this integer (s - t) which is positive, and which is an integer when multipled by √2. This is bad, because (s - t) is clearly less than s, and we picked s as the smallest member of S. This means S can't exist, and finally that √2 is an irrational number. Q.E.D.

Well that was fun, wasn't it?

1 comment:

  1. You took the hard way to prove it... :) There is an easiest way to prove it! The proof you choose is interesting. The blog - I have no words - is perfect! Keep the good work!