Previous article:
The Basics of Greedy Algorithms by Me!
Recently, we covered the algorithmic paradigm: greedy. We emphasized the importance of proving that your greedy algorithm is optimal (as greedy algorithms, while they can lead to a pretty decent solution, cannot always be trusted), but we never went over how to do this.
In this article I will explain The Exchange Argument, a method you can use for mathematically proving your greedy algorithm is optimal.
I will warn you now: as we are going to be working with abstract proofs, this article is going to get into a lot of mathematical theory. So... prepare yourself for that! Along with teaching you about the exchange argument, I will also give a quick lesson on mathematical induction, which is necessary for the exchange argument. A nice little bonus for you! (If you don't already know induction)
Structure of Optimization Algorithms
It's important to note that greedy algorithms are usually used in the context of an optimization problem. We can note that all optimization problems can be formally defined with these 5 things:
Instance: what does the input look like? We will call this
I
Solution Format: what does an output look like?
Constraints: what properties must a solution have?
Objective Function: what makes a solution better or worse than another solution?
Maximize or minimize?
Today we'll be working with the event scheduling problem from the last article. We can define the event scheduling problem as follows:
Instance: a list of events
[(s_1, f_1), (s_2, f_2),..., (s_n, f_n)]
Solution Format: a subset of the events,
S
Constraints: NO overlapping events in
S
Objective Function:
|S|
(the length of S)Maximize
So now that we have a mathematical definition of the problem, here's what we need to do to prove that our greedy algorithm is correct:
Let GS be the greedy algorithm's solution, and let OS be ANY OTHER solution to the problem.
If we are minimizing, then we must show:
$$Cost(OS) \ge Cost(GS)$$
If we are maximizing, then we must show:
$$Value(GS) \ge Value(OS)$$
Seems simple enough, right? We just need to show the the greedy solution is better than or as good as any other solution. You probably could've guessed that that's what we need to do without the fancy mathematical definitions, right?
Well the tricky part is that OS is arbitrary, unless we go through and calculate every single other possible solution, which is basically impossible, how do we really know that GS is always better than or just as good as OS?
There a couple techniques you can use to do this:
The Exchange Argument
Greedy-Stays-Ahead
Greedy Achieves the Bound
Today, we will go over The Exchange Argument. I may go over the other ones in future articles.
Introduction to Induction
The Exchange Argument has a step that requires using mathematical induction, so I will give a quick introduction to it now. If you already know how to write inductive proofs, you can skip this part!
Imagine you are trying to prove that you can climb a ladder. First, of course, you need to prove that you can get onto the ladder. Next, you can complete this proof by showing that if you are at any rung of the ladder, then you can get to the next rung of the ladder. This is basically what induction does.
Typically, you want to use induction to prove that something is true for all numbers or input size. Let the proposition we are trying to prove be called p
, and p(n)
be either true or false, with n
being the input size or length. We will see examples in a sec, but here are the main steps of the proof:
Base Case: the base case of the proof. Usually it will be at
n = 1
orn = 0
. In mathematical terms, this is proving thatp(0)
orp(1)
is true.Inductive Step: there are two sub-steps here:
Inductive Hypothesis: assume that
p(n)
is trueInductive Goal: prove that
p(n + 1)
is true (using the inductive hypothesis, that says thatp(n)
is true.
If that didn't make sense, it's made much easier with an example:
Say we want to prove that for all n
in the natural numbers (greater than 0), that
$$1+2+...+n=\frac{n(n+1)}{2}$$
Base Case: Our base case is
n = 1
. Plugging in1
forn
in the equation above, we get1 = 1
. Thus,p(1)
is true, or our base case is true. Note that since we are trying to prove this proposition for all of the natural numbers, our base case has to be the very first one (it's like proving you can get on the ladder)Inductive Step:
Inductive Hypothesis: assume inductively that 1+2+...+n = n(n+1)/2.
Inductive Goal: we must now show that
$$1+2+...+(n+1)=\frac{(n+1)(n+2)}{2}$$
Let's work with the left side of the equation. We have
$$1+2+...+(n+1)$$
Which equals
$$(1+2+...+n)+(n+1)$$
From our inductive hypothesis, we know that 1+2+...+n = n(n+1)/2, so we have:
$$\frac{n(n+1)}{2}+(n+1)$$
Adding these two together as fractions, we get:
$$\frac{n(n+1)+2(n+1)}{2}$$
And simplifying, we get:
$$\frac{(n+2)(n+1)}{2}$$
Thus, our proof is finished!
Congratuluations, you have just finished your first induction proof!
It's important to note that induction proofs can vary, so:
sometimes, your induction proof may have multiple base cases
sometimes, in your inductive step, you may want to make your inductive hypothesis for
n - 1
, and then provep(n)
instead
Strong Induction
The Exchange Argument typically calls for a variation of inductive proofs known as Strong Induction. This is the difference between strong and weak (the one we were doing earlier) induction:
Weak Induction:
Base Case: prove
p(n_0)
Inductive Step: Prove that if
p(n)
, thenp(n + 1)
Strong Induction:
Base Case: prove
p(n_0)
Inductive Step: Prove that if
p(k)
is true for alln_0 <= k <= n
, thenp(n + 1)
is true
The main difference is in the inductive hypothesis, where you make a much stronger assumption than before.
Imagine the ladder analogy again. First, you prove the base case. Now, instead of proving that you can climb from the current rung to the next, you first assume that you can climb the ladder up to the current rung, and you know show that you can get to the next rung.
Let's look at an example. We will prove that every number (greater than 1) has a prime factorization. In other words, for any number greater than 2, n
, there exists prime numbers p_1, p_2, ..., p_n
such that n = p_1 p_2 ... p_n
.
Base Case:n = 2
. 2 is prime, so this checks out.
Inductive Hypothesis: Let n
be greater than 2. Assume inductively that for any number k
from 3
to n
inclusive, that k
has a prime factorization.
Inductive Goal: We must now show that n + 1
has prime factorization.
Well, if (n + 1) is prime, then we're good.
If (n + 1) is not prime, then it's a bit more complicated. Since n
is greater than 2, we know that n + 1
is not 1. Therefore, there exist an x
and y
such that x
and y
are not 1, and n + 1 = xy
.
Now, we know from our inductive hypothesis that x
and y
both have prime factorizations! So let x = p_1 p_2 ... p_n
and y = q_1 q_2 ... q_n
. This means that n = p_1 ... p_n q_1 ... q_n
. Thus, (n+1) has a prime factorization and we are done with the proof!
Alright! So now that we know strong induction, we are finally ready to talk about and see the exchange argument in action!
The Exchange Argument
The idea behind the Exchange Argument is this: Let g
be the first greedy decision that our algorithm makes. Let OS be any legal solution that DOES NOT pick g
. Then there is a solution OS' that DOES pick g
, and OS' is at least as good as OS.
This is the general structure of an Exchange Argument proof:
The Exchange Part:
Let
g
be the first greedy choice the algorithm makeLet OS be any arbitrary solution achieved by NOT choosing
g
Show how to transform OS into some solution OS' that chooses
g
, and that is at least as good as**OS. (this is the part that is called the exchange argument). To do this, we must show that:OS' is a valid solution
OS' is better than (or equal to) OS
The Inductive Part:
Use the results from the previous section in an inductive proof:
Base Case: the greedy strategy works for an instance of size 1.
Inductive Hypothesis: assume the greedy strategy works for all instances of size < n.
Inductive Step: Let OS be any solution for
I
(rememberI
is the Instance of the problem, or the input), and|I| = n
. Then by the exchange argument, there is another solution OS' such that |OS| <= |OS'| and OS' includes the first greedy choice,g
. (In math notation, |x| means the size/length of the set x, by the way)
Okay, I know that's a lot. And this definitely is really tricky to do, even if you know what you're doing. But let's see an example of this in action by proving that our "Earliest Finish Time" algorithm was optimal, and hopefully it can make more sense.
Proving the Earliest Finish Time Algorithm: Exchange Argument
We will perform the exchange argument, following the template from earlier:
Say
G
is the event with the earliest finish time.Let OS be an arbitrary non-overlapping schedule (a valid one), that DOES NOT include
G
We will claim that there is a schedule OS' that DOES NOT include
G
, such that:
$$|OS'| \ge |OS|$$
Proof of Step 3: Forming OS'
Let the events in OS be J_1,...,J_k
, ordered by start and finish times (J_1
cannot be G
)
What we have to do now is change OS to OS', and make sure that OS' is valid, includes G
, and that is it better or at least as good as OS.
We will define OS' as:
$$OS'= (OS-\{J_1\}) \cup \{G\}$$
If you're not a math nerd and this notation doesn't make sense, this just means that OS' is the same as OS, but with J_1
being exchanged for G
. Now do you see where the exchange in "exchange argument" comes from?
Proof of Step 3: Proving that OS' is a valid solution
Now that we have defined OS', we have to show that it is a valid solution.
First, since we know that OS is a valid solution, we know that any pair of events J_i
and J_l
from OS will not overlap. This means that all we need to show is that G
doesn't overlap with any of the J_i
events.
Well, since the J_i
's are ordered by start/finish times, we just need to show that G
doesn't overlap with J_2
.
We know that J_1
doesn't overlap with J_2
. We also know that G
ends before J_1
(this is because it was the first greedy choice our algorithm made). Therefore, we know G
won't overlap with J_2
as well.
Nice! We have proved that OS' is valid.
Proof of Step 3: Proving that OS' is as good or better than OS
Since all we did was exchange one event for another to get from OS to OS', we know that OS' is as good as OS.
Whew! This means that we have proved the exchange argument! Unfortunately, there's still the inductive step.
Proving the Earliest Finish Time Algorithm: Inductive Step
Base Case: We know that our greedy algorithm will work on input size 1.
Inductive Hypothesis: Our greedy algorithm works on inputs of any size < n.
Let OS be any solution of the set of events I
.
Then there exists a solution OS' such that OS' is as good or better than OS, and OS' includes the first greedy choice G
.
Let I'
be the set of event that don't conflict with G
. Then
$$OS'=\{G\}\cup\{S(I')\}$$
Where S(I')
is some solution of I'
.
Now, since |I'| < n
, then by our inductive hypothesis,
$$|S(I')| \le |GS(I')|$$
And by definition, we know
$$GS(I)=\{G\}\cup GS(I')$$
So, finally, we know that:
$$|OS| \le |OS'| = |\{G\}\cup S(I')| \le |\{G\}\cup GS(I')|=|GS(I)|$$
Or
$$|GS(I)| \ge |OS|$$
Which is exactly what we wanted! The greedy solution is greater than or equal to any other solution for the event scheduling problem, no matter the input! We are finally done with the proof.
I hope that this article helped give a basic understanding of the exchange argument, and how to use it to prove the optimization of your greedy algorithms! If not, then I at least hope it was interesting!
Check out my previous articles in the algorithms series: