This is a brief statement of the LP duality theorem. We probably won't have time to go over this in lecture. Consider a "primal" linear program, in this form: MAX = max { c.x, over all x such that A*x <= b and x >= 0 } Everything is real valued. A is a rectangular matrix, and b, c, x are column vectors of the appropriate lengths. A, b, and c are given, while x is unknown (variable). A*x denotes matrix-vector product, and c.x denotes dot product. Its "dual" linear program is the following: MIN = min { b.y, over all y such that A'*y >= c and y >= 0 } Now vector y is the unknown (variable). A' is the transpose of A. It is relatively easy to argue "weak duality": MAX <= MIN. If we negate A and take duals again, we get back to the original LP (negated), so the dual of the dual is the original LP. The surprising "strong duality theorem" is this: MAX == MIN. Note that if we require x and y to be integer valued (IP), then strong duality fails, but we still have weak duality.