Removing redundant constraints from a linear system of equations
Suppose you are solving an optimization problem with linear equality constraints of the form $ Ax = b $ Several off-the-shelf optimizers, including those available in CVXOPT and scipy will complain if A is not full rank. That is, if A contains redundant constraints then optimization will likely fail. So how can we easily deal with this issue? The code below is something you can plug into your program for a quick fix to the issue. Simply call A, b = reduce_row_echelon(A, b) to get an equivalent set of constraints with redundancies removed. # Consumes a *consistent* rectangular system of equations # Produces a new compacted system of equations with linearly dependent rows removed # Note: this algorithm computes the row echelon form of the augmented matrix # This algorithm should be numerically robust def reduce_row_echelon(B, y): A = np.concatenate([B, y[:,np.newaxis]], axis=1) m,n = A.shape c = 0 for r in range(m): # find first non-zero column wh