Machine Learning Part 5, ILP Part 51 — Linear regression

This is the fifty first part of the ILP series. For your convenience you can find other parts in the table of contents in Part 1 – Boolean algebra

This is the fifth part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Today we are going to implement linear regression with ILP. Let’s go!

We start with defining the samples set. Next, we create real variables for linear regression coefficients. We add arbitrary bounds. Next, we calculate the predicted value in an obvious way.

Next, we calculate distances and error.

The error is the hard part. We can easily calculate sum of distances or approximated average of them, or even sum of approximated squared distances. But we don’t know how to multiply two real variables so we can’t calculate the real MSE.

Finally, we minimize te error and solve the problem. Here goes the solution:

As you can see, it is very fast (around 0.44 second). You can see the error and coefficients. Calculation time for sum of approximated squares is worse but not terrible: `Total (root+branch&cut) = 8.28 sec. (7154.17 ticks)`