ML – Random IT Utensils https://blog.adamfurmanek.pl IT, operating systems, maths, and more. Sat, 27 Jul 2019 09:02:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 Machine Learning Part 8 — Backpropagation in neural net in SQL https://blog.adamfurmanek.pl/2019/07/27/machine-learning-part-8/ https://blog.adamfurmanek.pl/2019/07/27/machine-learning-part-8/#respond Sat, 27 Jul 2019 08:00:18 +0000 https://blog.adamfurmanek.pl/?p=3004 Continue reading Machine Learning Part 8 — Backpropagation in neural net in SQL]]>

This is the eighth part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Last time we saw forward propagation in neural net. Today we are going to extend the process to backpropagate the errors. Let’s begin.

We need to add some more definitions to calculate output:

CREATE TABLE outputs (
  outputNode NUMERIC,
  outputValue NUMERIC
);

INSERT INTO outputs VALUES
    (1, 290)
   ,(2, 399)
   ,(3, 505)
;

Before we see some SQL code, let’s do some math. We had three layers (input, hidden, output), in input and output layers we used linear activation function. Hidden layer used ReLU.

We start with calculating loss function. We use normal squared error:

    \begin{gather*} Loss = \left[\begin{array}{c} \frac{\left(y^{out}_1 - target_1\right)^2 }{ 2 } \\ \frac{\left(y^{out}_2 - target_2\right)^2 }{ 2 } \end{array}\right] \end{gather*}

Now let’s calculate partial derivatives to update weights between hidden layer and output layer:

    \begin{gather*} \left[\begin{array}{ccc} \frac{\partial Loss}{\partial W^2_{1,1}} & \frac{\partial Loss}{\partial W^2_{1,2}} & \frac{\partial Loss}{\partial W^2_{1,3}} \\ \frac{\partial Loss}{\partial W^2_{2,1}} & \frac{\partial Loss}{\partial W^2_{2,2}} & \frac{\partial Loss}{\partial W^2_{2,3}} \end{array}\right] =  \left[\begin{array}{ccc}  \frac{\partial Loss}{\partial y^{out}_1 } \frac{\partial y^{out}_1 }{\partial y^{in}_1} \frac{\partial y^{in}_1}{\partial W^2_{1,1}} & \frac{\partial Loss}{\partial y^{out}_2 } \frac{\partial y^{out}_2 }{\partial y^{in}_2} \frac{\partial y^{in}_2}{\partial W^2_{1,2}} & \frac{\partial Loss}{\partial y^{out}_3 } \frac{\partial y^{out}_3 }{\partial y^{in}_3} \frac{\partial y^{in}_3}{\partial W^2_{1,3}} \\ \frac{\partial Loss}{\partial y^{out}_1 } \frac{\partial y^{out}_1 }{\partial y^{in}_1} \frac{\partial y^{in}_1}{\partial W^2_{2,1}} & \frac{\partial Loss}{\partial y^{out}_2 } \frac{\partial y^{out}_2 }{\partial y^{in}_2} \frac{\partial y^{in}_2}{\partial W^2_{2,2}} & \frac{\partial Loss}{\partial y^{out}_3 } \frac{\partial y^{out}_3 }{\partial y^{in}_3} \frac{\partial y^{in}_3}{\partial W^2_{2,3}} \end{array}\right]  =\\ \left[\begin{array}{ccc}  (y^{out}_1 - target_1) \cdot 1 \cdot h^{out}_1 & (y^{out}_2 - target_2) \cdot 1 \cdot h^{out}_1 & (y^{out}_3 - target_3) \cdot 1 \cdot h^{out}_1 \\ (y^{out}_1 - target_1) \cdot 1 \cdot h^{out}_2 & (y^{out}_2 - target_2) \cdot 1 \cdot h^{out}_2 & (y^{out}_3 - target_3) \cdot 1 \cdot h^{out}_2 \\ \end{array}\right]  \end{gather*}

Now, the same for biases:

    \begin{gather*} \left[\begin{array}{ccc} \frac{\partial Loss}{\partial b^2_{1,1}} & \frac{\partial Loss}{\partial b^2_{1,2}} & \frac{\partial Loss}{\partial b^2_{1,3}} \\ \frac{\partial Loss}{\partial b^2_{2,1}} & \frac{\partial Loss}{\partial b^2_{2,2}} & \frac{\partial Loss}{\partial b^2_{2,3}} \end{array}\right] =  \left[\begin{array}{ccc}  \frac{\partial Loss}{\partial y^{out}_1 } \frac{\partial y^{out}_1 }{\partial y^{in}_1} \frac{\partial y^{in}_1}{\partial b^2_{1,1}} & \frac{\partial Loss}{\partial y^{out}_2 } \frac{\partial y^{out}_2 }{\partial y^{in}_2} \frac{\partial y^{in}_2}{\partial b^2_{1,2}} & \frac{\partial Loss}{\partial y^{out}_3 } \frac{\partial y^{out}_3 }{\partial y^{in}_3} \frac{\partial y^{in}_3}{\partial b^2_{1,3}} \\ \frac{\partial Loss}{\partial y^{out}_1 } \frac{\partial y^{out}_1 }{\partial y^{in}_1} \frac{\partial y^{in}_1}{\partial b^2_{2,1}} & \frac{\partial Loss}{\partial y^{out}_2 } \frac{\partial y^{out}_2 }{\partial y^{in}_2} \frac{\partial y^{in}_2}{\partial b^2_{2,2}} & \frac{\partial Loss}{\partial y^{out}_3 } \frac{\partial y^{out}_3 }{\partial y^{in}_3} \frac{\partial y^{in}_3}{\partial b^2_{2,3}} \end{array}\right]  =\\ \left[\begin{array}{ccc}  (y^{out}_1 - target_1) \cdot 1 \cdot 1 & (y^{out}_2 - target_2) \cdot 1 \cdot 1 & (y^{out}_3 - target_3) \cdot 1 \cdot 1 \\ (y^{out}_1 - target_1) \cdot 1 \cdot 1 & (y^{out}_2 - target_2) \cdot 1 \cdot 1 & (y^{out}_3 - target_3) \cdot 1 \cdot 1 \\ \end{array}\right]  \end{gather*}

That was easy. Now we use learning rate equal to 0.1 and we can update both weights and biases between hidden layer and output layer.

Similar things go for other updates. If you are lost, you can find great explanation here.

Let’s now see the code:

WITH RECURSIVE currentPhase AS(
	SELECT CAST(0 AS NUMERIC) AS phase
),
oneRow AS(
	SELECT CAST(NULL AS NUMERIC) AS rowValue
),
solution AS (
	SELECT I.*, O1.rowValue AS inputLayerOutput, W1.*, I2.rowValue AS hiddenLayerInput, O2.rowValue AS hiddenLayerOutput, W2.*, I3.rowValue AS outputLayerInput, O3.rowValue AS outputLayerOutput, O.*, E.rowValue AS errorValue, P.*
	FROM inputs AS I
	CROSS JOIN oneRow AS O1
	JOIN weights1 AS W1 ON W1.weight1InputNodeNumber = I.inputNode
	CROSS JOIN oneRow AS I2
	CROSS JOIN oneRow AS O2
	JOIN weights2 AS W2 ON W2.weight2InputNodeNumber = W1.weight1OutputNodeNumber
	CROSS JOIN oneRow AS I3
	CROSS JOIN oneRow AS O3
	JOIN outputs AS O ON O.outputNode = W2.weight2OutputNodeNumber
	CROSS JOIN oneRow AS E
	CROSS JOIN currentPhase AS P

	UNION ALL
	
    SELECT
		inputNode,
		inputValue,

		CASE
			WHEN phase = 0 THEN inputValue
			ELSE inputLayerOutput
		END AS inputLayerOutput,

		weight1InputNodeNumber,
		weight1OutputNodeNumber,
		
		CASE
			WHEN phase = 6 THEN weight1Value - 0.1 * (SUM(outputLayerOutput - outputValue) OVER (PARTITION BY weight1InputNodeNumber, weight1OutputNodeNumber))  * 1 * weight2Value * (CASE WHEN hiddenLayerInput > 0 THEN 1 ELSE 0 END) * inputLayerOutput
			ELSE weight1Value
		END AS weight1Value,
		
		CASE
			WHEN phase = 6 THEN weight1Value - 0.1 * (SUM(outputLayerOutput - outputValue) OVER (PARTITION BY weight1InputNodeNumber, weight1OutputNodeNumber)) * 1 * weight2Value * (CASE WHEN hiddenLayerInput > 0 THEN 1 ELSE 0 END) * 1
			ELSE weight1Bias
		END weight1Bias,

		CASE
			WHEN phase = 1 THEN SUM(weight1Value * inputLayerOutput + weight1Bias) OVER (PARTITION BY weight1OutputNodeNumber, phase) / 3
			ELSE hiddenLayerInput
		END AS hiddenLayerInput,

		CASE
			WHEN phase = 2 THEN CASE WHEN hiddenLayerInput > 0 THEN hiddenLayerInput ELSE 0 END
			ELSE hiddenLayerOutput
		END AS hiddenLayerOutput,

		weight2InputNodeNumber,
		weight2OutputNodeNumber,
		
		CASE
			WHEN phase = 6 THEN weight2Value - 0.1 * (outputLayerOutput - outputValue) * 1 * hiddenLayerOutput
			ELSE weight2Value
		END AS weight2Value,
		
		CASE
			WHEN phase = 6 THEN weight2Value - 0.1 * (outputLayerOutput - outputValue) * 1 * 1
			ELSE weight2Bias
		END ASweight2Bias,

		CASE
			WHEN phase = 3 THEN SUM(weight2Value * hiddenLayerOutput + weight2Bias) OVER (PARTITION BY weight2OutputNodeNumber, phase) / 3
			ELSE outputLayerInput
		END AS outputLayerInput,

		CASE
			WHEN phase = 4 THEN outputLayerInput
			ELSE outputLayerOutput
		END AS outputLayerOutput,
		
		outputNode,
		outputValue,
		
		CASE
			WHEN phase = 5 THEN (outputLayerOutput - outputValue) * (outputLayerOutput - outputValue) / 2
			ELSE errorValue
		END AS errorValue,

		phase + 1 AS phase

	FROM solution
	WHERE phase <= 6
)
SELECT DISTINCT *
FROM solution WHERE phase = 7
ORDER BY weight1InputNodeNumber, weight1OutputNodeNumber, weight2OutputNodeNumber

It is very similar to the solution from previous post. This time in phase 5 we calculate error, in phase 6 we update weights and biases. You can find results here.

]]>
https://blog.adamfurmanek.pl/2019/07/27/machine-learning-part-8/feed/ 0
Machine Learning Part 7 — Forward propagation in neural net in SQL https://blog.adamfurmanek.pl/2019/07/20/machine-learning-part-7/ https://blog.adamfurmanek.pl/2019/07/20/machine-learning-part-7/#respond Sat, 20 Jul 2019 08:00:35 +0000 https://blog.adamfurmanek.pl/?p=2997 Continue reading Machine Learning Part 7 — Forward propagation in neural net in SQL]]>

This is the seventh part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Today we are going to create a neural net and calculate forward propagation using PostgreSQL. Let’s go.

We start with definition of the network: we will have input layer, hidden layer, and output layer. Input layer will have 3 nodes, hidden layer will have 2, output layer will have 3. In input layer we don’t do any transformation on the input data, in hidden layer we use ReLU, in output layer we use linear activation function (so no transformation).

Let’s start with the following definitions:

DROP TABLE IF EXISTS inputs;
DROP TABLE IF EXISTS weights1;
DROP TABLE IF EXISTS weights2;
DROP TABLE IF EXISTS biases;

CREATE TABLE inputs (
  inputNode NUMERIC,
  inputValue NUMERIC
);

INSERT INTO inputs VALUES
    (1, 1)
   ,(2, 3)
   ,(3, 5)
;

CREATE TABLE weights1 (
  weight1InputNodeNumber NUMERIC,
  weight1OutputNodeNumber NUMERIC,
  weight1Value NUMERIC,
  weight1Bias NUMERIC
);

INSERT INTO weights1 VALUES
    (1, 1, 2, 1)
   ,(1, 2, 3, 1)
   ,(2, 1, 4, 2)
   ,(2, 2, 5, 2)
   ,(3, 1, 6, 3)
   ,(3, 2, 7, 3)
;

CREATE TABLE weights2 (
  weight2InputNodeNumber NUMERIC,
  weight2OutputNodeNumber NUMERIC,
  weight2Value NUMERIC,
  weight2Bias NUMERIC
);

INSERT INTO weights2 VALUES
    (1, 1, 1, 2)
   ,(1, 2, 2, 2)
   ,(1, 3, 3, 2)
   ,(2, 1, 4, 3)
   ,(2, 2, 5, 3)
   ,(2, 3, 6, 3)
;

We define some input values, weights and biases. Values are completely made up and do not make a difference.

Before we write SQL code, let’s calculate result manually.

We have the following variables:

    \begin{gather*} input = \left[\begin{array}{c} 1 \\ 3 \\ 5 \end{array}\right] \\ W^1 = \left[\begin{array}{cc} 2 & 3 \\ 4 & 5 \\ 6 & 7 \end{array}\right] \\ b^1 = \left[\begin{array}{cc} 1 & 1 \\ 2 & 2 \\ 3 & 3 \end{array}\right] \\ W^2 = \left[\begin{array}{ccc} 1 & 2 & 3 \\ 4 & 5 & 6 \end{array}\right] \\ b^2 = \left[\begin{array}{ccc} 2 & 2 & 2 \\ 3 & 3 & 3 \end{array}\right] \\ \end{gather*}

Now, let’s calculate input for hidden layer:

    \begin{gather*} h^{in} = \left[\begin{array}{c} W^1_{1, 1} \cdot input_1 + b^1_{1, 1} + W^1_{2, 1} \cdot input_2 + b^1_{2, 1} + W^1_{3, 1} \cdot input_3 + b^1_{3, 1} \\ W^1_{1, 2} \cdot input_1 + b^1_{1, 2} + W^1_{2, 2} \cdot input_2 + b^1_{2, 2} + W^1_{3, 2} \cdot input_3 + b^1_{3, 2} \end{array}\right] \end{gather*}

Now, we use ReLU activation function for hidden layer:

    \begin{gather*} h^{out} = \left[\begin{array}{c} \max(h^{in}_1, 0) \\ \max(h^{in}_2, 0) \end{array}\right] \end{gather*}

We carry on with calculating input for output layer:

    \begin{gather*} y^{in} = \left[\begin{array}{c} W^2_{1, 1} \cdot h^{out}_1 + b^2_{1, 1} +  W^2_{2, 1} \cdot h^{out}_2 + b^2_{2, 1} \\ W^2_{1, 2} \cdot h^{out}_1 + b^2_{1, 2} +  W^2_{2, 2} \cdot h^{out}_2 + b^2_{2, 2} \\ W^2_{1, 3} \cdot h^{out}_1 + b^2_{1, 3} +  W^2_{2, 3} \cdot h^{out}_2 + b^2_{2, 3} \end{array}\right] \end{gather*}

Activation function for output layer is linear, so it is easy now:

    \begin{gather*} y^{out} = y^{in} \end{gather*}

We will calculate errors next time.

Now, let’s calculate the result:

WITH RECURSIVE currentPhase AS(
	SELECT CAST(0 AS NUMERIC) AS phase
),
oneRow AS(
	SELECT CAST(NULL AS NUMERIC) AS rowValue
),
solution AS (
	SELECT I.*, O1.rowValue AS inputLayerOutput, W1.*, I2.rowValue AS hiddenLayerInput, O2.rowValue AS hiddenLayerOutput, W2.*, I3.rowValue AS outputLayerInput, O3.rowValue AS outputLayerOutput, P.*
	FROM inputs AS I
	CROSS JOIN oneRow AS O1
	JOIN weights1 AS W1 ON W1.weight1InputNodeNumber = I.inputNode
	CROSS JOIN oneRow AS I2
	CROSS JOIN oneRow AS O2
	JOIN weights2 AS W2 ON W2.weight2InputNodeNumber = W1.weight1OutputNodeNumber
	CROSS JOIN oneRow AS I3
	CROSS JOIN oneRow AS O3
	CROSS JOIN currentPhase AS P

	UNION ALL
	
    SELECT
		inputNode,
		inputValue,

		CASE
			WHEN phase = 0 THEN inputValue
			ELSE inputLayerOutput
		END AS inputLayerOutput,

		weight1InputNodeNumber,
		weight1OutputNodeNumber,
		weight1Value,
		weight1Bias,

		CASE
			WHEN phase = 1 THEN SUM(weight1Value * inputLayerOutput + weight1Bias) OVER (PARTITION BY weight1OutputNodeNumber, phase) / 3
			ELSE hiddenLayerInput
		END AS hiddenLayerInput,

		CASE
			WHEN phase = 2 THEN CASE WHEN hiddenLayerInput > 0 THEN hiddenLayerInput ELSE 0 END
			ELSE hiddenLayerOutput
		END AS hiddenLayerOutput,

		weight2InputNodeNumber,
		weight2OutputNodeNumber,
		weight2Value,
		weight2Bias,

		CASE
			WHEN phase = 3 THEN SUM(weight2Value * hiddenLayerOutput + weight2Bias) OVER (PARTITION BY weight2OutputNodeNumber, phase) / 3
			ELSE outputLayerInput
		END AS outputLayerInput,

		CASE
			WHEN phase = 4 THEN outputLayerInput
			ELSE outputLayerOutput
		END AS outputLayerOutput,

		phase + 1 AS phase

	FROM solution
	WHERE phase <= 4
)
SELECT DISTINCT weight2OutputNodeNumber, outputLayerOutput
FROM solution WHERE phase = 5

This is actually very easy. We divide the process into multiple phases. Each row of CTE represents one complete path from some input node to some output node. Initially row carries some metadata and input value, in each phase we fill some next value using different case expressions.

In phase 0 we get the input and transform it into output, since input layer has no logic, we just copy the value.
In phase 1 we calculate inputs for next layer by multiplying weights and values.
In phase 2 we activate hidden layer. Since we use ReLU, we perform a very simple comparison.
In phase 3 we once again use weights and values to calculate input for next layer, this time we use different weights.
In phase 4 we activate output layer, which just copies values (since we use a linear activation function).

So in our query we start by defining a schema. We simply join all tables and cross join dummy table with one row which we use to define additional column. We fill these columns later throughout the process.

In recursive part of CTE we simply either rewrite values or do some logic depending on the phase number.

You can see results here.

Next time we will see how to backpropagate errors.

]]>
https://blog.adamfurmanek.pl/2019/07/20/machine-learning-part-7/feed/ 0
Machine Learning Part 6 — Matrix multiplication in SQL https://blog.adamfurmanek.pl/2019/06/29/machine-learning-part-6/ https://blog.adamfurmanek.pl/2019/06/29/machine-learning-part-6/#respond Sat, 29 Jun 2019 08:00:47 +0000 https://blog.adamfurmanek.pl/?p=2936 Continue reading Machine Learning Part 6 — Matrix multiplication in SQL]]>

This is the sixth part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Today we are going to implement a matrix multiplication in Redshift. Let’s go.

First, let’s see what we want to calculate:

    \begin{gather*} \left[\begin{array}{cc}2&3\\4&5\end{array}\right] \left[\begin{array}{cc}5&3\\2&4\end{array}\right] = \left[\begin{array}{cc}16&18\\30&32\end{array}\right] \end{gather*}

Nothing fancy. We would like our algorithm to be extensible to any sizes and non-uniform matrices as well.

Let’s start with matrix representation:

DROP TABLE IF EXISTS matrix1;

CREATE TEMP TABLE matrix1 (
  rowNumber INT,
  columnNumber INT,
  value INT
);

DROP TABLE IF EXISTS matrix2;

CREATE TEMP TABLE matrix2 (
  rowNumber INT,
  columnNumber INT,
  value INT
);

INSERT INTO matrix1 VALUES
   (1, 1, 2)
  ,(1, 2, 3)
  ,(2, 1, 4)
  ,(2, 2, 5)
;

INSERT INTO matrix2 VALUES
   (1, 1, 5)
  ,(1, 2, 3)
  ,(2, 1, 2)
  ,(2, 2, 4)
;

We store the matrices as a rows where each row represents one value for given row and column. Rows and columns are one-based.

First, we need to calculate size of the result:

WITH maxWidth AS(
  SELECT MAX(columnNumber) AS width FROM matrix2
),
maxHeight AS (
  SELECT MAX(rowNumber) AS height FROM matrix1
),
resultDimensions AS (
  SELECT width, height FROM maxWidth CROSS JOIN maxHeight
),

So we just get the maximum width and maximum height from the respective matrices. Now, we want to generate all the cells we need to fill:

rowNums AS (
  SELECT (row_number() OVER (ORDER BY 1)) AS rowNumber FROM matrix1 WHERE rowNumber <= (SELECT MAX(height) FROM resultDimensions)
),
columnNums AS (
  SELECT (row_number() OVER (ORDER BY 1)) AS columnNumber FROM matrix2 WHERE columnNumber <= (SELECT width FROM resultDimensions)
),
positions AS (
  SELECT rowNumber, columnNumber FROM rowNums CROSS JOIN columnNums
),

So we basically do the Cartesian product and we are done. Now, we would like to get correct pairs for each cell:

pairsForPositions AS (
  SELECT P.rowNumber, P.columnNumber, M1.value AS M1, M2.value AS M2
  FROM positions AS P
  JOIN matrix1 AS M1 ON M1.rowNumber = P.rowNumber
  JOIN matrix2 AS M2 ON M2.columnNumber = P.columnNumber AND M2.rowNumber = M1.columnNumber
),

This is what we get for our sample matrices:

row	column	m1	m2
1	1	2	5
1	1	3	2
1	2	2	3
1	2	3	4
2	1	4	5
2	1	5	2
2	2	4	3
2	2	5	4

Looks good. Now we just need to aggregate the pairs:

results AS (
  SELECT rowNumber, columnNumber, SUM(M1 * M2) AS value
  FROM pairsForPositions
  GROUP BY rowNumber, columnNumber
)
SELECT * FROM results ORDER BY rowNumber, columnNumber

And we are done. You can see the code here.

]]>
https://blog.adamfurmanek.pl/2019/06/29/machine-learning-part-6/feed/ 0
Machine Learning Part 4 — Linear regression in T-SQL https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/ https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/#comments Sat, 10 Nov 2018 09:00:03 +0000 https://blog.adamfurmanek.pl/?p=2640 Continue reading Machine Learning Part 4 — Linear regression in T-SQL]]>

This is the fourth part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

This time we are going to implement linear regression as a function. This gives us a little more flexibility in terms of debugging the code and reading it later, also, we can implement much more complex algorithms. Too bad, we can’t use this in Redshift at this time as it doesn’t support such functions or stored procedures. So I will use T-SQL and test the code with MS SQL 2017. I assume you have table samples with Iris dataset.

We start with declaring a type for the function parameter:

CREATE TYPE SamplesTable 
AS TABLE (id int, feature int, value float, target float)

Next, let’s prepare samples for training:

DECLARE @numbers TABLE (N int)

INSERT INTO @numbers SELECT TOP 5 row_number() OVER(ORDER BY t1.number) AS N FROM master..spt_values AS t1 CROSS JOIN master..spt_values AS t2

DECLARE @samples TABLE(
	sepal_length float
	,sepal_width float
	,petal_length float
	,petal_width float
	,iris varchar(255)
	,is_setosa float
	,is_virginica float
	,sample_id int
)

INSERT INTO @samples SELECT TOP 100 S.*,
CASE WHEN S.iris = 'setosa' THEN 1.0 ELSE 0.0 END AS is_setosa, 
CASE WHEN S.iris = 'virginica' THEN 1.0 ELSE 0.0 END AS is_virginica,
row_number() OVER(ORDER BY (SELECT NULL)) AS sample_id
FROM samples AS S ORDER BY (SELECT ABS(CHECKSUM(NewId()))) 

DECLARE @samplesPivoted SamplesTable

INSERT INTO @samplesPivoted 
SELECT
	S.sample_id,
	N.N,
	CASE
		WHEN N.N = 1 THEN S.sepal_width
		WHEN N.N = 2 THEN S.petal_length
		WHEN N.N = 3 THEN S.petal_width
		WHEN N.N = 4 THEN S.is_setosa
		ELSE S.is_virginica
	END,
	S.sepal_length
FROM @samples AS S CROSS JOIN @numbers AS N

We generate table with numbers, next add more features, and then pivot them just like in the last part.

Finally, our function:

CREATE FUNCTION Train(@samplesPivoted SamplesTable READONLY)
RETURNS @coefficients TABLE(feature int, w float, b float, mse float)
AS
BEGIN
    DECLARE @featureIds TABLE(feature int)
	INSERT INTO @featureIds SELECT DISTINCT feature from @samplesPivoted

	INSERT INTO @coefficients SELECT feature, 0.0, 0.0, -1.0 FROM @featureIds

	DECLARE @gradients TABLE(feature int, gw float, gb float)
	INSERT INTO @gradients SELECT feature, 0.0, 0.0 FROM @featureIds

	DECLARE @learningRate float
	SELECT @learningRate = 0.01

	DECLARE @iterations int
	SELECT @iterations = 500

	DECLARE @currentIteration int
	SELECT @currentIteration = 0

	DECLARE @newCoefficients TABLE(feature int, w float, b float)
	DECLARE @distances TABLE(id int, distance float)
	DECLARE @mse float

	WHILE @currentIteration < @iterations
	BEGIN
		DELETE FROM @newCoefficients
		INSERT INTO @newCoefficients SELECT C.feature, C.w - @learningRate * G.gw, C.b - @learningRate * G.gb FROM @coefficients AS C JOIN @gradients AS G ON C.feature = G.feature

		DELETE FROM @distances;

		INSERT INTO @distances SELECT 
			S.id, 
			SUM(N.w * S.value + N.b) - MAX(S.target)
		FROM 
			@samplesPivoted AS S
			JOIN @newCoefficients AS N ON S.feature = N.feature
		GROUP BY S.id

		SELECT @mse = AVG(D.distance * D.distance) FROM @distances AS D
		
		DELETE FROM @gradients;

		INSERT INTO @gradients SELECT
			S.feature,
			AVG(S.value * D.distance),
			AVG(D.distance)
		FROM 
			@samplesPivoted AS S
			JOIN @distances AS D ON S.id = D.id
		GROUP BY S.feature

		DELETE FROM @coefficients;

		INSERT INTO @coefficients SELECT *, @mse FROM @newCoefficients
		
		SELECT @currentIteration = @currentIteration + 1
	END

	RETURN
END

We extract featureIds so we can pass basically any dataset for training and it should work. We initialize coefficients with default values, do the same with gradients, and prepare some bookkeeping like iterations count or learning rate.

Next, in every iteration we start with calculating new coefficients based on old coefficients and old gradients. We clear distances table and calculate distance (which is the difference between predicted value and expected value) for each sample. Next, we calculate mean squared error.

Next, we need to calculate new gradients. For each feature we calculate the derivatives and we are done. We just need to store new coefficients and increase the counter.

Now we can execute the code:

SELECT * FROM Train(@samplesPivoted)

And the result is:

feature     w                      b                      mse
----------- ---------------------- ---------------------- ----------------------
1           0.746997439342549      0.282176586393152      0.098274347087078
2           0.563235001391582      0.282176586393152      0.098274347087078
3           0.0230764649956309     0.282176586393152      0.098274347087078
4           0.193704294614636      0.282176586393152      0.098274347087078
5           -0.110068224303597     0.282176586393152      0.098274347087078

]]>
https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/feed/ 2
Machine Learning Part 3 — Linear regression in SQL revisited https://blog.adamfurmanek.pl/2018/11/03/machine-learning-part-3/ https://blog.adamfurmanek.pl/2018/11/03/machine-learning-part-3/#comments Sat, 03 Nov 2018 09:00:42 +0000 https://blog.adamfurmanek.pl/?p=2637 Continue reading Machine Learning Part 3 — Linear regression in SQL revisited]]>

This is the third part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Last time we saw how to calculate linear regression for the Iris dataset. However, we had to hardcode all the featuers. Today we are going to make our query much more flexible. Let’s begin.

We start with the same schema as before. Now, the query, this time for PostgreSQL 10:

WITH RECURSIVE constants AS (
	SELECT 1 AS column_1
	UNION
	SELECT 2 AS column_1
	UNION
	SELECT 3 AS column_1
	UNION
	SELECT 4 AS column_1
	UNION
	SELECT 5 AS column_1
),
extended AS (
	SELECT
		S.*, 
		CASE WHEN S.iris = 'setosa' THEN 1.0 ELSE 0.0 END AS is_setosa, 
		CASE WHEN S.iris = 'virginica' THEN 1.0 ELSE 0.0 END AS is_virginica
	FROM samples AS S order by random() 
),
training AS (
  SELECT * FROM extended LIMIT 100
),
test AS (
  SELECT * FROM extended EXCEPT SELECT * FROM training
),
numbered AS(
    SELECT 
		*, 
		ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS row_number 
	FROM training
),
pivoted_training AS (
	SELECT
		N.row_number AS sample, C.column_1 AS feature,
		CAST(CASE
			WHEN C.column_1 = 1 THEN N.sepal_width
			WHEN C.column_1 = 2 THEN N.petal_length
			WHEN C.column_1 = 3 THEN N.petal_width
			WHEN C.column_1 = 4 THEN N.is_setosa
			ELSE N.is_virginica
		END AS float) AS value,
		N.sepal_length AS y
	FROM numbered AS N, constants AS C
),
 learning AS (
  SELECT 
		C.column_1 AS feature,
		CAST(0.0 AS float) as w,
		CAST(0.0 AS float) as b,
		CAST(0.0 AS float) as gw,
		CAST(0.0 AS float) as gb,
		1 as iteration,
		CAST(0.0 AS float) as mse,
		CAST(0.0 AS float) as distance,
		1 as dummy
  FROM constants AS C
	  
  UNION ALL
  
  SELECT R.feature, R.w, R.b, R.gw, R.gb, R.iteration, R.mse, R.distance, R.dummy
  FROM (
	  SELECT
		  CAST(Z.w AS float) AS w,
		  CAST(Z.b AS float) AS b,
		  CAST(AVG(Z.gw) OVER(PARTITION BY Z.feature) AS float) AS gw,
		  CAST(AVG(Z.gb) OVER(PARTITION BY Z.feature) AS float) AS gb, 
		  Z.iteration + 1 AS iteration,
		  Z.feature,
		  CAST(AVG(Z.squared_distance) OVER(PARTITION BY Z.dummy) AS float) AS mse,
		  Z.sample AS sample,
		  CAST(Z.distance AS FLOAT) AS distance,
		  Z.dummy
	  FROM (
		SELECT
		  X.*, 
		  X.distance * x.distance AS squared_distance,
		  X.distance * X.value AS gw,
		  X.distance AS gb
		FROM (
			SELECT 
				K.*,
				SUM(K.value * K.w + K.b) OVER(PARTITION BY K.sample) - K.y AS distance
			FROM (
			  SELECT
				T.*,
				L.w,
				L.b,
				L.iteration,
				L.dummy
			  FROM pivoted_training AS T INNER JOIN (
				SELECT
				  L.w - 0.01 * L.gw AS w,
				  L.b - 0.01 * L.gb AS b,
				  L.feature,
				  L.iteration,
				  MAX(L.iteration) OVER(PARTITION BY L.dummy) AS max_iteration,
				  L.dummy
				FROM learning AS L
			  ) AS L ON T.feature = L.feature AND L.iteration = max_iteration 
			  WHERE 
				L.iteration < 100
			) AS K
		) AS X
	  ) AS Z
  ) AS R
  WHERE R.sample = 1
)
SELECT * FROM learning

Uuu, terrible. Let’s go step by step.

First, constants is just a table with some numbers. We have 5 features so we have 5 rows there. This could be done much easier with recursive CTE or any other dynamic solution.

Next, extended: we just add two more features and randomize the rows.

training, test, and numbered are just tables for bookkeeping the samples.

pivoted_training: here comes some magic. We don’t want to have rows with all the features inside, we want to have one row for each sample’s feature. So we do the translation and emit rows with sample number, feature id, feature value, and target variable.

Next comes our recursive CTE for training. We start with some rows representing coefficients for each feature. We initialize w and b, as well as iteration and distances. We have dummy column again.

Next, we do the calculation. Let’s go from the middle.

We do the training in similar way as before. Assuming we have 100 samples, each has 5 features, the inner join target has 5\cdot i rows where i stands for iterations. We join this with samples (100 \cdot 5 rows) based on the feature id and maximum iteration. So finally, we should have 100 \cdot 5 rows per each iteration.

Next, we calculate the distance for each sample (this is the PARTITION BY K.sample part).

Next, we square the distance and calculate gradient coefficients for each sample.

Finally, we cast variables and calculate the final gradients by taking averages over features. We also calculate mse and we are almost done.

The only tricky part is how to get exactly 5 rows representing new coefficients. This is done by WHERE R.sample = 1 as for each sample we have exactly the same results so we can just take any of them.

Finally, we get our training results with SELECT * FROM learning. You can see it here:

feature	w	b	gw	gb	iteration	mse	distance	dummy
3	0	0	0	0	1	0	0	1
2	0	0	0	0	1	0	0	1
1	0	0	0	0	1	0	0	1
5	0	0	0	0	1	0	0	1
4	0	0	0	0	1	0	0	1
1	0	0	-16.52	-5	2	25.01	-5.1	1
2	0	0	-7.245	-5	2	25.01	-5.1	1
3	0	0	-0.755	-5	2	25.01	-5.1	1
4	0	0	-5	-5	2	25.01	-5.1	1
5	0	0	0	-5	2	25.01	-5.1	1
1	0.1652	0.05	-13.3746025	-4.048655	3	16.39653605105	-4.11886	1
2	0.07245	0.05	-5.8670395	-4.048655	3	16.39653605105	-4.11886	1
3	0.00755	0.05	-0.6108085	-4.048655	3	16.39653605105	-4.11886	1
4	0.05	0.05	-4.048655	-4.048655	3	16.39653605105	-4.11886	1
5	0	0.05	0	-4.048655	3	16.39653605105	-4.11886	1
1	0.298946025	0.09048655	-10.8278890377	-3.278385532	4	10.7499354232339	-3.3244694425	1
2	0.131120395	0.09048655	-4.751354825875	-3.278385532	4	10.7499354232339	-3.3244694425	1
3	0.013658085	0.09048655	-0.494062025325	-3.278385532	4	10.7499354232339	-3.3244694425	1
4	0.09048655	0.09048655	-3.278385532	-3.278385532	4	10.7499354232339	-3.3244694425	1
5	0	0.09048655	0	-3.278385532	4	10.7499354232339	-3.3244694425	1
1	0.407224915377	0.12327040532	-8.76590822437997	-2.65472632382273	5	7.04827711689888	-2.6812831026476	1
2	0.17863394325875	0.12327040532	-3.84802533060171	-2.65472632382273	5	7.04827711689888	-2.6812831026476	1
3	0.01859870525325	0.12327040532	-0.399536787514652	-2.65472632382273	5	7.04827711689888	-2.6812831026476	1
4	0.12327040532	0.12327040532	-2.65472632382273	-2.65472632382273	5	7.04827711689888	-2.6812831026476	1
5	0	0.12327040532	0	-2.65472632382273	5	7.04827711689888	-2.6812831026476	1
1	0.4948839976208	0.149817668558227	-7.09639777302516	-2.14977210051383	6	4.62163562959118	-2.16052130716148	1
2	0.217114196564767	0.149817668558227	-3.11663208541266	-2.14977210051383	6	4.62163562959118	-2.16052130716148	1
3	0.0225940731283965	0.149817668558227	-0.323003275409457	-2.14977210051383	6	4.62163562959118	-2.16052130716148	1
4	0.149817668558227	0.149817668558227	-2.14977210051383	-2.14977210051383	6	4.62163562959118	-2.16052130716148	1
5	0	0.149817668558227	0	-2.14977210051383	6	4.62163562959118	-2.16052130716148	1
1	0.565847975351051	0.171315389563366	-5.7446562719125	-1.74092897782157	7	3.03083789510468	-1.73888220332818	1
2	0.248280517418894	0.171315389563366	-2.52444935656594	-1.74092897782157	7	3.03083789510468	-1.73888220332818	1
3	0.0258241058824911	0.171315389563366	-0.261037007948566	-1.74092897782157	7	3.03083789510468	-1.73888220332818	1
4	0.171315389563366	0.171315389563366	-1.74092897782157	-1.74092897782157	7	3.03083789510468	-1.73888220332818	1
5	0	0.171315389563366	0	-1.74092897782157	7	3.03083789510468	-1.73888220332818	1
1	0.623294538070176	0.188724679341581	-4.65020071011763	-1.40990351099703	8	1.98798177898635	-1.39749913013413	1
2	0.273525010984553	0.188724679341581	-2.04498030998884	-1.40990351099703	8	1.98798177898635	-1.39749913013413	1
3	0.0284344759619768	0.188724679341581	-0.21086530760641	-1.40990351099703	8	1.98798177898635	-1.39749913013413	1
4	0.188724679341581	0.188724679341581	-1.40990351099703	-1.40990351099703	8	1.98798177898635	-1.39749913013413	1
5	0	0.188724679341581	0	-1.40990351099703	8	1.98798177898635	-1.39749913013413	1
1	0.669796545171352	0.202823714451552	-3.76406019771898	-1.14188416444708	9	1.30433157451662	-1.12109643966513	1
2	0.293974814084442	0.202823714451552	-1.65677142468737	-1.14188416444708	9	1.30433157451662	-1.12109643966513	1
3	0.0305431290380408	0.202823714451552	-0.170243238427965	-1.14188416444708	9	1.30433157451662	-1.12109643966513	1
4	0.202823714451552	0.202823714451552	-1.14188416444708	-1.14188416444708	9	1.30433157451662	-1.12109643966513	1
5	0	0.202823714451552	0	-1.14188416444708	9	1.30433157451662	-1.12109643966513	1
1	0.707437147148542	0.214242556096022	-3.04658478966489	-0.924878577539922	10	0.856160630419917	-0.897305996455662	1
2	0.310542528331315	0.214242556096022	-1.3424525664871	-0.924878577539922	10	0.856160630419917	-0.897305996455662	1
3	0.0322455614223205	0.214242556096022	-0.137353157576775	-0.924878577539922	10	0.856160630419917	-0.897305996455662	1
4	0.214242556096022	0.214242556096022	-0.924878577539922	-0.924878577539922	10	0.856160630419917	-0.897305996455662	1
5	0	0.214242556096022	0	-0.924878577539922	10	0.856160630419917	-0.897305996455662	1
1	0.737902995045191	0.223491341871422	-2.46567137030609	-0.749176972878157	11	0.562359311948542	-0.716113771919023	1
2	0.323967053996186	0.223491341871422	-1.08795977072128	-0.749176972878157	11	0.562359311948542	-0.716113771919023	1
3	0.0336190929980882	0.223491341871422	-0.110723385883767	-0.749176972878157	11	0.562359311948542	-0.716113771919023	1
4	0.223491341871422	0.223491341871422	-0.749176972878157	-0.749176972878157	11	0.562359311948542	-0.716113771919023	1
5	0	0.223491341871422	0	-0.749176972878157	11	0.562359311948542	-0.716113771919023	1
1	0.762559708748252	0.230983111600203	-1.99532721675111	-0.606917697531082	12	0.369755786024592	-0.569411772023755	1
2	0.334846651703399	0.230983111600203	-0.881905957695436	-0.606917697531082	12	0.369755786024592	-0.569411772023755	1
3	0.0347263268569259	0.230983111600203	-0.089162358354296	-0.606917697531082	12	0.369755786024592	-0.569411772023755	1
4	0.230983111600203	0.230983111600203	-0.606917697531082	-0.606917697531082	12	0.369755786024592	-0.569411772023755	1
5	0	0.230983111600203	0	-0.606917697531082	12	0.369755786024592	-0.569411772023755	1
1	0.782512980915763	0.237052288575514	-1.61450696845941	-0.491735457602315	13	0.243492987372605	-0.450635249461157	1
2	0.343665711280353	0.237052288575514	-0.715071423930415	-0.491735457602315	13	0.243492987372605	-0.450635249461157	1
3	0.0356179504404689	0.237052288575514	-0.0717053082332894	-0.491735457602315	13	0.243492987372605	-0.450635249461157	1
4	0.237052288575514	0.237052288575514	-0.491735457602315	-0.491735457602315	13	0.243492987372605	-0.450635249461157	1
5	0	0.237052288575514	0	-0.491735457602315	13	0.243492987372605	-0.450635249461157	1
1	0.798658050600357	0.241969643151537	-1.30617096390228	-0.398476506577675	14	0.160720189784965	-0.354468967557447	1
2	0.350816425519658	0.241969643151537	-0.579991311488641	-0.398476506577675	14	0.160720189784965	-0.354468967557447	1
3	0.0363350035228018	0.241969643151537	-0.0575710990356399	-0.398476506577675	14	0.160720189784965	-0.354468967557447	1
4	0.241969643151537	0.241969643151537	-0.398476506577675	-0.398476506577675	14	0.160720189784965	-0.354468967557447	1
5	0	0.241969643151537	0	-0.398476506577675	14	0.160720189784965	-0.354468967557447	1
1	0.81171976023938	0.245954408217314	-1.05652281007167	-0.322968043709101	15	0.106457483619503	-0.276609372867293	1
2	0.356616338634544	0.245954408217314	-0.470621596920286	-0.322968043709101	15	0.106457483619503	-0.276609372867293	1
3	0.0369107145131582	0.245954408217314	-0.0461272730142747	-0.322968043709101	15	0.106457483619503	-0.276609372867293	1
4	0.245954408217314	0.245954408217314	-0.322968043709101	-0.322968043709101	15	0.106457483619503	-0.276609372867293	1
5	0	0.245954408217314	0	-0.322968043709101	15	0.106457483619503	-0.276609372867293	1
1	0.822284988340097	0.249184088654405	-0.854392070294768	-0.261831504289324	16	0.0708847130155275	-0.213572034989326	1
2	0.361322554603747	0.249184088654405	-0.382068654684519	-0.261831504289324	16	0.0708847130155275	-0.213572034989326	1
3	0.0373719872433009	0.249184088654405	-0.0368617521783987	-0.261831504289324	16	0.0708847130155275	-0.213572034989326	1
4	0.249184088654405	0.249184088654405	-0.261831504289324	-0.261831504289324	16	0.0708847130155275	-0.213572034989326	1
5	0	0.249184088654405	0	-0.261831504289324	16	0.0708847130155275	-0.213572034989326	1
1	0.830828909043044	0.251802403697298	-0.690734449052578	-0.212331387591044	17	0.0475642249141984	-0.16253573760171	1
2	0.365143241150592	0.251802403697298	-0.31037029450648	-0.212331387591044	17	0.0475642249141984	-0.16253573760171	1
3	0.0377406047650849	0.251802403697298	-0.0293599256391899	-0.212331387591044	17	0.0475642249141984	-0.16253573760171	1
4	0.251802403697298	0.251802403697298	-0.212331387591044	-0.212331387591044	17	0.0475642249141984	-0.16253573760171	1
5	0	0.251802403697298	0	-0.212331387591044	17	0.0475642249141984	-0.16253573760171	1
1	0.83773625353357	0.253925717573209	-0.558227109840943	-0.172252858358044	18	0.0322757831507861	-0.121216244655038	1
2	0.368246944095657	0.253925717573209	-0.252318475304314	-0.172252858358044	18	0.0322757831507861	-0.121216244655038	1
3	0.0380342040214768	0.253925717573209	-0.0232860980685563	-0.172252858358044	18	0.0322757831507861	-0.121216244655038	1
4	0.253925717573209	0.253925717573209	-0.172252858358044	-0.172252858358044	18	0.0322757831507861	-0.121216244655038	1
5	0	0.253925717573209	0	-0.172252858358044	18	0.0322757831507861	-0.121216244655038	1
1	0.84331852463198	0.255648246156789	-0.450941018789407	-0.139802645192794	19	0.0222527904694817	-0.0877640934587252	1
2	0.3707701288487	0.255648246156789	-0.205315763116255	-0.139802645192794	19	0.0222527904694817	-0.0877640934587252	1
3	0.0382670650021624	0.255648246156789	-0.0183684691922157	-0.139802645192794	19	0.0222527904694817	-0.0877640934587252	1
4	0.255648246156789	0.255648246156789	-0.139802645192794	-0.139802645192794	19	0.0222527904694817	-0.0877640934587252	1
5	0	0.255648246156789	0	-0.139802645192794	19	0.0222527904694817	-0.0877640934587252	1
1	0.847827934819874	0.257046272608717	-0.364075653229332	-0.113528801592203	20	0.0156815899853819	-0.0606818414675168	1
2	0.372823286479863	0.257046272608717	-0.167259110314929	-0.113528801592203	20	0.0156815899853819	-0.0606818414675168	1
3	0.0384507496940845	0.257046272608717	-0.0143869722325961	-0.113528801592203	20	0.0156815899853819	-0.0606818414675168	1
4	0.257046272608717	0.257046272608717	-0.113528801592203	-0.113528801592203	20	0.0156815899853819	-0.0606818414675168	1
5	0	0.257046272608717	0	-0.113528801592203	20	0.0156815899853819	-0.0606818414675168	1
1	0.851468691352167	0.258181560624639	-0.293744204888794	-0.0922557393821863	21	0.0113732297144537	-0.0387570640200821	1
2	0.374495877583012	0.258181560624639	-0.136445755872275	-0.0922557393821863	21	0.0113732297144537	-0.0387570640200821	1
3	0.0385946194164105	0.258181560624639	-0.0111634271392227	-0.0922557393821863	21	0.0113732297144537	-0.0387570640200821	1
4	0.258181560624639	0.258181560624639	-0.0922557393821863	-0.0922557393821863	21	0.0113732297144537	-0.0387570640200821	1
5	0	0.258181560624639	0	-0.0922557393821863	21	0.0113732297144537	-0.0387570640200821	1
1	0.854406133401055	0.259104118018461	-0.236799666746822	-0.0750316276570682	22	0.0085482861437936	-0.0210081050495532	1
2	0.375860335141735	0.259104118018461	-0.111497036233125	-0.0750316276570682	22	0.0085482861437936	-0.0210081050495532	1
3	0.0387062536878027	0.259104118018461	-0.00855356801818448	-0.0750316276570682	22	0.0085482861437936	-0.0210081050495532	1
4	0.259104118018461	0.259104118018461	-0.0750316276570682	-0.0750316276570682	22	0.0085482861437936	-0.0210081050495532	1
5	0	0.259104118018461	0	-0.0750316276570682	22	0.0085482861437936	-0.0210081050495532	1
1	0.856774130068523	0.259854434295032	-0.190694021896671	-0.0610858036175914	23	0.00669580423005917	-0.00664015341069035	1
2	0.376975305504066	0.259854434295032	-0.0912966977558526	-0.0610858036175914	23	0.00669580423005917	-0.00664015341069035	1
3	0.0387917893679846	0.259854434295032	-0.00644058803229366	-0.0610858036175914	23	0.00669580423005917	-0.00664015341069035	1
4	0.259854434295032	0.259854434295032	-0.0610858036175914	-0.0610858036175914	23	0.00669580423005917	-0.00664015341069035	1
5	0	0.259854434295032	0	-0.0610858036175914	23	0.00669580423005917	-0.00664015341069035	1
1	0.85868107028749	0.260465292331207	-0.153364233899679	-0.049794289678438	24	0.00548082479888949	0.00499032051739423	1
2	0.377888272481624	0.260465292331207	-0.0749409505435267	-0.049794289678438	24	0.00548082479888949	0.00499032051739423	1
3	0.0388561952483075	0.260465292331207	-0.00472991294197409	-0.049794289678438	24	0.00548082479888949	0.00499032051739423	1
4	0.260465292331207	0.260465292331207	-0.049794289678438	-0.049794289678438	24	0.00548082479888949	0.00499032051739423	1
5	0	0.260465292331207	0	-0.049794289678438	24	0.00548082479888949	0.00499032051739423	1
1	0.860214712626487	0.260963235227992	-0.123139937329302	-0.0406518739267479	25	0.00468376366185414	0.0144043592180827	1
2	0.37863768198706	0.260963235227992	-0.0616980288510259	-0.0406518739267479	25	0.00468376366185414	0.0144043592180827	1
3	0.0389034943777272	0.260963235227992	-0.00334496943177065	-0.0406518739267479	25	0.00468376366185414	0.0144043592180827	1
4	0.260963235227992	0.260963235227992	-0.0406518739267479	-0.0406518739267479	25	0.00468376366185414	0.0144043592180827	1
5	0	0.260963235227992	0	-0.0406518739267479	25	0.00468376366185414	0.0144043592180827	1
1	0.86144611199978	0.261369753967259	-0.0986686981684429	-0.0332495046867876	26	0.00416067128862917	0.0220238318029926	1
2	0.37925466227557	0.261369753967259	-0.050975448620331	-0.0332495046867876	26	0.00416067128862917	0.0220238318029926	1
3	0.0389369440720449	0.261369753967259	-0.00222375887852913	-0.0332495046867876	26	0.00416067128862917	0.0220238318029926	1
4	0.261369753967259	0.261369753967259	-0.0332495046867876	-0.0332495046867876	26	0.00416067128862917	0.0220238318029926	1
5	0	0.261369753967259	0	-0.0332495046867876	26	0.00416067128862917	0.0220238318029926	1
1	0.862432798981464	0.261702249014127	-0.0788554998766926	-0.0272559877227097	27	0.0038171808332192	0.0281903103185366	1
2	0.379764416761773	0.261702249014127	-0.0422934970999913	-0.0272559877227097	27	0.0038171808332192	0.0281903103185366	1
3	0.0389591816608302	0.261702249014127	-0.00131608325634414	-0.0272559877227097	27	0.0038171808332192	0.0281903103185366	1
4	0.261702249014127	0.261702249014127	-0.0272559877227097	-0.0272559877227097	27	0.0038171808332192	0.0281903103185366	1
5	0	0.261702249014127	0	-0.0272559877227097	27	0.0038171808332192	0.0281903103185366	1
1	0.863221353980231	0.261974808891354	-0.0628137474641037	-0.0224031671305815	28	0.00359142963020953	0.0331803532034955	1
2	0.380187351732773	0.261974808891354	-0.0352637683560471	-0.0224031671305815	28	0.00359142963020953	0.0331803532034955	1
3	0.0389723424933937	0.261974808891354	-0.000581299052883377	-0.0224031671305815	28	0.00359142963020953	0.0331803532034955	1
4	0.261974808891354	0.261974808891354	-0.0224031671305815	-0.0224031671305815	28	0.00359142963020953	0.0331803532034955	1
5	0	0.261974808891354	0	-0.0224031671305815	28	0.00359142963020953	0.0331803532034955	1
1	0.863849491454872	0.26219884056266	-0.049825597275204	-0.0184739268466894	29	0.0034428632948798	0.037217879747665	1
2	0.380539989416334	0.26219884056266	-0.0295717842574173	-0.0184739268466894	29	0.0034428632948798	0.037217879747665	1
3	0.0389781554839225	0.26219884056266	0.0000135013027143136	-0.0184739268466894	29	0.0034428632948798	0.037217879747665	1
4	0.26219884056266	0.26219884056266	-0.0184739268466894	-0.0184739268466894	29	0.0034428632948798	0.037217879747665	1
5	0	0.26219884056266	0	-0.0184739268466894	29	0.0034428632948798	0.037217879747665	1
1	0.864347747427624	0.262383579831127	-0.039309837460669	-0.0152924759060284	30	0.00334489619415933	0.0404841892400958	1
2	0.380835707258908	0.262383579831127	-0.0249629233210475	-0.0152924759060284	30	0.00334489619415933	0.0404841892400958	1
3	0.0389780204708954	0.262383579831127	0.000494961871401944	-0.0152924759060284	30	0.00334489619415933	0.0404841892400958	1
4	0.262383579831127	0.262383579831127	-0.0152924759060284	-0.0152924759060284	30	0.00334489619415933	0.0404841892400958	1
5	0	0.262383579831127	0	-0.0152924759060284	30	0.00334489619415933	0.0404841892400958	1
1	0.864740845802231	0.262536504590187	-0.0307958819656946	-0.0127164827701165	31	0.00328009998108039	0.0431260731083327	1
2	0.381085336492118	0.262536504590187	-0.0212310278105914	-0.0127164827701165	31	0.00328009998108039	0.0431260731083327	1
3	0.0389730708521813	0.262536504590187	0.000884655378404986	-0.0127164827701165	31	0.00328009998108039	0.0431260731083327	1
4	0.262536504590187	0.262536504590187	-0.0127164827701165	-0.0127164827701165	31	0.00328009998108039	0.0431260731083327	1
5	0	0.262536504590187	0	-0.0127164827701165	31	0.00328009998108039	0.0431260731083327	1
1	0.865048804621888	0.262663669417888	-0.0239027144100668	-0.0106307067788558	32	0.0032370494140968	0.0452623830219308	1
2	0.381297646770224	0.262663669417888	-0.0182091793193802	-0.0106307067788558	32	0.0032370494140968	0.0452623830219308	1
3	0.0389642242983973	0.262663669417888	0.00120004847321096	-0.0106307067788558	32	0.0032370494140968	0.0452623830219308	1
4	0.262663669417888	0.262663669417888	-0.0106307067788558	-0.0106307067788558	32	0.0032370494140968	0.0452623830219308	1
5	0	0.262663669417888	0	-0.0106307067788558	32	0.0032370494140968	0.0452623830219308	1
1	0.865287831765988	0.262769976485677	-0.018321839715122	-0.0089418417691709	33	0.00320825461791578	0.0469893488465392	1
2	0.381479738563418	0.262769976485677	-0.0157622300960833	-0.0089418417691709	33	0.00320825461791578	0.0469893488465392	1
3	0.0389522238136652	0.262769976485677	0.00145528326540987	-0.0089418417691709	33	0.00320825461791578	0.0469893488465392	1
4	0.262769976485677	0.262769976485677	-0.0089418417691709	-0.0089418417691709	33	0.00320825461791578	0.0469893488465392	1
5	0	0.262769976485677	0	-0.0089418417691709	33	0.00320825461791578	0.0469893488465392	1
1	0.86547105016314	0.262859394903369	-0.0138034806573657	-0.00757434114092659	34	0.00318880556658337	0.0483848843975334	1
2	0.381637360864379	0.262859394903369	-0.0137807559312666	-0.00757434114092659	34	0.00318880556658337	0.0483848843975334	1
3	0.0389376709810111	0.262859394903369	0.00166181010578401	-0.00757434114092659	34	0.00318880556658337	0.0483848843975334	1
4	0.262859394903369	0.262859394903369	-0.00757434114092659	-0.00757434114092659	34	0.00318880556658337	0.0483848843975334	1
5	0	0.262859394903369	0	-0.00757434114092659	34	0.00318880556658337	0.0483848843975334	1
1	0.865609084969713	0.262935138314778	-0.0101454017209268	-0.00646703756493272	35	0.00317548346748419	0.0495120736518233	1
2	0.381775168423691	0.262935138314778	-0.0121761600299903	-0.00646703756493272	35	0.00317548346748419	0.0495120736518233	1
3	0.0389210528799532	0.262935138314778	0.00182889992609789	-0.00646703756493272	35	0.00317548346748419	0.0495120736518233	1
4	0.262935138314778	0.262935138314778	-0.00646703756493272	-0.00646703756493272	35	0.00317548346748419	0.0495120736518233	1
5	0	0.262935138314778	0	-0.00646703756493272	35	0.00317548346748419	0.0495120736518233	1
1	0.865710538986923	0.262999808690427	-0.00718386017816922	-0.00557040608370096	36	0.0031661782246097	0.0504219934065189	1
2	0.381896930023991	0.262999808690427	-0.0108767087958774	-0.00557040608370096	36	0.0031661782246097	0.0504219934065189	1
3	0.0389027638806923	0.262999808690427	0.00196405906195585	-0.00557040608370096	36	0.0031661782246097	0.0504219934065189	1
4	0.262999808690427	0.262999808690427	-0.00557040608370096	-0.00557040608370096	36	0.0031661782246097	0.0504219934065189	1
5	0	0.262999808690427	0	-0.00557040608370096	36	0.0031661782246097	0.0504219934065189	1
1	0.865782377588704	0.263055512751264	-0.00478627950938307	-0.00484434814385226	37	0.0031595065536436	0.0511559986827947	1
2	0.38200569711195	0.263055512751264	-0.00982432214991813	-0.00484434814385226	37	0.0031595065536436	0.0511559986827947	1
3	0.0388831232900727	0.263055512751264	0.00207336511975451	-0.00484434814385226	37	0.0031595065536436	0.0511559986827947	1
4	0.263055512751264	0.263055512751264	-0.00484434814385226	-0.00484434814385226	37	0.0031595065536436	0.0511559986827947	1
5	0	0.263055512751264	0	-0.00484434814385226	37	0.0031595065536436	0.0511559986827947	1
1	0.865830240383798	0.263103956232703	-0.00284531733771712	-0.00425639740791617	38	0.00315456163536681	0.051747573134115	1
2	0.382103940333449	0.263103956232703	-0.00897197476858001	-0.00425639740791617	38	0.00315456163536681	0.051747573134115	1
3	0.0388623896388752	0.263103956232703	0.00216173891591414	-0.00425639740791617	38	0.00315456163536681	0.051747573134115	1
4	0.263103956232703	0.263103956232703	-0.00425639740791617	-0.00425639740791617	38	0.00315456163536681	0.051747573134115	1
5	0	0.263103956232703	0	-0.00425639740791617	38	0.00315456163536681	0.051747573134115	1
1	0.865858693557175	0.263146520206782	-0.00127406245226594	-0.00378026706552692	39	0.00315074899967499	0.0522238272543376	1
2	0.382193660081135	0.263146520206782	-0.00828159196100726	-0.00378026706552692	39	0.00315074899967499	0.0522238272543376	1
3	0.038840772249716	0.263146520206782	0.00223316465616419	-0.00378026706552692	39	0.00315074899967499	0.0522238272543376	1
4	0.263146520206782	0.263146520206782	-0.00378026706552692	-0.00378026706552692	39	0.00315074899967499	0.0522238272543376	1
5	0	0.263146520206782	0	-0.00378026706552692	39	0.00315074899967499	0.0522238272543376	1
1	0.865871434181698	0.263184322877437	-0.00000214601263577585	-0.00339467364422052	40	0.00314767893851267	0.0526067114222393	1
2	0.382276476000745	0.263184322877437	-0.00772234603744276	-0.00339467364422052	40	0.00314767893851267	0.0526067114222393	1
3	0.0388184406031544	0.263184322877437	0.00229086820668991	-0.00339467364422052	40	0.00314767893851267	0.0526067114222393	1
4	0.263184322877437	0.263184322877437	-0.00339467364422052	-0.00339467364422052	40	0.00314767893851267	0.0526067114222393	1
5	0	0.263184322877437	0	-0.00339467364422052	40	0.00314767893851267	0.0526067114222393	1
1	0.865871455641824	0.263218269613879	0.00102740706694644	-0.00308238469191702	41	0.00314509597662611	0.052913998059446	1
2	0.38235369946112	0.263218269613879	-0.00726927694084783	-0.00308238469191702	41	0.00314509597662611	0.052913998059446	1
3	0.0387955319210875	0.263218269613879	0.0023374614337806	-0.00308238469191702	41	0.00314509597662611	0.052913998059446	1
4	0.263218269613879	0.263218269613879	-0.00308238469191702	-0.00308238469191702	41	0.00314509597662611	0.052913998059446	1
5	0	0.263218269613879	0	-0.00308238469191702	41	0.00314509597662611	0.052913998059446	1
1	0.865861181571155	0.263249093460799	0.00186072743721444	-0.00282944772011939	42	0.00314283263575629	0.0531600768479228	1
2	0.382426392230528	0.263249093460799	-0.00690217542257524	-0.00282944772011939	42	0.00314283263575629	0.0531600768479228	1
3	0.0387721573067497	0.263249093460799	0.0023750590703842	-0.00282944772011939	42	0.00314283263575629	0.0531600768479228	1
4	0.263249093460799	0.263249093460799	-0.00282944772011939	-0.00282944772011939	42	0.00314283263575629	0.0531600768479228	1
5	0	0.263249093460799	0	-0.00282944772011939	42	0.00314283263575629	0.0531600768479228	1
1	0.865842574296783	0.263277387938	0.00253516540503397	-0.0026245659073183	43	0.00314077912452132	0.0533565985886035	1
2	0.382495413984754	0.263277387938	-0.00660467879040763	-0.0026245659073183	43	0.00314077912452132	0.0533565985886035	1
3	0.0387484067160458	0.263277387938	0.00240537333869835	-0.0026245659073183	43	0.00314077912452132	0.0533565985886035	1
4	0.263277387938	0.263277387938	-0.0026245659073183	-0.0026245659073183	43	0.00314077912452132	0.0533565985886035	1
5	0	0.263277387938	0	-0.0026245659073183	43	0.00314077912452132	0.0533565985886035	1
1	0.865817222642732	0.263303633597073	0.00308096215279101	-0.00245859262879344	44	0.00313886346846463	0.0535129965102534	1
2	0.382561460772658	0.263303633597073	-0.00636353876870284	-0.00245859262879344	44	0.00313886346846463	0.0535129965102534	1
3	0.0387243529826588	0.263303633597073	0.00242979056263333	-0.00245859262879344	44	0.00313886346846463	0.0535129965102534	1
4	0.263303633597073	0.263303633597073	-0.00245859262879344	-0.00245859262879344	44	0.00313886346846463	0.0535129965102534	1
5	0	0.263303633597073	0	-0.00245859262879344	44	0.00313886346846463	0.0535129965102534	1
1	0.865786413021204	0.263328219523361	0.00352260286385637	-0.00232412219580569	45	0.00313703848420755	0.05363690835427	1
2	0.382625096160345	0.263328219523361	-0.00616802871142204	-0.00232412219580569	45	0.00313703848420755	0.05363690835427	1
3	0.0387000550770325	0.263328219523361	0.00244943319813293	-0.00232412219580569	45	0.00313703848420755	0.05363690835427	1
4	0.263328219523361	0.263328219523361	-0.00232412219580569	-0.00232412219580569	45	0.00313703848420755	0.05363690835427	1
5	0	0.263328219523361	0	-0.00232412219580569	45	0.00313703848420755	0.05363690835427	1
1	0.865751186992566	0.263351460745319	0.00387991229892592	-0.00221515849204623	46	0.00313527324028808	0.0537345181213462	1
2	0.382686776447459	0.263351460745319	-0.00600946364413666	-0.00221515849204623	46	0.00313527324028808	0.0537345181213462	1
3	0.0386755607450512	0.263351460745319	0.00246521005686269	-0.00221515849204623	46	0.00313527324028808	0.0537345181213462	1
4	0.263351460745319	0.263351460745319	-0.00221515849204623	-0.00221515849204623	46	0.00313527324028808	0.0537345181213462	1
5	0	0.263351460745319	0	-0.00221515849204623	46	0.00313527324028808	0.0537345181213462	1
1	0.865712387869576	0.263373612330239	0.00416894184430072	-0.00212684668063279	47	0.00313354745927142	0.0538108327713118	1
2	0.382746871083901	0.263373612330239	-0.00588081165951478	-0.00212684668063279	47	0.00313354745927142	0.0538108327713118	1
3	0.0386509086444825	0.263373612330239	0.00247785697050231	-0.00212684668063279	47	0.00313354745927142	0.0538108327713118	1
4	0.263373612330239	0.263373612330239	-0.00212684668063279	-0.00212684668063279	47	0.00313354745927142	0.0538108327713118	1
5	0	0.263373612330239	0	-0.00212684668063279	47	0.00313354745927142	0.0538108327713118	1
1	0.865670698451134	0.263394880797046	0.00440268772252717	-0.00205525597704881	48	0.00313184784802383	0.0538699062568924	1
2	0.382805679200496	0.263394880797046	-0.00577637927841784	-0.00205525597704881	48	0.00313184784802383	0.0538699062568924	1
3	0.0386261300747775	0.263394880797046	0.00248796971513974	-0.00205525597704881	48	0.00313184784802383	0.0538699062568924	1
4	0.263394880797046	0.263394880797046	-0.00205525597704881	-0.00205525597704881	48	0.00313184784802383	0.0538699062568924	1
5	0	0.263394880797046	0	-0.00205525597704881	48	0.00313184784802383	0.0538699062568924	1
1	0.865626671573908	0.263415433356816	0.00459167250139285	-0.00199720376830514	49	0.00313016569200614	0.053915020915694	1
2	0.38286344299328	0.263415433356816	-0.00569155669824242	-0.00199720376830514	49	0.00313016569200614	0.053915020915694	1
3	0.0386012503776261	0.263415433356816	0.00249603066895419	-0.00199720376830514	49	0.00313016569200614	0.053915020915694	1
4	0.263415433356816	0.263415433356816	-0.00199720376830514	-0.00199720376830514	49	0.00313016569200614	0.053915020915694	1
5	0	0.263415433356816	0	-0.00199720376830514	49	0.00313016569200614	0.053915020915694	1
1	0.865580754848894	0.263435405394499	0.00474441592057651	-0.00195011320863214	50	0.00312849527820012	0.0539488343366807	1
2	0.382920358560262	0.263435405394499	-0.00562261152978225	-0.00195011320863214	50	0.00312849527820012	0.0539488343366807	1
3	0.0385762900709366	0.263435405394499	0.00250243039597082	-0.00195011320863214	50	0.00312849527820012	0.0539488343366807	1
4	0.263435405394499	0.263435405394499	-0.00195011320863214	-0.00195011320863214	50	0.00312849527820012	0.0539488343366807	1
5	0	0.263435405394499	0	-0.00195011320863214	50	0.00312849527820012	0.0539488343366807	1
1	0.865533310689689	0.263454906526586	0.00486781610281425	-0.00191189791990531	51	0.00312683286124984	0.0539734982726037	1
2	0.38297658467556	0.263454906526586	-0.00556652179348816	-0.00191189791990531	51	0.00312683286124984	0.0539734982726037	1
3	0.0385512657669769	0.263454906526586	0.00250748512163965	-0.00191189791990531	51	0.00312683286124984	0.0539734982726037	1
4	0.263454906526586	0.263454906526586	-0.00191189791990531	-0.00191189791990531	51	0.00312683286124984	0.0539734982726037	1
5	0	0.263454906526586	0	-0.00191189791990531	51	0.00312683286124984	0.0539734982726037	1
1	0.86548463252866	0.263474025505785	0.00496745820668987	-0.00188086863778159	52	0.00312517598571044	0.0539907549190639	1
2	0.383032249893495	0.263474025505785	-0.00552084070262557	-0.00188086863778159	52	0.00312517598571044	0.0539907549190639	1
3	0.0385261909157605	0.263474025505785	0.00251145088217504	-0.00188086863778159	52	0.00312517598571044	0.0539907549190639	1
4	0.263474025505785	0.263474025505785	-0.00188086863778159	-0.00188086863778159	52	0.00312517598571044	0.0539907549190639	1
5	0	0.263474025505785	0	-0.00188086863778159	52	0.00312517598571044	0.0539907549190639	1
1	0.865434957946593	0.263492834192163	0.00504786433157811	-0.00185565762646966	53	0.00312352304174514	0.0540020148681704	1
2	0.383087458300521	0.263492834192163	-0.00548358718311302	-0.00185565762646966	53	0.00312352304174514	0.0540020148681704	1
3	0.0385010764069387	0.263492834192163	0.00251453498076155	-0.00185565762646966	53	0.00312352304174514	0.0540020148681704	1
4	0.263492834192163	0.263492834192163	-0.00185565762646966	-0.00185565762646966	53	0.00312352304174514	0.0540020148681704	1
5	0	0.263492834192163	0	-0.00185565762646966	53	0.00312352304174514	0.0540020148681704	1
1	0.865384479303278	0.263511390768427	0.00511269585651886	-0.0018351574801394	54	0.0031218729738595	0.054008420224755	1
2	0.383142294172352	0.263511390768427	-0.00545315723144686	-0.0018351574801394	54	0.0031218729738595	0.054008420224755	1
3	0.0384759310571311	0.263511390768427	0.00251690526322381	-0.0018351574801394	54	0.0031218729738595	0.054008420224755	1
4	0.263511390768427	0.263511390768427	-0.0018351574801394	-0.0018351574801394	54	0.0031218729738595	0.054008420224755	1
5	0	0.263511390768427	0	-0.0018351574801394	54	0.0031218729738595	0.054008420224755	1
1	0.865333352344712	0.263529742343229	0.00516491726669041	-0.00181847157263526	55	0.00312022508996155	0.0540108957092986	1
2	0.383196825744667	0.263529742343229	-0.00542825214441783	-0.00181847157263526	55	0.00312022508996155	0.0540108957092986	1
3	0.0384507620044989	0.263529742343229	0.00251869762820141	-0.00181847157263526	55	0.00312022508996155	0.0540108957092986	1
4	0.263529742343229	0.263529742343229	-0.00181847157263526	-0.00181847157263526	55	0.00312022508996155	0.0540108957092986	1
5	0	0.263529742343229	0	-0.00181847157263526	55	0.00312022508996155	0.0540108957092986	1
1	0.865281703172046	0.263547927058955	0.0052069287976972	-0.00180487393842599	56	0.00311857893618953	0.0540101900340888	1
2	0.383251108266111	0.263547927058955	-0.00540782040934343	-0.00180487393842599	56	0.00311857893618953	0.0540101900340888	1
3	0.0384255750282169	0.263547927058955	0.00252002210786184	-0.00180487393842599	56	0.00311857893618953	0.0540101900340888	1
4	0.263547927058955	0.263547927058955	-0.00180487393842599	-0.00180487393842599	56	0.00311857893618953	0.0540101900340888	1
5	0	0.263547927058955	0	-0.00180487393842599	56	0.00311857893618953	0.0540101900340888	1
1	0.865229633884069	0.263565975798339	0.00524067383281657	-0.00179377678967141	57	0.0031169342148546	0.0540069094039897	1
2	0.383305186470204	0.263565975798339	-0.0053910106547066	-0.00179377678967141	57	0.0031169342148546	0.0540069094039897	1
3	0.0384003748071382	0.263565975798339	0.00252096779123234	-0.00179377678967141	57	0.0031169342148546	0.0540069094039897	1
4	0.263565975798339	0.263565975798339	-0.00179377678967141	-0.00179377678967141	57	0.0031169342148546	0.0540069094039897	1
5	0	0.263565975798339	0	-0.00179377678967141	57	0.0031169342148546	0.0540069094039897	1
1	0.86517722714574	0.263583913566236	0.00526772585866157	-0.00178470421596755	58	0.00311529073064823	0.0540015446408049	1
2	0.383359096576751	0.263583913566236	-0.00537713355599156	-0.00178470421596755	58	0.00311529073064823	0.0540015446408049	1
3	0.0383751651292259	0.263583913566236	0.00252160681044349	-0.00178470421596755	58	0.00311529073064823	0.0540015446408049	1
4	0.263583913566236	0.263583913566236	-0.00178470421596755	-0.00178470421596755	58	0.00311529073064823	0.0540015446408049	1
5	0	0.263583913566236	0	-0.00178470421596755	58	0.00311529073064823	0.0540015446408049	1
1	0.865124549887154	0.263601760608396	0.00528935886997606	-0.00177727088999946	59	0.00311364835537781	0.0539944931448719	1
2	0.383412867912311	0.263601760608396	-0.00536563099224279	-0.00177727088999946	59	0.00311364835537781	0.0539944931448719	1
3	0.0383499490611215	0.263601760608396	0.00252199756824365	-0.00177727088999946	59	0.00311364835537781	0.0539944931448719	1
4	0.263601760608396	0.263601760608396	-0.00177727088999946	-0.00177727088999946	59	0.00311364835537781	0.0539944931448719	1
5	0	0.263601760608396	0	-0.00177727088999946	59	0.00311364835537781	0.0539944931448719	1
1	0.865071656298454	0.263619533317296	0.0053066043738685	-0.00177116482627326	60	0.00311200700484911	0.0539860766765781	1
2	0.383466524222234	0.263619533317296	-0.0053560510732388	-0.00177116482627326	60	0.00311200700484911	0.0539860766765781	1
3	0.038324729085439	0.263619533317296	0.00252218735120158	-0.00177116482627326	60	0.00311200700484911	0.0539860766765781	1
4	0.263619533317296	0.263619533317296	-0.00177116482627326	-0.00177116482627326	60	0.00311200700484911	0.0539860766765781	1
5	0	0.263619533317296	0	-0.00177116482627326	60	0.00311200700484911	0.0539860766765781	1
1	0.865018590254715	0.263637244965558	0.00532029754403256	-0.00176613342149867	61	0.00311036662371103	0.0539765557533922	1
2	0.383520084732966	0.263637244965558	-0.00534802791991762	-0.00176613342149867	61	0.00311036662371103	0.0539765557533922	1
3	0.038299507211927	0.263637244965558	0.00252221444551974	-0.00176613342149867	61	0.00311036662371103	0.0539765557533922	1
4	0.263637244965558	0.263637244965558	-0.00176613342149867	-0.00176613342149867	61	0.00311036662371103	0.0539765557533922	1
5	0	0.263637244965558	0	-0.00176613342149867	61	0.00311036662371103	0.0539765557533922	1
1	0.864965387279275	0.263654906299773	0.00533111459015369	-0.00176197215199103	62	0.00310872717552127	0.0539661413066295	1
2	0.383573565012165	0.263654906299773	-0.00534126529331802	-0.00176197215199103	62	0.00310872717552127	0.0539661413066295	1
3	0.0382742850674718	0.263654906299773	0.00252210985013237	-0.00176197215199103	62	0.00310872717552127	0.0539661413066295	1
4	0.263654906299773	0.263654906299773	-0.00176197215199103	-0.00176197215199103	62	0.00310872717552127	0.0539661413066295	1
5	0	0.263654906299773	0	-0.00176197215199103	62	0.00310872717552127	0.0539661413066295	1
1	0.864912076133374	0.263672526021293	0.0053396030145576	-0.00175851542236849	63	0.00310708863623283	0.0539550041194996	1
2	0.383626977665099	0.263672526021293	-0.00533552333952772	-0.00175851542236849	63	0.00310708863623283	0.0539550041194996	1
3	0.0382490639689705	0.263672526021293	0.00252189866373813	-0.00175851542236849	63	0.00310708863623283	0.0539550041194996	1
4	0.263672526021293	0.263672526021293	-0.00175851542236849	-0.00175851542236849	63	0.00310708863623283	0.0539550041194996	1
5	0	0.263672526021293	0	-0.00175851542236849	63	0.00310708863623283	0.0539550041194996	1
1	0.864858680103228	0.263690111175517	0.00534620610990562	-0.00175562915607941	64	0.00310545098992522	0.0539432824687589	1
2	0.383680332898494	0.263690111175517	-0.00533060785755706	-0.00175562915607941	64	0.00310545098992522	0.0539432824687589	1
3	0.0382238449823331	0.263690111175517	0.00252160120783	-0.00175562915607941	64	0.00310545098992522	0.0539432824687589	1
4	0.263690111175517	0.263690111175517	-0.00175562915607941	-0.00175562915607941	64	0.00310545098992522	0.0539432824687589	1
5	0	0.263690111175517	0	-0.00175562915607941	64	0.00310545098992522	0.0539432824687589	1
1	0.864805218042129	0.263707667467078	0.00535128279409584	-0.00175320479621854	65	0.00310381422600463	0.0539310883118667	1
2	0.383733638977069	0.263707667467078	-0.00532636160992115	-0.00175320479621854	65	0.00310381422600463	0.0539310883118667	1
3	0.0381986289702548	0.263707667467078	0.00252123393597148	-0.00175320479621854	65	0.00310381422600463	0.0539310883118667	1
4	0.263707667467078	0.263707667467078	-0.00175320479621854	-0.00175320479621854	65	0.00310381422600463	0.0539310883118667	1
5	0	0.263707667467078	0	-0.00175320479621854	65	0.00310381422600463	0.0539310883118667	1
1	0.864751705214188	0.26372519951504	0.00535512366984955	-0.00175115444821072	66	0.0031021783373702	0.053918512296514	1
2	0.383786902593169	0.26372519951504	-0.00532265728714179	-0.00175115444821072	66	0.0031021783373702	0.053918512296514	1
3	0.0381734166308951	0.26372519951504	0.00252081017000463	-0.00175115444821072	66	0.0031021783373702	0.053918512296514	1
4	0.26372519951504	0.26372519951504	-0.00175115444821072	-0.00175115444821072	66	0.0031021783373702	0.053918512296514	1
5	0	0.26372519951504	0	-0.00175115444821072	66	0.0031021783373702	0.053918512296514	1
1	0.86469815397749	0.263742711059522	0.00535796402758328	-0.00174940694701453	67	0.00310054331921004	0.0539056278166417	1
2	0.38384012916604	0.263742711059522	-0.00531939181135388	-0.00174940694701453	67	0.00310054331921004	0.0539056278166417	1
3	0.0381482085291951	0.263742711059522	0.00252034069613063	-0.00174940694701453	67	0.00310054331921004	0.0539056278166417	1
4	0.263742711059522	0.263742711059522	-0.00174940694701453	-0.00174940694701453	67	0.00310054331921004	0.0539056278166417	1
5	0	0.263742711059522	0	-0.00174940694701453	67	0.00310054331921004	0.0539056278166417	1
1	0.864644574337214	0.263760205128992	0.00535999437335181	-0.0017479046728841	68	0.00309890916821358	0.0538924942964627	1
2	0.383893323084154	0.263760205128992	-0.00531648172414929	-0.0017479046728841	68	0.00309890916821358	0.0538924942964627	1
3	0.0381230051222338	0.263760205128992	0.00251983424753472	-0.0017479046728841	68	0.00309890916821358	0.0538924942964627	1
4	0.263760205128992	0.263760205128992	-0.0017479046728841	-0.0017479046728841	68	0.00309890916821358	0.0538924942964627	1
5	0	0.263760205128992	0	-0.0017479046728841	68	0.00309890916821358	0.0538924942964627	1
1	0.86459097439348	0.263777684175721	0.00536136895295311	-0.00174660097320301	69	0.00309727588205438	0.0538791598494122	1
2	0.383946487901395	0.263777684175721	-0.00531385945227512	-0.00174660097320301	69	0.00309727588205438	0.0538791598494122	1
3	0.0380978067797584	0.263777684175721	0.00251929789515031	-0.00174660097320301	69	0.00309727588205438	0.0538791598494122	1
4	0.263777684175721	0.263777684175721	-0.00174660097320301	-0.00174660097320301	69	0.00309727588205438	0.0538791598494122	1
5	0	0.263777684175721	0	-0.00174660097320301	69	0.00309727588205438	0.0538791598494122	1
1	0.864537360703951	0.263795150185453	0.00536221265356525	-0.00174545807504289	70	0.00309564345905061	0.0538656634309911	1
2	0.383999626495918	0.263795150185453	-0.0053114702841139	-0.00174545807504289	70	0.00309564345905061	0.0538656634309911	1
3	0.0380726138008069	0.263795150185453	0.00251873736404526	-0.00174545807504289	70	0.00309564345905061	0.0538656634309911	1
4	0.263795150185453	0.263795150185453	-0.00174545807504289	-0.00174545807504289	70	0.00309564345905061	0.0538656634309911	1
5	0	0.263795150185453	0	-0.00174545807504289	70	0.00309564345905061	0.0538656634309911	1
1	0.864483738577415	0.263812604766203	0.00536262659176794	-0.00174444539503416	71	0.00309401189794438	0.0538520365818691	1
2	0.384052741198759	0.263812604766203	-0.00530926992164469	-0.00174444539503416	71	0.00309401189794438	0.0538520365818691	1
3	0.0380474264271665	0.263812604766203	0.00251815728959004	-0.00174444539503416	71	0.00309401189794438	0.0538520365818691	1
4	0.263812604766203	0.263812604766203	-0.00174444539503416	-0.00174444539503416	71	0.00309401189794438	0.0538520365818691	1
5	0	0.263812604766203	0	-0.00174444539503416	71	0.00309401189794438	0.0538520365818691	1
1	0.864430112311497	0.263830049220154	0.0053626926379485	-0.00174353817093165	72	0.00309238119775453	0.0538383048391831	1
2	0.384105833897975	0.263830049220154	-0.00530722249835663	-0.00174353817093165	72	0.00309238119775453	0.0538383048391831	1
3	0.0380222448542706	0.263830049220154	0.00251756142486599	-0.00174353817093165	72	0.00309238119775453	0.0538383048391831	1
4	0.263830049220154	0.263830049220154	-0.00174353817093165	-0.00174353817093165	72	0.00309238119775453	0.0538383048391831	1
5	0	0.263830049220154	0	-0.00174353817093165	72	0.00309238119775453	0.0538383048391831	1
1	0.864376485385118	0.263847484601863	0.00536247707956901	-0.00174271635363832	73	0.00309075135768194	0.053824488879239	1
2	0.384158906122959	0.263847484601863	-0.00530529897441943	-0.00174271635363832	73	0.00309075135768194	0.053824488879239	1
3	0.0379970692400219	0.263847484601863	0.00251695280859812	-0.00174271635363832	73	0.00309075135768194	0.053824488879239	1
4	0.263847484601863	0.263847484601863	-0.00174271635363832	-0.00174271635363832	73	0.00309075135768194	0.053824488879239	1
5	0	0.263847484601863	0	-0.00174271635363832	73	0.00309075135768194	0.053824488879239	1
1	0.864322860614322	0.2638649117654	0.00536203358713654	-0.00174196371012991	74	0.00308912237704695	0.0538106054426963	1
2	0.384211959112703	0.2638649117654	-0.00530347583732969	-0.00174196371012991	74	0.00308912237704695	0.0538106054426963	1
3	0.0379718997119359	0.2638649117654	0.00251633390112183	-0.00174196371012991	74	0.00308912237704695	0.0538106054426963	1
4	0.2638649117654	0.2638649117654	-0.00174196371012991	-0.00174196371012991	74	0.00308912237704695	0.0538106054426963	1
5	0	0.2638649117654	0	-0.00174196371012991	74	0.00308912237704695	0.0538106054426963	1
1	0.864269240278451	0.263882331402501	0.00536140561569884	-0.00174126709710842	75	0.00308749425524835	0.0537966680836748	1
2	0.384264993871076	0.263882331402501	-0.00530173404984638	-0.00174126709710842	75	0.00308749425524835	0.0537966680836748	1
3	0.0379467363729247	0.263882331402501	0.0025157066944729	-0.00174126709710842	75	0.00308749425524835	0.0537966680836748	1
4	0.263882331402501	0.263882331402501	-0.00174126709710842	-0.00174126709710842	75	0.00308749425524835	0.0537966680836748	1
5	0	0.263882331402501	0	-0.00174126709710842	75	0.00308749425524835	0.0537966680836748	1
1	0.864215626222294	0.263899744073472	0.00536062834920563	-0.00174061587291829	76	0.0030858669917361	0.0537826877762617	1
2	0.384318011211575	0.263899744073472	-0.00530005819819053	-0.00174061587291829	76	0.0030858669917361	0.0537826877762617	1
3	0.03792157930598	0.263899744073472	0.00251507280152126	-0.00174061587291829	76	0.0030858669917361	0.0537826877762617	1
4	0.263899744073472	0.263899744073472	-0.00174061587291829	-0.00174061587291829	76	0.0030858669917361	0.0537826877762617	1
5	0	0.263899744073472	0	-0.00174061587291829	76	0.0030858669917361	0.0537826877762617	1
1	0.864162019938802	0.263917150232201	0.00535973027479027	-0.00174000142139574	77	0.00308424058599398	0.0537686734055853	1
2	0.384371011793557	0.263917150232201	-0.00529843580237288	-0.00174000142139574	77	0.00308424058599398	0.0537686734055853	1
3	0.0378964285779647	0.263917150232201	0.00251443352813969	-0.00174000142139574	77	0.00308424058599398	0.0537686734055853	1
4	0.263917150232201	0.263917150232201	-0.00174000142139574	-0.00174000142139574	77	0.00308424058599398	0.0537686734055853	1
5	0	0.263917150232201	0	-0.00174000142139574	77	0.00308424058599398	0.0537686734055853	1
1	0.864108422636054	0.263934550246415	0.00535873445743906	-0.0017394167663376	78	0.00308261503752822	0.0537546321654281	1
2	0.384423996151581	0.263934550246415	-0.0052968567577778	-0.0017394167663376	78	0.00308261503752822	0.0537546321654281	1
3	0.0378712842426834	0.263934550246415	0.00251378993163764	-0.0017394167663376	78	0.00308261503752822	0.0537546321654281	1
4	0.263934550246415	0.263934550246415	-0.0017394167663376	-0.0017394167663376	78	0.00308261503752822	0.0537546321654281	1
5	0	0.263934550246415	0	-0.0017394167663376	78	0.00308261503752822	0.0537546321654281	1
1	0.864054835291479	0.263951944414078	0.00535765957200497	-0.00173885625936254	79	0.00308099034585972	0.0537405698801443	1
2	0.384476964719158	0.263951944414078	-0.00529531288305103	-0.00173885625936254	79	0.00308099034585972	0.0537405698801443	1
3	0.037846146343367	0.263951944414078	0.00251314286807096	-0.00173885625936254	79	0.00308099034585972	0.0537405698801443	1
4	0.263951944414078	0.263951944414078	-0.00173885625936254	-0.00173885625936254	79	0.00308099034585972	0.0537405698801443	1
5	0	0.263951944414078	0	-0.00173885625936254	79	0.00308099034585972	0.0537405698801443	1
1	0.864001258695759	0.263969332976672	0.00535652073882113	-0.00173831532717461	80	0.00307936651051865	0.0537264912653121	1
2	0.384529917847989	0.263969332976672	-0.00529379755402753	-0.00173831532717461	80	0.00307936651051865	0.0537264912653121	1
3	0.0378210149146863	0.263969332976672	0.00251249303054815	-0.00173831532717461	80	0.00307936651051865	0.0537264912653121	1
4	0.263969332976672	0.263969332976672	-0.00173831532717461	-0.00173831532717461	80	0.00307936651051865	0.0537264912653121	1
5	0	0.263969332976672	0	-0.00173831532717461	80	0.00307936651051865	0.0537264912653121	1
1	0.863947693488371	0.263986716129944	0.00535533020024959	-0.00173779026693754	81	0.00307774353104218	0.0537124001387799	1
2	0.384582855823529	0.263986716129944	-0.00529230540734531	-0.00173779026693754	81	0.00307774353104218	0.0537124001387799	1
3	0.0377958899843808	0.263986716129944	0.00251184098024524	-0.00173779026693754	81	0.00307774353104218	0.0537124001387799	1
4	0.263986716129944	0.263986716129944	-0.00173779026693754	-0.00173779026693754	81	0.00307774353104218	0.0537124001387799	1
5	0	0.263986716129944	0	-0.00173779026693754	81	0.00307774353104218	0.0537124001387799	1
1	0.863894140186369	0.264004094032613	0.00535409786846586	-0.00173727808059354	82	0.00307612140697126	0.0536982995915292	1
2	0.384635778877603	0.264004094032613	-0.00529083210046677	-0.00173727808059354	82	0.00307612140697126	0.0536982995915292	1
3	0.0377707715745783	0.264004094032613	0.0025111871715171	-0.00173727808059354	82	0.00307612140697126	0.0536982995915292	1
4	0.264004094032613	0.264004094032613	-0.00173727808059354	-0.00173727808059354	82	0.00307612140697126	0.0536982995915292	1
5	0	0.264004094032613	0	-0.00173727808059354	82	0.00307612140697126	0.0536982995915292	1
1	0.863840599207684	0.264021466813419	0.00535283176898006	-0.00173677634071812	83	0.0030745001378501	0.0536841921260311	1
2	0.384688687198607	0.264021466813419	-0.00528937411737873	-0.00173677634071812	83	0.0030745001378501	0.0536841921260311	1
3	0.0377456597028632	0.264021466813419	0.00251053197222975	-0.00173677634071812	83	0.0030745001378501	0.0536841921260311	1
4	0.264021466813419	0.264021466813419	-0.00173677634071812	-0.00173677634071812	83	0.0030745001378501	0.0536841921260311	1
5	0	0.264021466813419	0	-0.00173677634071812	83	0.0030745001378501	0.0536841921260311	1
1	0.863787070889994	0.264038834576826	0.00535153839973894	-0.00173628308190743	84	0.00307287972322492	0.0536700797682599	1
2	0.384741580939781	0.264038834576826	-0.00528792861127414	-0.00173628308190743	84	0.00307287972322492	0.0536700797682599	1
3	0.0377205543831409	0.264038834576826	0.00250987568022225	-0.00173628308190743	84	0.00307287972322492	0.0536700797682599	1
4	0.264038834576826	0.264038834576826	-0.00173628308190743	-0.00173628308190743	84	0.00307287972322492	0.0536700797682599	1
5	0	0.264038834576826	0	-0.00173628308190743	84	0.00307287972322492	0.0536700797682599	1
1	0.863733555505997	0.264056197407645	0.00535022302187017	-0.00173579671284063	85	0.00307126016264287	0.0536559641583807	1
2	0.384794460225894	0.264056197407645	-0.00528649327717998	-0.00173579671284063	85	0.00307126016264287	0.0536559641583807	1
3	0.0376954556263386	0.264056197407645	0.00250921853663497	-0.00173579671284063	85	0.00307126016264287	0.0536559641583807	1
4	0.264056197407645	0.264056197407645	-0.00173579671284063	-0.00173579671284063	85	0.00307126016264287	0.0536559641583807	1
5	0	0.264056197407645	0	-0.00173579671284063	85	0.00307126016264287	0.0536559641583807	1
1	0.863680053275778	0.264073555374774	0.00534888989509494	-0.00173531594507859	86	0.00306964145565276	0.0536418466241928	1
2	0.384847325158666	0.264073555374774	-0.00528506624882752	-0.00173531594507859	86	0.00306964145565276	0.0536418466241928	1
3	0.0376703634409723	0.264073555374774	0.00250856073670178	-0.00173531594507859	86	0.00306964145565276	0.0536418466241928	1
4	0.264073555374774	0.264073555374774	-0.00173531594507859	-0.00173531594507859	86	0.00306964145565276	0.0536418466241928	1
5	0	0.264073555374774	0	-0.00173531594507859	86	0.00306964145565276	0.0536418466241928	1
1	0.863626564376827	0.264090908534225	0.0053475424683492	-0.00173483973540867	87	0.00306802360180356	0.0536277282405804	1
2	0.384900175821154	0.264090908534225	-0.00528364601514202	-0.00173483973540867	87	0.00306802360180356	0.0536277282405804	1
3	0.0376452778336053	0.264090908534225	0.00250790243848815	-0.00173483973540867	87	0.00306802360180356	0.0536277282405804	1
4	0.264090908534225	0.264090908534225	-0.00173483973540867	-0.00173483973540867	87	0.00306802360180356	0.0536277282405804	1
5	0	0.264090908534225	0	-0.00173483973540867	87	0.00306802360180356	0.0536277282405804	1
1	0.863573088952144	0.264108256931579	0.0053461835340788	-0.00173436723917764	88	0.00306640660064492	0.0536136098776474	1
2	0.384953012281305	0.264108256931579	-0.00528223135264883	-0.00173436723917764	88	0.00306640660064492	0.0536136098776474	1
3	0.0376201988092204	0.264108256931579	0.00250724376996461	-0.00173436723917764	88	0.00306640660064492	0.0536136098776474	1
4	0.264108256931579	0.264108256931579	-0.00173436723917764	-0.00173436723917764	88	0.00306640660064492	0.0536136098776474	1
5	0	0.264108256931579	0	-0.00173436723917764	88	0.00306640660064492	0.0536136098776474	1
1	0.863519627116803	0.264125600603971	0.00534481535321064	-0.00173389777249344	89	0.00306479045172715	0.0535994922397016	1
2	0.385005834594832	0.264125600603971	-0.00528082127072525	-0.00173389777249344	89	0.00306479045172715	0.0535994922397016	1
3	0.0375951263715207	0.264125600603971	0.00250658483473574	-0.00173389777249344	89	0.00306479045172715	0.0535994922397016	1
4	0.264125600603971	0.264125600603971	-0.00173389777249344	-0.00173389777249344	89	0.00306479045172715	0.0535994922397016	1
5	0	0.264125600603971	0	-0.00173389777249344	89	0.00306479045172715	0.0535994922397016	1
1	0.863466178963271	0.264142939581695	0.00534343975631799	-0.00173343078162702	90	0.00306317515460085	0.0535853758968088	1
2	0.385058642807539	0.264142939581695	-0.00527941496728097	-0.00173343078162702	90	0.00306317515460085	0.0535853758968088	1
3	0.0375700605231734	0.264142939581695	0.00250592571667774	-0.00173343078162702	90	0.00306317515460085	0.0535853758968088	1
4	0.264142939581695	0.264142939581695	-0.00173343078162702	-0.00173343078162702	90	0.00306317515460085	0.0535853758968088	1
5	0	0.264142939581695	0	-0.00173343078162702	90	0.00306317515460085	0.0535853758968088	1
1	0.863412744565707	0.264160273889512	0.00534205822553266	-0.00173296581823745	91	0.00306156070881689	0.0535712613103438	1
2	0.385111436957212	0.264160273889512	-0.00527801179287338	-0.00173296581823745	91	0.00306156070881689	0.0535712613103438	1
3	0.0375450012660066	0.264160273889512	0.00250526648369345	-0.00173296581823745	91	0.00306156070881689	0.0535712613103438	1
4	0.264160273889512	0.264160273889512	-0.00173296581823745	-0.00173296581823745	91	0.00306156070881689	0.0535712613103438	1
5	0	0.264160273889512	0	-0.00173296581823745	91	0.00306156070881689	0.0535712613103438	1
1	0.863359323983452	0.264177603547694	0.00534067196086156	-0.00173250251931423	92	0.00305994711392649	0.0535571488536783	1
2	0.385164217075141	0.264177603547694	-0.00527661122165526	-0.00173250251931423	92	0.00305994711392649	0.0535571488536783	1
3	0.0375199486011697	0.264177603547694	0.0025046071907525	-0.00173250251931423	92	0.00305994711392649	0.0535571488536783	1
4	0.264177603547694	0.264177603547694	-0.00173250251931423	-0.00173250251931423	92	0.00305994711392649	0.0535571488536783	1
5	0	0.264177603547694	0	-0.00173250251931423	92	0.00305994711392649	0.0535571488536783	1
1	0.863305917263843	0.264194928572887	0.00533928193388257	-0.00173204059093646	93	0.00305833436948097	0.053543038828928	1
2	0.385216983187357	0.264194928572887	-0.00527521282785109	-0.00173204059093646	93	0.00305833436948097	0.053543038828928	1
3	0.0374949025292622	0.264194928572887	0.00250394788235275	-0.00173204059093646	93	0.00305833436948097	0.053543038828928	1
4	0.264194928572887	0.264194928572887	-0.00173204059093646	-0.00173204059093646	93	0.00305833436948097	0.053543038828928	1
5	0	0.264194928572887	0	-0.00173204059093646	93	0.00305833436948097	0.053543038828928	1
1	0.863252524444505	0.264212248978797	0.00533788893123721	-0.00173157979511851	94	0.00305672247503216	0.053528931480523	1
2	0.385269735315636	0.264212248978797	-0.00527381626670392	-0.00173157979511851	94	0.00305672247503216	0.053528931480523	1
3	0.0374698630504386	0.264212248978797	0.0025032885945143	-0.00173157979511851	94	0.00305672247503216	0.053528931480523	1
4	0.264212248978797	0.264212248978797	-0.00173157979511851	-0.00173157979511851	94	0.00305672247503216	0.053528931480523	1
5	0	0.264212248978797	0	-0.00173157979511851	94	0.00305672247503216	0.053528931480523	1
1	0.863199145555192	0.264229564776748	0.00533649358982165	-0.00173111993916608	95	0.00305511143013205	0.0535148270061825	1
2	0.385322473478303	0.264229564776748	-0.00527242125905825	-0.00173111993916608	95	0.00305511143013205	0.0535148270061825	1
3	0.0374448301644935	0.264229564776748	0.00250262935639252	-0.00173111993916608	95	0.00305511143013205	0.0535148270061825	1
4	0.264229564776748	0.264229564776748	-0.00173111993916608	-0.00173111993916608	95	0.00305511143013205	0.0535148270061825	1
5	0	0.264229564776748	0	-0.00173111993916608	95	0.00305511143013205	0.0535148270061825	1
1	0.863145780619294	0.264246875976139	0.00533509642527563	-0.00173066086705953	96	0.00305350123433273	0.0535007255658009	1
2	0.385375197690893	0.264246875976139	-0.00527102757887934	-0.00173066086705953	96	0.00305350123433273	0.0535007255658009	1
3	0.0374198038709296	0.264246875976139	0.00250197019158409	-0.00173066086705953	96	0.00305350123433273	0.0535007255658009	1
4	0.264246875976139	0.264246875976139	-0.00173066086705953	-0.00173066086705953	96	0.00305350123433273	0.0535007255658009	1
5	0	0.264246875976139	0	-0.00173066086705953	96	0.00305350123433273	0.0535007255658009	1
1	0.863092429655041	0.26426418258481	0.00533369785509552	-0.00173020245246347	97	0.00305189188718688	0.0534866272886614	1
2	0.385427907966682	0.26426418258481	-0.00526963504312827	-0.00173020245246347	97	0.00305189188718688	0.0534866272886614	1
3	0.0373947841690137	0.26426418258481	0.00250131111918672	-0.00173020245246347	97	0.00305189188718688	0.0534866272886614	1
4	0.26426418258481	0.26426418258481	-0.00173020245246347	-0.00173020245246347	97	0.00305189188718688	0.0534866272886614	1
5	0	0.26426418258481	0	-0.00173020245246347	97	0.00305189188718688	0.0534866272886614	1
1	0.86303909267649	0.264281484609335	0.00533229821728032	-0.00172974459308683	98	0.00305028338824717	0.0534725322792475	1
2	0.385480604317113	0.264281484609335	-0.00526824350359263	-0.00172974459308683	98	0.00305028338824717	0.0534725322792475	1
3	0.0373697710578219	0.264281484609335	0.00250065215465369	-0.00172974459308683	98	0.00305028338824717	0.0534725322792475	1
4	0.264281484609335	0.264281484609335	-0.00172974459308683	-0.00172974459308683	98	0.00305028338824717	0.0534725322792475	1
5	0	0.264281484609335	0	-0.00172974459308683	98	0.00305028338824717	0.0534725322792475	1
1	0.862985769694318	0.264298782055265	0.00533089778548161	-0.00172928720610077	99	0.0030486757370664	0.05345844062197	1
2	0.385533286752149	0.264298782055265	-0.00526685284024966	-0.00172928720610077	99	0.0030486757370664	0.05345844062197	1
3	0.0373447645362753	0.264298782055265	0.00249999331048842	-0.00172928720610077	99	0.0030486757370664	0.05345844062197	1
4	0.264298782055265	0.264298782055265	-0.00172928720610077	-0.00172928720610077	99	0.0030486757370664	0.05345844062197	1
5	0	0.264298782055265	0	-0.00172928720610077	99	0.0030486757370664	0.05345844062197	1
1	0.862932460716463	0.264316074927326	0.00532949678123926	-0.00172883022443804	100	0.00304706893319777	0.0534443523849859	1
2	0.385585955280552	0.264316074927326	-0.00526546295590635	-0.00172883022443804	100	0.00304706893319777	0.0534443523849859	1
3	0.0373197646031704	0.264316074927326	0.00249933459680549	-0.00172883022443804	100	0.00304706893319777	0.0534443523849859	1
4	0.264316074927326	0.264316074927326	-0.00172883022443804	-0.00172883022443804	100	0.00304706893319777	0.0534443523849859	1
5	0	0.264316074927326	0	-0.00172883022443804	100	0.00304706893319777	0.0534443523849859	1

Now you can evaluate the model.

]]>
https://blog.adamfurmanek.pl/2018/11/03/machine-learning-part-3/feed/ 2
Machine Learning Part 2 — Linear regression in SQL https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/ https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/#comments Sat, 27 Oct 2018 08:00:18 +0000 https://blog.adamfurmanek.pl/?p=2633 Continue reading Machine Learning Part 2 — Linear regression in SQL]]>

This is the second part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Image that you have only a data warehouse with SQL capabilities to train and evaluate your models. Last time we ran Python code to calculate linear regression for Iris dataset, today we are going to do exactly the same but in SQL.

The code provided below is for MS SQL 2017.

Let’s start with dataset and schema:

CREATE TABLE samples(
sepal_length float
,sepal_width float
,petal_length float
,petal_width float
,iris varchar(255)
);

INSERT INTO samples
VALUES
(5.1,3.5,1.4,0.2,'setosa'),
(4.9,3,1.4,0.2,'setosa'),
(4.7,3.2,1.3,0.2,'setosa'),
(4.6,3.1,1.5,0.2,'setosa'),
(5,3.6,1.4,0.2,'setosa'),
(5.4,3.9,1.7,0.4,'setosa'),
(4.6,3.4,1.4,0.3,'setosa'),
(5,3.4,1.5,0.2,'setosa'),
(4.4,2.9,1.4,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(5.4,3.7,1.5,0.2,'setosa'),
(4.8,3.4,1.6,0.2,'setosa'),
(4.8,3,1.4,0.1,'setosa'),
(4.3,3,1.1,0.1,'setosa'),
(5.8,4,1.2,0.2,'setosa'),
(5.7,4.4,1.5,0.4,'setosa'),
(5.4,3.9,1.3,0.4,'setosa'),
(5.1,3.5,1.4,0.3,'setosa'),
(5.7,3.8,1.7,0.3,'setosa'),
(5.1,3.8,1.5,0.3,'setosa'),
(5.4,3.4,1.7,0.2,'setosa'),
(5.1,3.7,1.5,0.4,'setosa'),
(4.6,3.6,1,0.2,'setosa'),
(5.1,3.3,1.7,0.5,'setosa'),
(4.8,3.4,1.9,0.2,'setosa'),
(5,3,1.6,0.2,'setosa'),
(5,3.4,1.6,0.4,'setosa'),
(5.2,3.5,1.5,0.2,'setosa'),
(5.2,3.4,1.4,0.2,'setosa'),
(4.7,3.2,1.6,0.2,'setosa'),
(4.8,3.1,1.6,0.2,'setosa'),
(5.4,3.4,1.5,0.4,'setosa'),
(5.2,4.1,1.5,0.1,'setosa'),
(5.5,4.2,1.4,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(5,3.2,1.2,0.2,'setosa'),
(5.5,3.5,1.3,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(4.4,3,1.3,0.2,'setosa'),
(5.1,3.4,1.5,0.2,'setosa'),
(5,3.5,1.3,0.3,'setosa'),
(4.5,2.3,1.3,0.3,'setosa'),
(4.4,3.2,1.3,0.2,'setosa'),
(5,3.5,1.6,0.6,'setosa'),
(5.1,3.8,1.9,0.4,'setosa'),
(4.8,3,1.4,0.3,'setosa'),
(5.1,3.8,1.6,0.2,'setosa'),
(4.6,3.2,1.4,0.2,'setosa'),
(5.3,3.7,1.5,0.2,'setosa'),
(5,3.3,1.4,0.2,'setosa'),
(7,3.2,4.7,1.4,'versicolor'),
(6.4,3.2,4.5,1.5,'versicolor'),
(6.9,3.1,4.9,1.5,'versicolor'),
(5.5,2.3,4,1.3,'versicolor'),
(6.5,2.8,4.6,1.5,'versicolor'),
(5.7,2.8,4.5,1.3,'versicolor'),
(6.3,3.3,4.7,1.6,'versicolor'),
(4.9,2.4,3.3,1,'versicolor'),
(6.6,2.9,4.6,1.3,'versicolor'),
(5.2,2.7,3.9,1.4,'versicolor'),
(5,2,3.5,1,'versicolor'),
(5.9,3,4.2,1.5,'versicolor'),
(6,2.2,4,1,'versicolor'),
(6.1,2.9,4.7,1.4,'versicolor'),
(5.6,2.9,3.6,1.3,'versicolor'),
(6.7,3.1,4.4,1.4,'versicolor'),
(5.6,3,4.5,1.5,'versicolor'),
(5.8,2.7,4.1,1,'versicolor'),
(6.2,2.2,4.5,1.5,'versicolor'),
(5.6,2.5,3.9,1.1,'versicolor'),
(5.9,3.2,4.8,1.8,'versicolor'),
(6.1,2.8,4,1.3,'versicolor'),
(6.3,2.5,4.9,1.5,'versicolor'),
(6.1,2.8,4.7,1.2,'versicolor'),
(6.4,2.9,4.3,1.3,'versicolor'),
(6.6,3,4.4,1.4,'versicolor'),
(6.8,2.8,4.8,1.4,'versicolor'),
(6.7,3,5,1.7,'versicolor'),
(6,2.9,4.5,1.5,'versicolor'),
(5.7,2.6,3.5,1,'versicolor'),
(5.5,2.4,3.8,1.1,'versicolor'),
(5.5,2.4,3.7,1,'versicolor'),
(5.8,2.7,3.9,1.2,'versicolor'),
(6,2.7,5.1,1.6,'versicolor'),
(5.4,3,4.5,1.5,'versicolor'),
(6,3.4,4.5,1.6,'versicolor'),
(6.7,3.1,4.7,1.5,'versicolor'),
(6.3,2.3,4.4,1.3,'versicolor'),
(5.6,3,4.1,1.3,'versicolor'),
(5.5,2.5,4,1.3,'versicolor'),
(5.5,2.6,4.4,1.2,'versicolor'),
(6.1,3,4.6,1.4,'versicolor'),
(5.8,2.6,4,1.2,'versicolor'),
(5,2.3,3.3,1,'versicolor'),
(5.6,2.7,4.2,1.3,'versicolor'),
(5.7,3,4.2,1.2,'versicolor'),
(5.7,2.9,4.2,1.3,'versicolor'),
(6.2,2.9,4.3,1.3,'versicolor'),
(5.1,2.5,3,1.1,'versicolor'),
(5.7,2.8,4.1,1.3,'versicolor'),
(6.3,3.3,6,2.5,'virginica'),
(5.8,2.7,5.1,1.9,'virginica'),
(7.1,3,5.9,2.1,'virginica'),
(6.3,2.9,5.6,1.8,'virginica'),
(6.5,3,5.8,2.2,'virginica'),
(7.6,3,6.6,2.1,'virginica'),
(4.9,2.5,4.5,1.7,'virginica'),
(7.3,2.9,6.3,1.8,'virginica'),
(6.7,2.5,5.8,1.8,'virginica'),
(7.2,3.6,6.1,2.5,'virginica'),
(6.5,3.2,5.1,2,'virginica'),
(6.4,2.7,5.3,1.9,'virginica'),
(6.8,3,5.5,2.1,'virginica'),
(5.7,2.5,5,2,'virginica'),
(5.8,2.8,5.1,2.4,'virginica'),
(6.4,3.2,5.3,2.3,'virginica'),
(6.5,3,5.5,1.8,'virginica'),
(7.7,3.8,6.7,2.2,'virginica'),
(7.7,2.6,6.9,2.3,'virginica'),
(6,2.2,5,1.5,'virginica'),
(6.9,3.2,5.7,2.3,'virginica'),
(5.6,2.8,4.9,2,'virginica'),
(7.7,2.8,6.7,2,'virginica'),
(6.3,2.7,4.9,1.8,'virginica'),
(6.7,3.3,5.7,2.1,'virginica'),
(7.2,3.2,6,1.8,'virginica'),
(6.2,2.8,4.8,1.8,'virginica'),
(6.1,3,4.9,1.8,'virginica'),
(6.4,2.8,5.6,2.1,'virginica'),
(7.2,3,5.8,1.6,'virginica'),
(7.4,2.8,6.1,1.9,'virginica'),
(7.9,3.8,6.4,2,'virginica'),
(6.4,2.8,5.6,2.2,'virginica'),
(6.3,2.8,5.1,1.5,'virginica'),
(6.1,2.6,5.6,1.4,'virginica'),
(7.7,3,6.1,2.3,'virginica'),
(6.3,3.4,5.6,2.4,'virginica'),
(6.4,3.1,5.5,1.8,'virginica'),
(6,3,4.8,1.8,'virginica'),
(6.9,3.1,5.4,2.1,'virginica'),
(6.7,3.1,5.6,2.4,'virginica'),
(6.9,3.1,5.1,2.3,'virginica'),
(5.8,2.7,5.1,1.9,'virginica'),
(6.8,3.2,5.9,2.3,'virginica'),
(6.7,3.3,5.7,2.5,'virginica'),
(6.7,3,5.2,2.3,'virginica'),
(6.3,2.5,5,1.9,'virginica'),
(6.5,3,5.2,2,'virginica'),
(6.2,3.4,5.4,2.3,'virginica'),
(5.9,3,5.1,1.8,'virginica')

Nothing fancy, just a table with Iris data. Next, the training:

WITH transformed AS (
	SELECT TOP 100000
		S.*, 
		CASE WHEN S.iris = 'setosa' THEN 1.0 ELSE 0.0 END AS is_setosa, 
		CASE WHEN S.iris = 'virginica' THEN 1.0 ELSE 0.0 END AS is_virginica
	FROM samples AS S ORDER BY (SELECT ABS(CHECKSUM(NewId())))
),
training AS (
  SELECT TOP 100 * FROM transformed ORDER BY (SELECT RAND())
),
test AS (
  SELECT * FROM transformed EXCEPT SELECT * FROM training
),
learning AS (
  SELECT 
	  CAST(0.0 AS float) as w1, 
	  CAST(0.0 AS float) as w2, 
	  CAST(0.0 AS float) as w3, 
	  CAST(0.0 AS float) as w4,
	  CAST(0.0 AS float) as w5, 
	  CAST(0.0 AS float) as b1, 
	  CAST(0.0 AS float) as b2, 
	  CAST(0.0 AS float) as b3, 
	  CAST(0.0 AS float) as b4, 
	  CAST(0.0 AS float) as b5, 
	  
	  CAST(0.0 AS float) as gw1,
	  
	  CAST(0.0 AS float) as gw2, 
	  CAST(0.0 AS float) as gw3, 
	  CAST(0.0 AS float) as gw4, 
	  CAST(0.0 AS float) as gw5, 
	  CAST(0.0 AS float) as gb1, 
	  CAST(0.0 AS float) as gb2, 
	  CAST(0.0 AS float) as gb3, 
	  CAST(0.0 AS float) as gb4, 
	  CAST(0.0 AS float) as gb5, 
	  1 as iteration,
	  CAST(0.0 AS float) as mse,
	  1 as dummy
	  
  UNION ALL
  SELECT R.w1, R.w2, R.w3, R.w4, R.w5, R.b1, R.b2, R.b3, R.b4, R.b5, R.gw1, R.gw2, R.gw3, R.gw4, R.gw5, R.gb1, R.gb2, R.gb3, R.gb4, R.gb5, R.iteration, R.mse, R.dummy
  FROM (
	  SELECT
		  CAST(Z.w1 AS float) AS w1, 
		  CAST(Z.w2 AS float) AS w2, 
		  CAST(Z.w3 AS float) AS w3, 
		  CAST(Z.w4 AS float) AS w4,
		  CAST(Z.w5 AS float) AS w5, 
		  CAST(Z.b1 AS float) AS b1,
		  CAST(Z.b2 AS float) AS b2, 
		  CAST(Z.b3 AS float) AS b3, 
		  CAST(Z.b4 AS float) AS b4,
		  CAST(Z.b5 AS float) AS b5, 
		  CAST(AVG(Z.gw1) OVER(PARTITION BY Z.iteration) AS float) AS gw1,
		  CAST(AVG(Z.gw2) OVER(PARTITION BY Z.iteration) AS float) AS gw2,
		  CAST(AVG(Z.gw3) OVER(PARTITION BY Z.iteration) AS float) AS gw3, 
		  CAST(AVG(Z.gw4) OVER(PARTITION BY Z.iteration) AS float) AS gw4, 
		  CAST(AVG(Z.gw5) OVER(PARTITION BY Z.iteration) AS float) AS gw5, 
		  CAST(AVG(Z.gb1) OVER(PARTITION BY Z.iteration) AS float) AS gb1, 
		  CAST(AVG(Z.gb2) OVER(PARTITION BY Z.iteration) AS float) AS gb2, 
		  CAST(AVG(Z.gb3) OVER(PARTITION BY Z.iteration) AS float) AS gb3,
		  CAST(AVG(Z.gb4) OVER(PARTITION BY Z.iteration) AS float) AS gb4, 
		  CAST(AVG(z.gb5) OVER(PARTITION BY Z.iteration) AS float) AS gb5,
		  Z.iteration + 1 AS iteration,
		  CAST(AVG(z.squared_distance) OVER(PARTITION BY Z.w1, Z.w2, Z.w3, Z.w4, Z.w5, Z.b1, Z.b2, Z.b3, Z.b4, Z.b5, Z.iteration) AS float) AS mse,
		  Z.dummy AS dummy,
		  ROW_NUMBER() OVER(PARTITION BY Z.dummy ORDER BY Z.dummy) AS row_number
	  FROM (
		SELECT
		  X.*, 
		  X.distance * x.distance AS squared_distance, 
		  X.distance * X.sepal_width AS gw1, 
		  X.distance * X.petal_length AS gw2,
		  X.distance * X.petal_width AS gw3,
		  X.distance * X.is_setosa AS gw4,
		  X.distance * X.is_virginica AS gw5,
		  X.distance AS gb1,
		  X.distance AS gb2,
		  X.distance AS gb3,
		  X.distance AS gb4,
		  X.distance AS gb5,
		  1 as dummy
		FROM (
		  SELECT T.*, L.*, 
		  (T.sepal_width * L.w1 + L.b1) + 
		  (T.petal_length * L.w2 + L.b2) + 
		  (T.petal_width * L.w3 + L.b3) + 
		  (T.is_setosa * L.w4 + L.b4) + 
		  (T.is_virginica * L.w5 + L.b5)
		  - T.sepal_length AS distance
		  FROM training AS T, (
			SELECT
			  l.w1 - 0.01 * l.gw1 AS w1,
			  l.w2 - 0.01 * l.gw2 AS w2,
			  l.w3 - 0.01 * l.gw3 AS w3,
			  l.w4 - 0.01 * l.gw4 AS w4,
			  l.w5 - 0.01 * l.gw5 AS w5,
			  l.b1 - 0.01 * l.gb1 AS b1,
			  l.b2 - 0.01 * l.gb2 AS b2,
			  l.b3 - 0.01 * l.gb3 AS b3,
			  l.b4 - 0.01 * l.gb4 AS b4,
			  l.b5 - 0.01 * l.gb5 AS b5,
			  l.iteration,
			  MAX(l.iteration) OVER(PARTITION BY L.dummy) AS max_iteration
			FROM learning AS L
		  ) AS L
		  WHERE L.iteration = max_iteration
		  AND L.iteration < 100
		) AS X
	  ) AS Z
  ) AS R
  WHERE R.row_number = 1
)
SELECT DISTINCT * FROM learning ORDER BY iteration

Whoa, looks terrible. Let’s go step by step.

First, we get transformed table with samples in randomized order and two new features. The same as in Python code.

Next, we gat training and test tables representing datasets for training and evaluation respectively.

Nest, learning table. We want to represent the formula Aw + b - y where A is a matrix of samples, w and b are vectors of parameters we calculate with linear regression (representing the line), y is a vector of target variables. gw# and bw# are variables representing gradient, mse is a mean square error. dummy is just a variable we need to use in windowing functions since we cannot use grouping.

Next, we go with recursive CTE part. Let’s start from the most nested part.

Our initial learning values represent some coefficients with gradients calculated in last iteration. We could start with random values as well, here we start with constants. In the innermost view we do the actual training: for every feature we subtract gradient multiplied by learning rate (0.01 here) and this is how we calculate new coefficients. Because of performance issues we also calculate highest iteration available so far.

Next, We join training samples with coefficients and calculate the actual l^2 metric. We multiply coefficients by value and finally subtract target variable. Just before that we filter only the last iteration (with WHERE L.iteration = max_iteration) to decrease the dataset size. We also limit the number of iterations.

Now, we have distance calculated. We calculate squared distance and components for gradient. Since we need to find the derivatives on our own (and we know the result, don’t we?), we multiply distance by features for partial derivatives for w and get just a distance for partial derivatives for b.

Next, we do a lot of ugly casting to match the CTE requirements of uniform data types. We also calculate averages of gradients for every feature. We divide the dataset for given partitions, actually there is just one iteration, but we need to have some partition for syntax purposes. We could use Z.dummy as well.

Ultimately, we just get the values for the first row, as all the rows have the same values. We could ignore this filtering but the our dataset would be very big and training would take much longer.

And here are the results of the fiddle

w1	w2	w3	w4	w5	b1	b2	b3	b4	b5	gw1	gw2	gw3	gw4	gw5	gb1	gb2	gb3	gb4	gb5	iteration	mse	dummy
0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	1	0	1
0	0	0	0	0	0	0	0	0	0	-17.866099999999992	-23.68590000000001	-7.787099999999996	-1.54	-2.298	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	2	34.993599999999994	1
0.17866099999999993	0.23685900000000012	0.07787099999999997	0.0154	0.02298	0.05858000000000001	0.05858000000000001	0.05858000000000001	0.05858000000000001	0.05858000000000001	-12.380883275799999	-15.605772535299998	-5.0526124829	-1.2608740210000005	-1.4953080819999993	-4.007251468	-4.007251468	-4.007251468	-4.007251468	-4.007251468	3	16.27646348154281	1
0.30246983275799993	0.3929167253530001	0.12839712482899995	0.028008740210000008	0.03793308081999999	0.09865251468	0.09865251468	0.09865251468	0.09865251468	0.09865251468	-8.418834585943394	-10.81760488381647	-3.4164139831366573	-0.8216200035865951	-0.9625399001012132	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	4	7.769214971609398	1
0.3866581786174339	0.5010927741911648	0.16256126466036652	0.036224940245865964	0.04755847982101212	0.1262838291627874	0.1262838291627874	0.1262838291627874	0.1262838291627874	0.1262838291627874	-6.035317318894683	-6.606228048514185	-2.023863003680904	-0.8444615479627321	-0.5368810335347928	-1.928314905725471	-1.928314905725471	-1.928314905725471	-1.928314905725471	-1.928314905725471	5	3.90475896095533	1
0.44701135180638074	0.5671550546763067	0.18279989469717556	0.04466955572549328	0.05292729015636005	0.1455669782200421	0.1455669782200421	0.1455669782200421	0.1455669782200421	0.1455669782200421	-4.259932246247001	-4.69904967785691	-1.4272812920919014	-0.5994414351159882	-0.3482019192777488	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	6	2.104835405441499	1
0.48961067426885074	0.6141455514548758	0.19707270761809456	0.050663970076653166	0.05640930934913754	0.15937712983995134	0.15937712983995134	0.15937712983995134	0.15937712983995134	0.15937712983995134	-2.9131502368507523	-2.7900047108941357	-0.786858726083015	-0.4902770512098376	-0.13835673111718788	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	7	1.1812001776943115	1
0.5187421766373582	0.6420455985638172	0.2049412948789247	0.055566740588751544	0.057792876660309425	0.1687379847942487	0.1687379847942487	0.1687379847942487	0.1687379847942487	0.1687379847942487	-2.2815822515924356	-1.8669176720067389	-0.48251503714682115	-0.45540670681884726	-0.061890674774057554	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	8	0.8633620171570588	1
0.5415579991532826	0.6607147752838846	0.20976644525039292	0.060120807656940015	0.05841178340805	0.17591668156774054	0.17591668156774054	0.17591668156774054	0.17591668156774054	0.17591668156774054	-1.5999202323884023	-1.1506719996479482	-0.2718566121871	-0.3411146806640698	0.0033881819862846907	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	9	0.5882617765607544	1
0.5575572014771667	0.672221495280364	0.21248501137226392	0.06353195446358072	0.058377901588187155	0.1808783316268599	0.1808783316268599	0.1808783316268599	0.1808783316268599	0.1808783316268599	-1.4486656783545695	-0.7126912415655796	-0.10067134875629	-0.3912972375425979	0.05674696038537284	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	10	0.5623460089803844	1
0.5720438582607124	0.6793484076960198	0.21349172485982681	0.0674449268390067	0.05781043198433343	0.18525007642063715	0.18525007642063715	0.18525007642063715	0.18525007642063715	0.18525007642063715	-0.9306475612833495	-0.15288151866185962	0.0744573029382314	-0.3115818095388638	0.11786433111911637	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	11	0.47691160165459484	1
0.5813503338735458	0.6808772228826384	0.21274715183044451	0.07056074493439533	0.056631788673142266	0.18800897488615292	0.18800897488615292	0.18800897488615292	0.18800897488615292	0.18800897488615292	-0.7351425771472415	0.07290133335083944	0.15718629093419806	-0.29006779678979683	0.14059860645334357	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	12	0.4716605760544138	1
0.5887017596450183	0.68014820954913	0.21117528892110254	0.07346142290229331	0.05522580260860883	0.19014358391946656	0.19014358391946656	0.19014358391946656	0.19014358391946656	0.19014358391946656	-0.8496869040548067	0.025167947969057334	0.13655829480440038	-0.32553628103124616	0.13687662647187548	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	13	0.4339828812327394	1
0.5971986286855663	0.6798965300694394	0.20980970597305854	0.07671678571260578	0.05385703634389007	0.1925911568181025	0.1925911568181025	0.1925911568181025	0.1925911568181025	0.1925911568181025	-0.7795646413472435	-0.0028445948234325025	0.11892653879067865	-0.2834392136806792	0.12863334072334467	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	14	0.4378123399820016	1
0.6049942750990387	0.6799249760176738	0.20862044058515175	0.07955117784941257	0.052570702936656624	0.19485685597694405	0.19485685597694405	0.19485685597694405	0.19485685597694405	0.19485685597694405	-0.6199022285354157	0.2241419869804357	0.18844403042810082	-0.27631647315104274	0.14659705709607304	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	15	0.4041025964725166	1
0.6111932973843929	0.6776835561478695	0.20673600028087075	0.082314342580923	0.05110473236569589	0.19663186028279306	0.19663186028279306	0.19663186028279306	0.19663186028279306	0.19663186028279306	-0.5982201415162909	0.223020429269901	0.18786410062666778	-0.2778417715564467	0.14219223365502676	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	16	0.37677359384506365	1
0.6171754987995558	0.6754533518551704	0.20485735927460408	0.08509276029648746	0.049682810029145624	0.19830851604112262	0.19830851604112262	0.19830851604112262	0.19830851604112262	0.19830851604112262	-0.504727184910006	0.3488735090236196	0.23169940182123533	-0.27304021389090055	0.15137057707747892	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	17	0.377184784439225	1
0.6222227706486558	0.6719646167649342	0.2025403652563917	0.08782316243539647	0.048169104258370836	0.1997679973225409	0.1997679973225409	0.1997679973225409	0.1997679973225409	0.1997679973225409	-0.44998764800328744	0.2717811698570403	0.18838367144926862	-0.23068727674683934	0.13729583498280745	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	18	0.3358015439419573	1
0.6267226471286887	0.6692468050663638	0.20065652854189903	0.09013003520286486	0.04679614590854276	0.2010565874041992	0.2010565874041992	0.2010565874041992	0.2010565874041992	0.2010565874041992	-0.3327449242778986	0.4218061610425528	0.250603029713697	-0.2186705079506715	0.16399901690035443	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	19	0.3435586135251613	1
0.6300500963714677	0.6650287434559383	0.19815049824476205	0.09231674028237158	0.04515615573953922	0.20198473592632568	0.20198473592632568	0.20198473592632568	0.20198473592632568	0.20198473592632568	-0.3978821591273137	0.37818526431069854	0.22397317697432803	-0.2301645447314299	0.15656659519583066	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	20	0.32909737647738074	1
0.6340289179627409	0.6612468908128314	0.19591076647501876	0.09461838572968588	0.04359048978758091	0.20303677516825397	0.20303677516825397	0.20303677516825397	0.20303677516825397	0.20303677516825397	-0.49501548321294875	0.2824756186489414	0.19154839534497095	-0.2408611483352118	0.13186150356062115	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	21	0.34489232659720287	1
0.6389790727948703	0.658422134626342	0.19399528252156906	0.097026997213038	0.0422718747519747	0.20438284072112814	0.20438284072112814	0.20438284072112814	0.20438284072112814	0.20438284072112814	-0.501011104499183	0.24610496419058692	0.18246922326963586	-0.24294075989110828	0.12900402506295225	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	22	0.3276205502872738	1
0.6439891838398621	0.6559610849844362	0.1921705902888727	0.09945640481194908	0.040981834501345175	0.20577615317085846	0.20577615317085846	0.20577615317085846	0.20577615317085846	0.20577615317085846	-0.324775351310746	0.3746107213205154	0.2111139416712132	-0.20058332957969047	0.13686659765338857	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	23	0.2935346775960134	1
0.6472369373529696	0.652214977771231	0.19005945087216058	0.10146223810774598	0.03961316852481129	0.20664170012910063	0.20664170012910063	0.20664170012910063	0.20664170012910063	0.20664170012910063	-0.4838045379993712	0.1006841330157371	0.10643052467775646	-0.20234130338363415	0.09292148398897694	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	24	0.25669597691151474	1
0.6520749827329633	0.6512081364410737	0.18899514562538303	0.10348565114158231	0.03868395368492152	0.2079962419957329	0.2079962419957329	0.2079962419957329	0.2079962419957329	0.2079962419957329	-0.35575362508485314	0.3141589432080337	0.20155588679506228	-0.20586212623287087	0.137290032171127	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	25	0.28212690272956203	1
0.6556325189838118	0.6480665470089934	0.1869795867574324	0.10554427240391102	0.03731105336321025	0.20901797014710258	0.20901797014710258	0.20901797014710258	0.20901797014710258	0.20901797014710258	-0.3875722593205403	0.39575152037580225	0.2376765951470362	-0.22980463433325074	0.14741462189301477	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	26	0.3275866369928804	1
0.6595082415770173	0.6441090318052354	0.18460282080596205	0.10784231874724354	0.0358369071442801	0.20999015432848075	0.20999015432848075	0.20999015432848075	0.20999015432848075	0.20999015432848075	-0.38666612411329326	0.30298380325383223	0.18544748254748933	-0.2109305062431924	0.1207459374420806	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	27	0.28304267674482325	1
0.6633749028181501	0.6410791937726971	0.18274834598048714	0.10995162380967546	0.034629447769859295	0.21101813637609396	0.21101813637609396	0.21101813637609396	0.21101813637609396	0.21101813637609396	-0.24847682120208867	0.3094963979968232	0.18946065318285155	-0.16684214138339748	0.12473991223078539	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	28	0.246960439850011	1
0.6658596710301711	0.6379842297927288	0.18085373944865862	0.11162004522350943	0.03338204864755144	0.21177772536960832	0.21177772536960832	0.21177772536960832	0.21177772536960832	0.21177772536960832	-0.26713664218284927	0.36465094054170605	0.21039792537887425	-0.18595615618522174	0.12832303993252667	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	29	0.24185518216726717	1
0.6685310374519996	0.6343377203873117	0.17874976019486988	0.11347960678536165	0.03209881824822618	0.21249071518591414	0.21249071518591414	0.21249071518591414	0.21249071518591414	0.21249071518591414	-0.285372851000564	0.3905166535024734	0.2273788959907782	-0.1908825686291652	0.14691956832436104	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	30	0.28331571855985055	1
0.6713847659620052	0.630432553852287	0.1764759712349621	0.1153884324716533	0.030629622564982566	0.21327140639501657	0.21327140639501657	0.21327140639501657	0.21327140639501657	0.21327140639501657	-0.28485721231197203	0.30590558562656417	0.19669736501813603	-0.17821444469252792	0.1307925544131432	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	31	0.2785500859852849	1
0.674233338085125	0.6273734979960214	0.17450899758478072	0.11717057691857857	0.029321697020851134	0.21408867203160045	0.21408867203160045	0.21408867203160045	0.21408867203160045	0.21408867203160045	-0.5129761189347142	-0.020461479708326444	0.07474153979340704	-0.19273808445220872	0.08150043014763933	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	32	0.2402129621304276	1
0.6793630992744721	0.6275781127931046	0.17376158218684665	0.11909795776310066	0.02850669271937474	0.21567567094561496	0.21567567094561496	0.21567567094561496	0.21567567094561496	0.21567567094561496	-0.3645442663637262	0.16150327033185932	0.13174406832208205	-0.16820261994603983	0.1031770297602549	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	33	0.20801326313092441	1
0.6830085419381094	0.625963080089786	0.17244414150362583	0.12077998396256105	0.027474922421772192	0.21667973904336865	0.21667973904336865	0.21667973904336865	0.21667973904336865	0.21667973904336865	-0.16005878697371478	0.4342485263778689	0.23041246929945536	-0.1625295605496381	0.12296231586278182	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	34	0.24620358056538802	1
0.6846091298078466	0.6216205948260073	0.1701400168106313	0.12240527956805744	0.026245299263144374	0.2171380890556932	0.2171380890556932	0.2171380890556932	0.2171380890556932	0.2171380890556932	-0.41601032764657925	0.11475000070103901	0.11587338094101118	-0.18685988461561295	0.0887760397414091	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	35	0.23356966469422705	1
0.6887692330843124	0.6204730948189969	0.16898128300122117	0.12427387841421357	0.02535753886573028	0.2183212042033871	0.2183212042033871	0.2183212042033871	0.2183212042033871	0.2183212042033871	-0.2785437161017364	0.3233998047974474	0.18920131636751455	-0.18286457427062713	0.11775577342967697	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	36	0.24040606051671987	1
0.6915546702453297	0.6172390967710224	0.16708926983754602	0.12610252415691983	0.024179981131433513	0.21908595876071058	0.21908595876071058	0.21908595876071058	0.21908595876071058	0.21908595876071058	-0.22878912220235262	0.34081474260147404	0.20446731002501103	-0.16554152172856654	0.12462164430370101	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	37	0.2591299535666149	1
0.6938425614673532	0.6138309493450077	0.1650445967372959	0.1277579393742055	0.022933764688396502	0.21977299563157562	0.21977299563157562	0.21977299563157562	0.21977299563157562	0.21977299563157562	-0.2365525548720244	0.19374497966192147	0.15472041810991788	-0.14106522179189995	0.11214128223363509	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	38	0.21101976832048863	1
0.6962080870160735	0.6118934995483885	0.16349739255619672	0.1291685915921245	0.02181235186606015	0.22052739046735798	0.22052739046735798	0.22052739046735798	0.22052739046735798	0.22052739046735798	-0.24388468328031931	0.22846748194350855	0.15135068447016434	-0.13594214266327828	0.10725768425018055	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	39	0.1976078850328914	1
0.6986469338488767	0.6096088247289534	0.16198388571149508	0.13052801301875727	0.020739775023558345	0.22117237259937225	0.22117237259937225	0.22117237259937225	0.22117237259937225	0.22117237259937225	-0.33139653494040894	0.21233479979061756	0.15611986380630455	-0.1776685173153882	0.1016600568301543	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	40	0.2119873530999341	1
0.7019608991982808	0.6074854767310472	0.16042268707343205	0.13230469819191115	0.0197231744552568	0.2221547266566301	0.2221547266566301	0.2221547266566301	0.2221547266566301	0.2221547266566301	-0.1820100756511364	0.3385437748726924	0.1895384437749324	-0.14677758015871256	0.11365139288940825	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	41	0.21098584005326665	1
0.7037809999547922	0.6041000389823203	0.1585273026356827	0.1337724739934983	0.01858666052636272	0.22262047180947983	0.22262047180947983	0.22262047180947983	0.22262047180947983	0.22262047180947983	-0.24014841963522296	0.247211257489466	0.1577555298296247	-0.14619268030256632	0.0929074227806532	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	42	0.1907255411266933	1
0.7061824841511444	0.6016279264074257	0.15694974733738645	0.13523440079652396	0.017657586298556186	0.22330466405961144	0.22330466405961144	0.22330466405961144	0.22330466405961144	0.22330466405961144	-0.16124098721985047	0.33210451867969654	0.18635267245402815	-0.14390401892582097	0.10331979261932978	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	43	0.2026762296511086	1
0.7077948940233428	0.5983068812206287	0.15508622061284616	0.13667344098578219	0.016624388372362887	0.22376418335789405	0.22376418335789405	0.22376418335789405	0.22376418335789405	0.22376418335789405	-0.26076824328866843	0.1079584865588379	0.09819151125046531	-0.12024340228460798	0.09233343337573185	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	44	0.18878720771229177	1
0.7104025764562295	0.5972272963550403	0.15410430550034152	0.13787587500862827	0.01570105403860557	0.22455640823331433	0.22455640823331433	0.22455640823331433	0.22455640823331433	0.22455640823331433	-0.17740718061463628	0.22014674324259673	0.1362254765522256	-0.11907598110044319	0.0918081731169333	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	45	0.17577173056129552	1
0.7121766482623759	0.5950258289226144	0.15274205073481925	0.13906663481963272	0.014782972307436236	0.2250822965691386	0.2250822965691386	0.2250822965691386	0.2250822965691386	0.2250822965691386	-0.17988621609542654	0.2806248857396519	0.16684906001955507	-0.13179877335974524	0.11636925597792469	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	46	0.1981644076210309	1
0.7139755104233302	0.5922195800652178	0.1510735601346237	0.14038462255323017	0.01361927974765699	0.2256285549687686	0.2256285549687686	0.2256285549687686	0.2256285549687686	0.2256285549687686	-0.2584227412524325	0.10067043498952959	0.09950617905851229	-0.12221606979845091	0.07900578792054862	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	47	0.17812710443683283	1
0.7165597378358545	0.5912128757153226	0.15007849834403858	0.14160678325121467	0.012829221868451503	0.22637901605398117	0.22637901605398117	0.22637901605398117	0.22637901605398117	0.22637901605398117	-0.22804415541398007	0.23192581929406977	0.16116113538750568	-0.13938740548221923	0.10589767726938647	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	48	0.20346020018740268	1
0.7188401793899942	0.5888936175223819	0.14846688699016353	0.14300065730603687	0.011770245095757638	0.22700233449097043	0.22700233449097043	0.22700233449097043	0.22700233449097043	0.22700233449097043	-0.26365809483682423	0.0798335159629457	0.10408220872868917	-0.12185364491356804	0.08895869156554267	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	49	0.20362719044891814	1
0.7214767603383625	0.5880952823627524	0.14742606490287663	0.14421919375517256	0.010880658180102212	0.22783537395780798	0.22783537395780798	0.22783537395780798	0.22783537395780798	0.22783537395780798	-0.12070713376803383	0.19327870847710976	0.1327363086814136	-0.09172258738844023	0.10350211081461654	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	50	0.17307091982205539	1
0.7226838316760429	0.5861624952779814	0.1460987018160625	0.14513641962905696	0.009845637071956046	0.2282376960585759	0.2282376960585759	0.2282376960585759	0.2282376960585759	0.2282376960585759	-0.15208272219677954	0.36597879356450186	0.201455586587483	-0.14214553955958847	0.11307259585551618	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	51	0.20510799671368793	1
0.7242046588980107	0.5825027073423363	0.14408414595018768	0.14655787502465284	0.008714911113400885	0.22864899959020996	0.22864899959020996	0.22864899959020996	0.22864899959020996	0.22864899959020996	-0.23546368590662695	0.07863444953185318	0.08998641352883109	-0.11249190149937331	0.0754169796131565	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	52	0.16452781662027022	1
0.7265592957570769	0.5817163628470178	0.14318428181489937	0.14768279403964657	0.007960741317269319	0.22939672107634704	0.22939672107634704	0.22939672107634704	0.22939672107634704	0.22939672107634704	-0.22412747127965854	0.048052278123844744	0.0775914843586476	-0.10227680547335986	0.06254776208605048	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	53	0.14017649291733109	1
0.7288005704698735	0.5812358400657793	0.1424083669713129	0.14870556209438016	0.007335263696408814	0.2301286646736147	0.2301286646736147	0.2301286646736147	0.2301286646736147	0.2301286646736147	-0.25624041560747846	0.029390761535552937	0.06828636833776426	-0.10898589105571262	0.07017707601687669	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	54	0.14970429593083914	1
0.7313629746259482	0.5809419324504238	0.14172550328793526	0.14979542100493728	0.006633492936240047	0.23092015221009232	0.23092015221009232	0.23092015221009232	0.23092015221009232	0.23092015221009232	-0.22913911184216446	0.13226417066134594	0.11863807691589294	-0.12507494662909818	0.08910521429375504	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	55	0.18622134861905248	1
0.7336543657443699	0.5796192907438104	0.14053912251877634	0.15104617047122826	0.005742440793302496	0.23163034824545525	0.23163034824545525	0.23163034824545525	0.23163034824545525	0.23163034824545525	-0.09998887300596754	0.28464046462850695	0.16586450497279256	-0.1004024281651072	0.09738975422044112	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	56	0.16501636741652853	1
0.7346542544744296	0.5767728860975253	0.1388804774690484	0.15205019475287934	0.004768543251098085	0.2319265543961002	0.2319265543961002	0.2319265543961002	0.2319265543961002	0.2319265543961002	-0.19611257838146476	0.07306489202052129	0.0839723571585399	-0.09336818629413685	0.07303650082337178	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	57	0.16714329194422128	1
0.7366153802582442	0.5760422371773201	0.138040753897463	0.15298387661582072	0.004038178242864367	0.23254798329460552	0.23254798329460552	0.23254798329460552	0.23254798329460552	0.23254798329460552	-0.22363147525512367	-0.0024376351496950654	0.052456962471046274	-0.09354886698006908	0.06150269888035172	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	58	0.1568818655691469	1
0.7388516950107955	0.576066613528817	0.13751618427275256	0.15391936528562142	0.00342315125406085	0.23328768520146329	0.23328768520146329	0.23328768520146329	0.23328768520146329	0.23328768520146329	-0.08953988837717489	0.19607299980858844	0.12001900614550602	-0.07744170356369351	0.07989846080456761	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	59	0.15146671249226407	1
0.7397470938945673	0.5741058835307312	0.1363159942112975	0.15469378232125836	0.002624166646015174	0.2335653224642438	0.2335653224642438	0.2335653224642438	0.2335653224642438	0.2335653224642438	-0.10934864957790269	0.15525500702030612	0.1133993364010108	-0.079016224469531	0.08861978417259017	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	60	0.1510732057321453	1
0.7408405803903463	0.5725533334605282	0.1351820008472874	0.15548394456595366	0.0017379688042892722	0.23397883659230959	0.23397883659230959	0.23397883659230959	0.23397883659230959	0.23397883659230959	-0.2160275996543884	0.11691690812699129	0.10078229606147529	-0.1096247693793104	0.08362684381319943	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	61	0.17161510513564956	1
0.7430008563868902	0.5713841643792582	0.13417417788667266	0.15658019225974676	0.0009017003661572779	0.23464868336586248	0.23464868336586248	0.23464868336586248	0.23464868336586248	0.23464868336586248	-0.055536840564336075	0.31286777483113964	0.17508517551052347	-0.0959580020511796	0.10970668633538562	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	62	0.1752640440036056	1
0.7435562247925336	0.5682554866309468	0.13242332613156743	0.15753977228025856	-0.00019536649719657827	0.23484543104487304	0.23484543104487304	0.23484543104487304	0.23484543104487304	0.23484543104487304	-0.18244391053238806	0.07504035522087159	0.08444011577478755	-0.08784467564824183	0.07069882358929407	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	63	0.14813242245230082	1
0.7453806638978575	0.5675050830787381	0.13157892497381957	0.15841821903674097	-0.000902354733089519	0.2354208691654422	0.2354208691654422	0.2354208691654422	0.2354208691654422	0.2354208691654422	-0.09810495103647211	0.21942831404466884	0.12957345616493116	-0.08822113899863321	0.07607507223736298	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	64	0.14548512205790703	1
0.7463617134082222	0.5653107999382915	0.13028319041217026	0.1593004304267273	-0.0016631054554631488	0.23567833969156915	0.23567833969156915	0.23567833969156915	0.23567833969156915	0.23567833969156915	-0.32812328713649097	-0.14080925333981334	0.010850431329594932	-0.10260966746205227	0.05049462621900181	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	65	0.16076785344633143	1
0.7496429462795872	0.5667188924716896	0.1301746860988743	0.16032652710134784	-0.0021680517176531668	0.2367847656359493	0.2367847656359493	0.2367847656359493	0.2367847656359493	0.2367847656359493	-0.1979877928574162	0.04136037632766339	0.06919498812733316	-0.0953543502035378	0.06001726142107431	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	66	0.1540183055576599	1
0.7516228242081613	0.566305288708413	0.12948273621760098	0.16128007060338323	-0.0027682243318639097	0.2374454582112869	0.2374454582112869	0.2374454582112869	0.2374454582112869	0.2374454582112869	-0.08920006127166223	0.19791213757769224	0.11667310986796635	-0.08501533410229058	0.0714916421354736	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	67	0.14287594000870538	1
0.752514824820878	0.5643261673326361	0.12831600511892133	0.16213022394440613	-0.003483140753218646	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.01905716157834171	0.3048941600109156	0.15246005973911841	-0.06397268858760383	0.08826052003745843	0.004439930870313775	0.004439930870313775	0.004439930870313775	0.004439930870313775	0.004439930870313775	68	0.14354684658906344	1
0.7523242532050946	0.5612772257325269	0.12679140452153015	0.16276995083028217	-0.00436574595359323	0.23766607833982725	0.23766607833982725	0.23766607833982725	0.23766607833982725	0.23766607833982725	-0.12454143429966495	0.08808459085958564	0.08611866524566845	-0.0756576129359348	0.07663404484811812	-0.046410981816632	-0.046410981816632	-0.046410981816632	-0.046410981816632	-0.046410981816632	69	0.1552713068383162	1
0.7535696675480912	0.5603963798239311	0.12593021786907346	0.1635265269596415	-0.005132086402074411	0.23813018815799358	0.23813018815799358	0.23813018815799358	0.23813018815799358	0.23813018815799358	-0.14121596124163496	0.06456803442600145	0.07155879643214269	-0.07373620627454074	0.06202810952350835	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	70	0.13120812533515577	1
0.7549818271605075	0.559750699479671	0.12521462990475205	0.16426388902238692	-0.005752367497309495	0.23859998978309002	0.23859998978309002	0.23859998978309002	0.23859998978309002	0.23859998978309002	-0.04107865859575949	0.17660052977850144	0.1173143825892387	-0.060542654804122	0.07215316852100737	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	71	0.1504912691420637	1
0.7553926137464652	0.557984694181886	0.12404148607885966	0.16486931557042814	-0.006473899182519569	0.23874135726386012	0.23874135726386012	0.23874135726386012	0.23874135726386012	0.23874135726386012	-0.18856706766651427	0.003499916117622832	0.05590851270829903	-0.0772715012971761	0.06349906696270137	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	72	0.15533830139959728	1
0.7572782844231303	0.5579496950207098	0.12348240095177668	0.1656420305833999	-0.007108889852146583	0.23940410722342967	0.23940410722342967	0.23940410722342967	0.23940410722342967	0.23940410722342967	-0.22750105830278453	-0.05423550300058153	0.03548031707478287	-0.08248403931937454	0.04950287068564042	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	73	0.1467113877863856	1
0.7595532950061581	0.5584920500507156	0.12312759778102884	0.16646687097659366	-0.007603918559002987	0.24017766964928933	0.24017766964928933	0.24017766964928933	0.24017766964928933	0.24017766964928933	-0.10512526149587802	0.09710354354836108	0.0764814073253542	-0.062317819415637035	0.064312522127424	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	74	0.12477147666327537	1
0.7606045476211168	0.557521014615232	0.1223627837077753	0.16709004917075002	-0.008247043780277227	0.24051926710762073	0.24051926710762073	0.24051926710762073	0.24051926710762073	0.24051926710762073	-0.1312889166516503	0.043072316463475493	0.06466703346740718	-0.06822191816703964	0.0644501410408365	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	75	0.139848343937992	1
0.7619174367876334	0.5570902914505972	0.12171611337310123	0.1677722683524204	-0.008891545190685593	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.05608208253031271	0.3332334320911999	0.16612740246198984	-0.05842284436754449	0.08401380854343107	0.016064210179425747	0.016064210179425747	0.016064210179425747	0.016064210179425747	0.016064210179425747	76	0.1439294009963182	1
0.7613566159623302	0.5537579571296852	0.12005483934848134	0.16835649679609585	-0.009731683276119904	0.2408461078390527	0.2408461078390527	0.2408461078390527	0.2408461078390527	0.2408461078390527	-0.1444919742242204	0.05212588366251118	0.0757039417929166	-0.0709897148297897	0.06129848216074554	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	77	0.14755249301554693	1
0.7628015357045724	0.5532366982930601	0.11929779993055217	0.16906639394439374	-0.01034466809772736	0.24137423102946906	0.24137423102946906	0.24137423102946906	0.24137423102946906	0.24137423102946906	-0.08210801404100974	0.0651040977250028	0.06013389048457362	-0.05381548528687653	0.05599204646535418	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	78	0.12778180880043968	1
0.7636226158449825	0.5525856573158101	0.11869646102570644	0.1696045487972625	-0.010904588562380902	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.04244081758396698	0.2869107655779289	0.14777111986448882	-0.048953794555473884	0.09078126424653137	0.011781732644253298	0.011781732644253298	0.011781732644253298	0.011781732644253298	0.011781732644253298	79	0.14254292725697765	1
0.7631982076691428	0.5497165496600308	0.11721874982706156	0.17009408674281723	-0.011812401204846217	0.2415701624402271	0.2415701624402271	0.2415701624402271	0.2415701624402271	0.2415701624402271	-0.15486226059221975	0.0536218177160101	0.08320641094516558	-0.07734953852493864	0.07291625238903855	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	80	0.15134168925096764	1
0.764746830275065	0.5491803314828707	0.1163866857176099	0.1708675821280666	-0.012541563728736603	0.24211622945204864	0.24211622945204864	0.24211622945204864	0.24211622945204864	0.24211622945204864	-0.07666026528371934	0.08965244539838306	0.07969940256955325	-0.056744964738265076	0.05740035259615096	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	81	0.12497887736713718	1
0.7655134329279022	0.5482838070288869	0.11558969169191437	0.17143503177544925	-0.013115567254698111	0.24241321132755958	0.24241321132755958	0.24241321132755958	0.24241321132755958	0.24241321132755958	-0.1955624427970849	-0.1490763449540694	-0.005952343631955764	-0.04892586145794186	0.036323511504754685	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	82	0.1308424008423824	1
0.767469057355873	0.5497745704784276	0.11564921512823392	0.17192429039002866	-0.013478802369745657	0.2431270403274694	0.2431270403274694	0.2431270403274694	0.2431270403274694	0.2431270403274694	-0.07958764279476577	-0.02787911452614944	0.020861408995979117	-0.03534351779308108	0.033310314330203886	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	83	0.10959137096965264	1
0.7682649337838207	0.5500533616236891	0.11544060103827414	0.17227772556795948	-0.013811905513047696	0.24346278535882307	0.24346278535882307	0.24346278535882307	0.24346278535882307	0.24346278535882307	-0.10697400824212892	0.0005825758405553483	0.036658693836451546	-0.04760596259891188	0.042994003382070184	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	84	0.12988951457702805	1
0.769334673866242	0.5500475358652835	0.11507401409990962	0.1727537851939486	-0.014241845546868397	0.24385288631461807	0.24385288631461807	0.24385288631461807	0.24385288631461807	0.24385288631461807	-0.16818210932614797	-0.05800927001748076	0.023743373104544156	-0.06003821194789818	0.03987184518080429	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	85	0.12130372159084488	1
0.7710164949595034	0.5506276285654583	0.11483658036886418	0.1733541673134276	-0.01464056399867644	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.08977679614187536	0.2757420815255268	0.14310472026978727	-0.03784614043050057	0.08319118404603752	0.020958453855994943	0.020958453855994943	0.020958453855994943	0.020958453855994943	0.020958453855994943	86	0.13311264087386843	1
0.7701187269980847	0.547870207750203	0.1134055331661663	0.1737326287177326	-0.015472475839136815	0.24423678228965665	0.24423678228965665	0.24423678228965665	0.24423678228965665	0.24423678228965665	-0.01020537860094696	0.18597865909212818	0.10807216653184656	-0.04405159118881292	0.0760369607846935	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	87	0.13400454338996654	1
0.7702207807840942	0.5460104211592818	0.11232481150084785	0.17417314462962072	-0.01623284544698375	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.009841974683875019	0.2543513898421049	0.13829510046539067	-0.06201639569073605	0.07077867639799883	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	88	0.1386872357874715	1
0.7701223610372554	0.5434669072608608	0.11094186049619394	0.17479330858652808	-0.01694063221096374	0.24427042003661337	0.24427042003661337	0.24427042003661337	0.24427042003661337	0.24427042003661337	-0.18770427314422566	-0.11783529327564121	0.004345102911476412	-0.04901970381455122	0.032903348126294966	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	89	0.1231833502421304	1
0.7719994037686977	0.5446452601936171	0.11089840946707917	0.1752835056246736	-0.01726966569222669	0.2449281520957269	0.2449281520957269	0.2449281520957269	0.2449281520957269	0.2449281520957269	-0.10777513643651629	-0.02946688859150442	0.0358105315654737	-0.041458818753371436	0.049801760690163246	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	90	0.14234524490869968	1
0.7730771551330629	0.5449399290795321	0.11054030415142443	0.1756980938122073	-0.017767683299128322	0.24535905624930235	0.24535905624930235	0.24535905624930235	0.24535905624930235	0.24535905624930235	-0.03029478545601165	0.12063189104715981	0.0926271105306644	-0.04494968801482946	0.06225696531422806	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	91	0.1457493115440951	1
0.7733801029876229	0.5437336101690605	0.10961403304611779	0.17614759069235558	-0.018390252952270602	0.24553466243950903	0.24553466243950903	0.24553466243950903	0.24553466243950903	0.24553466243950903	-0.06523603548853846	0.09667365400886294	0.0713368152859959	-0.049380507728132364	0.05282451252486076	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	92	0.11709471026673818	1
0.7740324633425083	0.5427668736289719	0.10890066489325784	0.17664139576963692	-0.01891849807751921	0.24581606656189964	0.24581606656189964	0.24581606656189964	0.24581606656189964	0.24581606656189964	-0.08166418125447414	0.011620192693396562	0.04235739894196954	-0.04497095901436263	0.032569707128320594	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	93	0.12100758485735008	1
0.774849105155053	0.542650671702038	0.10847709090383814	0.17709110535978054	-0.019244195148802413	0.2461020973625736	0.2461020973625736	0.2461020973625736	0.2461020973625736	0.2461020973625736	-0.054049878408753055	0.09529952496794664	0.07496089108022766	-0.051295296110918515	0.053311038307389834	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	94	0.14030289681359853	1
0.7753896039391406	0.5416976764523584	0.10772748199303586	0.1776040583208897	-0.019777305531876312	0.24632948810975433	0.24632948810975433	0.24632948810975433	0.24632948810975433	0.24632948810975433	-0.004634836658378023	0.11994905030666918	0.0796069470946916	-0.03872300046315407	0.05500233694519574	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	95	0.13482222346592368	1
0.7754359523057244	0.5404981859492918	0.10693141252208894	0.17799128832552125	-0.02032732890132827	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.0019463231711243912	0.09639414315953916	0.07592558455146892	-0.02516854515566532	0.06450413358614304	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	96	0.11817866295073942	1
0.7754164890740132	0.5395342445176964	0.10617215667657426	0.1782429737770779	-0.020972370237189703	0.24650578727508796	0.24650578727508796	0.24650578727508796	0.24650578727508796	0.24650578727508796	-0.09513208220936142	0.048799027507884826	0.06116810689537701	-0.04946169106847582	0.056547844377849064	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	97	0.1280206781422752	1
0.7763678098961068	0.5390462542426175	0.10556047560762048	0.17873759068776265	-0.021537848680968193	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.021823360536357136	0.24381010531203992	0.13430132877809084	-0.0542497680525091	0.06769974962677341	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	98	0.12127154804897983	1
0.7761495762907432	0.5366081531894972	0.10421746231983957	0.17928008836828774	-0.022214846177235927	0.24683400936972374	0.24683400936972374	0.24683400936972374	0.24683400936972374	0.24683400936972374	-0.01971157818123563	0.13430145878528907	0.0901747839551022	-0.04185287529821893	0.05974885555957459	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	99	0.124218103071711	1
0.7763466920725556	0.5352651386016443	0.10331571448028855	0.17969861712126992	-0.022812334732831674	0.24693784882027114	0.24693784882027114	0.24693784882027114	0.24693784882027114	0.24693784882027114	-0.12080450395174498	-0.008428871457682084	0.042768930169876214	-0.0453162870866954	0.05141831036282574	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	100	0.12942311376115329	1

You can evaluate the dataset now.

This query works but has a lot of drawback. In next parts we will try to fix some of them.

]]>
https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/feed/ 2
Machine Learning Part 1 — Linear regression in MXNet https://blog.adamfurmanek.pl/2018/10/20/machine-learning-part-1/ https://blog.adamfurmanek.pl/2018/10/20/machine-learning-part-1/#comments Sat, 20 Oct 2018 08:00:13 +0000 https://blog.adamfurmanek.pl/?p=2629 Continue reading Machine Learning Part 1 — Linear regression in MXNet]]>

This is the first part of the Machine Learning series. For your convenience you can find other parts using the links below (or by guessing the address):
Part 1 — Linear regression in MXNet
Part 2 — Linear regression in SQL
Part 3 — Linear regression in SQL revisited
Part 4 — Linear regression in T-SQL
Part 5 — Linear regression
Part 6 — Matrix multiplication in SQL
Part 7 — Forward propagation in neural net in SQL
Part 8 — Backpropagation in neural net in SQL

In this series I assume you do know basics of machine learning. I will provide some source code for different use cases but no extensive explanation. Let’s go.

Today we will take a look at linear regression in MXNet. We will predict sepal length in well know iris dataset.

I assume you have the dataset uploaded to s3. Let’s go with loading the dataset:

from mxnet import nd, autograd
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split

local_file="/tmp/Iris.csv"
df = pd.read_csv("/blabla/" + local_file, delimiter=',', header = 0)

print df.shape

We can see some records with print df.head(3) or check different iris categories with df.iris.unique().

We have one target variable and four features. Let’s create two adttional:

df['i_setosa'] = 0
df.loc[(df['iris']=='setosa'), 'i_setosa']= 1
df['i_versicolor'] = 0
df.loc[(df['iris']=='versicolor'), 'i_versicolor']= 1

Two features similar to one hot encoding of categorical feature.

Time to prepare training and test datasets with: df_train, df_test = train_test_split( df, test_size=0.3, random_state=1)

Let’s get down to training. We start with defining the training variables and the target one:

independent_var = ['sepal_width','petal_length','petal_width','i_setosa','i_versicolor']
y_train = nd.array(df_train['sepal_length'])
X_train = nd.array(df_train[independent_var]) 
y_test = nd.array(df_test['sepal_length'])
X_test = nd.array(df_test[independent_var])

Let’s prepare class representing data instance:

class data:
    def __init__(self,X,y):
        self.X = nd.array(X) 
        self.y = nd.array(y)
        cols = X.shape[1]
        self.initialize_parameter(cols)

    def initialize_parameter(self,cols):
        self.w = nd.random.normal(shape = [cols, 1])
        self.b = nd.random.normal(shape = 1)
        self.params = [self.w, self.b]

        for x in self.params:
            x.attach_grad()

We initialize parameters and attach gradient calculation. This is a very nice feature, we don’t need to take care of derivatives, everything is taken care for us.

Let’s now carry on with a single step for gradient:

class optimizer:
    def __init__(self):
        pass
    
    def GD(self,data_instance,lr):
        for x in data_instance.params:
            x[:] = x - x.grad * lr

We just subtract gradient multiplied by learning rate. Also, we use x[:] instead of x to avoid reinitializing the gradient. If we go with the latter, we will see the following error:

Check failed: !AGInfo::IsNone(*i) Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.

Now, let’s train our model:

def main():
    # Modeling parameters
    learning_rate = 1e-2
    num_iters = 100
    
    data_instance = data(X_train,y_train) 

    opt = optimizer()
    gd = optimizer.GD
    
    loss_sequence = []
    
    for iteration in range(num_iters):
        with autograd.record():
            loss = nd.mean((nd.dot(X_train, data_instance.w) + data_instance.b - y_train)**2)
            
        loss.backward()
        gd(opt, data_instance, learning_rate)
        
        print ("iteration %s, Mean loss: %s" % (iteration,loss))
        loss_sequence.append(loss.asscalar())
        
    plt.figure(num=None,figsize=(8, 6))
    plt.plot(loss_sequence)
    plt.xlabel('iteration',fontsize=14)
    plt.ylabel('Mean loss',fontsize=14)

We should get the following:

Training - mean

Note that our loss uses mean whereas we could calculate just a sum. However, due to overflow problems we would get the following:

Training - sum

Finally, let’s check the performance of trained model:

MSE = nd.mean(((nd.dot(X_test, data_instance.w) + data_instance.b) - y_test)**2)
print ("Mean Squared Error on Test Set: %s" % (MSE))

Done.

Summary

We can see that linear regression is pretty concise and easy. However, this uses Python and Spark which me might want to avoid. In next parts we will take a look at different solutions.

]]>
https://blog.adamfurmanek.pl/2018/10/20/machine-learning-part-1/feed/ 4