MSSQL – Random IT Utensils https://blog.adamfurmanek.pl IT, operating systems, maths, and more. Sat, 23 Mar 2019 16:45:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 Windowing functions in recursive CTE https://blog.adamfurmanek.pl/2019/07/13/windowing-functions-in-recursive-cte/ https://blog.adamfurmanek.pl/2019/07/13/windowing-functions-in-recursive-cte/#respond Sat, 13 Jul 2019 08:00:24 +0000 https://blog.adamfurmanek.pl/?p=2994 Continue reading Windowing functions in recursive CTE]]> Today we will see an interesting case of incompatibility between MS SQL Server 2017 and PostgreSQL 9.6 (and different versions as well). Let’s start with this code:

WITH dummy AS(
    SELECT 1 AS rowValue, 0 AS phase
    UNION ALL
    SELECT 2 AS rowValue, 0 AS phase
),
solution AS (
    SELECT * FROM dummy
),
solution2 AS(
    SELECT
        SUM(rowValue) OVER (PARTITION BY phase) AS rowValue,
        phase + 1 AS phase
    FROM solution
    WHERE phase = 0
)
SELECT *
FROM solution2
WHERE phase = 1

We emulate a recursive CTE. We have two columns in source dataset, we want to sum first column for rows partitioned by second column. This gives a very expected result:

rowValue    phase
----------- -----------
3           1
3           1

Now let’s use recursive CTE in MS SQL:

WITH dummy AS(
    SELECT 1 AS rowValue, 0 AS phase
    UNION ALL
    SELECT 2 AS rowValue, 0 AS phase
),
solution AS (
    SELECT * FROM dummy
    UNION ALL
        SELECT
        SUM(rowValue) OVER (PARTITION BY phase) AS rowValue,
        phase + 1 AS phase
    FROM solution
    WHERE phase = 0
)
SELECT * FROM solution WHERE phase = 1;

And result is:

rowValue    phase
----------- -----------
2           1
1           1

However, PostgreSQL gives correct values:

rowValue    phase
----------- -----------
3           1
3           1

Beware! Also, see this great post explaining row-based approach and set-based approach for implementing CTE.

]]>
https://blog.adamfurmanek.pl/2019/07/13/windowing-functions-in-recursive-cte/feed/ 0
Machine Learning Part 4 — Linear regression in T-SQL https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/ https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/#comments Sat, 10 Nov 2018 09:00:03 +0000 https://blog.adamfurmanek.pl/?p=2640 Continue reading Machine Learning Part 4 — Linear regression in T-SQL]]>

This is the fourth part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

This time we are going to implement linear regression as a function. This gives us a little more flexibility in terms of debugging the code and reading it later, also, we can implement much more complex algorithms. Too bad, we can’t use this in Redshift at this time as it doesn’t support such functions or stored procedures. So I will use T-SQL and test the code with MS SQL 2017. I assume you have table samples with Iris dataset.

We start with declaring a type for the function parameter:

CREATE TYPE SamplesTable 
AS TABLE (id int, feature int, value float, target float)

Next, let’s prepare samples for training:

DECLARE @numbers TABLE (N int)

INSERT INTO @numbers SELECT TOP 5 row_number() OVER(ORDER BY t1.number) AS N FROM master..spt_values AS t1 CROSS JOIN master..spt_values AS t2

DECLARE @samples TABLE(
	sepal_length float
	,sepal_width float
	,petal_length float
	,petal_width float
	,iris varchar(255)
	,is_setosa float
	,is_virginica float
	,sample_id int
)

INSERT INTO @samples SELECT TOP 100 S.*,
CASE WHEN S.iris = 'setosa' THEN 1.0 ELSE 0.0 END AS is_setosa, 
CASE WHEN S.iris = 'virginica' THEN 1.0 ELSE 0.0 END AS is_virginica,
row_number() OVER(ORDER BY (SELECT NULL)) AS sample_id
FROM samples AS S ORDER BY (SELECT ABS(CHECKSUM(NewId()))) 

DECLARE @samplesPivoted SamplesTable

INSERT INTO @samplesPivoted 
SELECT
	S.sample_id,
	N.N,
	CASE
		WHEN N.N = 1 THEN S.sepal_width
		WHEN N.N = 2 THEN S.petal_length
		WHEN N.N = 3 THEN S.petal_width
		WHEN N.N = 4 THEN S.is_setosa
		ELSE S.is_virginica
	END,
	S.sepal_length
FROM @samples AS S CROSS JOIN @numbers AS N

We generate table with numbers, next add more features, and then pivot them just like in the last part.

Finally, our function:

CREATE FUNCTION Train(@samplesPivoted SamplesTable READONLY)
RETURNS @coefficients TABLE(feature int, w float, b float, mse float)
AS
BEGIN
    DECLARE @featureIds TABLE(feature int)
	INSERT INTO @featureIds SELECT DISTINCT feature from @samplesPivoted

	INSERT INTO @coefficients SELECT feature, 0.0, 0.0, -1.0 FROM @featureIds

	DECLARE @gradients TABLE(feature int, gw float, gb float)
	INSERT INTO @gradients SELECT feature, 0.0, 0.0 FROM @featureIds

	DECLARE @learningRate float
	SELECT @learningRate = 0.01

	DECLARE @iterations int
	SELECT @iterations = 500

	DECLARE @currentIteration int
	SELECT @currentIteration = 0

	DECLARE @newCoefficients TABLE(feature int, w float, b float)
	DECLARE @distances TABLE(id int, distance float)
	DECLARE @mse float

	WHILE @currentIteration < @iterations
	BEGIN
		DELETE FROM @newCoefficients
		INSERT INTO @newCoefficients SELECT C.feature, C.w - @learningRate * G.gw, C.b - @learningRate * G.gb FROM @coefficients AS C JOIN @gradients AS G ON C.feature = G.feature

		DELETE FROM @distances;

		INSERT INTO @distances SELECT 
			S.id, 
			SUM(N.w * S.value + N.b) - MAX(S.target)
		FROM 
			@samplesPivoted AS S
			JOIN @newCoefficients AS N ON S.feature = N.feature
		GROUP BY S.id

		SELECT @mse = AVG(D.distance * D.distance) FROM @distances AS D
		
		DELETE FROM @gradients;

		INSERT INTO @gradients SELECT
			S.feature,
			AVG(S.value * D.distance),
			AVG(D.distance)
		FROM 
			@samplesPivoted AS S
			JOIN @distances AS D ON S.id = D.id
		GROUP BY S.feature

		DELETE FROM @coefficients;

		INSERT INTO @coefficients SELECT *, @mse FROM @newCoefficients
		
		SELECT @currentIteration = @currentIteration + 1
	END

	RETURN
END

We extract featureIds so we can pass basically any dataset for training and it should work. We initialize coefficients with default values, do the same with gradients, and prepare some bookkeeping like iterations count or learning rate.

Next, in every iteration we start with calculating new coefficients based on old coefficients and old gradients. We clear distances table and calculate distance (which is the difference between predicted value and expected value) for each sample. Next, we calculate mean squared error.

Next, we need to calculate new gradients. For each feature we calculate the derivatives and we are done. We just need to store new coefficients and increase the counter.

Now we can execute the code:

SELECT * FROM Train(@samplesPivoted)

And the result is:

feature     w                      b                      mse
----------- ---------------------- ---------------------- ----------------------
1           0.746997439342549      0.282176586393152      0.098274347087078
2           0.563235001391582      0.282176586393152      0.098274347087078
3           0.0230764649956309     0.282176586393152      0.098274347087078
4           0.193704294614636      0.282176586393152      0.098274347087078
5           -0.110068224303597     0.282176586393152      0.098274347087078

]]>
https://blog.adamfurmanek.pl/2018/11/10/machine-learning-part-4/feed/ 2
Machine Learning Part 2 — Linear regression in SQL https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/ https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/#comments Sat, 27 Oct 2018 08:00:18 +0000 https://blog.adamfurmanek.pl/?p=2633 Continue reading Machine Learning Part 2 — Linear regression in SQL]]>

This is the second part of the ML series. For your convenience you can find other parts in the table of contents in Part 1 – Linear regression in MXNet

Image that you have only a data warehouse with SQL capabilities to train and evaluate your models. Last time we ran Python code to calculate linear regression for Iris dataset, today we are going to do exactly the same but in SQL.

The code provided below is for MS SQL 2017.

Let’s start with dataset and schema:

CREATE TABLE samples(
sepal_length float
,sepal_width float
,petal_length float
,petal_width float
,iris varchar(255)
);

INSERT INTO samples
VALUES
(5.1,3.5,1.4,0.2,'setosa'),
(4.9,3,1.4,0.2,'setosa'),
(4.7,3.2,1.3,0.2,'setosa'),
(4.6,3.1,1.5,0.2,'setosa'),
(5,3.6,1.4,0.2,'setosa'),
(5.4,3.9,1.7,0.4,'setosa'),
(4.6,3.4,1.4,0.3,'setosa'),
(5,3.4,1.5,0.2,'setosa'),
(4.4,2.9,1.4,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(5.4,3.7,1.5,0.2,'setosa'),
(4.8,3.4,1.6,0.2,'setosa'),
(4.8,3,1.4,0.1,'setosa'),
(4.3,3,1.1,0.1,'setosa'),
(5.8,4,1.2,0.2,'setosa'),
(5.7,4.4,1.5,0.4,'setosa'),
(5.4,3.9,1.3,0.4,'setosa'),
(5.1,3.5,1.4,0.3,'setosa'),
(5.7,3.8,1.7,0.3,'setosa'),
(5.1,3.8,1.5,0.3,'setosa'),
(5.4,3.4,1.7,0.2,'setosa'),
(5.1,3.7,1.5,0.4,'setosa'),
(4.6,3.6,1,0.2,'setosa'),
(5.1,3.3,1.7,0.5,'setosa'),
(4.8,3.4,1.9,0.2,'setosa'),
(5,3,1.6,0.2,'setosa'),
(5,3.4,1.6,0.4,'setosa'),
(5.2,3.5,1.5,0.2,'setosa'),
(5.2,3.4,1.4,0.2,'setosa'),
(4.7,3.2,1.6,0.2,'setosa'),
(4.8,3.1,1.6,0.2,'setosa'),
(5.4,3.4,1.5,0.4,'setosa'),
(5.2,4.1,1.5,0.1,'setosa'),
(5.5,4.2,1.4,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(5,3.2,1.2,0.2,'setosa'),
(5.5,3.5,1.3,0.2,'setosa'),
(4.9,3.1,1.5,0.1,'setosa'),
(4.4,3,1.3,0.2,'setosa'),
(5.1,3.4,1.5,0.2,'setosa'),
(5,3.5,1.3,0.3,'setosa'),
(4.5,2.3,1.3,0.3,'setosa'),
(4.4,3.2,1.3,0.2,'setosa'),
(5,3.5,1.6,0.6,'setosa'),
(5.1,3.8,1.9,0.4,'setosa'),
(4.8,3,1.4,0.3,'setosa'),
(5.1,3.8,1.6,0.2,'setosa'),
(4.6,3.2,1.4,0.2,'setosa'),
(5.3,3.7,1.5,0.2,'setosa'),
(5,3.3,1.4,0.2,'setosa'),
(7,3.2,4.7,1.4,'versicolor'),
(6.4,3.2,4.5,1.5,'versicolor'),
(6.9,3.1,4.9,1.5,'versicolor'),
(5.5,2.3,4,1.3,'versicolor'),
(6.5,2.8,4.6,1.5,'versicolor'),
(5.7,2.8,4.5,1.3,'versicolor'),
(6.3,3.3,4.7,1.6,'versicolor'),
(4.9,2.4,3.3,1,'versicolor'),
(6.6,2.9,4.6,1.3,'versicolor'),
(5.2,2.7,3.9,1.4,'versicolor'),
(5,2,3.5,1,'versicolor'),
(5.9,3,4.2,1.5,'versicolor'),
(6,2.2,4,1,'versicolor'),
(6.1,2.9,4.7,1.4,'versicolor'),
(5.6,2.9,3.6,1.3,'versicolor'),
(6.7,3.1,4.4,1.4,'versicolor'),
(5.6,3,4.5,1.5,'versicolor'),
(5.8,2.7,4.1,1,'versicolor'),
(6.2,2.2,4.5,1.5,'versicolor'),
(5.6,2.5,3.9,1.1,'versicolor'),
(5.9,3.2,4.8,1.8,'versicolor'),
(6.1,2.8,4,1.3,'versicolor'),
(6.3,2.5,4.9,1.5,'versicolor'),
(6.1,2.8,4.7,1.2,'versicolor'),
(6.4,2.9,4.3,1.3,'versicolor'),
(6.6,3,4.4,1.4,'versicolor'),
(6.8,2.8,4.8,1.4,'versicolor'),
(6.7,3,5,1.7,'versicolor'),
(6,2.9,4.5,1.5,'versicolor'),
(5.7,2.6,3.5,1,'versicolor'),
(5.5,2.4,3.8,1.1,'versicolor'),
(5.5,2.4,3.7,1,'versicolor'),
(5.8,2.7,3.9,1.2,'versicolor'),
(6,2.7,5.1,1.6,'versicolor'),
(5.4,3,4.5,1.5,'versicolor'),
(6,3.4,4.5,1.6,'versicolor'),
(6.7,3.1,4.7,1.5,'versicolor'),
(6.3,2.3,4.4,1.3,'versicolor'),
(5.6,3,4.1,1.3,'versicolor'),
(5.5,2.5,4,1.3,'versicolor'),
(5.5,2.6,4.4,1.2,'versicolor'),
(6.1,3,4.6,1.4,'versicolor'),
(5.8,2.6,4,1.2,'versicolor'),
(5,2.3,3.3,1,'versicolor'),
(5.6,2.7,4.2,1.3,'versicolor'),
(5.7,3,4.2,1.2,'versicolor'),
(5.7,2.9,4.2,1.3,'versicolor'),
(6.2,2.9,4.3,1.3,'versicolor'),
(5.1,2.5,3,1.1,'versicolor'),
(5.7,2.8,4.1,1.3,'versicolor'),
(6.3,3.3,6,2.5,'virginica'),
(5.8,2.7,5.1,1.9,'virginica'),
(7.1,3,5.9,2.1,'virginica'),
(6.3,2.9,5.6,1.8,'virginica'),
(6.5,3,5.8,2.2,'virginica'),
(7.6,3,6.6,2.1,'virginica'),
(4.9,2.5,4.5,1.7,'virginica'),
(7.3,2.9,6.3,1.8,'virginica'),
(6.7,2.5,5.8,1.8,'virginica'),
(7.2,3.6,6.1,2.5,'virginica'),
(6.5,3.2,5.1,2,'virginica'),
(6.4,2.7,5.3,1.9,'virginica'),
(6.8,3,5.5,2.1,'virginica'),
(5.7,2.5,5,2,'virginica'),
(5.8,2.8,5.1,2.4,'virginica'),
(6.4,3.2,5.3,2.3,'virginica'),
(6.5,3,5.5,1.8,'virginica'),
(7.7,3.8,6.7,2.2,'virginica'),
(7.7,2.6,6.9,2.3,'virginica'),
(6,2.2,5,1.5,'virginica'),
(6.9,3.2,5.7,2.3,'virginica'),
(5.6,2.8,4.9,2,'virginica'),
(7.7,2.8,6.7,2,'virginica'),
(6.3,2.7,4.9,1.8,'virginica'),
(6.7,3.3,5.7,2.1,'virginica'),
(7.2,3.2,6,1.8,'virginica'),
(6.2,2.8,4.8,1.8,'virginica'),
(6.1,3,4.9,1.8,'virginica'),
(6.4,2.8,5.6,2.1,'virginica'),
(7.2,3,5.8,1.6,'virginica'),
(7.4,2.8,6.1,1.9,'virginica'),
(7.9,3.8,6.4,2,'virginica'),
(6.4,2.8,5.6,2.2,'virginica'),
(6.3,2.8,5.1,1.5,'virginica'),
(6.1,2.6,5.6,1.4,'virginica'),
(7.7,3,6.1,2.3,'virginica'),
(6.3,3.4,5.6,2.4,'virginica'),
(6.4,3.1,5.5,1.8,'virginica'),
(6,3,4.8,1.8,'virginica'),
(6.9,3.1,5.4,2.1,'virginica'),
(6.7,3.1,5.6,2.4,'virginica'),
(6.9,3.1,5.1,2.3,'virginica'),
(5.8,2.7,5.1,1.9,'virginica'),
(6.8,3.2,5.9,2.3,'virginica'),
(6.7,3.3,5.7,2.5,'virginica'),
(6.7,3,5.2,2.3,'virginica'),
(6.3,2.5,5,1.9,'virginica'),
(6.5,3,5.2,2,'virginica'),
(6.2,3.4,5.4,2.3,'virginica'),
(5.9,3,5.1,1.8,'virginica')

Nothing fancy, just a table with Iris data. Next, the training:

WITH transformed AS (
	SELECT TOP 100000
		S.*, 
		CASE WHEN S.iris = 'setosa' THEN 1.0 ELSE 0.0 END AS is_setosa, 
		CASE WHEN S.iris = 'virginica' THEN 1.0 ELSE 0.0 END AS is_virginica
	FROM samples AS S ORDER BY (SELECT ABS(CHECKSUM(NewId())))
),
training AS (
  SELECT TOP 100 * FROM transformed ORDER BY (SELECT RAND())
),
test AS (
  SELECT * FROM transformed EXCEPT SELECT * FROM training
),
learning AS (
  SELECT 
	  CAST(0.0 AS float) as w1, 
	  CAST(0.0 AS float) as w2, 
	  CAST(0.0 AS float) as w3, 
	  CAST(0.0 AS float) as w4,
	  CAST(0.0 AS float) as w5, 
	  CAST(0.0 AS float) as b1, 
	  CAST(0.0 AS float) as b2, 
	  CAST(0.0 AS float) as b3, 
	  CAST(0.0 AS float) as b4, 
	  CAST(0.0 AS float) as b5, 
	  
	  CAST(0.0 AS float) as gw1,
	  
	  CAST(0.0 AS float) as gw2, 
	  CAST(0.0 AS float) as gw3, 
	  CAST(0.0 AS float) as gw4, 
	  CAST(0.0 AS float) as gw5, 
	  CAST(0.0 AS float) as gb1, 
	  CAST(0.0 AS float) as gb2, 
	  CAST(0.0 AS float) as gb3, 
	  CAST(0.0 AS float) as gb4, 
	  CAST(0.0 AS float) as gb5, 
	  1 as iteration,
	  CAST(0.0 AS float) as mse,
	  1 as dummy
	  
  UNION ALL
  SELECT R.w1, R.w2, R.w3, R.w4, R.w5, R.b1, R.b2, R.b3, R.b4, R.b5, R.gw1, R.gw2, R.gw3, R.gw4, R.gw5, R.gb1, R.gb2, R.gb3, R.gb4, R.gb5, R.iteration, R.mse, R.dummy
  FROM (
	  SELECT
		  CAST(Z.w1 AS float) AS w1, 
		  CAST(Z.w2 AS float) AS w2, 
		  CAST(Z.w3 AS float) AS w3, 
		  CAST(Z.w4 AS float) AS w4,
		  CAST(Z.w5 AS float) AS w5, 
		  CAST(Z.b1 AS float) AS b1,
		  CAST(Z.b2 AS float) AS b2, 
		  CAST(Z.b3 AS float) AS b3, 
		  CAST(Z.b4 AS float) AS b4,
		  CAST(Z.b5 AS float) AS b5, 
		  CAST(AVG(Z.gw1) OVER(PARTITION BY Z.iteration) AS float) AS gw1,
		  CAST(AVG(Z.gw2) OVER(PARTITION BY Z.iteration) AS float) AS gw2,
		  CAST(AVG(Z.gw3) OVER(PARTITION BY Z.iteration) AS float) AS gw3, 
		  CAST(AVG(Z.gw4) OVER(PARTITION BY Z.iteration) AS float) AS gw4, 
		  CAST(AVG(Z.gw5) OVER(PARTITION BY Z.iteration) AS float) AS gw5, 
		  CAST(AVG(Z.gb1) OVER(PARTITION BY Z.iteration) AS float) AS gb1, 
		  CAST(AVG(Z.gb2) OVER(PARTITION BY Z.iteration) AS float) AS gb2, 
		  CAST(AVG(Z.gb3) OVER(PARTITION BY Z.iteration) AS float) AS gb3,
		  CAST(AVG(Z.gb4) OVER(PARTITION BY Z.iteration) AS float) AS gb4, 
		  CAST(AVG(z.gb5) OVER(PARTITION BY Z.iteration) AS float) AS gb5,
		  Z.iteration + 1 AS iteration,
		  CAST(AVG(z.squared_distance) OVER(PARTITION BY Z.w1, Z.w2, Z.w3, Z.w4, Z.w5, Z.b1, Z.b2, Z.b3, Z.b4, Z.b5, Z.iteration) AS float) AS mse,
		  Z.dummy AS dummy,
		  ROW_NUMBER() OVER(PARTITION BY Z.dummy ORDER BY Z.dummy) AS row_number
	  FROM (
		SELECT
		  X.*, 
		  X.distance * x.distance AS squared_distance, 
		  X.distance * X.sepal_width AS gw1, 
		  X.distance * X.petal_length AS gw2,
		  X.distance * X.petal_width AS gw3,
		  X.distance * X.is_setosa AS gw4,
		  X.distance * X.is_virginica AS gw5,
		  X.distance AS gb1,
		  X.distance AS gb2,
		  X.distance AS gb3,
		  X.distance AS gb4,
		  X.distance AS gb5,
		  1 as dummy
		FROM (
		  SELECT T.*, L.*, 
		  (T.sepal_width * L.w1 + L.b1) + 
		  (T.petal_length * L.w2 + L.b2) + 
		  (T.petal_width * L.w3 + L.b3) + 
		  (T.is_setosa * L.w4 + L.b4) + 
		  (T.is_virginica * L.w5 + L.b5)
		  - T.sepal_length AS distance
		  FROM training AS T, (
			SELECT
			  l.w1 - 0.01 * l.gw1 AS w1,
			  l.w2 - 0.01 * l.gw2 AS w2,
			  l.w3 - 0.01 * l.gw3 AS w3,
			  l.w4 - 0.01 * l.gw4 AS w4,
			  l.w5 - 0.01 * l.gw5 AS w5,
			  l.b1 - 0.01 * l.gb1 AS b1,
			  l.b2 - 0.01 * l.gb2 AS b2,
			  l.b3 - 0.01 * l.gb3 AS b3,
			  l.b4 - 0.01 * l.gb4 AS b4,
			  l.b5 - 0.01 * l.gb5 AS b5,
			  l.iteration,
			  MAX(l.iteration) OVER(PARTITION BY L.dummy) AS max_iteration
			FROM learning AS L
		  ) AS L
		  WHERE L.iteration = max_iteration
		  AND L.iteration < 100
		) AS X
	  ) AS Z
  ) AS R
  WHERE R.row_number = 1
)
SELECT DISTINCT * FROM learning ORDER BY iteration

Whoa, looks terrible. Let’s go step by step.

First, we get transformed table with samples in randomized order and two new features. The same as in Python code.

Next, we gat training and test tables representing datasets for training and evaluation respectively.

Nest, learning table. We want to represent the formula Aw + b - y where A is a matrix of samples, w and b are vectors of parameters we calculate with linear regression (representing the line), y is a vector of target variables. gw# and bw# are variables representing gradient, mse is a mean square error. dummy is just a variable we need to use in windowing functions since we cannot use grouping.

Next, we go with recursive CTE part. Let’s start from the most nested part.

Our initial learning values represent some coefficients with gradients calculated in last iteration. We could start with random values as well, here we start with constants. In the innermost view we do the actual training: for every feature we subtract gradient multiplied by learning rate (0.01 here) and this is how we calculate new coefficients. Because of performance issues we also calculate highest iteration available so far.

Next, We join training samples with coefficients and calculate the actual l^2 metric. We multiply coefficients by value and finally subtract target variable. Just before that we filter only the last iteration (with WHERE L.iteration = max_iteration) to decrease the dataset size. We also limit the number of iterations.

Now, we have distance calculated. We calculate squared distance and components for gradient. Since we need to find the derivatives on our own (and we know the result, don’t we?), we multiply distance by features for partial derivatives for w and get just a distance for partial derivatives for b.

Next, we do a lot of ugly casting to match the CTE requirements of uniform data types. We also calculate averages of gradients for every feature. We divide the dataset for given partitions, actually there is just one iteration, but we need to have some partition for syntax purposes. We could use Z.dummy as well.

Ultimately, we just get the values for the first row, as all the rows have the same values. We could ignore this filtering but the our dataset would be very big and training would take much longer.

And here are the results of the fiddle

w1	w2	w3	w4	w5	b1	b2	b3	b4	b5	gw1	gw2	gw3	gw4	gw5	gb1	gb2	gb3	gb4	gb5	iteration	mse	dummy
0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	0	1	0	1
0	0	0	0	0	0	0	0	0	0	-17.866099999999992	-23.68590000000001	-7.787099999999996	-1.54	-2.298	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	-5.8580000000000005	2	34.993599999999994	1
0.17866099999999993	0.23685900000000012	0.07787099999999997	0.0154	0.02298	0.05858000000000001	0.05858000000000001	0.05858000000000001	0.05858000000000001	0.05858000000000001	-12.380883275799999	-15.605772535299998	-5.0526124829	-1.2608740210000005	-1.4953080819999993	-4.007251468	-4.007251468	-4.007251468	-4.007251468	-4.007251468	3	16.27646348154281	1
0.30246983275799993	0.3929167253530001	0.12839712482899995	0.028008740210000008	0.03793308081999999	0.09865251468	0.09865251468	0.09865251468	0.09865251468	0.09865251468	-8.418834585943394	-10.81760488381647	-3.4164139831366573	-0.8216200035865951	-0.9625399001012132	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	-2.7631314482787417	4	7.769214971609398	1
0.3866581786174339	0.5010927741911648	0.16256126466036652	0.036224940245865964	0.04755847982101212	0.1262838291627874	0.1262838291627874	0.1262838291627874	0.1262838291627874	0.1262838291627874	-6.035317318894683	-6.606228048514185	-2.023863003680904	-0.8444615479627321	-0.5368810335347928	-1.928314905725471	-1.928314905725471	-1.928314905725471	-1.928314905725471	-1.928314905725471	5	3.90475896095533	1
0.44701135180638074	0.5671550546763067	0.18279989469717556	0.04466955572549328	0.05292729015636005	0.1455669782200421	0.1455669782200421	0.1455669782200421	0.1455669782200421	0.1455669782200421	-4.259932246247001	-4.69904967785691	-1.4272812920919014	-0.5994414351159882	-0.3482019192777488	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	-1.3810151619909217	6	2.104835405441499	1
0.48961067426885074	0.6141455514548758	0.19707270761809456	0.050663970076653166	0.05640930934913754	0.15937712983995134	0.15937712983995134	0.15937712983995134	0.15937712983995134	0.15937712983995134	-2.9131502368507523	-2.7900047108941357	-0.786858726083015	-0.4902770512098376	-0.13835673111718788	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	-0.9360854954297377	7	1.1812001776943115	1
0.5187421766373582	0.6420455985638172	0.2049412948789247	0.055566740588751544	0.057792876660309425	0.1687379847942487	0.1687379847942487	0.1687379847942487	0.1687379847942487	0.1687379847942487	-2.2815822515924356	-1.8669176720067389	-0.48251503714682115	-0.45540670681884726	-0.061890674774057554	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	-0.7178696773491847	8	0.8633620171570588	1
0.5415579991532826	0.6607147752838846	0.20976644525039292	0.060120807656940015	0.05841178340805	0.17591668156774054	0.17591668156774054	0.17591668156774054	0.17591668156774054	0.17591668156774054	-1.5999202323884023	-1.1506719996479482	-0.2718566121871	-0.3411146806640698	0.0033881819862846907	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	-0.49616500591193463	9	0.5882617765607544	1
0.5575572014771667	0.672221495280364	0.21248501137226392	0.06353195446358072	0.058377901588187155	0.1808783316268599	0.1808783316268599	0.1808783316268599	0.1808783316268599	0.1808783316268599	-1.4486656783545695	-0.7126912415655796	-0.10067134875629	-0.3912972375425979	0.05674696038537284	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	-0.43717447937772547	10	0.5623460089803844	1
0.5720438582607124	0.6793484076960198	0.21349172485982681	0.0674449268390067	0.05781043198433343	0.18525007642063715	0.18525007642063715	0.18525007642063715	0.18525007642063715	0.18525007642063715	-0.9306475612833495	-0.15288151866185962	0.0744573029382314	-0.3115818095388638	0.11786433111911637	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	-0.2758898465515766	11	0.47691160165459484	1
0.5813503338735458	0.6808772228826384	0.21274715183044451	0.07056074493439533	0.056631788673142266	0.18800897488615292	0.18800897488615292	0.18800897488615292	0.18800897488615292	0.18800897488615292	-0.7351425771472415	0.07290133335083944	0.15718629093419806	-0.29006779678979683	0.14059860645334357	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	-0.21346090333136303	12	0.4716605760544138	1
0.5887017596450183	0.68014820954913	0.21117528892110254	0.07346142290229331	0.05522580260860883	0.19014358391946656	0.19014358391946656	0.19014358391946656	0.19014358391946656	0.19014358391946656	-0.8496869040548067	0.025167947969057334	0.13655829480440038	-0.32553628103124616	0.13687662647187548	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	-0.24475728986359135	13	0.4339828812327394	1
0.5971986286855663	0.6798965300694394	0.20980970597305854	0.07671678571260578	0.05385703634389007	0.1925911568181025	0.1925911568181025	0.1925911568181025	0.1925911568181025	0.1925911568181025	-0.7795646413472435	-0.0028445948234325025	0.11892653879067865	-0.2834392136806792	0.12863334072334467	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	-0.22656991588415598	14	0.4378123399820016	1
0.6049942750990387	0.6799249760176738	0.20862044058515175	0.07955117784941257	0.052570702936656624	0.19485685597694405	0.19485685597694405	0.19485685597694405	0.19485685597694405	0.19485685597694405	-0.6199022285354157	0.2241419869804357	0.18844403042810082	-0.27631647315104274	0.14659705709607304	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	-0.17750043058490167	15	0.4041025964725166	1
0.6111932973843929	0.6776835561478695	0.20673600028087075	0.082314342580923	0.05110473236569589	0.19663186028279306	0.19663186028279306	0.19663186028279306	0.19663186028279306	0.19663186028279306	-0.5982201415162909	0.223020429269901	0.18786410062666778	-0.2778417715564467	0.14219223365502676	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	-0.16766557583295694	16	0.37677359384506365	1
0.6171754987995558	0.6754533518551704	0.20485735927460408	0.08509276029648746	0.049682810029145624	0.19830851604112262	0.19830851604112262	0.19830851604112262	0.19830851604112262	0.19830851604112262	-0.504727184910006	0.3488735090236196	0.23169940182123533	-0.27304021389090055	0.15137057707747892	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	-0.14594812814182823	17	0.377184784439225	1
0.6222227706486558	0.6719646167649342	0.2025403652563917	0.08782316243539647	0.048169104258370836	0.1997679973225409	0.1997679973225409	0.1997679973225409	0.1997679973225409	0.1997679973225409	-0.44998764800328744	0.2717811698570403	0.18838367144926862	-0.23068727674683934	0.13729583498280745	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	-0.12885900816583054	18	0.3358015439419573	1
0.6267226471286887	0.6692468050663638	0.20065652854189903	0.09013003520286486	0.04679614590854276	0.2010565874041992	0.2010565874041992	0.2010565874041992	0.2010565874041992	0.2010565874041992	-0.3327449242778986	0.4218061610425528	0.250603029713697	-0.2186705079506715	0.16399901690035443	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	-0.09281485221264772	19	0.3435586135251613	1
0.6300500963714677	0.6650287434559383	0.19815049824476205	0.09231674028237158	0.04515615573953922	0.20198473592632568	0.20198473592632568	0.20198473592632568	0.20198473592632568	0.20198473592632568	-0.3978821591273137	0.37818526431069854	0.22397317697432803	-0.2301645447314299	0.15656659519583066	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	-0.10520392419282784	20	0.32909737647738074	1
0.6340289179627409	0.6612468908128314	0.19591076647501876	0.09461838572968588	0.04359048978758091	0.20303677516825397	0.20303677516825397	0.20303677516825397	0.20303677516825397	0.20303677516825397	-0.49501548321294875	0.2824756186489414	0.19154839534497095	-0.2408611483352118	0.13186150356062115	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	-0.1346065552874162	21	0.34489232659720287	1
0.6389790727948703	0.658422134626342	0.19399528252156906	0.097026997213038	0.0422718747519747	0.20438284072112814	0.20438284072112814	0.20438284072112814	0.20438284072112814	0.20438284072112814	-0.501011104499183	0.24610496419058692	0.18246922326963586	-0.24294075989110828	0.12900402506295225	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	-0.1393312449730318	22	0.3276205502872738	1
0.6439891838398621	0.6559610849844362	0.1921705902888727	0.09945640481194908	0.040981834501345175	0.20577615317085846	0.20577615317085846	0.20577615317085846	0.20577615317085846	0.20577615317085846	-0.324775351310746	0.3746107213205154	0.2111139416712132	-0.20058332957969047	0.13686659765338857	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	-0.08655469582421822	23	0.2935346775960134	1
0.6472369373529696	0.652214977771231	0.19005945087216058	0.10146223810774598	0.03961316852481129	0.20664170012910063	0.20664170012910063	0.20664170012910063	0.20664170012910063	0.20664170012910063	-0.4838045379993712	0.1006841330157371	0.10643052467775646	-0.20234130338363415	0.09292148398897694	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	-0.1354541866632277	24	0.25669597691151474	1
0.6520749827329633	0.6512081364410737	0.18899514562538303	0.10348565114158231	0.03868395368492152	0.2079962419957329	0.2079962419957329	0.2079962419957329	0.2079962419957329	0.2079962419957329	-0.35575362508485314	0.3141589432080337	0.20155588679506228	-0.20586212623287087	0.137290032171127	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	-0.10217281513696584	25	0.28212690272956203	1
0.6556325189838118	0.6480665470089934	0.1869795867574324	0.10554427240391102	0.03731105336321025	0.20901797014710258	0.20901797014710258	0.20901797014710258	0.20901797014710258	0.20901797014710258	-0.3875722593205403	0.39575152037580225	0.2376765951470362	-0.22980463433325074	0.14741462189301477	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	-0.09721841813781816	26	0.3275866369928804	1
0.6595082415770173	0.6441090318052354	0.18460282080596205	0.10784231874724354	0.0358369071442801	0.20999015432848075	0.20999015432848075	0.20999015432848075	0.20999015432848075	0.20999015432848075	-0.38666612411329326	0.30298380325383223	0.18544748254748933	-0.2109305062431924	0.1207459374420806	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	-0.10279820476131978	27	0.28304267674482325	1
0.6633749028181501	0.6410791937726971	0.18274834598048714	0.10995162380967546	0.034629447769859295	0.21101813637609396	0.21101813637609396	0.21101813637609396	0.21101813637609396	0.21101813637609396	-0.24847682120208867	0.3094963979968232	0.18946065318285155	-0.16684214138339748	0.12473991223078539	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	-0.07595889935143632	28	0.246960439850011	1
0.6658596710301711	0.6379842297927288	0.18085373944865862	0.11162004522350943	0.03338204864755144	0.21177772536960832	0.21177772536960832	0.21177772536960832	0.21177772536960832	0.21177772536960832	-0.26713664218284927	0.36465094054170605	0.21039792537887425	-0.18595615618522174	0.12832303993252667	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	-0.0712989816305829	29	0.24185518216726717	1
0.6685310374519996	0.6343377203873117	0.17874976019486988	0.11347960678536165	0.03209881824822618	0.21249071518591414	0.21249071518591414	0.21249071518591414	0.21249071518591414	0.21249071518591414	-0.285372851000564	0.3905166535024734	0.2273788959907782	-0.1908825686291652	0.14691956832436104	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	-0.07806912091024437	30	0.28331571855985055	1
0.6713847659620052	0.630432553852287	0.1764759712349621	0.1153884324716533	0.030629622564982566	0.21327140639501657	0.21327140639501657	0.21327140639501657	0.21327140639501657	0.21327140639501657	-0.28485721231197203	0.30590558562656417	0.19669736501813603	-0.17821444469252792	0.1307925544131432	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	-0.08172656365838715	31	0.2785500859852849	1
0.674233338085125	0.6273734979960214	0.17450899758478072	0.11717057691857857	0.029321697020851134	0.21408867203160045	0.21408867203160045	0.21408867203160045	0.21408867203160045	0.21408867203160045	-0.5129761189347142	-0.020461479708326444	0.07474153979340704	-0.19273808445220872	0.08150043014763933	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	-0.15869989140145196	32	0.2402129621304276	1
0.6793630992744721	0.6275781127931046	0.17376158218684665	0.11909795776310066	0.02850669271937474	0.21567567094561496	0.21567567094561496	0.21567567094561496	0.21567567094561496	0.21567567094561496	-0.3645442663637262	0.16150327033185932	0.13174406832208205	-0.16820261994603983	0.1031770297602549	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	-0.10040680977537035	33	0.20801326313092441	1
0.6830085419381094	0.625963080089786	0.17244414150362583	0.12077998396256105	0.027474922421772192	0.21667973904336865	0.21667973904336865	0.21667973904336865	0.21667973904336865	0.21667973904336865	-0.16005878697371478	0.4342485263778689	0.23041246929945536	-0.1625295605496381	0.12296231586278182	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	-0.04583500123245548	34	0.24620358056538802	1
0.6846091298078466	0.6216205948260073	0.1701400168106313	0.12240527956805744	0.026245299263144374	0.2171380890556932	0.2171380890556932	0.2171380890556932	0.2171380890556932	0.2171380890556932	-0.41601032764657925	0.11475000070103901	0.11587338094101118	-0.18685988461561295	0.0887760397414091	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	-0.11831151476938896	35	0.23356966469422705	1
0.6887692330843124	0.6204730948189969	0.16898128300122117	0.12427387841421357	0.02535753886573028	0.2183212042033871	0.2183212042033871	0.2183212042033871	0.2183212042033871	0.2183212042033871	-0.2785437161017364	0.3233998047974474	0.18920131636751455	-0.18286457427062713	0.11775577342967697	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	-0.0764754557323482	36	0.24040606051671987	1
0.6915546702453297	0.6172390967710224	0.16708926983754602	0.12610252415691983	0.024179981131433513	0.21908595876071058	0.21908595876071058	0.21908595876071058	0.21908595876071058	0.21908595876071058	-0.22878912220235262	0.34081474260147404	0.20446731002501103	-0.16554152172856654	0.12462164430370101	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	-0.06870368708650554	37	0.2591299535666149	1
0.6938425614673532	0.6138309493450077	0.1650445967372959	0.1277579393742055	0.022933764688396502	0.21977299563157562	0.21977299563157562	0.21977299563157562	0.21977299563157562	0.21977299563157562	-0.2365525548720244	0.19374497966192147	0.15472041810991788	-0.14106522179189995	0.11214128223363509	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	-0.07543948357823392	38	0.21101976832048863	1
0.6962080870160735	0.6118934995483885	0.16349739255619672	0.1291685915921245	0.02181235186606015	0.22052739046735798	0.22052739046735798	0.22052739046735798	0.22052739046735798	0.22052739046735798	-0.24388468328031931	0.22846748194350855	0.15135068447016434	-0.13594214266327828	0.10725768425018055	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	-0.06449821320142851	39	0.1976078850328914	1
0.6986469338488767	0.6096088247289534	0.16198388571149508	0.13052801301875727	0.020739775023558345	0.22117237259937225	0.22117237259937225	0.22117237259937225	0.22117237259937225	0.22117237259937225	-0.33139653494040894	0.21233479979061756	0.15611986380630455	-0.1776685173153882	0.1016600568301543	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	-0.09823540572578526	40	0.2119873530999341	1
0.7019608991982808	0.6074854767310472	0.16042268707343205	0.13230469819191115	0.0197231744552568	0.2221547266566301	0.2221547266566301	0.2221547266566301	0.2221547266566301	0.2221547266566301	-0.1820100756511364	0.3385437748726924	0.1895384437749324	-0.14677758015871256	0.11365139288940825	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	-0.04657451528497105	41	0.21098584005326665	1
0.7037809999547922	0.6041000389823203	0.1585273026356827	0.1337724739934983	0.01858666052636272	0.22262047180947983	0.22262047180947983	0.22262047180947983	0.22262047180947983	0.22262047180947983	-0.24014841963522296	0.247211257489466	0.1577555298296247	-0.14619268030256632	0.0929074227806532	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	-0.06841922501316199	42	0.1907255411266933	1
0.7061824841511444	0.6016279264074257	0.15694974733738645	0.13523440079652396	0.017657586298556186	0.22330466405961144	0.22330466405961144	0.22330466405961144	0.22330466405961144	0.22330466405961144	-0.16124098721985047	0.33210451867969654	0.18635267245402815	-0.14390401892582097	0.10331979261932978	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	-0.045951929828262045	43	0.2026762296511086	1
0.7077948940233428	0.5983068812206287	0.15508622061284616	0.13667344098578219	0.016624388372362887	0.22376418335789405	0.22376418335789405	0.22376418335789405	0.22376418335789405	0.22376418335789405	-0.26076824328866843	0.1079584865588379	0.09819151125046531	-0.12024340228460798	0.09233343337573185	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	-0.07922248754202584	44	0.18878720771229177	1
0.7104025764562295	0.5972272963550403	0.15410430550034152	0.13787587500862827	0.01570105403860557	0.22455640823331433	0.22455640823331433	0.22455640823331433	0.22455640823331433	0.22455640823331433	-0.17740718061463628	0.22014674324259673	0.1362254765522256	-0.11907598110044319	0.0918081731169333	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	-0.05258883358242868	45	0.17577173056129552	1
0.7121766482623759	0.5950258289226144	0.15274205073481925	0.13906663481963272	0.014782972307436236	0.2250822965691386	0.2250822965691386	0.2250822965691386	0.2250822965691386	0.2250822965691386	-0.17988621609542654	0.2806248857396519	0.16684906001955507	-0.13179877335974524	0.11636925597792469	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	-0.0546258399629982	46	0.1981644076210309	1
0.7139755104233302	0.5922195800652178	0.1510735601346237	0.14038462255323017	0.01361927974765699	0.2256285549687686	0.2256285549687686	0.2256285549687686	0.2256285549687686	0.2256285549687686	-0.2584227412524325	0.10067043498952959	0.09950617905851229	-0.12221606979845091	0.07900578792054862	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	-0.07504610852125593	47	0.17812710443683283	1
0.7165597378358545	0.5912128757153226	0.15007849834403858	0.14160678325121467	0.012829221868451503	0.22637901605398117	0.22637901605398117	0.22637901605398117	0.22637901605398117	0.22637901605398117	-0.22804415541398007	0.23192581929406977	0.16116113538750568	-0.13938740548221923	0.10589767726938647	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	-0.06233184369892504	48	0.20346020018740268	1
0.7188401793899942	0.5888936175223819	0.14846688699016353	0.14300065730603687	0.011770245095757638	0.22700233449097043	0.22700233449097043	0.22700233449097043	0.22700233449097043	0.22700233449097043	-0.26365809483682423	0.0798335159629457	0.10408220872868917	-0.12185364491356804	0.08895869156554267	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	-0.08330394668375415	49	0.20362719044891814	1
0.7214767603383625	0.5880952823627524	0.14742606490287663	0.14421919375517256	0.010880658180102212	0.22783537395780798	0.22783537395780798	0.22783537395780798	0.22783537395780798	0.22783537395780798	-0.12070713376803383	0.19327870847710976	0.1327363086814136	-0.09172258738844023	0.10350211081461654	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	-0.04023221007679314	50	0.17307091982205539	1
0.7226838316760429	0.5861624952779814	0.1460987018160625	0.14513641962905696	0.009845637071956046	0.2282376960585759	0.2282376960585759	0.2282376960585759	0.2282376960585759	0.2282376960585759	-0.15208272219677954	0.36597879356450186	0.201455586587483	-0.14214553955958847	0.11307259585551618	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	-0.04113035316340337	51	0.20510799671368793	1
0.7242046588980107	0.5825027073423363	0.14408414595018768	0.14655787502465284	0.008714911113400885	0.22864899959020996	0.22864899959020996	0.22864899959020996	0.22864899959020996	0.22864899959020996	-0.23546368590662695	0.07863444953185318	0.08998641352883109	-0.11249190149937331	0.0754169796131565	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	-0.07477214861370918	52	0.16452781662027022	1
0.7265592957570769	0.5817163628470178	0.14318428181489937	0.14768279403964657	0.007960741317269319	0.22939672107634704	0.22939672107634704	0.22939672107634704	0.22939672107634704	0.22939672107634704	-0.22412747127965854	0.048052278123844744	0.0775914843586476	-0.10227680547335986	0.06254776208605048	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	-0.07319435972676704	53	0.14017649291733109	1
0.7288005704698735	0.5812358400657793	0.1424083669713129	0.14870556209438016	0.007335263696408814	0.2301286646736147	0.2301286646736147	0.2301286646736147	0.2301286646736147	0.2301286646736147	-0.25624041560747846	0.029390761535552937	0.06828636833776426	-0.10898589105571262	0.07017707601687669	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	-0.0791487536477615	54	0.14970429593083914	1
0.7313629746259482	0.5809419324504238	0.14172550328793526	0.14979542100493728	0.006633492936240047	0.23092015221009232	0.23092015221009232	0.23092015221009232	0.23092015221009232	0.23092015221009232	-0.22913911184216446	0.13226417066134594	0.11863807691589294	-0.12507494662909818	0.08910521429375504	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	-0.07101960353629404	55	0.18622134861905248	1
0.7336543657443699	0.5796192907438104	0.14053912251877634	0.15104617047122826	0.005742440793302496	0.23163034824545525	0.23163034824545525	0.23163034824545525	0.23163034824545525	0.23163034824545525	-0.09998887300596754	0.28464046462850695	0.16586450497279256	-0.1004024281651072	0.09738975422044112	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	-0.02962061506449232	56	0.16501636741652853	1
0.7346542544744296	0.5767728860975253	0.1388804774690484	0.15205019475287934	0.004768543251098085	0.2319265543961002	0.2319265543961002	0.2319265543961002	0.2319265543961002	0.2319265543961002	-0.19611257838146476	0.07306489202052129	0.0839723571585399	-0.09336818629413685	0.07303650082337178	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	-0.062142889850532394	57	0.16714329194422128	1
0.7366153802582442	0.5760422371773201	0.138040753897463	0.15298387661582072	0.004038178242864367	0.23254798329460552	0.23254798329460552	0.23254798329460552	0.23254798329460552	0.23254798329460552	-0.22363147525512367	-0.0024376351496950654	0.052456962471046274	-0.09354886698006908	0.06150269888035172	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	-0.07397019068577668	58	0.1568818655691469	1
0.7388516950107955	0.576066613528817	0.13751618427275256	0.15391936528562142	0.00342315125406085	0.23328768520146329	0.23328768520146329	0.23328768520146329	0.23328768520146329	0.23328768520146329	-0.08953988837717489	0.19607299980858844	0.12001900614550602	-0.07744170356369351	0.07989846080456761	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	-0.02776372627805278	59	0.15146671249226407	1
0.7397470938945673	0.5741058835307312	0.1363159942112975	0.15469378232125836	0.002624166646015174	0.2335653224642438	0.2335653224642438	0.2335653224642438	0.2335653224642438	0.2335653224642438	-0.10934864957790269	0.15525500702030612	0.1133993364010108	-0.079016224469531	0.08861978417259017	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	-0.041351412806578634	60	0.1510732057321453	1
0.7408405803903463	0.5725533334605282	0.1351820008472874	0.15548394456595366	0.0017379688042892722	0.23397883659230959	0.23397883659230959	0.23397883659230959	0.23397883659230959	0.23397883659230959	-0.2160275996543884	0.11691690812699129	0.10078229606147529	-0.1096247693793104	0.08362684381319943	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	-0.06698467735528844	61	0.17161510513564956	1
0.7430008563868902	0.5713841643792582	0.13417417788667266	0.15658019225974676	0.0009017003661572779	0.23464868336586248	0.23464868336586248	0.23464868336586248	0.23464868336586248	0.23464868336586248	-0.055536840564336075	0.31286777483113964	0.17508517551052347	-0.0959580020511796	0.10970668633538562	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	-0.019674767901056676	62	0.1752640440036056	1
0.7435562247925336	0.5682554866309468	0.13242332613156743	0.15753977228025856	-0.00019536649719657827	0.23484543104487304	0.23484543104487304	0.23484543104487304	0.23484543104487304	0.23484543104487304	-0.18244391053238806	0.07504035522087159	0.08444011577478755	-0.08784467564824183	0.07069882358929407	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	-0.057543812056915555	63	0.14813242245230082	1
0.7453806638978575	0.5675050830787381	0.13157892497381957	0.15841821903674097	-0.000902354733089519	0.2354208691654422	0.2354208691654422	0.2354208691654422	0.2354208691654422	0.2354208691654422	-0.09810495103647211	0.21942831404466884	0.12957345616493116	-0.08822113899863321	0.07607507223736298	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	-0.02574705261269635	64	0.14548512205790703	1
0.7463617134082222	0.5653107999382915	0.13028319041217026	0.1593004304267273	-0.0016631054554631488	0.23567833969156915	0.23567833969156915	0.23567833969156915	0.23567833969156915	0.23567833969156915	-0.32812328713649097	-0.14080925333981334	0.010850431329594932	-0.10260966746205227	0.05049462621900181	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	-0.11064259443801606	65	0.16076785344633143	1
0.7496429462795872	0.5667188924716896	0.1301746860988743	0.16032652710134784	-0.0021680517176531668	0.2367847656359493	0.2367847656359493	0.2367847656359493	0.2367847656359493	0.2367847656359493	-0.1979877928574162	0.04136037632766339	0.06919498812733316	-0.0953543502035378	0.06001726142107431	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	-0.06606925753376086	66	0.1540183055576599	1
0.7516228242081613	0.566305288708413	0.12948273621760098	0.16128007060338323	-0.0027682243318639097	0.2374454582112869	0.2374454582112869	0.2374454582112869	0.2374454582112869	0.2374454582112869	-0.08920006127166223	0.19791213757769224	0.11667310986796635	-0.08501533410229058	0.0714916421354736	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	-0.026501943724348456	67	0.14287594000870538	1
0.752514824820878	0.5643261673326361	0.12831600511892133	0.16213022394440613	-0.003483140753218646	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.2377104776485304	0.01905716157834171	0.3048941600109156	0.15246005973911841	-0.06397268858760383	0.08826052003745843	0.004439930870313775	0.004439930870313775	0.004439930870313775	0.004439930870313775	0.004439930870313775	68	0.14354684658906344	1
0.7523242532050946	0.5612772257325269	0.12679140452153015	0.16276995083028217	-0.00436574595359323	0.23766607833982725	0.23766607833982725	0.23766607833982725	0.23766607833982725	0.23766607833982725	-0.12454143429966495	0.08808459085958564	0.08611866524566845	-0.0756576129359348	0.07663404484811812	-0.046410981816632	-0.046410981816632	-0.046410981816632	-0.046410981816632	-0.046410981816632	69	0.1552713068383162	1
0.7535696675480912	0.5603963798239311	0.12593021786907346	0.1635265269596415	-0.005132086402074411	0.23813018815799358	0.23813018815799358	0.23813018815799358	0.23813018815799358	0.23813018815799358	-0.14121596124163496	0.06456803442600145	0.07155879643214269	-0.07373620627454074	0.06202810952350835	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	-0.04698016250964484	70	0.13120812533515577	1
0.7549818271605075	0.559750699479671	0.12521462990475205	0.16426388902238692	-0.005752367497309495	0.23859998978309002	0.23859998978309002	0.23859998978309002	0.23859998978309002	0.23859998978309002	-0.04107865859575949	0.17660052977850144	0.1173143825892387	-0.060542654804122	0.07215316852100737	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	-0.014136748077008078	71	0.1504912691420637	1
0.7553926137464652	0.557984694181886	0.12404148607885966	0.16486931557042814	-0.006473899182519569	0.23874135726386012	0.23874135726386012	0.23874135726386012	0.23874135726386012	0.23874135726386012	-0.18856706766651427	0.003499916117622832	0.05590851270829903	-0.0772715012971761	0.06349906696270137	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	-0.06627499595695469	72	0.15533830139959728	1
0.7572782844231303	0.5579496950207098	0.12348240095177668	0.1656420305833999	-0.007108889852146583	0.23940410722342967	0.23940410722342967	0.23940410722342967	0.23940410722342967	0.23940410722342967	-0.22750105830278453	-0.05423550300058153	0.03548031707478287	-0.08248403931937454	0.04950287068564042	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	-0.07735624258596711	73	0.1467113877863856	1
0.7595532950061581	0.5584920500507156	0.12312759778102884	0.16646687097659366	-0.007603918559002987	0.24017766964928933	0.24017766964928933	0.24017766964928933	0.24017766964928933	0.24017766964928933	-0.10512526149587802	0.09710354354836108	0.0764814073253542	-0.062317819415637035	0.064312522127424	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	-0.03415974583314118	74	0.12477147666327537	1
0.7606045476211168	0.557521014615232	0.1223627837077753	0.16709004917075002	-0.008247043780277227	0.24051926710762073	0.24051926710762073	0.24051926710762073	0.24051926710762073	0.24051926710762073	-0.1312889166516503	0.043072316463475493	0.06466703346740718	-0.06822191816703964	0.0644501410408365	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	-0.048748283322623286	75	0.139848343937992	1
0.7619174367876334	0.5570902914505972	0.12171611337310123	0.1677722683524204	-0.008891545190685593	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.24100674994084695	0.05608208253031271	0.3332334320911999	0.16612740246198984	-0.05842284436754449	0.08401380854343107	0.016064210179425747	0.016064210179425747	0.016064210179425747	0.016064210179425747	0.016064210179425747	76	0.1439294009963182	1
0.7613566159623302	0.5537579571296852	0.12005483934848134	0.16835649679609585	-0.009731683276119904	0.2408461078390527	0.2408461078390527	0.2408461078390527	0.2408461078390527	0.2408461078390527	-0.1444919742242204	0.05212588366251118	0.0757039417929166	-0.0709897148297897	0.06129848216074554	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	-0.05281231904163748	77	0.14755249301554693	1
0.7628015357045724	0.5532366982930601	0.11929779993055217	0.16906639394439374	-0.01034466809772736	0.24137423102946906	0.24137423102946906	0.24137423102946906	0.24137423102946906	0.24137423102946906	-0.08210801404100974	0.0651040977250028	0.06013389048457362	-0.05381548528687653	0.05599204646535418	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	-0.03137487372005904	78	0.12778180880043968	1
0.7636226158449825	0.5525856573158101	0.11869646102570644	0.1696045487972625	-0.010904588562380902	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.24168797976666964	0.04244081758396698	0.2869107655779289	0.14777111986448882	-0.048953794555473884	0.09078126424653137	0.011781732644253298	0.011781732644253298	0.011781732644253298	0.011781732644253298	0.011781732644253298	79	0.14254292725697765	1
0.7631982076691428	0.5497165496600308	0.11721874982706156	0.17009408674281723	-0.011812401204846217	0.2415701624402271	0.2415701624402271	0.2415701624402271	0.2415701624402271	0.2415701624402271	-0.15486226059221975	0.0536218177160101	0.08320641094516558	-0.07734953852493864	0.07291625238903855	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	-0.054606701182151475	80	0.15134168925096764	1
0.764746830275065	0.5491803314828707	0.1163866857176099	0.1708675821280666	-0.012541563728736603	0.24211622945204864	0.24211622945204864	0.24211622945204864	0.24211622945204864	0.24211622945204864	-0.07666026528371934	0.08965244539838306	0.07969940256955325	-0.056744964738265076	0.05740035259615096	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	-0.02969818755109547	81	0.12497887736713718	1
0.7655134329279022	0.5482838070288869	0.11558969169191437	0.17143503177544925	-0.013115567254698111	0.24241321132755958	0.24241321132755958	0.24241321132755958	0.24241321132755958	0.24241321132755958	-0.1955624427970849	-0.1490763449540694	-0.005952343631955764	-0.04892586145794186	0.036323511504754685	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	-0.0713828999909823	82	0.1308424008423824	1
0.767469057355873	0.5497745704784276	0.11564921512823392	0.17192429039002866	-0.013478802369745657	0.2431270403274694	0.2431270403274694	0.2431270403274694	0.2431270403274694	0.2431270403274694	-0.07958764279476577	-0.02787911452614944	0.020861408995979117	-0.03534351779308108	0.033310314330203886	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	-0.03357450313536812	83	0.10959137096965264	1
0.7682649337838207	0.5500533616236891	0.11544060103827414	0.17227772556795948	-0.013811905513047696	0.24346278535882307	0.24346278535882307	0.24346278535882307	0.24346278535882307	0.24346278535882307	-0.10697400824212892	0.0005825758405553483	0.036658693836451546	-0.04760596259891188	0.042994003382070184	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	-0.039010095579500075	84	0.12988951457702805	1
0.769334673866242	0.5500475358652835	0.11507401409990962	0.1727537851939486	-0.014241845546868397	0.24385288631461807	0.24385288631461807	0.24385288631461807	0.24385288631461807	0.24385288631461807	-0.16818210932614797	-0.05800927001748076	0.023743373104544156	-0.06003821194789818	0.03987184518080429	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	-0.05934805135985457	85	0.12130372159084488	1
0.7710164949595034	0.5506276285654583	0.11483658036886418	0.1733541673134276	-0.01464056399867644	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.2444463668282166	0.08977679614187536	0.2757420815255268	0.14310472026978727	-0.03784614043050057	0.08319118404603752	0.020958453855994943	0.020958453855994943	0.020958453855994943	0.020958453855994943	0.020958453855994943	86	0.13311264087386843	1
0.7701187269980847	0.547870207750203	0.1134055331661663	0.1737326287177326	-0.015472475839136815	0.24423678228965665	0.24423678228965665	0.24423678228965665	0.24423678228965665	0.24423678228965665	-0.01020537860094696	0.18597865909212818	0.10807216653184656	-0.04405159118881292	0.0760369607846935	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	-0.0058501303198060664	87	0.13400454338996654	1
0.7702207807840942	0.5460104211592818	0.11232481150084785	0.17417314462962072	-0.01623284544698375	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.2442952835928547	0.009841974683875019	0.2543513898421049	0.13829510046539067	-0.06201639569073605	0.07077867639799883	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	0.0024863556241352127	88	0.1386872357874715	1
0.7701223610372554	0.5434669072608608	0.11094186049619394	0.17479330858652808	-0.01694063221096374	0.24427042003661337	0.24427042003661337	0.24427042003661337	0.24427042003661337	0.24427042003661337	-0.18770427314422566	-0.11783529327564121	0.004345102911476412	-0.04901970381455122	0.032903348126294966	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	-0.06577320591135308	89	0.1231833502421304	1
0.7719994037686977	0.5446452601936171	0.11089840946707917	0.1752835056246736	-0.01726966569222669	0.2449281520957269	0.2449281520957269	0.2449281520957269	0.2449281520957269	0.2449281520957269	-0.10777513643651629	-0.02946688859150442	0.0358105315654737	-0.041458818753371436	0.049801760690163246	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	-0.0430904153575444	90	0.14234524490869968	1
0.7730771551330629	0.5449399290795321	0.11054030415142443	0.1756980938122073	-0.017767683299128322	0.24535905624930235	0.24535905624930235	0.24535905624930235	0.24535905624930235	0.24535905624930235	-0.03029478545601165	0.12063189104715981	0.0926271105306644	-0.04494968801482946	0.06225696531422806	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	-0.017560619020668335	91	0.1457493115440951	1
0.7733801029876229	0.5437336101690605	0.10961403304611779	0.17614759069235558	-0.018390252952270602	0.24553466243950903	0.24553466243950903	0.24553466243950903	0.24553466243950903	0.24553466243950903	-0.06523603548853846	0.09667365400886294	0.0713368152859959	-0.049380507728132364	0.05282451252486076	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	-0.028140412239059866	92	0.11709471026673818	1
0.7740324633425083	0.5427668736289719	0.10890066489325784	0.17664139576963692	-0.01891849807751921	0.24581606656189964	0.24581606656189964	0.24581606656189964	0.24581606656189964	0.24581606656189964	-0.08166418125447414	0.011620192693396562	0.04235739894196954	-0.04497095901436263	0.032569707128320594	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	-0.028603080067397365	93	0.12100758485735008	1
0.774849105155053	0.542650671702038	0.10847709090383814	0.17709110535978054	-0.019244195148802413	0.2461020973625736	0.2461020973625736	0.2461020973625736	0.2461020973625736	0.2461020973625736	-0.054049878408753055	0.09529952496794664	0.07496089108022766	-0.051295296110918515	0.053311038307389834	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	-0.022739074718073474	94	0.14030289681359853	1
0.7753896039391406	0.5416976764523584	0.10772748199303586	0.1776040583208897	-0.019777305531876312	0.24632948810975433	0.24632948810975433	0.24632948810975433	0.24632948810975433	0.24632948810975433	-0.004634836658378023	0.11994905030666918	0.0796069470946916	-0.03872300046315407	0.05500233694519574	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	-0.00908898025347773	95	0.13482222346592368	1
0.7754359523057244	0.5404981859492918	0.10693141252208894	0.17799128832552125	-0.02032732890132827	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.2464203779122891	0.0019463231711243912	0.09639414315953916	0.07592558455146892	-0.02516854515566532	0.06450413358614304	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	-0.00854093627988571	96	0.11817866295073942	1
0.7754164890740132	0.5395342445176964	0.10617215667657426	0.1782429737770779	-0.020972370237189703	0.24650578727508796	0.24650578727508796	0.24650578727508796	0.24650578727508796	0.24650578727508796	-0.09513208220936142	0.048799027507884826	0.06116810689537701	-0.04946169106847582	0.056547844377849064	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	-0.034589467035792396	97	0.1280206781422752	1
0.7763678098961068	0.5390462542426175	0.10556047560762048	0.17873759068776265	-0.021537848680968193	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.24685168194544588	0.021823360536357136	0.24381010531203992	0.13430132877809084	-0.0542497680525091	0.06769974962677341	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	0.0017672575722143958	98	0.12127154804897983	1
0.7761495762907432	0.5366081531894972	0.10421746231983957	0.17928008836828774	-0.022214846177235927	0.24683400936972374	0.24683400936972374	0.24683400936972374	0.24683400936972374	0.24683400936972374	-0.01971157818123563	0.13430145878528907	0.0901747839551022	-0.04185287529821893	0.05974885555957459	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	-0.010383945054739047	99	0.124218103071711	1
0.7763466920725556	0.5352651386016443	0.10331571448028855	0.17969861712126992	-0.022812334732831674	0.24693784882027114	0.24693784882027114	0.24693784882027114	0.24693784882027114	0.24693784882027114	-0.12080450395174498	-0.008428871457682084	0.042768930169876214	-0.0453162870866954	0.05141831036282574	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	-0.04464964907170844	100	0.12942311376115329	1

You can evaluate the dataset now.

This query works but has a lot of drawback. In next parts we will try to fix some of them.

]]>
https://blog.adamfurmanek.pl/2018/10/27/machine-learning-part-2/feed/ 2