the gradual reduction method tries to find the result by gradually changing the coefficients that make up the line in the direction that will give the lowest result. The learning rate parameter determines the size of the stages . Epochs specifies how many rotations the algorithm will rotate. It is useful to enter a high value, especially in cases with little data. L2 regularization weight , also known as "Ridge Regularization", specifies the weight of a value added to the cost function without further learning. " Normalized features " provides normalization of attributes between 0 and 1. " Decrease learning rate “By reducing the learning multiplier in loops, it allows to take bigger steps initially and then smaller steps.

# Score Model

We connect the data we divided into two for testing into this box. In this module, it adds another column to the end of the test data and adds its value according to the model.

# Evaluate Model

Score model had written the model's prediction to the far right. This data also has the value it should actually be. We can calculate for ourselves how successful we have of these two models, but it will be burdensome with big data. This box does this for us. I share the result of my own work, as I went into the details in the previous articles.

I got results according to LSM.

 Metric Value Explanation Mean Absolute Error 46418 The average of the absolute values ​​of the true value and the predicted values Root Mean Squared Error 52791 The square root of the mean of the squared differences between the true value and the predicted values Relative Absolute Error 0.576456 MAE divided by the difference of the actual values ​​from the mean Relative Squared Error 0.269637 RMS divided by the difference of the actual values ​​from the mean Coefficient of Determination 0.730363 Proportionally, it shows how much the estimate reflects reality. Also shown as R² square value.

# Note

This is the last part of our blog I hope you guys understand the whole scenario and it will beneficial for newbies and also professionals.