'using gradient descent for curve fitting in python

I am trying to make a code for fitting a curve M=AL^b using gradient descent and least square method. I took logarithms on both sides and set lnM=y , lnL=x and a=lnA.Then the problem became fitting the curve f(x)=bx+a. The file for data I am using is: [data file][1] . First column is length L and the second is weight M. I wrote a code for fitting the curve using least square method and it came out great. The result is depicted in the following picture: [![curve-fit using least squares method][2]][2] . The a and b coefficients I got are -12.28937227761337 3.138974848045538 . Then I tried to do the gradient descent method by implementing this code:

import numpy as np
import matplotlib.pyplot as plt



data=np.loadtxt("/content/drive/MyDrive/Colab Notebooks/fish.txt")
L=data[:,0]
M=data[:,1]
X=np.log(data[:,0])
Y=np.log(data[:,1])
plt.scatter(L,M)
a = 0
b = 0
learning_rate = 0.0001  # The learning Rate
epochs = 1000  # The number of iterations to perform gradient descent

n = float(len(X)) # Number of elements in X
A=np.exp(a)
# Performing Gradient Descent 
for i in range(len(X)): 
    Y_pred = b*X+a  # The current predicted value of Y
    D_b = (-2/n) * sum(X * (Y - Y_pred))  # Derivative wrt b
    D_a = (-2/n) * sum(Y- Y_pred)  # Derivative wrt a
    b = b - learning_rate * D_b  # Update b
    a = a - learning_rate * D_a  # Update a
    
print (a, b)

However, I got the following result for a and b : 0.03524500694986668 0.3671228276778898 And it does not seem any right. The coefficients for gradient descent and for least square method should be the same right? Can you help me understand what is wrong with my code? [1]: https://drive.google.com/file/d/1Pku2ZPiArgB5zSSBC70Xi2aJRpodaXZz/view?usp=sharing [2]: https://i.stack.imgur.com/ecyxf.png



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source