Gradients not changing in co-ordinate descent for logistic regression

20 Views Asked by At

I am trying to implement a co-ordinate descent algorithm for logistic regression. My gradients are not changing, as a result I end up updating a single co-ordinate for each epoch. Here is the code:

class LogisticRegression():

  def coordinate(self, x, y, pred):
    grads = self.gradient(x, y, pred)
    grads = abs(grads)
    # print(grads, np.argmax(grads))

    return np.argmax(grads)

  def gradient(self, x, y, pred):
    dw = np.dot(x.T, (pred - y))
    dw = np.mean(dw.T, axis=0)
    # print(dw)
    return dw

Am I doing something wrong? Thank you!

0

There are 0 best solutions below