Skip to content

Commit

Permalink
Add files via upload
Browse files Browse the repository at this point in the history
  • Loading branch information
eskinderit authored Nov 8, 2021
0 parents commit 7b676f2
Show file tree
Hide file tree
Showing 3 changed files with 834 additions and 0 deletions.
812 changes: 812 additions & 0 deletions Gradient Descent.ipynb

Large diffs are not rendered by default.

Binary file added Gradient Descent.pdf
Binary file not shown.
22 changes: 22 additions & 0 deletions utils/backtracking.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
import numpy as np

def backtracking(f, grad_f, x):
"""
This function is a simple implementation of the backtracking algorithm for
the GD (Gradient Descent) method.
f: function. The function that we want to optimize.
grad_f: function. The gradient of f(x).
x: ndarray. The actual iterate x_k.
"""
alpha = 1
c = 0.8
tau = 0.25

while f(x - alpha * grad_f(x)) > f(x) - c * alpha * np.linalg.norm(grad_f(x), 2) ** 2:
alpha = tau * alpha

if alpha < 1e-5:
break

return alpha

0 comments on commit 7b676f2

Please sign in to comment.