(a) Compute one iteration of the gradient descent algorithm for f(x, y) = (x - 2)² + (y + x)² +2 starting at [1,-1] with a learning rate of 1/2. (b) Run the gradient descent algorithm starting from [1,-1] with learning rate set to 1/10 and a threshold of 0.001. How many iterations does it take for the algorithm to terminate and to what vector [x,y] does the algorithm converge? (c) Is [2,-2] a critical point of f(x, y)? (d) Do you think [2,-2] is a global minimum of f(x, y)? Explain why or why not.

Trigonometry (MindTap Course List)
10th Edition
ISBN:9781337278461
Author:Ron Larson
Publisher:Ron Larson
Chapter6: Topics In Analytic Geometry
Section6.4: Hyperbolas
Problem 5ECP: Repeat Example 5 when microphone A receives the sound 4 seconds before microphone B.
icon
Related questions
Question

Asap!! 

Problem 4
(a) Compute one iteration of the gradient descent algorithm for
f(x, y) = (x – 2)² + (y + x)* + 2
starting at [1,-1] with a learning rate of 1/2.
(b) Run the gradient descent algorithm starting from [1,-1] with learning rate set to 1/10
and a threshold of 0.001. How many iterations does it take for the algorithm to
terminate and to what vector [x,y] does the algorithm converge?
(c) Is [2,-2] a critical point of f(x, y)?
(d) Do you think [2,-2] is a global minimum of f(x, y)? Explain why or why not.
Transcribed Image Text:Problem 4 (a) Compute one iteration of the gradient descent algorithm for f(x, y) = (x – 2)² + (y + x)* + 2 starting at [1,-1] with a learning rate of 1/2. (b) Run the gradient descent algorithm starting from [1,-1] with learning rate set to 1/10 and a threshold of 0.001. How many iterations does it take for the algorithm to terminate and to what vector [x,y] does the algorithm converge? (c) Is [2,-2] a critical point of f(x, y)? (d) Do you think [2,-2] is a global minimum of f(x, y)? Explain why or why not.
Expert Solution
steps

Step by step

Solved in 4 steps

Blurred answer