Consider basic gradient descent (Algorithm 1) and Newton’s algorithm (Algorithm 2) applied to the data in the tables. (a) Apply both to the three-dimensional data in categories ω1 and ω3. For the gradient descent use η(k)=0.1. Plot the criterion function as function of the iteration number. (b) Estimate the total number of mathematical operations in the two algorithms. (c) Plot the convergence time versus learning rate. What is the minimum learning rate that fails to lead to convergence?

Question
Asked Mar 5, 2020
1 views

Consider basic gradient descent (Algorithm 1) and Newton’s algorithm (Algorithm 2) applied to the data in the tables.

(a) Apply both to the three-dimensional data in categories ω1 and ω3. For the gradient descent use η(k)=0.1. Plot the criterion function as function of the iteration number.

(b) Estimate the total number of mathematical operations in the two algorithms.

(c) Plot the convergence time versus learning rate. What is the minimum learning rate that fails to lead to convergence?

 

Expert Answer

This question hasn't been answered yet.

Ask an expert

Check out a sample Q&A here.

The solution to your study problems

Solutions are written by subject matter experts who are available 24/7. Questions are typically answered within 1 hour.*

Get Started
*Response times may vary by subject and question.