Task 1 Complete the gradient_descent function below. You do not need to work on the graphs below. If the function is finished properly, you s see logical graphs as outputs. [ ] def gradient_descent (b_val, m_val, x_val, y_val, learning_rate, num_iterations): # Defining the error function x,y,m,b,n = sp.symbols('x y mb n') n_val= float(len(x_val)) error_function = #YOUR CODE HERE #calcule the partial derivatives error_function_b = #YOUR CODE HERE error_function_m = #YOUR CODE HERE # repeat for num_iterations for j in range(num_iterations): 0 b_gradient m_gradient = 0 for i in range (#YOUR CODE HERE)): b_gradient = #YOUR CODE HERE m_gradient = #YOUR CODE HERE #update the value for b and m b_val = #YOUR CODE HERE m_val = #YOUR CODE HERE return [b_val, m_val]

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question

Complete the linear regression math using Python in Google Colab 

Your code should start from wherever you can see (#YOUR CODE HERE)

Task 1
Complete the gradient_descent function below. You do not need to work on the graphs below. If the function is finished properly, you shloud
see logical graphs as outputs.
[ ] def gradient_descent (b_val, m_val, x_val, y_val, learning_rate, num_iterations):
# Defining the error function
x,y,m,b,n = sp.symbols('x y mb n')
n_val= float(len(x_val))
error_function = #YOUR CODE HERE
#calcule the partial derivatives
error_function_b = #YOUR CODE HERE
error_function_m = #YOUR CODE HERE
# repeat for num_iterations
for j in range(num_iterations):
b_gradient 0
m_gradient = 0
for i in range (#YOUR CODE HERE)):
b_gradient = #YOUR CODE HERE
m_gradient = #YOUR CODE HERE
#update the value for b and m
b_val = #YOUR CODE HERE
m_val = #YOUR CODE HERE
return [b_val, m_val]
Transcribed Image Text:Task 1 Complete the gradient_descent function below. You do not need to work on the graphs below. If the function is finished properly, you shloud see logical graphs as outputs. [ ] def gradient_descent (b_val, m_val, x_val, y_val, learning_rate, num_iterations): # Defining the error function x,y,m,b,n = sp.symbols('x y mb n') n_val= float(len(x_val)) error_function = #YOUR CODE HERE #calcule the partial derivatives error_function_b = #YOUR CODE HERE error_function_m = #YOUR CODE HERE # repeat for num_iterations for j in range(num_iterations): b_gradient 0 m_gradient = 0 for i in range (#YOUR CODE HERE)): b_gradient = #YOUR CODE HERE m_gradient = #YOUR CODE HERE #update the value for b and m b_val = #YOUR CODE HERE m_val = #YOUR CODE HERE return [b_val, m_val]
Inntially we are randomly taking m and b to be 0, which will produce not so correct predictions.
[ ]m_val = 0
b_val = 0
predictions = [(m_val * X[i]) + b_val for i in range(len(X))]
plt.scatter (X, Y)
plt.plot(X, predictions, color='r')
Here, we will use 2 iterations to see that our prediction have improved slightly.
[] vals = gradient_descent(0, 0, X, Y, .0001, 2)
m_val = vals[1]
b_val= vals[0]
predictions = [(m_val * X[i]) + b_val for i in range(len(X))]
plt.scatter (X, Y)
plt.plot(X, predictions, color='r')
Now we will perform 10 iterations. We should see very accurate results here.
vals = gradient descent(0, 0, X, Y, 0001, 10)
m_val= vals[1]
b_val= vals[0]
predictions = [(m_val * X[i]) + b_val for i in range(len(X))]
plt.scatter (X, Y)
plt.plot(X, predictions, color='r')
Transcribed Image Text:Inntially we are randomly taking m and b to be 0, which will produce not so correct predictions. [ ]m_val = 0 b_val = 0 predictions = [(m_val * X[i]) + b_val for i in range(len(X))] plt.scatter (X, Y) plt.plot(X, predictions, color='r') Here, we will use 2 iterations to see that our prediction have improved slightly. [] vals = gradient_descent(0, 0, X, Y, .0001, 2) m_val = vals[1] b_val= vals[0] predictions = [(m_val * X[i]) + b_val for i in range(len(X))] plt.scatter (X, Y) plt.plot(X, predictions, color='r') Now we will perform 10 iterations. We should see very accurate results here. vals = gradient descent(0, 0, X, Y, 0001, 10) m_val= vals[1] b_val= vals[0] predictions = [(m_val * X[i]) + b_val for i in range(len(X))] plt.scatter (X, Y) plt.plot(X, predictions, color='r')
Expert Solution
steps

Step by step

Solved in 3 steps with 3 images

Blurred answer
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY