See https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html for details. It expects an input of size Nsamples x Nclasses, which is un-normalized logits, and a target, which is y_train_l of size Nsamplesx1. You may also feed it y_train_T of size Nsample x Nclasses. Please see the documentation. Cross entropy loss expects raw unnrormalized scores. Soft-max converts raw unnormalized scores to probabilities, which are used to plot the labels. Use SGD and run 20,000 epochs using a learning rate of 1e-2 to train the neural network code: nn = NeuralNet(Nfeatures,Nclasses,20,20).to(device) sm = N.Softmax(dim=1) # Weight the cross entropy loss to balance the classes Nsamples_per_class = y_train_T.sum(axis=0) Weight = Nsamples_per_class.sum()/Nsamples_per_class loss = torch.nn.CrossEntropyLoss(weight=Weight) learning_rate = 0.01 #YOUR CODE HERE optimizer= optim.SGD(net.parameters(), lr=learning_rate) # define optimizer for epoch in range(20000):          #YOUR CODE HERE     predNN = ??   # Forward pass     error = ??    # find the loss     optimizer.zero_grad()              # clear the gradients     backward.loss()              # Send loss backward     optimizer.step()              # update weights      if(np.mod(epoch,5000)==0):       print("Error =",error.detach().cpu().item())       fig,ax = plt.subplots(1,2,figsize=(12,4))       ax[0].plot(y_train_T[0:40].detach().cpu())       ax[1].plot(sm(predNN[0:40]).detach().cpu())       plt.show() Need help with predNN and error formula

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question

See https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html for details. It expects an input of size Nsamples x Nclasses, which is un-normalized logits, and a target, which is y_train_l of size Nsamplesx1. You may also feed it y_train_T of size Nsample x Nclasses. Please see the documentation.

Cross entropy loss expects raw unnrormalized scores. Soft-max converts raw unnormalized scores to probabilities, which are used to plot the labels.

Use SGD and run 20,000 epochs using a learning rate of 1e-2 to train the neural network

code:

nn = NeuralNet(Nfeatures,Nclasses,20,20).to(device)
sm = N.Softmax(dim=1)

# Weight the cross entropy loss to balance the classes
Nsamples_per_class = y_train_T.sum(axis=0)
Weight = Nsamples_per_class.sum()/Nsamples_per_class
loss = torch.nn.CrossEntropyLoss(weight=Weight)
learning_rate = 0.01
#YOUR CODE HERE
optimizer= optim.SGD(net.parameters(), lr=learning_rate) # define optimizer

for epoch in range(20000):
    
    #YOUR CODE HERE
    predNN = ??   # Forward pass
    error = ??    # find the loss
    optimizer.zero_grad()              # clear the gradients
    backward.loss()              # Send loss backward
    optimizer.step()              # update weights 

    if(np.mod(epoch,5000)==0):
      print("Error =",error.detach().cpu().item())
      fig,ax = plt.subplots(1,2,figsize=(12,4))
      ax[0].plot(y_train_T[0:40].detach().cpu())
      ax[1].plot(sm(predNN[0:40]).detach().cpu())
      plt.show()

Need help with predNN and error formula

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Knowledge Booster
Huffman coding
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education