Homework4_LR

.pdf

School

Arizona State University *

*We aren’t endorsed by this school

Course

405

Subject

Electrical Engineering

Date

Dec 6, 2023

Type

pdf

Pages

3

Uploaded by MagistrateSnow15501

Report
EEE 498 ML with application to FPGAs All homework is handed in online. Don’t hand in pictures of the code files, hand in the actual code files so I can see if they work. Hand in images of outputs of your code (displays, plots …) converted to pdf. If there are questions to be answered answer them in PowerPoint, Word, or on paper and convert to pdf (It doesn’t actually matter how you get to the pdf). Organize your work so that the answer to each problem can be easily identified. You will not get credit if your submission is poorly organized, or if the grader cannot find the answers. You can use any language python, c, C++, MATLAB,…, to do coding though Python is likely the easiest and easiest to translate from lectures. Be careful copying code from office products to Spyder or other editors, some characters are changed like single quotes. Sometimes invisible characters occur causing you to get an error on a line until you completely retype it. Homework 4 1) Starting with the Least squares cost function, find the first and second derivative, show your work. 2) Show that the derivative of the Sigmoid function is as shown here ( ) ( ) ( ) 1 ( ) t t t = . show all steps 3) Using the cross-entropy cost function prove the equation for the gradient given in the lecture, show all steps 4) Write your own cross entropy cost function logistic regression algorithm. Use the iris dataset as shown below. I recommend Stochastic Gradient Descents.
from sklearn import datasets iris = datasets.load_iris() X = iris.data[:,0:4] y = iris.target You can start by simplifying this to the first two sets of 50 points so there are only two classes. Then expand the code to a multiclass with all 150 points and 3 classes. To do that you will have to onehot code target y and use argmax to determine the prediction. Useful code ## ## convert to compact form, now in compact form ## the first weight is the intercept ## Fones = np.ones(Nobs,dtype=float) X_std_ = np.column_stack((Fones,X_std)) ## ## Activation ## def activation(Z): act = np.ones(len(Z),float) for i,z in enumerate(Z): act[i] = 1./(1.+np.exp(-np.clip(z,-250,250))) return( act ) ## ## convert to multiclass one hot coding ## def onehoty(y): from sklearn.preprocessing import OneHotEncoder onehot_encoder = OneHotEncoder(sparse=False) integer_encoded = y.reshape(len(y), 1) onehot_encoded = onehot_encoder.fit_transform(integer_encoded) return(onehot_encoded) ##
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help