Skip to content Skip to sidebar Skip to footer

93 - Assignment Module Review Questions and Exercise

Coursera: Machine Learning (Week 3) [Assignment Solution] - Andrew NG

Logistic regression and apply it to two different datasets.

I have recently completed the Machine Learning course from Coursera by Andrew NG.

While doing the form we have to go through various quiz and assignments.

Here, I am sharing my solutions for the weekly assignments throughout the class.

These solutions are for reference only.

>It is recommended that you should solve the assignments by yourself honestly and then just it makes sense to complete the grade.

> Only, In case you stuck in between, feel free to refer the solutions provided by me.

NOTE:

Don't simply copy paste the lawmaking for the sake of completion.

Even if y'all copy the lawmaking, make sure you understand the lawmaking first.

Click hither to cheque out week-2 assignment solutions, Ringlet down for the solutions for week-3 consignment.

In this exercise, you will implement logistic regression and apply it to two dissimilar datasets. Before starting on the programming practice, nosotros strongly recommend watching the video lectures and completing the review questions for the associated topics.

Information technology consist of the following files:

  • ex2.m - Octave/MATLAB script that steps y'all through the practise
  • ex2 reg.thou - Octave/MATLAB script for the later parts of the exercise
  • ex2data1.txt - Grooming ready for the first one-half of the do
  • ex2data2.txt - Training set for the second one-half of the exercise
  • submit.m - Submission script that sends your solutions to our servers
  • mapFeature.m - Function to generate polynomial features
  • plotDecisionBoundary.m - Function to plot classifier's decision boundary
  • [*] plotData.m - Function to plot 2d classification information
  • [*] sigmoid.one thousand - Sigmoid Office
  • [*] costFunction.yard - Logistic Regression Cost Function
  • [*] predict.m - Logistic Regression Prediction Part
  • [*] costFunctionReg.m - Regularized Logistic Regression Price
  • Video - YouTube videos featuring Free IOT/ML tutorials

* indicates files you volition need to complete


plotData.m :

                          part                                          plotData              (X, y)                                                        %PLOTDATA Plots the information points X and y into a new figure                            %   PLOTDATA(ten,y) plots the data points with + for the positive examples              %   and o for the negative examples. X is assumed to be a Mx2 matrix.              % ====================== YOUR CODE Here ======================              % Instructions: Plot the positive and negative examples on a              %               2D plot, using the option 'k+' for the positive              %               examples and 'ko' for the negative examples.              %              %Seperating positive and negative results              pos              =              find              (              y              ==              i              );              %index of positive results              neg              =              find              (              y              ==              0              );              %index of negative results              % Create New Figure              effigy              ;              %Plotting Positive Results on                            %    X_axis: Exam1 Score =  Ten(pos,1)              %    Y_axis: Exam2 Score =  X(pos,ii)              plot              (              X              (              pos              ,              1              ),              Ten              (              pos              ,              two              ),              'g+'              );              %To continue higher up plotted graph every bit it is.              hold              on              ;              %Plotting Negative Results on                            %    X_axis: Exam1 Score =  X(neg,1)              %    Y_axis: Exam2 Score =  10(neg,ii)              plot              (              X              (              neg              ,              1              ),              Ten              (              neg              ,              2              ),              'ro'              );              % =========================================================================              hold              off              ;              end                      

sigmoid.m :

                          function                            g              =                                          sigmoid              (z)                                                        %SIGMOID Compute sigmoid function              %   1000 = SIGMOID(z) computes the sigmoid of z.              % Yous need to render the post-obit variables correctly                            g              =              zeros              (              size              (              z              ));              % ====================== YOUR CODE HERE ======================              % Instructions: Compute the sigmoid of each value of z (z can exist a matrix,              %               vector or scalar).              thou              =              i.              /              (              1              +              exp              (              -              z              ));              % =============================================================              terminate                      

costFunction.m :

                          function                            [J, grad]              =                                          costFunction              (theta, 10, y)                                                        %COSTFUNCTION Compute cost and gradient for logistic regression              %   J = COSTFUNCTION(theta, 10, y) computes the toll of using theta as the              %   parameter for logistic regression and the slope of the cost              %   w.r.t. to the parameters.              % Initialize some useful values              thou              =              length              (              y              );              % number of training examples              % You lot need to return the following variables correctly                            J              =              0              ;              grad              =              zeros              (              size              (              theta              ));              % ====================== YOUR CODE HERE ======================              % Instructions: Compute the price of a particular selection of theta.              %               You should set J to the cost.              %               Compute the fractional derivatives and set up grad to the partial              %               derivatives of the toll w.r.t. each parameter in theta              %              % Note: grad should accept the same dimensions every bit theta              %              %DIMENSIONS:                            %   theta = (due north+1) x 1              %   X     = m x (n+1)              %   y     = m x 1              %   grad  = (n+one) x 1              %   J     = Scalar              z              =              X              *              theta              ;              % chiliad x ane              h_x              =              sigmoid              (              z              );              % g x 1                            J              =              (              one              /              m              )              *              sum              ((              -              y              .*              log              (              h_x              ))              -              ((              one              -              y              )              .*              log              (              i              -              h_x              )));              % scalar              grad              =              (              1              /              m              )              *              (              Ten              '*              (              h_x              -              y              ));              % (n+1) x one              % =============================================================              terminate                      

Bank check-out our gratuitous tutorials on IOT (Internet of Things):



predict.m :

                          function                            p              =                                          predict              (theta, X)                                                        %PREDICT Predict whether the label is 0 or 1 using learned logistic                            %regression parameters theta              %   p = PREDICT(theta, X) computes the predictions for X using a                            %   threshold at 0.v (i.east., if sigmoid(theta'*ten) >= 0.5, predict 1)              m              =              size              (              X              ,              1              );              % Number of preparation examples              % Yous need to render the post-obit variables correctly              p              =              zeros              (              grand              ,              i              );              % ====================== YOUR Code HERE ======================              % Instructions: Complete the following code to make predictions using              %               your learned logistic regression parameters.                            %               You should set p to a vector of 0'southward and 1'southward              %              % Dimentions:              % X     =  yard ten (due north+1)              % theta = (n+i) ten 1              h_x              =              sigmoid              (              10              *              theta              );              p              =(              h_x              >              =              0.v              );              %p = double(sigmoid(X * theta)>=0.v);              % =========================================================================              end                      

costFunctionReg.m :

                          function                            [J, grad]              =                                          costFunctionReg              (theta, X, y, lambda)                                                        %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization              %   J = COSTFUNCTIONREG(theta, 10, y, lambda) computes the cost of using              %   theta equally the parameter for regularized logistic regression and the              %   slope of the cost due west.r.t. to the parameters.                            % Initialize some useful values              one thousand              =              length              (              y              );              % number of training examples              % Yous need to render the following variables correctly                            J              =              0              ;              grad              =              zeros              (              size              (              theta              ));              % ====================== YOUR CODE Here ======================              % Instructions: Compute the cost of a detail choice of theta.              %               You lot should gear up J to the cost.              %               Compute the fractional derivatives and prepare grad to the partial              %               derivatives of the price west.r.t. each parameter in theta              %DIMENSIONS:                            %   theta = (n+1) x i              %   10     = m x (n+1)              %   y     = m x ane              %   grad  = (n+1) x one              %   J     = Scalar              z              =              10              *              theta              ;              % grand x i              h_x              =              sigmoid              (              z              );              % m x one                            reg_term              =              (              lambda              /              (              2              *              m              ))              *              sum              (              theta              (              2              :              end              )              .^              2              );              J              =              (              1              /              m              )              *              sum              ((              -              y              .*              log              (              h_x              ))              -              ((              1              -              y              )              .*              log              (              i              -              h_x              )))              +              reg_term              ;              % scalar              grad              (              1              )              =              (              1              /              1000              )              *              (              10              (:,              1              )              '*              (              h_x              -              y              ));              % ane ten i              grad              (              two              :              stop              )              =              (              one              /              one thousand              )              *              (              X              (:,              2              :              end              )              '*              (              h_x              -              y              ))              +              (              lambda              /              grand              )              *              theta              (              2              :              finish              );              % n x 1              % =============================================================              end                      

I tried to provide optimized solutions similarvectorized implementation for each assignment. If you think that more than optimization can be done, then put suggest the corrections / improvements.

--------------------------------------------------------------------------------

Click here to run into solutions for all Machine Learning  Coursera Assignments.

&

Click here to see more codes for Raspberry Pi iii and similar Family.

&

Click here to see more codes for NodeMCU ESP8266  and similar Family.

&

Click here to see more codes for Arduino Mega (ATMega 2560)  and similar Family.

Feel free to ask doubts in the comment section. I volition try my best to solve it.

If you find this helpful by any hateful like, comment and share the post.

This is the simplest manner to encourage me to continue doing such piece of work.

Cheers and Regards,

-Akshay P. Daga


doylesuchaked1974.blogspot.com

Source: https://www.apdaga.com/2018/06/coursera-machine-learning-week-3.html

Post a Comment for "93 - Assignment Module Review Questions and Exercise"