APDaga DumpBox : The Thirst for Learning...

  • 🌐 All Sites
  • _APDaga DumpBox
  • _APDaga Tech
  • _APDaga Invest
  • _APDaga Videos
  • 🗃️ Categories
  • _Free Tutorials
  • __Python (A to Z)
  • __Internet of Things
  • __Coursera (ML/DL)
  • __HackerRank (SQL)
  • __Interview Q&A
  • _Artificial Intelligence
  • __Machine Learning
  • __Deep Learning
  • _Internet of Things
  • __Raspberry Pi
  • __Coursera MCQs
  • __Linkedin MCQs
  • __Celonis MCQs
  • _Handwriting Analysis
  • __Graphology
  • _Investment Ideas
  • _Open Diary
  • _Troubleshoots
  • _Freescale/NXP
  • 📣 Mega Menu
  • _Logo Maker
  • _Youtube Tumbnail Downloader
  • 🕸️ Sitemap

Coursera: Machine Learning (Week 6) [Assignment Solution] - Andrew NG

coursera linear regression programming assignment

Recommended Machine Learning Courses: Coursera: Machine Learning    Coursera: Deep Learning Specialization Coursera: Machine Learning with Python Coursera: Advanced Machine Learning Specialization Udemy: Machine Learning LinkedIn: Machine Learning Eduonix: Machine Learning edX: Machine Learning Fast.ai: Introduction to Machine Learning for Coders
  • ex5.m - Octave/MATLAB script that steps you through the exercise
  • ex5data1.mat - Dataset
  • submit.m - Submission script that sends your solutions to our servers
  • featureNormalize.m - Feature normalization function
  • fmincg.m - Function minimization routine (similar to fminunc)
  • plotFit.m - Plot a polynomial fit
  • trainLinearReg.m - Trains linear regression using your cost function
  • [*] linearRegCostFunction.m - Regularized linear regression cost function
  • [*] learningCurve.m - Generates a learning curve
  • [*] polyFeatures.m - Maps data into polynomial feature space
  • [*] validationCurve.m - Generates a cross validation curve
  • Video - YouTube videos featuring Free IOT/ML tutorials

linearRegCostFunction.m :

Learningcurve.m :, polyfeatures.m :, check-out our free tutorials on iot (internet of things):.

validationCurve.m :

29 comments.

coursera linear regression programming assignment

Thankyou for your solutions :) I have 2 questions : 1) I see that the sizes of test set and validation set are 21X1 each while that of training set is only 12X1, why is the training set's size smaller that the test and validation set? 2) Why do we put lamda =0 while finding the error_train and error_val in both the functions learningCurve.m and validationCurve.m ??

These codes are not working for me. they're running but not giving any marks.

coursera linear regression programming assignment

Is there any other code (from other website) which is working for you? If that's the case, Please let me know I will recheck my codes. Otherwise, You must be doing small mistake from your end. Either way please let me know.

Your code has been working for me. its not just good to copy and paste only. its good to understand the code. Thanks for your help.

My code is working and show correct result but while submit the code grader not give marks.

It's difficult for me to tell you what's wrong with your code without checking it. I doubt, You have to debug your code on your own. Sorry.

Please help.

In learningCurve.m when i write the same code as given above. it is showing me this division by zero.944305e-31 warning: called from fmincg at line 102 column 12 can someone help me to fix it

Hi Akshay, In the polyFeatures.m file, when the X is raised to the power of 1,2,3,4 and so on, the X value is divided by thousand before the power calculation. why is that. -15.9368 -29.1530 36.1895 37.4922 -48.0588 -8.9415 15.3078 -34.7063 1.3892 -44.3838 7.0135 22.7627 The above dataset is divided by 1000 in the second iteration. please help to clarify.

Hello, thank you for your effort, please I have a question regarding your solution for the linearRegCostFunction. I didn t understand when we need to add the Sum function and when we re not suppose to add it. could you please explain that. Thank you in advance

If you look at cost function equations, you have to calculate (elementwise) square of the difference then summation of that. In that case you have to use "sum" function. In general, if you do matrix multiplication then it already consist "sum of the product" so separate "sum" function is not required. But, if you do element wise multiplication of square operation on matrices (indicated by .* or .^ respectively) then you have to do "sum" operation separately for the summation purpose.

Ah I get it, thank you so much for this explination

Hi - For validation curve, dont think you need the 1:m loop....the way it is implemented , only the last iteration of the loop matters. Below code is working with only one loop: len = length(lambda_vec) ; for i = 1:len lambda = lambda_vec(i) ; [theta] = trainLinearReg(X, y,lambda ); error_train(i) = linearRegCostFunction(X,y,theta,0) ; error_val(i) = linearRegCostFunction(Xval,yval,theta,0) ; end

I am not sure if you can help but i cant find where is the problem. I am getting those answers for learning curves and my learning curve is identical to the one in ex5. It shows 0 points tho. Can someone help me. # Training Examples Train Error Cross Validation Error 1 0.000000 205.121096 2 0.000000 110.300366 3 3.286595 45.010231 4 2.842678 48.368911 5 13.154049 35.865165 6 19.443963 33.829962 7 20.098522 31.970986 8 18.172859 30.862446 9 22.609405 31.135998 10 23.261462 28.936207 11 24.317250 29.551432 12 22.373906 29.433818

For linearRegCostFunction, Don't understand why there's two grads. Grad(1) and Grad(2:end)?

and why remove horizontal bias unit from X? X(: 2:end)

As per the theory taught in lecture, We don't apply regularization on the first term, regularization is only applied from 2nd to end term. that's why we break the grads into 2 parts Grad(1) and Grad(2:m) do the processing separately and then and combine them.

thanks man. and thank you for sharing your work. really helps people who are stuck.

reg_gradient = lambda/m*(theta(2:end))' grad = ((1/m)*(h-y)'*X) + reg_gradient; Here is what i did by the way. Didn't break the grads into 2 parts but still works. Do you really need to break it into two parts? Am i applying regularization on the first term using this method?

Can you explain this part of the for loop? Xtrain = X(1:i,:); ytrain = y(1:i); how are you getting the x and y values in a for loop for the training curve? I know the cross validation x and y values are already given but what's your method of getting values for the training function?

Here, we are increasing the size of training set in each iteration from 1 upto m and plotting the graph of train and test error. Please check the comments given in each function. you will find it helpful. Eg. you will compute the train and test errors for % dataset sizes from 1 up to m. In practice, when working with larger % datasets, you might want to do this in larger intervals.

hi.. can you please help.. In validation curve.. why have you put lambda = 0 for error_train(j) and error_val(j). why are we not using different values of lamba here are different values of lambda only needed for theta?

In validation curve, we are calculating error_train(j) and error_val(j) without regularization. So, to remove the regularization term we have set lambda = 0.

Thankyou man!!. Your solutions are a saviour

I keep getting non conformant arguments linearRegCostFunction: operator *: nonconformant arguments (op1 is 12x1, op2 is 9x1) I have even copy and paste your code for it. What could be the reason?

When you start attempting the work, only run the section you are working on (click on section, then run section). Clicking on "run" activates other sections and may alter the saved variables in your workspace.

even i am facing the same problem sir,did you get how to solve If so please explain me

have you tried optional part?

whenever i try to run the code with (ex5) on MATLAB. It shows "unrecognized function or variable "ex5". please help how to resolve this.

Our website uses cookies to improve your experience. Learn more

Contact form

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications

This repository contains my code implementation of Linear Regression, as programming assignment of week 2 of Coursera's Machine Learning course by Andrew Ng

jaylamberte99/linear-regression-assignment-week-2-coursera

Folders and files, repository files navigation, linear-regression-assignment-week-2-coursera.

  • MATLAB 100.0%

IMAGES

  1. Coursera Machine Learning week 2 assignment Linear Regression Ex 1 in 5

    coursera linear regression programming assignment

  2. Online Course: Linear Regression with Python from Coursera Project

    coursera linear regression programming assignment

  3. GitHub

    coursera linear regression programming assignment

  4. Linear Regression for Business Statistics coursera week1 solution

    coursera linear regression programming assignment

  5. Linear regression using R programming

    coursera linear regression programming assignment

  6. Linear Regression and Modeling

    coursera linear regression programming assignment

VIDEO

  1. Coursera assignment week 3rd|| logistics regression implementation #logistics #coursera

  2. Linear Regression for Business Statistics coursera week 4 solution

  3. Algebra

  4. Zero To Hero Machine Learning

  5. 01 03 Part 3 of 3 Details of mathematics for least squares for linear regression

  6. Logistic Regression:Programming Assignment

COMMENTS

  1. greyhatguy007/Machine-Learning-Specialization-Coursera

    Programming Assignment. Linear Regression; Week 3. Practice quiz: Cost function for logistic regression; Practice quiz: Gradient descent for logistic regression ... python machine-learning deep-learning neural-network solutions mooc tensorflow linear-regression coursera recommendation-system logistic-regression decision-trees unsupervised ...

  2. Coursera: Machine Learning (Week 2) [Assignment Solution]

    163. Linear regression and get to see it work on data. I have recently completed the Machine Learning course from Coursera by Andrew NG. While doing the course we have to go through various quiz and assignments. Here, I am sharing my solutions for the weekly assignments throughout the course. These solutions are for reference only.

  3. Mathematics for Machine Learning and Data Science Specialization

    Programming Assignment - Optimization Using Gradient Descent: Linear Regression; Lecture Materials; Week 3. Practice Quiz - Optimization in Neural Networks; Ungraded Lab - Regression with Perceptron; Ungraded Lab - Classification with Perceptron; Ungraded Lab - Optimization Using Newtons Method; Graded Quiz - Optimization in Neural Networks and ...

  4. Linear Regression and Modeling

    About Linear Regression and Modeling. Module 1 • 22 minutes to complete. This short module introduces basics about Coursera specializations and courses in general, this specialization: Statistics with R, and this course: Linear Regression and Modeling. Please take several minutes to browse them through.

  5. Regression Models

    Linear models, as their name implies, relates an outcome to a set of predictors of interest using linear assumptions. Regression models, a subset of linear models, are the most important statistical analysis tool in a data scientist's toolkit. ... 7 videos 6 readings 1 quiz 4 programming assignments 1 peer review. ... Join over 3,400 global ...

  6. Build Essential Linear Regression Skills

    Build Essential Linear Regression Skills. Linear Regression courses teach students how to analyze and interpret data using the linear regression model. Topics covered include model building, hypothesis testing, and prediction. Gain practical skills in data analysis and learn how to make informed decisions based on statistical analysis.

  7. GitHub

    Programming assignment 1 in Machine Learning course by Andrew Ng on Coursera. ex1.pdf - Information of this exercise ex1.m - Octave script that will help you debug and step you through the exercise ex1_multi.m - Octave script for the later parts of the exercise ex1data1.txt - Dataset for linear regression with one variable

  8. Coursera: Machine Learning (Week 6) [Assignment Solution]

    29. Regularized linear regression to study models with different bias-variance properties. I have recently completed the Machine Learning course from Coursera by Andrew NG. While doing the course we have to go through various quiz and assignments. Here, I am sharing my solutions for the weekly assignments throughout the course.

  9. Programming Exercise 1: Linear Regression

    Note x and y should have the same size. Instructions ------------ Plot the training data into a figure using the "figure" and "plot" functions. Set the axes labels using the "xlabel" and "ylabel" functions. Assume the population and revenue data have been passed in as the x and y arguments of this function.

  10. Modern Regression Analysis in R

    There are 6 modules in this course. This course will provide a set of foundational statistical modeling tools for data science. In particular, students will be introduced to methods, theory, and applications of linear statistical models, covering the topics of parameter estimation, residual diagnostics, goodness of fit, and various strategies ...

  11. Andrew Ng's Machine Learning Course in Python (Linear Regression)

    Machine learning by Andrew Ng offered by Stanford in Coursera ... Although It is all well and good to learn some Octave programming and complete the programming assignment, I would like to test my knowledge in python and try to complete the assignment in python from scratch. ... First off will be univariate linear regression using the dataset ...

  12. Personal Solutions to Programming Assignments on Matlab

    Personal Solutions to Programming Assignments on Matlab - GitHub - koushal95/Coursera-Machine-Learning-Assignments-Personal-Solutions: Personal Solutions to Programming Assignments on Matlab. ... Linear Regression with Multiple Variables. Octave / Matlab Tutorial. Programming Exercise 1.

  13. Coursera Machine Learning week 2 assignment Linear Regression ...

    If you are unable to complete the week 2 assignment Linear Regression Ex1 of Coursera Machine Learning, then You are in the right place to complete it with ...

  14. Machine Learning Coursera

    About Press Copyright Contact us Creators Advertise Developers Terms Privacy Policy & Safety How YouTube works Test new features NFL Sunday Ticket Press Copyright ...

  15. Linear regression (Coursera programming assignment)

    Week 2programming assignmentlinear regressioncourseramachine learning courseBy Andrew Ng

  16. Best Regression Courses & Certificates Online [2024]

    Mathematics: A solid foundation in calculus and linear algebra is crucial for regression analysis. Understanding concepts like derivatives, matrices, and vectors will help you grasp regression models more effectively. Programming: Proficiency in a programming language is necessary for implementing regression models.

  17. Coursera Machine Learning MOOC by Andrew Ng

    My Python solutions for the assignments in the machine learning class by andrew ng on coursera. - aosama16/Coursera-Machine-Learning ... Regularized Linear Regression and Bias/Variance. Exercise 6 - Week 07. Support Vector Machines. Exercise 7 - Week 08 ... An unfortunate aspect of this class is that the programming assignments are in MATLAB or ...

  18. Ex1

    Programming Exercise 1: Linear Regression Machine Learning Introduction. In this exercise, you will implement linear regression and get to see it work on data. Before starting on this programming exercise, we strongly recom- mend watching the video lectures and completing the review questions for the associated topics.

  19. Linear Regression

    This course is part of the Performance Based Admission courses for the Data Science program. ... - Use R to fit a linear regression model to a given data set. - Interpret and draw conclusions on the linear regression model. ... Access to lectures and assignments depends on your type of enrollment. If you take a course in audit mode, you will be ...

  20. Coursera : Machine Learning Week 3 Programming Assignment ...

    Coursera : Machine Learning Week 3 Programming Assignment: Logistics Regression Solutions | Stanford University.Logistics Regression Assignment Machine Learn...

  21. jaylamberte99/linear-regression-assignment-week-2-coursera

    This repository contains my code implementation of Linear Regression, as programming assignment of week 2 of Coursera's Machine Learning course by Andrew Ng 0 stars 0 forks Branches Tags Activity Star

  22. Best Linear Programming Courses Online with Certificates [2024]

    Build Essential Linear Programming Skills. Our linear programming courses are designed to provide you with the knowledge and skills necessary to excel in optimization, decision-making, and mathematical modeling. Whether you are interested in operations research, supply chain management, or business analytics, our courses will equip you with the ...

  23. Coursera: Machine learning Linear Regression Week 2 Assignment

    All 8 Solutions Files :- https://ko-fi.com/s/a764a2cc3e Send me message on (WhatsApp) +918302648025I complete all Your Assignments Using Email+Token.Machine ...