Classification And Overfitting
2017-11-20
Machine-Learning
658
This is a learning note of Logistic Regression of Machine Learning by Andrew Ng on Coursera.
Hypothesis Representation
Uses the "Sigmoid Function," also called the "Logistic Function":
Which turn linear regression into classification.
Sigmoid function looks like this:
give us the probability that the output is 1.
In fact, is simplified as
for logistic regression, and is
for linear regression. In some complicated case, z might be something like:
Decision Boundary
Decision boundary is the line (or hyperplane) that separates the area where y = 0 and where y = 1 (or separates different classes). It's created by our hypothesis function.
The input to the sigmoid function is not necessary to be linear, and could be a function that describes a circle (e.g. ) or any shape to fit the data.
Cost Function
Using the cost function for linear regression in classification will cause the output to be wavy, resulting in many local optima.