Logistic Regression: Types, Hypothesis and Decision Boundary

The term logistic regression can be deceptive. Because Actually it is classification model. We use logistic regression to solve problems like:

Emails are spam or not.

Online transactions are: fraudulent (yes/no)

Tumor: Malignant / Benign

Classification problems can be binary classification: that means the target value y is either 0 or 1. Generally, 0 represents the negative class and 1 represents the positive class. For instance, when classifying emails, if an email is a spam, the value of y may be 1 and y is 0 when it is not.  So the value of y can be either 0 or 1 only. There is multiclass classification also where the value of y can be 0, 1, 2, 3, 4 and so on. But in this article, I am only focusing on binary classification. I will definitely talk about multiclass classification in future articles.

Let’s see why logistic regression got importance. We already know linear regression. There are several problems in linear regression. First, linear regression only looks at the linear relationship between dependent variables and independent variables. But in the real world the relationships between dependent and independent variables are not always linear. Secondly, linear regression only looks at the mean of the variables. That may induce a high rate of imperfection in the model to begin with. Sometimes it is important to look at the extreme cases too. For example, if we are looking at the birth weight of the infants verses the weight or age of the mother, sometimes infant’s weight might be too low and it is serious health issue. In the picture above, it shows that there are few data points in the far right end that makes the decision boundary in a way that we get negative probability. So, lots of times, it is not enough to take a mean weight and make decision boundary based on that. There are few other issues as well, but we are not going deeper into those.  I want to focus on a different model that works better to address this kind of problems.

Logistic Function or Sigmoid Function

Logistic regression uses a more complex formula for hypothesis. The hypothesis in logistic regression can be defined as Sigmoid function. This is called as Logistic function as well. Logistic function is expected to output 0 or 1. But linear function can output less than 0 o more than 1. So, we cannot use the linear regression hypothesis. Logistic regression hypothesis can be expressed as follows:

In this Equation:

Decision Boundary

From the picture above, the value of sigmoid function is greater than or equal to 0.5, if z is greater than or equal to 0.5. Hypothesis is less than 0.5 when z is less than 0.5. For example, we are trying to identify from bunch pictures if there are cars in picture or not. If the hypothesis is 0.5 or greater, we will predict that there is car and if the hypothesis is less than 0.5, the prediction will be ‘no car’.  

Leave a Reply

Close Menu