Signup/Sign In

A Deeper Dive into Logistic Regression

    In the previous posts, we looked at the idea behind Logistic Regression, visual representation, sigmoid function and the equations related to it. In today's post, we will first recall the concepts and go ahead to show how Logistic Regression works in real life and how it benefits the world. I will also discuss the applications of this function in addition to the variants of Logistic Regression.

    logistic regression and some examples

    Start from the beginning:

    Logistic Regression is a type of supervised learning algorithm, which is a classification problem that is used to classify a component into one of the 2 categories possible. It is a classification problem wherein the machine must classify the outcome of an input as 0 or 1, yes or no, true or false.

    In this type of regression, the target variable (or the value to be predicted) is categorical in nature. The dependent variable here is the probability of the event falling into the positive category.

    This type of regression is also considered a special case of Linear Regression.

    For example:

    • Whether Liverpool would win the match (1) or not (0).

    • Whether stock prices would go up or down (here we are ignoring the fact that it can remain constant as well).

    • Whether an email is spam or not spam.

    This is known as "Binary Logistic Regression", since it classifies the outcome into one of the two results possible. The output is either a 0 or a 1 since it predicts the probability of the situation.

    Another case of logistic regression is when it is required to classify the dependent variables into more than 2 classes of outcome. Such a case is known as "Multinomial Logistic Regression".

    The function used in logistic regression is known as Logistic function aka Sigmoid function which is represented as an S-shaped graph (Refer to the previous post for the visual representation). The machine (system) predicts the probability of an event occurring with the help of this "Logistic Function".

    Now let us look at the mathematical side of it. The equation for Linear Regression is:

    z = b0 +b1x1 + b2x2 + …. +bnxn;

    Applying the sigmoid function (p = 1/(1+e-z) to this expression, we get,

    p = 1/(1+e-(b0 +b1x1 + b2x2 + …. +bnxn))

    Logistic regression belongs to a class of algorithms which is called “Generalized Linear Model” or GLM. The basic GLM equation is:

    g(E(y)) = + x1 + x2;


    g is the link function,

    E(y) is the probability of the target variable (the variable which needs to be predicted),

    and the RHS is the equation used to predict the target variable.

    Outcome = p/(1-p) = probability of the occurrence of an event/ probability of the event not occurring

    ln(outcome) = ln(p/(1-p))

    logisticFunction(p) = ln(p/(1-p))

    In the above expression, p is the probability that the outcome is in our favour.

    The next trivial question would be, Why have we used ln in the mathematical expression?. This is just an intuitive way of understanding how logistic regression progresses. ln has been used to make it simple and it is the best possible mathematical method to showcase how a step function works.

    In Linear Regression, the aim is to minimize the sum of squared errors. On similar lines, in Logistic Regression, the aim is to obtain coefficients of the model which relate the target variable to the predictor equation. It uses maximum likelihood estimation (MLE) to obtain the model coefficients. MLE is a method which increases the accuracy of the output. It determines the parameters which would give output that is closer to the observed data.

    Mathematically, it gives the mean and variance values as parameters to determine the output.

    After the MLE has been estimated, the process repeats until LL (Log Likelihood) doesn’t change significantly.

    Log likelihood basically tells how similar the output is to the observed data.

    Assumptions while using Logistic Regression:

    • In binary logistic regression, the dependent variable needs to be binary (either a 0 or 1).

    • Include relevant variables only.

    • Large datasets are needed.

    The other commonly used regression algorithms are:

    • Polynomial Regression

    • Ridge Regression

    • Lasso Regression

    • ElasticNet Regression

    Applications of Logistic Regression:

    Finding whether an event would occur or not (weather forecast, stock prediction, election poll outcome).

    Think of a few examples and let us know in the comments below. In the upcoming post, we will see how Logistic Regression can be implemented in Python.

    Hi, My name is Smriti. I enjoy coding, solving puzzles, singing, blogging and writing on new technologies. The idea of artificial intelligence and the fact that machines learn, impresses me every day.