Why is logistic regression is used over linear regression sometimes

·

1 min read

When I first started learning about linear regression, I believed that gradient descent could solve any problem. However, as I delved deeper into the topic, I realized that the real world is far more complex than I initially thought. A good example of this is when we need to model the probability of a binary outcome, such as whether a patient has cancer or not. While linear regression might seem like a natural choice to categorize outcomes based on certain features, it is not always the best method.

Linear regression can be highly affected by outliers, which can lead to misclassifying data. Therefore, in situations where we need to model binary outcomes, we often turn to logistic regression instead. Interestingly, logistic regression includes linear regression as part of its formula. The formula for logistic regression is:

1 / (1 + e^(-z))

Here, z represents the linear regression component of the model.