In today’s post, we’ll explore Linear Discriminant Analysis (LDA) and demonstrate how it works with a straightforward example. We’ll be using the same dataset as our previous discussion on logistic regression, which identifies whether or not a particular internet user clicked on an advertisement. If you want to learn more...
[Read More]
Today’s post is about logistic regression that predicts a categorical dependent variable given a set of independent variables. We will go through important concepts of logistic regression with data about advertisement downloaded from Kaggle. The goal here is to predict well whether a user clicked on an advertisement or not...
[Read More]
In the last two posts, we have seen two regularization techniques; Ridge and Lasso. Depending on the situation, we can choose one of the two to estimate a model. When the model contains many variables that do not help predict the dependent variable, lasso regression works best because it removes...
[Read More]
In today’s article, we will look at lasso regression as an extension of ridge regression. They are very similar in that they aim to reduce variance at the expense of bias by penalizing a model with a penality term. The difference between them is that lasso regression takes the sum...
[Read More]