Logistic regression is a next step from linear regression. The most real life data have non linear relationship, thus applying linear models might be ineffective. Logistic regression is capable of handling hon linear effects in prediction tasks. You can think of lots of different scenarios where logistic regression could be applied. There can be financial, demographic, health, weather and other data where model could be applied and used to predict next events on upcoming data. For instance you can classify emails in to span and non spam, transactions being fraud or non, tumors being malignant or benign. In order to understand logistic regression, let’s cover some basics, do a simple classification on data set with two features and then test it on real life data with multiple features.
This is a followup post from previous where we were calculating Naive Bayes prediction on given data set. This time I want to demonstrate how all this can be implemented using WEKA application. For those who doesn’t know what WEKA is I highly recommend visiting their website and getting latest release. It is really powerful machine learning software written in Java. You can find plenty of tutorials in youtube on how to get started with WEKA. So I wont get in to details. I’m sure you’ll be able to follow anyway.