Confusion Matrix,as it goes with the name people get confused with the terms used in the matrix.Probably you will not feel the same after reading this post.

A clear cut understanding of confusion matrix is needed in statistics part of data science.

So let’s start with definition of confusion matrix and then will explain the terms involved with an example,at the end we will discuss which parameter is important with respect to this example.

A confusion matrix is a table that is often used to describe the performance of a classification model (or “classifier”) on a set of test data for which the true values are known.

Let’s take the following example and the required details were mentioned in the table.

A retail stores marketing team uses analytics to predict who is likely to buy a newly introduced high-end (expensive) product.

Buyer or not Actual Negative Actual Positive Total
Predicted Negative 725 158 883
Predicted Positive 75 302 377
Total 800 460 1260

The above matrix tells the following things.

  • 725 people are not likely to buy the product in reality and the team also predicted the same.
  • 75 people are not likely to buy the product in reality and the team predicted in opposite way.
  • 158 people are likely to buy the product in reality and the team predicted in opposite way.
  • 302 people are likely to buy the product in reality and the team also predicted the same.

Let’s now define the basic terms, which are whole numbers (not rates)

True positive (TP): These are cases in which the team predicted positively (they are likely to buy), and they are buying in reality.

True negative (TN): These are cases in which the team predicted negatively(they are not likey to buy), and they are not buying in reality.

False positive (FP): These are cases in which the team predicted positively(they are likey to buy), and they are not buying in reality.

False negative (FN): These are cases in which the team predicted negatively(they are not likey to buy), and they are buying in reality.

Note: FP is generally known as type-I error and FN is known as type-II error.

Following is a list of rates that are often computed from a confusion matrix:

Accuracy: This tells us , how often is the classifier correct?

Accuracy = (TP+TN)/(TP+TN+FP+FN) = 81.5%

Recall: When they are actually buying, how often does they predict correctly?

Recall = TP/(TP+FN)= 65.6%

This is also known as true positive rate or sensitivity.

Specificity: When they are actually not buying, how often does they predict that they are not buying?

Specificity = (TN)/(TN+FP) = 90.6%

Precision: When they predicted that they are buying, how often is it correct?

Precision = (TP)/(TP+FP) = 80.1%

F1-Score: This is a weighted average of the recall and precision.

F1-Score = (2*Recall*Precision)/(Recall+Precision) = 72.1%

So now here comes the big challenge.

About which parameter should the team be worried?

About FP or FN or equally worried about both of them?

Ans:FN

Why?

If the model predicts that the person will not buy, the product will not be marketed to him/her, and the team will lose customers,money,business. FP is not such a big worry since only the cost of a phone call, SMS or sending a catalog will be lost.

and What is more important: Recall, Precision or Accuracy?

Ans:Recall.

But this will not be same with every case.It varies from case to case.

So a through understanding of case is required before concluding which parameter is important.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s