Precission and recall Precision and recall are pretty useful metrics. Precision is defined as the ratio between all the instances that were correctly classified in the positive class against the total number of instances classified in the positive class. In other words, it's the percentage of the instances classified in the positive class that is actually right. The equation for precision is: PrecisionEquation Let's make a brief analysis of this equation to understand it better. True Positives are all the instances that were correctly classified as positive, now we need to divide it by the total number of instances that were classified as positive. If you look at the previous images, you can see that the total number of instances classified as positive is given by the sum of the true positives and false positives. We can use the confusion matrix to calculate the precision value for our classifier: 9/14 In a practical sense, precision tells you how much you can trust your classifier when it tells you an instance belongs to the positive class. A high precision value means there were very few false positives and the classifier is very strict in the criteria for classifying something as positive. Now let's take a look at recall. Recall is defined as the ratio between all the instances that were correctly classified in the positive class against the total number of actual members of the positive class. In other words, it tells you how many of the total numbers of positive instances were correctly classified. The equation for recall is: RecallEquation Let's make another brief analysis of the equation to understand it better. True Positives are all the classes that were correctly classified as positive, now we need to divide it by the total number of actual members of the positive class. If you look at the previous images, you can see that the total of number members of the positive class is given by the sum of the true positives and false negatives. We can use the confusion matrix to calculate the recall value for our classifier: 9/13 In a practical sense, precision tells you how much you can trust your classifier to find all the members of the positive class. A high recall value means there were very few false negatives and that the classifier is more permissive in the criteria for classifying something as positive.