Naive-Bayes Classifier

Pros & Cons

naive bayes classifier

Advantages

1- Easy Implementation

Probably one of the simplest, easiest to implement and most straight-forward machine learning algorithm.

2- Fast and Simple

Naive Bayes is not only simple but it's fast and simple which makes it a perfect candidate in certain situations.

When you're looking for the least hyper parameters but still very acceptable results Naive Bates algorithm may be the way to go.

3- Scaling Advantages

Naive Bayes scales linearly which makes it a great candidate for large setups.

4- Noise Resilience

If your data has noise, irrelevant features, outlier values etc., no worries, Naive Bayes thrives in such situations and its prediction capabilities won't be seriously affected like some of the other algorithms.

5- Easy Training

Naive Bayes is not a model to which you keep adding training data and it gets a whole lot better continuously.

This may look like a problem especially if you're after that last bit of juice with ever lasting improvements. But, this also means Naive Bayes is one of the easiest machine learning algorithms to train and it can thrive with training on a very small and limited dataset.

6- No Overfitting

Naive Bayes' simplicity comes with another perk. Since it's not sensitive to noisy features that are irrelevant these won't be well represented in Naive Bayes Model. This also means that there is no risk of overfitting.

7- Frequent Updates

Another advantage of Naive Bayes Algorithm is that it can be updated on the go and quickly.

8- Computationally Efficient

Naive Bayes uses very little resources (Ram & Cpu) compared to other algorithms. Without kernels, regularization etc. Naive Bayes is such a lightweight implementation.

9- Suitable for Large Datasets

Naive Bayes runs very efficiently. Much more efficiently than kernel methods, SVMs or Neurol Networks. Its time complexity is O(pn) which means its computational needs will grow linearly with data.

They also train fast as mentioned before.

This makes Naive Bayes implementations suitable for large datasets with millions of rows as well as small datasets.

10- Provides Intuitive Reports

Naive Bayes gives useful and intuitive outputs such as mean values, standard deviation and probability calculation for each feature and class.

machine learning

Holy Python is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

naive bayes classifier

Disadvantages

1- Real World Problems

Now, we mentioned Naive Bayes is fast and simple. Who doesn't like that? One problem is if you have data with features that are dependent of each other this might take away from Naive Bayes' prediction powers.

The thing is more often than note real datasets come with features that are not so independent of each other at all. The implications of this can vary from extra work (engineering features, more data processing etc.) to model unsuitability.

However, it doesn't mean it's always the case and when the problem is suitable for Naive Bayes it can be the best option to choose.

2- No Regression

Naive Bayes Classifier is strictly a classification algorithm and can't be used to predict continuous numerical value, hence no regression with Naive Bayes Classifier.

Programming Category (English)160x600

3- Biased Nature

Since Naive Bayes is such a quick and dirty method and it avoids noise so well this also might mean shortcomings. Naive Bayes processes all features as independent and this means some features might be processed with a much higher bias than you'd wish.

4- Limited Application Case

As another side effect of all the other cons Naive Bayes comes with, the application case can be quite limited depending on your domain.

If you are after a simple classification solutions such as: (spam | not spam; soccer | hockey | golf' great. But if you have tons of complex features or a problem that requires every predictive juice possible because the consequences of small inaccuracy can mean big trouble (think autonomous driving) Naive Bayes might be too naive for you.

However, because of its simplicity Naive Bayes is probably one of the most overlooked algorithms since it doesn't match the Machine Learning hype in the mainstream avenues sometimes.

wrap-up

Naive Bayes Classifier Pros & Cons Summary

Why Naive Bayes Classifiers?

Simple, quick, efficient, accurate, scalable.

On the other hand it might be too quick-n-dirty when you require complex approaches and finesse. Still it's hard not to admire its predictive potency in comparison with its efficiency and simplicity.

Another truly mesmerizing point about Naive Bayes Algorithm is that it still uses the formula Thomas Bayes came up with who was a progressive church minister and thinker and he was thinking about questions such as "chance's role in life", "god's role in life if he exists" etc. in early 18th century. It's a well-deserved comment if we say, he was a true pioneer in statistics domain before statistics existed the way it does today.

Interesting Story of Bayes Theorem from early 18th century United Kingdom

Simple

Almost no hyper parameters and great usability out of the box.

Fast

The way Naive Bayes is implemented means, fast training and fast predictions.

Large Data Friendly

Linear Time Complexity of Naive Bayes means it will remain efficient even when data gets really big.

Accurate

If you're sure Naive Bayes is appropriate for your data and tasks at hand, Naive Bayes can be surprisingly accurate for its simplicity and efficiency.

Lightweight

Naive Bayes doesn't clog the RAM like random forests or require high computation resources like SVMs or Neural Networks. Even though some of these models can be considered lightweight, then we can probably say Naive Bayes is ultra-lightweight.

Bias

Naive Bayes learns fast and easily but if your training set is not ideal, NB can be highly biased and results will be garbage, there aren't many parameters to fix things either.

Applicability

Lots of real world problems have co-dependent features. Meaning features will have high correlation with each other. In these cases bias and inaccuracy will come to the picture. Don't you wish we could apply Naive Bayes to every problem out there?

Not So Open-Minded

If you have complex problems with complex features. Naive Bayes will avoid these since NB is great at navigating around noise. It's kind of a my way or the highway algorithm in that sense.