Current location - Recipe Complete Network - Complete cookbook - What is Naive Bayesian Algorithm?
What is Naive Bayesian Algorithm?
Naive Bayesian method is simplified on the basis of Bayesian algorithm, that is, given the target value, the attributes are assumed to be independent of each other.

In other words, no attribute variable accounts for a large proportion of the decision-making results, and no attribute variable accounts for a small proportion of the decision-making results. Although this simplified method reduces the classification effect of Bayesian classification algorithm to a certain extent, it greatly simplifies the complexity of Bayesian method in practical application scenarios.

Naive Bayesian classification (NBC) is a method based on Bayesian theorem and assuming that the feature conditions are independent of each other. Firstly, the joint probability distribution from input to output is learned through a given training set, assuming that the feature words are independent. Then, based on the learned model, input X and find the output Y that maximizes the posterior probability.

Personal contribution:

Bayesian mainly studies probability theory in mathematics. He first applied inductive reasoning to the basic theory of probability theory, established Bayesian statistical theory, and made contributions to statistical decision function, statistical inference and statistical estimation. 1763, his works in this field were published, which played an important role in modern probability theory and mathematical statistics. Another book by Bayes, Introduction to Chance Theory, was published in 1758. Many terms used by Bayesian are still in use today.

His main contribution to statistical reasoning is to use the concept of "inverse probability" and put it forward as a universal reasoning method. Bayesian theorem is originally a theorem in probability theory, which can be expressed by a mathematical formula, which is the famous Bayesian formula.