Normalizing flows

probability VS likelihood: 

https://zhuanlan.zhihu.com/p/25768606

http://sdsy888.me/%E9%9A%8F%E7%AC%94-Writing/2018/%E4%BC%BC%E7%84%B6%EF%BC%88likelihood%EF%BC%89%E5%92%8C%E6%A6%82%E7%8E%87%EF%BC%88probability%EF%BC%89%E7%9A%84%E5%8C%BA%E5%88%AB%E4%B8%8E%E8%81%94%E7%B3%BB/

https://www.psychologicalscience.org/observer/bayes-for-beginners-probability-and-likelihood

To approach MLE today, let’s come from the Bayesian angle, and use Bayes Theorem to frame our question as such:

P(β∣y) = P(y∣β) x P(β) / P(y)

Or, in English:

posterior = likelihood x prior / evidence

猜你喜欢

转载自www.cnblogs.com/dulun/p/12297635.html