模式识别与机器学习 Pattern Recognition and Machine Learning 学习总结

这篇文章是我复习KTH课程Pattern Recognition and Machine Learning时的学习笔记,主要的参考资料为该课程课本。

有可能会出现图片打不开的情况,翻墙会解决这个问题

目录

Chapter 1

△Decision&Discriminant Function

△GMM

Chapter 3 Bayesian Pattern Classification

△MAP

△ML

△ML,MAP和最小风险法则

△具体步骤

△fundamental Bayes Rule

△本章总结

Chapter 4 Classification in Practical Applications

△practical problems

△稀疏化模型

△Some Important Concepts in Applied Classification

Chapter 5 HMM

△Three key factors make the HMM very simple to apply

△各类马尔科夫过程

△Forward & Backward Algorithm

△Vertibi algorithm

Chapter 7 EM Algorithm

Chapter 8 Bayesian Learning


Chapter 1

△Decision&Discriminant Function

d(x): decision function;  g(x):discriminant function (书P15)

(给出了threshold)

△GMM

Chapter 3 Bayesian Pattern Classification

△MAP

△ML

△ML,MAP和最小风险法则

△具体步骤

△fundamental Bayes Rule

△本章总结

Chapter 4 Classification in Practical Applications

△practical problems

△稀疏化模型

稀疏模型在机器学习和图像处理等领域发挥着越来越重要的作用,它具有变量选择功能,可以解决建模中的过拟合等问题。稀疏模型将大量的冗余变量去除,只保留与响应变量最相关的解释变量,简化了模型的同时却保留了数据集中最重要的信息,有效地解决了高维数据集建模中的诸多问题。

Cross-validation is frequently a good way to check for overfitting.

△Some Important Concepts in Applied Classification

Chapter 5 HMM

△Three key factors make the HMM very simple to apply:

1. All sub-sources are stationary.

2. Sub-sources do not influence each other, i.e., any correlation over time

is caused only by the hidden state sequence.

3. The state sequence is a time-invariant (also called homogeneous) Markov

chain, i.e., the probability distribution of state St depends only on the

previous state St≠1, and this dependence is time-invariant.

△各类马尔科夫过程

Subset relations among variants of Markov chains

1、A HMM source can generate either a stationary or a non-stationary random process, because the hidden state sequence can be stationary or nonstationary, although all parameters defining the HMM are time-invariant

Stationary 性质:

2、转移矩阵A如果是一个方阵(n×n),说明是infinite的。不是方阵(存在end state),说明是finite的(To model sequences with finite duration, we must introduce a special exit state)。

3、infinite和ergodic是不同的。ergodic必须遍历所有的state。inifinite但不ergodic的例子:可能最终会一直停留在某个state。

irreducible+aperiodic=ergodic:It can be shown that an irreducible and aperiodic Markov chain with finite number of states is guaranteed to be ergodic.

4、left-right HMM: the state number never decreases at any allowed transition。

left-right可以是infinite也可以finite。如果infinite的话,最终一定会收敛到 the rightmost state (称为absorbing/final state)。

      △

△Forward & Backward Algorithm

△Vertibi algorithm

Chapter 7 EM Algorithm

Help function:

其中x是known, S是unknown

Proof:

所以:

GMM:

ML estimate:

(S表示男女)

Chapter 8 Bayesian Learning








 

猜你喜欢

转载自blog.csdn.net/m0_37622530/article/details/81143245
今日推荐