-EM machine learning algorithm -GMM model notes

GMM i.e., Gaussian mixture model, according to the following EM model derived from theoretical formula GMM:

    X is a random variable with a Gaussian distribution formed by mixing K, take the respective Gaussian probability distribution is [Phi] . 1, [Phi] 2, ..., [Phi] K , the i-th mean Gaussian distribution is [mu] i , [Sigma variance i . If the observed series of samples of the random variable X X . 1 , X 2 , ..., X n- , try to estimate the parameters φ, μ, Σ.

    E-step

    

 M-step

    The number and distribution parameters of the Gaussian distribution into the EM model:

    

    The partial derivative of the mean:

     

    So that the above formula is equal to 0, the mean Solutions:

    

    The variance of the Gaussian distribution: the partial derivative equal to 0:

    

    Distribution of a number of parameters:

    

    get

    

        Lagrange multiplier method:

        Since the probability distribution of the number is 1, the establishment of Lagrange equation:

        

 

        Solving φi non-negative constant, irrespective φi≥0 this condition, the partial derivatives equal to 0:

        

So far GMM derivation formally completed.   

Guess you like

Origin www.cnblogs.com/yang901112/p/11621452.html