C#, image binarization (15) - one-dimensional maximum entropy (1D maxent) algorithm and source program of global threshold

1. Maximum entropy (maxent)

Maximum entropy (maxent) methods are rooted in information theory and have been successfully applied in many fields, including physics and natural language processing. It creates a model that best explains the available data, with the constraint that the model should maximize entropy without any additional information. In other words, the model favors the uniform distribution by maximizing the conditional entropy. Maximum entropy models were originally developed by Berger et al. (1996) for natural language applications such as information retrieval and speech recognition. Jeon and Manmatha (2004) used this model for images. We follow the derivation of Jeon and Manmatha (2004) for applying maxent to the problem of shape classification.

Maximum entropy
The maximum entropy (maxent) approach is rooted in information theory and has been successfully applied to many fields including physics and natural language processing. It creates a model that best accounts for the available data but with a constraint that without any additional information the model should maximize entropy. In other words, the model prefers a uniform distribution by maximizing the conditional entropy. The maximum entropy model was originally developed by Berger et al. (1996) for natural language applications such as information retrieval and speech recognition. Jeon and Manmatha (2004) adapted the model for images. We follow the derivation of Jeon and Manmatha (2004) applying maxent to the problem of shape classification.

The principle of maximum entropy is a rule that allows us to choose a "best" one from a number of different probability distributions, all of which represent the current state of knowledge. It tells us that the best choice is the one with the highest entropy.

This will be the system with the greatest remaining uncertainty, and by selecting it you can ensure that no additional bias or unnecessary assumptions have been added to your analysis.

We know that over time, all systems tend towards a maximum entropy configuration, and therefore the probability that a maximum entropy distribution accurately represents a system is higher than that of a more ordered system.

The maximum entropy principle is a rule which allows us to choose a ‘best’ from a number of different probability distributions that all express the current state of knowledge. It tells us that the best choice is the one with maximum entropy.

This will be the system with the largest remaining uncertainty, and by choosing it you’re making sure you’re not adding any extra biases or uncalled for assumptions into your analysis.

We know that all systems tend toward maximal entropy configurations over time, so the likelihood that your system is accurately represented by the maximum entropy distribution is higher than the likelihood it would be represented by a more ordered system.
 

2. Grayscale image binarization, global algorithm, one-dimensional maximum entropy threshold algorithm source program

  For an overview of binary algorithms, please read:

C#, image binarization (01) - a review of binarization algorithms and a catalog of twenty-three algorithms https://blog.csdn.net/beijinghorn/article/details/128425225?spm=1001.2014.3001.5502

For supporting functions, please read:

C#, image binarization (02) - C# source code of some basic image processing functions for image binarization https://blog.csdn.net/beijinghorn/article/details/128425984?spm=1001.2014 . 3001.5502

using System;
using System.Linq;
using System.Text;
using System.Drawing;
using System.Collections;
using System.Collections.Generic;
using System.Drawing.Imaging;
 
namespace Legalsoft.Truffer.ImageTools
{
    public static partial class BinarizationHelper
    {
        #region 灰度图像二值化 全局算法 一维最大熵

        /// <summary>
        /// 一维最大熵
        /// </summary>
        /// <param name="histogram"></param>
        /// <returns></returns>
        public static int Maxium_Entropy_1D_Threshold(int[] histogram)
        {
            int MinValue = Histogram_Left(histogram);
            int MaxValue = Histogram_Right(histogram);
            double[] HistGramD = Histogram_Normalize(histogram);

            int Threshold = 0;
            double MaxEntropy = double.MinValue;
            for (int i = MinValue + 1; i < MaxValue; i++)
            {
                double SumIntegral = 0.0;
                for (int j = MinValue; j <= i; j++)
                {
                    SumIntegral += HistGramD[j];
                }

                double EntropyBack = 0.0;
                for (int j = MinValue; j <= i; j++)
                {
                    EntropyBack += (-HistGramD[j] / SumIntegral * Math.Log(HistGramD[j] / SumIntegral));
                }

                double EntropyFore = 0.0;
                for (int j = i + 1; j <= MaxValue; j++)
                {
                    EntropyFore += (-HistGramD[j] / (1.0 - SumIntegral) * Math.Log(HistGramD[j] / (1.0 - SumIntegral)));
                }

                if ((EntropyBack + EntropyFore) > MaxEntropy)
                {
                    Threshold = i;
                    MaxEntropy = EntropyBack + EntropyFore;
                }
            }
            return Threshold;
        }

        public static void Maxium_Entropy_1D_Algorithm(byte[,] data)
        {
            int[] histogram = Gray_Histogram(data);
            int threshold = Maxium_Entropy_1D_Threshold(histogram);
            Threshold_Algorithm(data, threshold);
        }

        #endregion

    }
}
 

3. Grayscale image binarization, global algorithm, one-dimensional maximum entropy threshold algorithm calculation effect

The effect is very general. 

Guess you like

Origin blog.csdn.net/beijinghorn/article/details/128522175