He published a long article in English: G semantic information theory and Bayesian inference logic for statistical learning

 [This article is Lu Chenguang teacher published an article on the semantic information theory and statistical learning feeling after writing a paper, published in Science online, especially recommended excerpt]

2019-8-21 01:04 | Personal category: mathematics and philosophy information | classification system: the exchange of papers | information theory, semantic information, machine learning, inductive, philosophy of science

This is a summary of my last five years of the study, also based on my previous findings.

The journal is published: Information: https: //www.mdpi.com/journal/information

The article is: https: //www.mdpi.com/2078-2489/10/8/261

English open access journals, not length. The method I use - semantic information method - distinctive, Information just right for me. To peer-reviewed over two hurdles, to be honest requirements according to reviewers, one by one response. Did not know, eat burned.

Western Research semantic information theory celebrities Luciano Floridi and Wolfgang Johannsen also published articles on it.

Flridi still in Information Information Theory and Methodology Section of the editorial board.

Now let us compare.

Article 30 pages. I am very happy, equivalent to published four. http://www.survivor99.com/lcg/english/information/GIT/index.htm, 40 multiple pages I had published an article in English is very long. Like a short but difficult to deliver - just published meeting. New things may not be comprehensive it is difficult to understand.



My article, the background consists of two parts:

1. From Shannon Information Theory to G theory of semantic information, semantic information theory also touched others (including Professor Floridi and Zhong Yixin);

2. From the traditional Bayesian logic to predict Bayesian inference, the main challenge to the Bayesian inference (Bayesian Inference) of.

Methods include my original 4 channel matching algorithm:

Channel matched semantic Shannon channel, a learning function for solving multi tag - membership function - a simple method. But the most important application is to solve the confirmation of if-then described.

2. The two channels matched: solving multi-label classification; simpler than many popular method.

3. Repeat the two channels match with each other, iterative algorithm, solving the mutual information classification. According to a feature for solving the largest mutual information classification or estimate, this is the classical Shannon information theory and information theory to leave problems.

4. The two channels match each other, the maximum communication efficiency by solving the G / R, solving mixed model Incidentally mixture model to prove the theory based on the EM algorithm is wrong.

Many examples are provided herein; supplementary holding material also provides these algorithms Python 3.6 program. Reading my own series. Methods forced me to learn Python programming. Fortunately, I was old programmer.

For machine learning, classification is 2,3 for very practical; the most difficult is to solve the mixed model, in particular, to prove the convergence of iterations. But the most theoretical sense is to provide a new confirmation of the b *.

Induction long-standing problem. Because absolutely correct full name of the induction hypothesis is denied, for the sake of the evolution of the problem of induction is not entirely correct assumption that the problem of induction - the confirmation of computational problems. Falsificationism Popper who has tried to solve this problem. Early advocates of Bayesian logic Carnap Cairns and conclusive evidence of an attempt to indicate logical probability or conditional probability logic (varies between 0 and 1), but mostly with modern induction advocates induction credibility or support (at - varies between 1 and 1) represents the degree confirmed (see here http://www.fitelson.org/probability/comp.pdf).

I also confirmation of between -1 and 1, and confirmation of the popular but different, depending on the degree of popularity of confirmed positive cases of whether more, and confirmation of my b * depends on whether the counterexample less - which Popper's falsification compatible ideas. To let you convinced, we need to be done.

I believe that my confirmation of the formula sooner or later will be accepted by most people, but its fate may be like my color vision model (http://blog.sciencenet.cn/home.php?mod=space&uid=2056&do=blog&id=1160412) .

Academician Xu called for a study algorithm: http: //www.sohu.com/a/312151330_680938

My effort is. But then this article published in domestic journals really not easy. My article on mutual information classified repeatedly been rejected. Reviewers do not know whether to know: Shannon and newcomers still can not solve this problem! This should be the crown of diamonds on information theory ah!

A draft Chinese see: http: //www.survivor99.com/lcg/CM/Homepage-NewFrame.pdf

More on semantic information theory and statistical learning, see: http: //www.survivor99.com/lcg/books/GIT/

Welcome to the exchange.


Guess you like

Origin blog.csdn.net/VucNdnrzk8iwX/article/details/100012352