Google confrontation training the neural network has been patented, BERT concern in the end what? ... # 20200115

EDITORIAL

Some circles in this series of articles I learn every day to share the forefront of interesting events and open-source work, share transfer needs.

Table of Contents Introduction

  • Google confrontation training the neural network has been patented
  • BERT concern in the end what? Stanford's analysis of BERT Attention
  • The new open source community project release -Cortex v0.12: for developers of machine learning infrastructure

Google confrontation training the neural network has been patented

Christian Szegedy combat training recently and Ian Goodfellow neural networks applied for a patent (US patent 10521718)
-w963

This event on reddit launched a lively discussion:
-w630

Questioner asked whether this incident will make use of the combat training of the neural network companies and academia rely more on google, but there is no clear statement.

There are some objective reviews I liked, such as a review of ReginaldIII users:
-w607

Like every occurrence of this situation, and every time you publish it to this subdirectory, as it is meaningless.
Every time arguing back and forth in the comments is a good / bad, moral / immoral, and other good intentions, but ultimately just people's subjective opinion.
Then, every time we continue the thread remains in the same state, because it will not have any impact on the work done or the work we will proceed.
I do not like it (refer to the patent system), but I understand why other people think it is necessary. The patent system and its related laws and intellectual property rights have been abused, to carve through years of lobbying interests and money. Most people are defensively patents on these concepts to address significant deficiencies in the patent system.

And rhiyo of:
-w706

This may apply for a patent and dropout, as will later become some of the world's best melons for heading the party's news media.

BERT concern in the end what? Stanford's analysis of BERT Attention

Stanford's recent Attention BERT do the analysis, with papers and open source (Jupyter).

Warehouse Code Address: https://github.com/clarkkev/attention-analysis
Paper Address: https://arxiv.org/abs/1906.04341

It comprises means for obtaining from FIG attention BERT then written to disk, generally BERT analysis of attention (paper and the third part 6) and its dependency syntax compared with the Attention code (part 5 and 4.2). For as soon as possible and add the code refers to a total digestion analysis (In section 4.3)!

The new open source community project release -Cortex v0.12: for developers of machine learning infrastructure

Cortex is a free, open-source model of service infrastructure enables developers to model with minimal configuration will be deployed as Web API, that is not need to know what is what is the back-propagation hidden layer, you can learn how to build machine learning to build Web service.
Deployment of machine learning models in production

Cortex has a detailed article on the Medium: Cortex v0.12: Machine Learning Infrastructure for Developers

There are a suitable entry NLP article: using a pre-training model of the list of items NLP for beginners

Its main functions are as follows:

  • Framework agnostic: You can use any model deployment framework for the production of API.
  • Elastically stretchable: Cortex automatically expand your instance to handle fluctuations in flow rate while minimizing costs.
  • GPU / CPU support: service model on the GPU or CPU.
  • Spot examples: Cortex supports deployment on Spot instances, this can be reduced to 90% cloud costs.

GitHub link: https://github.com/cortexlabs/cortex/

Epilogue

This series of articles on time to update some interesting events and open-source AI circle work, attention to anti-lost.

In addition, we intend to launch NLP communication group, interested partners can contact echoooo741 reply NLP into the group.

Published 88 original articles · won praise 37 · views 90000 +

Guess you like

Origin blog.csdn.net/u012891055/article/details/103987417