Create a text-generating web application using 100% Python (NLP)

Create a text-generating web application using 100% Python (NLP)

Harness GPT-Neo - a natural language processing (NLP) text generation model. Demo with 100% Python web app

English name of the course: Create a Text Generation Web App with 100% Python (NLP)

This video tutorial is 882MB in total, with Chinese and English subtitles, clear picture quality and no watermark, and full source code attachments

Course address: https://xueshu.fun/1462 Demo address: https://www.udemy.com/course/nlp-text-generation-python-web-app/

Course content

what will you learn

  • How to implement a state-of-the-art text generation AI model
  • Background information on GPT-Neo, a state-of-the-art NLP model for text generation
  • How to use Happy Transformer - a Python library for implementing NLP Transformer models
  • How to train/implement GPT-2
  • How to implement different text generation algorithms
  • How to get data using Hugging Face's dataset library
  • How to train GPT-Neo with Happy Transformer
  • How to create 100% Python web applications with Anvil
  • How to host Transformer models on Paperspace

This course includes:

  • 2 hours of video on demand
  • 6 articles
  • lifetime access
  • Access on mobile and TV
  • Operation

Require

  • Solid understanding of basic Python syntax
  • A Google account (for Google Colab)

describe

GPT-3 is the most advanced text-generating natural language processing (NLP) model created by OpenAI. You can use it to generate text similar to human-generated text.

This course shows how to create a web application that uses the open source version of GPT-3 called GPT-Neo and 100% Python . That's right, no HTML, Javascript, CSS or any other programming language is required. Only 100% Python!

You will learn how to:

  1. Implementing GPT-Neo (and GPT-2) with Happy Transformer
  2. Training GPT-Neo to generate unique text for specific domains
  3. Create web applications using 100% Python and Anvil!
  4. Host your language model with Google Colab and Paperspace

install :

not any! ! ! All the tools we use in this tutorial are web-based. They include Google Colab, Anvil, and Paperspace. So, whether you're on Mac, Windows, or Linux, you don't have to worry about downloading any software.

technology:

  1. Model: GPT-Neo - open source version of GPT-3 created by Eleuther AI
  2. Framework: Happy Transformer - an open source Python package that allows us to implement and train GPT-Neo with just a few lines of code
  3. Web Technologies: Anvil - a website that allows us to develop web applications using Python
  4. Backend Technology: We will cover how to use Google Colab and Paperspace to host the model. Anvil automatically overrides hosted web apps.

About the tutor:

My name is Eric Fillion and I am from Canada. My mission is to achieve state-of-the-art advancements in the field of NLP by creating open source tools and creating educational content. In early 2020, I led a team that launched an open-source Python package called Happy Transformer. Happy Transformer allows programmers to implement and train state-of-the-art Transformer models with just a few lines of code. It has been award-winning and has been downloaded over 13,000 times since its release.

Require:

  • Basic understanding of Python
  • A Google account - for Google Colab

Who this course is suitable for:

  • Python developers interested in AI and NLP

Academic Fun https://xueshu.fun/ Continuously update video tutorials in online classrooms such as Udemy, Coursera, etc. The categories cover artificial intelligence, machine learning, programming languages, game development, network security, cloud computing, Linux operation and maintenance, interview skills, etc. All knowledge of computer science.

All video tutorials include Chinese and English subtitles, exercise source code and supporting supplementary materials.

Guess you like

Origin blog.csdn.net/duoshehuan6005/article/details/130038183