[Book Gift-Issue 31] "TVM Compiler Principles and Practice"

Preface

Recommended audience: Engineers and technicians engaged in the development of AI algorithms, software, AI chips, and compilers.
Artificial Intelligence (AI) has been widely used in the information industry around the world. Deep learning models have promoted the AI ​​technology revolution, such as TensorFlow, PyTorch, MXNet, Caffe, etc. Most existing system frameworks are only optimized for a small range of server-class GPUs, and therefore require a lot of optimization efforts to be deployed on other platforms such as automotive, mobile phones, IoT devices, and specialized accelerators (FPGAs, ASICs) . As the number of deep learning models and hardware backends increases, TVM builds a unified solution based on intermediate representation (IR). TVM not only automatically optimizes deep learning models, but also provides an efficient open source deployment framework across platforms. The popularity of large models is gradually increasing. TVM is a good bridge to transform artificial intelligence theory and algorithm framework into practical project implementation. Therefore, this book will be loved by a large number of readers.

About the Author

Wu Jianming, PhD in Pattern Recognition and Intelligent Systems from Shanghai Jiao Tong University, has been engaged in artificial intelligence chip design for a long time. He is especially good at theoretical research and technological innovation in TVM/LLVM compilers, AI frameworks, autonomous driving, chip manufacturing, embedded systems and other fields. He has been working on the front line for a long time, including product design and code implementation, and has presided over and participated in the research and development of more than 30 products. He has also participated in projects of the National Natural Science Foundation and the Shanghai Municipal Science and Technology Commission, and has published 8 papers in core journals, 6 of which are DIYI authors.

brief introduction

TVM (Tensor Virtual Machine, Tensor Virtual Machine) is an open source model compilation framework that aims to automatically compile machine learning models into machine language that can be executed by the underlying hardware, thereby utilizing various types of computing power. Its working principle is to first optimize the inference, memory management and thread scheduling of the deep learning model, and then use the LLVM framework to deploy the model on hardware devices such as CPU, GPU, FPGA, and ARM.
This book comprehensively analyzes the main functions of TVM, helps readers understand the working principle of TVM, and uses TVM to optimize and deploy deep learning and machine learning.
This book combines the author's many years of work and study experience, and strives to integrate TVM basic theory and case practice to explain in detail. The book has 9 chapters in total, including basic knowledge of TVM, development using TVM, operator fusion and graph optimization, TVM quantification technology, TVM optimization scheduling, Relay IR, code generation, back-end deployment and OpenCL (Open Computing Language, Open Computing Language). Automatic scheduling, automatic search and cost model. In addition to containing important knowledge points and practical skills, each chapter is also equipped with carefully selected typical cases.
This book is suitable for reading by engineering technicians, scientific research staff, and technical managers engaged in AI algorithms, software, compiler development, and hardware development. It can also be used as a reference book for teachers and students in colleges and universities related to compilers.
Insert image description here
Book purchase link

Lottery method

Like + collect the article and
leave a message in the comment area: "Life is short, refuse to be involved" or anything else (you can only enter the prize pool by leaving a message, and each person can leave a maximum of three messages).
5 people will be randomly drawn at 8pm on Friday night.

comminicate

Friends who are interested in software exams can join the bloggers' communication group. There are currently four groups: software designers, senior experts, system architects, and system analysts.

There are past papers, e-books and other materials in the group that you can pick up by yourself;
there is no marketing, it is a purely communication group;
there will be book delivery events twice a week, three books at a time, with free shipping to your home.
Group entrance

Guess you like

Origin blog.csdn.net/weixin_50843918/article/details/135357762