Qianfan large model platform upgrade: most models, most complete prompt templates

Hello everyone, I am Nezha.

On August 2, the Qianfan large-scale model platform was released and upgraded, and the era of large-scale models came.

Excited heart, trembling hands, must try.

1. What has been upgraded on the Qianfan large model platform?

Qianfan large-scale model platform is the world's first one-stop enterprise-level large-scale model platform launched by Baidu Smart Cloud. It can provide enterprises with a full-process tool chain and a complete environment for large-scale model development. Users can not only directly call Wenxinyiyan service on Qianfan, but also develop, deploy and call their own large-scale model services on Qianfan. It is worth mentioning that the large-scale model platform not only provides a full set of Wenxin large-scale model services, but also supports various third-party large-scale models, becoming a distribution center for large-scale model production and distribution.

The Qianfan large-scale model platform mainly for B-end customers not only covers the capabilities of Wenxin Yiyan, but also provides a full set of Wenxin large-scale models and the corresponding development tool chains for third-party large-scale models.

To be precise, the Qianfan large-scale model platform is very much in line with the vision proposed by Robin Li, the founder of Baidu, "as the underlying platform, to enable thousands of industries to realize intelligent transformation."

The Qianfan large-scale model platform is fully connected to 33 large-scale models such as the full series of Llama 2, ChatGLM2-6B, RWKV-4-World, MPT-7B-Instruct, and Falcon-7B, becoming the platform with the most large-scale models in China. At the same time, it supports dual enhancements of performance and security, and the cost of model inference can be reduced by 50%.

2. Qianfan is the platform with the most third-party large models in China, and the cost of model reasoning can be reduced by 50%

1. Why does Qianfan access the third-party large model platform?

The Qianfan platform is designed around the large-scale model application requirements of enterprises, and aims to provide enterprise users with a full-scenario, one-stop large-scale model development and service tool chain.

The current open source large model ecosystem is developing rapidly, and a large number of high-quality third-party models have emerged, showing differentiated advantages in different task scenarios, parameter magnitudes, and computing power environments.

The Qianfan team selects high-quality third-party models in the industry and seamlessly integrates with the platform, so that enterprise users can quickly experience, test, and access services, so as to better meet the business needs of different segmentation scenarios.

2. What are the advantages of customers calling third-party models through the Qianfan large model platform? What is the advantage over calling directly without the platform?

(1) The model is reliable

The third-party large models hosted on the Qianfan large model platform are all high-quality models selected by the platform, and three main indicators are assessed: commercial availability, model effect, and model security.

insert image description here

(2) Stronger security

In order to ensure the security of models used by enterprises and developers, Qianfan has made model security enhancements to all connected third-party models, not only ensuring the content security of Wenxin large models, but also ensuring the safe output of third-party large models.

(3) Lower cost

Qianfan has made secondary performance enhancements for each connected large model. By optimizing model throughput and reducing model size, the speed of model inference is greatly improved. According to estimates, after tuning, the model volume can be reduced to 25%-50%, and the inference cost can be reduced by 50%. This means that compared with direct calls, enterprises can greatly save costs and improve results by using these models on the Qianfan platform.

insert image description here

(4) Support a full set of tool chains and diverse model tuning

Qianfan has made in-depth adaptation to the large models connected, and provides a complete tool chain for model retraining, supporting various forms of model tuning, including SFT (full parameter fine-tuning, Prompt Tuning, Lora) and reinforcement learning (reward model learning, reinforcement learning training), etc. Help enterprises and developers to quickly retrain based on the basic large model, and build an enterprise-specific large model.

3. Qianfan has the most complete preset prompt templates in China, with as many as 103

Qianfan large model platform service experience application address: https://cloud.baidu.com/survey/qianfan.html .

Qianfan large-scale model platform has released 103 Prompt templates, including more than ten scenarios such as dialogue, programming, e-commerce, medical care, games, translation, and speeches. Among them are not only the prompt templates based on Baidu Smart Cloud's accumulation in industrial practice, but also the prompt templates from Wenxin Yiyan's high-frequency users. These templates can be directly output to the large model, so that the large model can quickly understand our needs. A good prompt can greatly improve the interaction efficiency and output quality of the model.
insert image description here
how? Just try it out.

insert image description here
insert image description here
Haha, that's what it means, it's really convenient.

Four. Summary

The Qianfan large-scale model platform is more convenient to use. Based on the basic capabilities of the general large-scale model, customers can fine-tune their own large-scale model at a small cost by injecting a small amount of industry data according to their own needs. It is equivalent to searching for templates, which is cool to use.

The designed general-purpose large model can be directly hosted on the cloud of Baidu Smart Cloud. We just use the large model, and Baidu Smart Cloud will ensure the high availability, high performance and high security of the large model, and enterprises don't have to worry about complicated deployment and management issues.

You can visit the Qianfan large model platform to apply for a test, try it out, you will fall in love with her.

Guess you like

Origin blog.csdn.net/guorui_java/article/details/132143997