ai_huggingFace 練習

序文:

AIはますます活発になり、それを理解して適用するために率先して行動する必要があります

アイディア:

huggingface の興味深いモデルを WeChat アプレットに統合してツールキットを作成したい

どうやってするの :

microsoft/DialoGPT-medium ・ハグフェイスは  指定した機種を検索できます

主な買収:Hugging Face - 未来を築く AI コミュニティ。

API 呼び出し: ほぼすべてのポスト リクエスト、json ボディを運ぶ 

 公式の例:詳細なパラメーター (huggingface.co)  いくつかの興味深いモデルと呼び出し方の説明

以下は練習コードです 

import json
import requests
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch



API_TOKEN="hf_yELNcAUXRyLDOOazkesXEdvLnTtefLcMHxxxxxxxxxxxxx"
headers = {"Authorization": f"Bearer {API_TOKEN}"}

def query(payload):
    #单词联想模型,给出前缀话,可以联想后序的单词   data = {"inputs": "my mom is a [MASK]. "}
    API_URL = "https://api-inference.huggingface.co/models/bert-base-uncased"
    data = json.dumps(payload)
    response = requests.request("POST", API_URL, headers=headers, data=data)
    result= json.loads(response.content.decode("utf-8"))
    return json.dumps(data, indent=4)

def summer_text(payload):
    #摘要模型,给出一段话,可以生成摘要
    payload = {"inputs": "The Little Match Girl Once upon a time, in a cold winter night, a poor little girl was trying to sell matches on the street. She didn't have any warm clothes and her shoes had oles in them, which made her feet very cold. The girl was shivering and feeling very hungry. But she hadn't sold any matches all day.As she was walking, she saw a warm and cozy house with a big fireplace. She imagined how lovely it would be to be inside, feeling warm and safe. The girl couldn't resist the temptation and decided to light one of the tches to warm up her hands.To her surprise, as she lit the match, she saw a beautiful vision of a Christmas feast with roast goose, Christmas pudding, and all kinds of treats.But as soon as the match burned out, the vision disappeared. The girl quickly lit another match, and this time she saw her beloved grandmother, who had passed away. In her third match, she saw a beautiful Christmas tree with shining lights.The little girl was so happy and forgot all about her cold and hunger. She kept lighting matches hoping to see more beautiful things. But she had only a limited number of atches, and soon she ran out of them. The girl felt sad and alone, and didn't dare to go back home without any money.As the night grew colder and darker, the girl curled up under a anket of snow and closed her eyes, hoping never to wake up again. The next morning, people found her frozen body and realized that she had died from the cold.The Little Match Girl's try is a sad one. But it teaches us about the importance of kindness and compassion. We should look out for those who are less fortunate han us, and lend a helping hand whenever we can."}
    data = json.dumps(payload)
    API_URL="https://api-inference.huggingface.co/models/facebook/bart-large-cnn"
    response = requests.request("POST", API_URL, headers=headers, data=data)
    loads = json.loads(response.content.decode("utf-8"))
    print(json.dumps(loads, indent=4))
    return loads



def ai_robbot_DialoGPT(payload):
    tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-medium")
    model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-medium")
    #聊天模型,给出对话,可以生成回复
   # payload = {"inputs": "what's your name?"}
    API_URL = "https://api-inference.huggingface.co/models/microsoft/DialoGPT-medium"
    data = json.dumps(payload)
    response = requests.request("POST", API_URL, headers=headers, data=data)
    result= json.loads(response.content.decode("utf-8"))
    print(json.dumps(result, indent=4))
    return result



if __name__ == '__main__':
    # summer_text(";;")
    ai_robbot_DialoGPT({"inputs": "How did you know that?",
                        "pad_token_id": 56789})

 翻訳モデル

サードパーティの翻訳を呼び出すには常に料金がかかります. 現在、ローカルで呼び出すことができる無料のモデルがあります. 実践は次のとおりです:

import json
from transformers import AutoModelForCausalLM, AutoTokenizer,AutoModelForSeq2SeqLM

def local_translate(input_text):
    #如果本地不存在则会下载到本地,否则则会直接加载本地,第一次会要联网下载
    
    tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-zh-en")#这里也可以直接指定本地模型的路径
    model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-zh-en")##这里也可以直接指定本地模型的路径

    #保存模型
    # Define the path to save the model and tokenizer
   # model_dir = "path/to/save/model"
    #tokenizer_dir = "path/to/save/tokenizer"
    # Save the model and tokenizer
    #model.save_pretrained(model_dir)
    #tokenizer.save_pretrained(tokenizer_dir)

    #使用模型,用tokenizer将输入的中文文本编码成token ID。
    # Encode the input text using the tokenizer
    input_ids = tokenizer.encode(input_text, return_tensors="pt")

    # Generate the model output
    sample_outputs = model.generate(
        input_ids,
        max_length=1000,
        do_sample=True,
        top_k=50,
        top_p=0.95,
        temperature=0.7,
    )

    # 使用tokenizer将英文文本转换回可读文本
    generated_text = tokenizer.decode(sample_outputs[0], skip_special_tokens=True)

    # Print the generated text
    print(generated_text) #输出:How's the weather tomorrow? I'd like to go out for a swim.


if __name__ == '__main__':
    local_translate("明天天气怎么样,我想出去游玩")

詳細なパラメータの説明:

 

おすすめ

転載: blog.csdn.net/u013372493/article/details/130092766