| | --- |
| | license: apache-2.0 |
| | language: |
| | - en |
| | - zh |
| | datasets: |
| | - timdettmers/openassistant-guanaco |
| | pipeline_tag: text-generation |
| | Tags: |
| | - Chat |
| |
|
| | widget: |
| | - text: "How can I find a girlfriend?" |
| | example_title: "How can I find a girlfriend" |
| | - text: "你吃什么?" |
| | example_title: "你吃什么?" |
| | --- |
| | |
| | This is a beginner's test model. |
| |
|
| | ``` |
| | from transformers import AutoTokenizer, AutoModelForCausalLM |
| | import transformers |
| | import torch |
| | modelName='MingZhuang/mz_model_merged' |
| | model = AutoModelForCausalLM.from_pretrained(modelName) |
| | tokenizer = AutoTokenizer.from_pretrained(modelName) |
| | generate = transformers.pipeline( |
| | "text-generation", |
| | model=model, |
| | tokenizer=tokenizer, |
| | ) |
| | |
| | Result = generate( |
| | "</s>Human: How can I find a girlfriend?<s> Assistant:", |
| | max_new_tokens=120 |
| | ) |
| | Result |
| | ``` |