Question Answering
Transformers
Safetensors
Chinese
English
llama
text-generation
custom_code
text-generation-inference
Instructions to use FlagAlpha/Atom-7B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use FlagAlpha/Atom-7B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="FlagAlpha/Atom-7B", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("FlagAlpha/Atom-7B", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("FlagAlpha/Atom-7B", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -91,6 +91,6 @@ Github:[**Llama-Chinese**](https://github.com/LlamaFamily/Llama-Chinese)
|
|
| 91 |
|
| 92 |
## 🐼 社区资源
|
| 93 |
- Llama2在线体验链接[**llama.family**](https://llama.family/),同时包含Meta原版和中文微调版本!
|
| 94 |
-
- Llama2 Chat模型的[中文问答能力评测](https://github.com/
|
| 95 |
- [社区飞书知识库](https://chinesellama.feishu.cn/wiki/space/7257824476874768388?ccm_open_type=lark_wiki_spaceLink),欢迎大家一起共建!
|
| 96 |
|
|
|
|
| 91 |
|
| 92 |
## 🐼 社区资源
|
| 93 |
- Llama2在线体验链接[**llama.family**](https://llama.family/),同时包含Meta原版和中文微调版本!
|
| 94 |
+
- Llama2 Chat模型的[中文问答能力评测](https://github.com/LlamaFamily/Llama-Chinese/tree/main#-%E6%A8%A1%E5%9E%8B%E8%AF%84%E6%B5%8B)!
|
| 95 |
- [社区飞书知识库](https://chinesellama.feishu.cn/wiki/space/7257824476874768388?ccm_open_type=lark_wiki_spaceLink),欢迎大家一起共建!
|
| 96 |
|