How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="SUSTech/SUS-Chat-72B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("SUSTech/SUS-Chat-72B")
model = AutoModelForCausalLM.from_pretrained("SUSTech/SUS-Chat-72B")
Quick Links


🐷SUS-Chat: Instruction tuning done right

not currently in use

中文  |  English 



🐷SUS-Chat: Instruction tuning done right

中文  |  English 



News

  • 2023-12-09: 🔥 Tigerbot variant has been deleted, SUS-Chat-34B is now the the top-ranked LLaMA model and the top-ranked chat model.

  • 2023-12-07: SUS-Chat-34B is now available on WiseModel🧠.

  • 2023-12-06: Try SUS-Chat-34B chat-ui.

  • 2023-12-05: SUS-Chat-34B is now available on ModelScope🤖

  • 2023-12-05: SUS-Chat-34B is ranked 2nd in Open LLM leaderboard and surpassed all models under 70B.

  • 2023-12-01: SUS-Chat-34B is now available on HuggingFace🤗.

Introduction

Downloads last month
941
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support