How to use DragonLLM/Qwen-Open-Finance-R-8B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="DragonLLM/Qwen-Open-Finance-R-8B")
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("DragonLLM/Qwen-Open-Finance-R-8B") model = AutoModelForCausalLM.from_pretrained("DragonLLM/Qwen-Open-Finance-R-8B")
The community tab is the place to discuss and collaborate with the HF community!