File size: 1,171 Bytes
5832803 5cf5723 5832803 5cf5723 5832803 5cf5723 7b11484 5832803 5cf5723 c14feb8 5cf5723 c14feb8 5cf5723 c14feb8 5cf5723 c14feb8 5cf5723 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
base_model: unsloth/llama-3-8b-Instruct-bnb-4bit
---
# Model: BAI_LLM_FinArg
- **Developed by:** varadsrivastava
- **License:** apache-2.0
- **Base Model :** unsloth/llama-3-8b-Instruct-bnb-4bit
# For Proper Inference, please use:
!pip install "unsloth[colab-new] @ git+https://GitHub.com/unslothai/unsloth.git
### Loading the fine-tuned model and the tokenizer for inference
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
model_name = "varadsrivastava/BAI_LLM_FinArg",
max_seq_length = 20,
dtype = torch.bfloat16,
load_in_4bit = True
)
### Using FastLanguageModel for fast inference
FastLanguageModel.for_inference(model)
# Prompt template:
"""<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{instruction}<|eot_id|><|start_header_id|>user<|end_header_id|>
Sentence: {row['text']}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Class: {row['label']}<|eot_id|>"""
NOTE: This model was trained 2x faster using Unsloth and Huggingface's TRL library.
|