Quantifying the Carbon Emissions of Machine Learning
Paper
•
1910.09700
•
Published
•
29
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
Here is an example of how to use the tien007/llama-3-8b-sft model with Hugging Face's Transformers library:
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load the model and tokenizer
model_id = "tien007/llama-3-8b-sft"
model = AutoModelForCausalLM.from_pretrained(model_id, ignore_mismatched_sizes=True, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)
# Define input messages
messages = [{ "role": "user", "content": "How to effectively treat cancer?" }]
# Prepare the prompt
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
# Tokenize input and move to GPU
inputs = tokenizer(prompt, return_tensors='pt', padding=True, truncation=True).to("cuda")
# Generate a response
outputs = model.generate(**inputs, max_length=150, num_return_sequences=1)
# Decode the response
text = tokenizer.decode(outputs[0], skip_special_tokens=True)
# Print the response
print(text.split("assistant")[1])
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).