How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Alphacode-AI/Alphallama3-8B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Alphacode-AI/Alphallama3-8B")
model = AutoModelForCausalLM.from_pretrained("Alphacode-AI/Alphallama3-8B")
Quick Links

This model is a version of Meta-Llama-3-8B that has been fine-tuned with Our In House CustomData.

Train Spec : We utilized an A100x4 * 1 for training our model with DeepSpeed / HuggingFace TRL Trainer / HuggingFace Accelerate

Downloads last month
13
Safetensors
Model size
8B params
Tensor type
BF16
Β·
Inference Providers NEW

Model tree for Alphacode-AI/Alphallama3-8B

Finetuned
(596)
this model
Quantizations
2 models

Spaces using Alphacode-AI/Alphallama3-8B 9

Collection including Alphacode-AI/Alphallama3-8B