Argonne 2.0

A 4.9 billion parameter decoder-only transformer language model trained from scratch.

Model Architecture

Component Specification
Parameters ~4.9B
Layers 24 transformer blocks
Hidden Size 4,080
Attention Heads 24 query / 8 key-value (GQA)
Context Length 4,096 tokens
Vocabulary Size 151,665

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model = AutoModelForCausalLM.from_pretrained(
    "PursuitOfDataScience/Argonne-2.0",
    torch_dtype=torch.bfloat16,
    device_map="auto",
    trust_remote_code=True
)
tokenizer = AutoTokenizer.from_pretrained("PursuitOfDataScience/Argonne-2.0", trust_remote_code=True)

prompt = "The future of AI is"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_length=256, do_sample=True, temperature=0.7)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

License

Apache 2.0

Citation

@misc{argonne2,
  author = {PursuitOfDataScience},
  title = {Argonne 2.0: A 4.9B Parameter Language Model},
  year = {2026},
  publisher = {Hugging Face},
  url = {https://huggingface.co/PursuitOfDataScience/Argonne-2.0}
}

Links

Downloads last month
34
Safetensors
Model size
6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including PursuitOfDataScience/Argonne-2.0