metadata
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- chat
- conversational
- llama
- fine-tuned
- rax
- raxcore
model_type: llama
Rax 3.5 Chat
Developed by RaxCore - A leading developer company in Africa and beyond
Model Description
Rax 3.5 Chat is an extensively enhanced conversational AI model featuring breakthrough improvements developed by RaxCore. Built upon the Llama architecture with TinyLlama as foundation, this model incorporates proprietary optimization techniques, advanced training methodologies, and cultural context awareness that significantly exceed baseline performance.
Quick Start
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("rax-3.5-chat")
model = AutoModelForCausalLM.from_pretrained("rax-3.5-chat")
messages = [
{"role": "system", "content": "You are Rax, a helpful AI assistant."},
{"role": "user", "content": "Hello!"}
]
input_text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=256)
Model Details
- Architecture: Enhanced Llama (1.1B parameters with RaxCore optimizations)
- Context Length: 2048 tokens
- Development: Extensively enhanced by RaxCore with proprietary improvements
- Base: TinyLlama foundation with significant RaxCore upgrades
- License: Apache 2.0
Intended Use
- Conversational AI applications
- Research and educational purposes
- Creative writing assistance
- Chatbot development
Limitations
- 2048 token context limit
- May generate biased or incorrect information
- Requires responsible deployment practices
Links
- RaxCore Website: www.raxcore.dev
- Hugging Face Profile: raxcore-dev