File size: 3,820 Bytes
4d5a069 65d8dda 4d5a069 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 |
---
license: apache-2.0
tags:
- reasoning
- multilingual
- transformer
- robi-labs
- delta
- lexa
- lexa-family
- lexa-delta
pipeline_tag: text-generation
---
# Model Card for Lexa-Delta
Lexa-Delta is a multilingual reasoning large language model, developed by **Robi Labs**, designed for structured reasoning and natural conversation across multiple languages. It has been trained independently by Robi Labs.
---
## Model Details
### Model Description
* **Developed by:** Robi Labs
* **Model type:** Causal language model (decoder-only transformer)
* **Language(s):** Multilingual (English, Spanish, French, German, Chinese, Hindi, and more)
* **License:** Custom License (see LICENSE file)
### Model Sources
* **Repository:** [https://huggingface.co/RobiLabs/Lexa-Delta](https://huggingface.co/RobiLabs/Lexa-Delta)
* **Website:** [https://labs.robiai.com](https://labs.robiai.com)
* **Lexa Chat:** [https://lexa.chat](https://lexa.chat) (coming soon)
* **Socials:**
* [Twitter/X](https://twitter.com/justlexait)
* [LinkedIn](https://www.linkedin.com/company/robilabsai)
* [Instagram](https://www.instagram.com/robilabs)
---
## Uses
### Direct Use
Lexa-Delta can be used directly for:
* Multilingual question answering
* Chain-of-thought reasoning
* Conversational AI assistants
* Educational support (explaining concepts across languages)
### Downstream Use
* Fine-tuning for domain-specific tasks (e.g., legal, medical, educational)
* Integration into applications and chat platforms
### Out-of-Scope Use
* Disallowed or harmful content generation
* High-stakes decision making without expert human oversight
---
## Bias, Risks, and Limitations
* May inherit biases from multilingual training data
* Reasoning ability may vary by language
* Can generate incorrect or hallucinated outputs
### Recommendations
Users should:
* Verify important information independently
* Avoid high-stakes reliance without human review
* Use responsibly with awareness of multilingual limitations
---
## How to Get Started with the Model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("RobiLabs/Lexa-Delta", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("RobiLabs/Lexa-Delta")
messages = [
{"role": "system", "content": "You are Lexa-Delta, a multilingual reasoning model from Robi Labs."},
{"role": "user", "content": "What is the capital of Armenia?"}
]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
outputs = model.generate(inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
---
## Training Details
### Training Data
* Multilingual reasoning data (sources undisclosed)
### Training Procedure
* **Method:** Full training
* **Precision:** Mixed precision
* **Compute:** B200 GPUs
#### Training Hyperparameters
* **Learning rate:** 2e-4
* **Sequence length:** 4096 tokens
* **Gradient accumulation:** enabled
---
## Environmental Impact
* **Hardware Type:** B200 GPUs
* **Region:** Not disclosed
* **Carbon Emitted:** Not disclosed
---
## Technical Specifications
### Model Architecture and Objective
* Decoder-only transformer
* Objective: Causal language modeling with reasoning-oriented training
### Compute Infrastructure
* **Hardware:** B200 GPUs
* **Software:** PyTorch, Hugging Face Transformers
---
## Citation
```bibtex
@misc{lexa-delta,
title={Lexa-Delta: A Multilingual Reasoning LLM},
author={Robi Labs},
year={2025},
howpublished={\url{https://huggingface.co/RobiLabs/Lexa-Delta}},
}
```
---
## Model Card Authors
[Robi Labs](https://labs.robiai.com)
## Model Card Contact
* Website: [labs.robiai.com](https://labs.robiai.com)
* Email: [labs@robiai.com](mailto:labs@robiai.com) |