|
|
--- |
|
|
license: apache-2.0 |
|
|
datasets: |
|
|
- open-r1/OpenR1-Math-220k |
|
|
- FreedomIntelligence/medical-o1-reasoning-SFT |
|
|
language: |
|
|
- en |
|
|
- ar |
|
|
metrics: |
|
|
- accuracy |
|
|
library_name: transformers |
|
|
new_version: moelanoby/kok-baseV2 |
|
|
--- |
|
|
|
|
|
This model is made to be a good AI model with a custom architecture and it's going to be made from scratch entirely |
|
|
and since this is a custom architecture you need to use the following python code |
|
|
|
|
|
```python |
|
|
|
|
|
from transformers import AutoConfig, AutoModel |
|
|
config = AutoConfig.from_pretrained("moelanoby/kok-base") |
|
|
model = AutoModel.from_config(config) |
|
|
|
|
|
``` |
|
|
The previous script uses the correct implementation to do it |
|
|
you can finetune this model to an LLM |
|
|
you can use it for any use case and I won't care |
|
|
and while you're at it you can support me with buy me a coffee :D |
|
|
|
|
|
http://www.buymeacoffee.com/Moelanoby |
|
|
|
|
|
:> |