|
|
--- |
|
|
license: mit |
|
|
tags: |
|
|
- llama2 |
|
|
- fused |
|
|
- cpu |
|
|
- context-8000 |
|
|
- fusion-all2one |
|
|
- tensor-fusion |
|
|
- bias-removal |
|
|
- decode |
|
|
- coherence-enhancement |
|
|
- custom-code |
|
|
library_name: transformers |
|
|
--- |
|
|
|
|
|
# xddd |
|
|
|
|
|
Este repositorio incluye: |
|
|
- `hghghgkskdmskdms/xddd` con fusi贸n completa de layers en 1 (sin eliminaci贸n de originales) |
|
|
- Fusi贸n de todos los tensores en un 煤nico vector continuo |
|
|
- Eliminaci贸n de bias y desactivaci贸n de censura |
|
|
- Configuraci贸n de generaci贸n: do_sample=True, temp=0.7, top_p=0.9, repetition_penalty=1.2, no_repeat_ngram_size=3 |
|
|
- Funciones de decodificaci贸n de tokens, par谩metros, respuestas, layers, neuronas, tensores, arquitectura y tensor fusionado |
|
|
- max_position_embeddings: 8000 |
|
|
- torch_dtype: float32 |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
model = AutoModelForCausalLM.from_pretrained("jnjj/xddd", torch_dtype="float32", trust_remote_code=True) |
|
|
``` |