|
|
--- |
|
|
library_name: transformers |
|
|
tags: |
|
|
- eeg |
|
|
- decoding |
|
|
--- |
|
|
|
|
|
# Cobbs_Head |
|
|
|
|
|
## Model Details |
|
|
|
|
|
- **Developed by:** Dukee2506 (private competition release) |
|
|
- **Model type:** Transformer-based sequence model |
|
|
- **Languages:** English |
|
|
- **License:** Research-only, non-commercial |
|
|
- **Base model:** Private pre-trained backbone (not disclosed) |
|
|
|
|
|
## Intended Use |
|
|
|
|
|
This model is provided for a closed competition task. |
|
|
It is intended to decode sequential biosignal inputs into text. |
|
|
|
|
|
### Direct Use |
|
|
- Running inference on provided competition data. |
|
|
|
|
|
### Out-of-Scope Use |
|
|
- Any deployment outside research/competition setting. |
|
|
- Using with unrelated modalities or datasets. |
|
|
|
|
|
## Training Data |
|
|
|
|
|
The model was adapted on a curated subset of aligned signals and transcripts. |
|
|
Exact dataset details are withheld for fairness in the competition. |
|
|
|
|
|
## Evaluation |
|
|
|
|
|
- Task: Sentence-level decoding on hidden test data |
|
|
|
|
|
## Quick Start |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForPreTraining |
|
|
|
|
|
model = AutoModelForPreTraining.from_pretrained("Dukee2506/Cobbs_Head") |
|
|
``` |