Instructions to use lxzcpro/demo2_protrek with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use lxzcpro/demo2_protrek with PEFT:
from peft import PeftModel from transformers import AutoModelForSequenceClassification base_model = AutoModelForSequenceClassification.from_pretrained("ProTrekHub/Protein_Encoder_35M") model = PeftModel.from_pretrained(base_model, "lxzcpro/demo2_protrek") - Notebooks
- Google Colab
- Kaggle
Model Card for Model-demo-35M
This model is used for a demo classification task
Task type
Protein-level Classification
Model input type
SA Sequence
Label meanings
0: None
LoRA config
- r: 8
- lora_dropout: 0.0
- lora_alpha: 16
- target_modules: ['output.dense', 'key', 'value', 'query', 'intermediate.dense']
- modules_to_save: ['classifier']
Training config
- optimizer:
- class: AdamW
- betas: (0.9, 0.98)
- weight_decay: 0.01
- learning rate: 0.001
- epoch: 10
- batch size: 2
- precision: 16-mixed
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for lxzcpro/demo2_protrek
Base model
ProTrekHub/Protein_Encoder_35M