Random LoRA Adapter for Snowflake/snowflake-arctic-embed-l-v2.0
This is a randomly initialized LoRA adapter for testing/benchmarking purposes.
Configuration
- Base Model:
Snowflake/snowflake-arctic-embed-l-v2.0 - LoRA Rank: 64
- LoRA Alpha: 64
- Scaling Factor: 1.0
- Init Scale: 2.0
- Dtype: bfloat16
- Target Modules: ['value', 'query', 'key', 'dense']
Usage
from transformers import AutoModel
from peft import PeftModel
base_model = AutoModel.from_pretrained("Snowflake/snowflake-arctic-embed-l-v2.0", trust_remote_code=True)
model = PeftModel.from_pretrained(base_model, "fred-baseten/snowflake-lora-r64-2")
Note
These weights are randomly initialized and not trained. They are intended for:
- Testing LoRA loading pipelines
- Benchmarking inference with adapters
- Initializing weights before fine-tuning
- Downloads last month
- 19
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for fred-baseten/snowflake-lora-r64-2
Base model
Snowflake/snowflake-arctic-embed-l-v2.0