Instructions to use Harshit-Makraria/the-pivot-lora-quick with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Harshit-Makraria/the-pivot-lora-quick with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2.5-1.5B-Instruct") model = PeftModel.from_pretrained(base_model, "Harshit-Makraria/the-pivot-lora-quick") - Notebooks
- Google Colab
- Kaggle
The Pivot โ RL-trained Startup Advisor (LoRA)
Fine-tuned with GRPO on the The Pivot OpenEnv environment.
Trained to navigate hidden market phase shifts across 5 startup scenarios over 150 episodes.
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
base = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2.5-1.5B-Instruct")
model = PeftModel.from_pretrained(base, "Harshit-Makraria/the-pivot-lora-quick")
tokenizer = AutoTokenizer.from_pretrained("Harshit-Makraria/the-pivot-lora-quick")
Built for the Meta PyTorch OpenEnv Hackathon 2026.
- Downloads last month
- 36
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support