Text Classification
Transformers
TensorBoard
Safetensors
roberta
Generated from Trainer
unsloth
text-embeddings-inference
Instructions to use PiGrieco/OpenSesame with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use PiGrieco/OpenSesame with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="PiGrieco/OpenSesame")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("PiGrieco/OpenSesame") model = AutoModelForSequenceClassification.from_pretrained("PiGrieco/OpenSesame") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- Unsloth Studio new
How to use PiGrieco/OpenSesame with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for PiGrieco/OpenSesame to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for PiGrieco/OpenSesame to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for PiGrieco/OpenSesame to start chatting
Load model with FastModel
pip install unsloth from unsloth import FastModel model, tokenizer = FastModel.from_pretrained( model_name="PiGrieco/OpenSesame", max_seq_length=2048, )
Update README.md
Browse files
README.md
CHANGED
|
@@ -26,9 +26,27 @@ It achieves the following results on the evaluation set:
|
|
| 26 |
|
| 27 |
This model is part of "Word Of Prompt" library and is intended to detect buying intention in LLM prompt from users.
|
| 28 |
|
| 29 |
-
Label 1 = User has buying intention
|
|
|
|
| 30 |
Label 2 = User hasn't buying intention
|
| 31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
## Training procedure
|
| 33 |
|
| 34 |
### Training hyperparameters
|
|
|
|
| 26 |
|
| 27 |
This model is part of "Word Of Prompt" library and is intended to detect buying intention in LLM prompt from users.
|
| 28 |
|
| 29 |
+
Label 1 = User has buying intention ;
|
| 30 |
+
|
| 31 |
Label 2 = User hasn't buying intention
|
| 32 |
|
| 33 |
+
# Join The Team
|
| 34 |
+
|
| 35 |
+
You can find our pitch deck here: [Word Of Prompt - Pitch Deck](https://drive.google.com/file/d/16EdcmIdPV7qq4XihTqDHyvNdUW-N7v4A/view?usp=sharing)
|
| 36 |
+
|
| 37 |
+
Word of Prompt wants to help LLM and Agent democratization: incertain returns on development of such technologies stop their diffusion.
|
| 38 |
+
|
| 39 |
+
WoP gives to developers an alternative monetization methodology, meanwhile enhances User Experience: revolutionizing advertising by integrating it seamlessly into AI-driven conversations, enhancing user experience while maintaining the natural flow of dialogue.
|
| 40 |
+
|
| 41 |
+
Unlike traditional disruptive ads, our integrable library leverages AI to present contextually relevant ads, mirroring the trust and personal touch of word-of-mouth recommendations.
|
| 42 |
+
This approach ensures that ads are not just seen but are also relevant and timely, significantly increasing engagement and conversion rates.
|
| 43 |
+
|
| 44 |
+
In future, we'll develop a managed platform giving marketers a new channel to marketing products and developers a new earning opportunity.
|
| 45 |
+
|
| 46 |
+
With Word of Prompt, we’re not just changing how ads are delivered; we’re transforming how they're perceived, making them a valuable addition to every conversation.
|
| 47 |
+
|
| 48 |
+
If you want to know more or join the team, contact us on LinkedIn: [Piermatteo Grieco](https://www.linkedin.com/in/piermatteo-grieco/)
|
| 49 |
+
|
| 50 |
## Training procedure
|
| 51 |
|
| 52 |
### Training hyperparameters
|