| | ---
|
| | pipeline_tag: text2text-generation
|
| | license: apache-2.0
|
| | datasets:
|
| | - ElMater06/VSCD-dataset
|
| | language:
|
| | - en
|
| | ---
|
| | |
| | # Uploaded model |
| | - **What's This?** |
| |
|
| | - Vega Search Controlling Dataset or VSCD |
| |
|
| |
|
| |
|
| | **Prompt Format: Alpaca** |
| |
|
| | >### Edit this user's input to make it search for relevant keywords online. Only respond with the new string of optimized text. |
| | >### ### Instruction: {User_query} ### Response: " |
| | >### response: |
| | |
| | |
| | - VSCD-2b is a large language model trained on search queries and summarized search queries. |
| | - VSCD was trained on a synthetic dataset based off the Gemma-2b-IT model which contained 300 entries for 9 epochs, It was made to provide fast summarization of a user's query to make searching algorithms faster. |
| | - **Developed by:** ElMater06 |
| | - **License:** apache-2.0 |
| | - **Finetuned from model :** unsloth/gemma-2b-it-bnb-4bit |
| | |
| | This gemma model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. |
| | |
| | [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) |