SadokBarbouche commited on
Commit
41f6f5f
·
verified ·
1 Parent(s): ef21fa5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -27
README.md CHANGED
@@ -1,34 +1,35 @@
1
- ---
2
- license: gemma
3
- library_name: transformers
4
- tags:
5
- - mlx
6
- widget:
7
- - messages:
8
- - role: user
9
- content: How does the brain work?
10
- inference:
11
- parameters:
12
- max_new_tokens: 200
13
- extra_gated_heading: Access Gemma on Hugging Face
14
- extra_gated_prompt: To access Gemma on Hugging Face, you’re required to review and
15
- agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging
16
- Face and click below. Requests are processed immediately.
17
- extra_gated_button_content: Acknowledge license
18
- ---
19
-
20
- # SadokBarbouche/planned.AI-gemma-2b-it-quantized
21
- This model was converted to MLX format from [`google/gemma-1.1-2b-it`]().
22
- Refer to the [original model card](https://huggingface.co/google/gemma-1.1-2b-it) for more details on the model.
23
- ## Use with mlx
24
 
 
 
 
 
 
 
 
 
 
 
25
  ```bash
26
- pip install mlx-lm
27
  ```
28
 
 
29
  ```python
30
- from mlx_lm import load, generate
 
 
 
31
 
32
- model, tokenizer = load("SadokBarbouche/planned.AI-gemma-2b-it-quantized")
33
- response = generate(model, tokenizer, prompt="hello", verbose=True)
34
  ```
 
 
 
 
 
 
 
 
 
 
1
+ # Planned.AI (planned day) Personalized Trip Planner Model in Tunisia (4-bit-quantized)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
 
3
+ ## Overview
4
+ This repository contains a personalized trip planner tool based on a finetuned version of the base model from the Hugging Face Transformers library. The tool generates tailored trip itineraries for users based on their preferences and specified destinations. The model leverages a dataset of scraped places from across Tunisia to provide comprehensive and personalized recommendations.
5
+
6
+ ## Model Description
7
+ The personalized trip planner utilizes a finetuned version of the base model from the Hugging Face Transformers library. The model has been trained on a dataset comprising various attractions, landmarks, and destinations from Tunisia. By incorporating user preferences and destination inputs, the model generates personalized trip plans that cater to individual interests and requirements.
8
+
9
+ ## Usage
10
+ To utilize the Personalized Trip Planner tool, follow these steps:
11
+
12
+ 1. Install the Hugging Face Transformers library:
13
  ```bash
14
+ pip install transformers
15
  ```
16
 
17
+ 2. Load the base model and tokenizer:
18
  ```python
19
+ from transformers import AutoModelForCausalLM, AutoTokenizer
20
+
21
+ # Load the base model
22
+ model = AutoModelForCausalLM.from_pretrained("SadokBarbouche/planned.AI-gemma-2b-it")
23
 
24
+ # Load the tokenizer
25
+ tokenizer = AutoTokenizer.from_pretrained("SadokBarbouche/planned.AI-gemma-2b-it")
26
  ```
27
+
28
+ ## Data Preparation
29
+ The model training data comprises scraped information about various attractions and landmarks from Tunisia. The dataset was carefully curated to encompass a diverse range of destinations, ensuring the model's ability to generate comprehensive trip plans.
30
+
31
+ ## Evaluation
32
+ The performance of the personalized trip planner tool was evaluated based on its ability to generate relevant, coherent, and personalized trip plans tailored to user preferences and specified destinations. Evaluation results demonstrate the effectiveness of the base model in providing valuable recommendations for travelers.
33
+
34
+ ## Acknowledgements
35
+ We would like to express our gratitude to the contributors of the ``google-maps-scraper`` tool on github , as well as the developers of the Hugging Face Transformers library for their support in model integration and usage.