teolm30 commited on
Commit
4825040
·
verified ·
1 Parent(s): 0274e91

Add comprehensive terminal commands for Ollama usage

Browse files
Files changed (1) hide show
  1. README.md +47 -5
README.md CHANGED
@@ -94,18 +94,60 @@ Fox1.3 combines two strategies that eliminate the need for massive model sizes:
94
 
95
  ## 🚀 Usage
96
 
97
- ### Ollama
98
 
99
  ```bash
 
 
 
 
 
 
 
 
 
 
100
  ollama run fox1.3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
101
  ```
102
 
103
- ### Via Python (with LoRA)
104
 
105
  ```python
106
- from unsloth import FastLanguageModel
107
- model, tokenizer = FastLanguageModel.from_pretrained("unsloth/Qwen2.5-0.5B-unsloth-bnb-4bit")
108
- model = PeftModel.from_pretrained(model, "teolm30/fox1.3-v7-lora")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
109
  ```
110
 
111
  ---
 
94
 
95
  ## 🚀 Usage
96
 
97
+ ### Terminal / Command Line
98
 
99
  ```bash
100
+ # Run the model (single prompt)
101
+ ollama run fox1.3 "Your question here"
102
+
103
+ # Check if model is installed
104
+ ollama list
105
+
106
+ # Pull the model from HuggingFace
107
+ ollama pull teolm30/fox1.3
108
+
109
+ # Start interactive chat
110
  ollama run fox1.3
111
+
112
+ # Example prompts to try:
113
+ # "If all birds can fly and penguins are birds, can penguins fly?"
114
+ # "A bat and ball cost $1.10. The bat costs $1.00 more than the ball. How much is the ball?"
115
+ # "Write a Python function to check if a number is even"
116
+ ```
117
+
118
+ ### Python API
119
+
120
+ ```python
121
+ import requests
122
+
123
+ response = requests.post("http://localhost:11434/api/generate", json={
124
+ "model": "fox1.3",
125
+ "prompt": "Your question here",
126
+ "stream": False
127
+ })
128
+ print(response.json()["response"])
129
  ```
130
 
131
+ ### Via Ollama Python Library
132
 
133
  ```python
134
+ import ollama
135
+
136
+ response = ollama.chat(model='fox1.3', messages=[
137
+ {'role': 'user', 'content': 'Your question here'}
138
+ ])
139
+ print(response['message']['content'])
140
+ ```
141
+
142
+ ### Via OpenClaw (Recommended for Web Search)
143
+
144
+ Fox1.3 works best through OpenClaw, which adds web search capability:
145
+
146
+ ```bash
147
+ # Start OpenClaw
148
+ openclaw start
149
+
150
+ # The model is automatically available through the OpenClaw interface
151
  ```
152
 
153
  ---