manikumargouni commited on
Commit
d7448ff
·
verified ·
1 Parent(s): db31104

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +46 -1
README.md CHANGED
@@ -43,7 +43,30 @@ Combines multitask intent modeling, supervised IAB content classification, and p
43
 
44
  ## Deployment Options
45
 
46
- ### 1. `transformers.pipeline()` one line anywhere
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
  ```python
49
  from transformers import pipeline
@@ -142,6 +165,28 @@ clf = AdmeshIntentPipeline.from_pretrained("admesh/agentic-intent-classifier")
142
 
143
  ---
144
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
145
  ## Example Output
146
 
147
  ```json
 
43
 
44
  ## Deployment Options
45
 
46
+ ### 0. Colab / Kaggle Quickstart (copy/paste)
47
+
48
+ ```python
49
+ !pip -q install -U pip
50
+ !pip -q install -U "torch>=2.0.0" "transformers>=4.36.0" "huggingface_hub>=0.20.0" "safetensors>=0.4.0"
51
+ ```
52
+
53
+ ```python
54
+ from transformers import pipeline
55
+
56
+ clf = pipeline(
57
+ "admesh-intent",
58
+ model="admesh/agentic-intent-classifier",
59
+ trust_remote_code=True, # required (custom pipeline + multi-model bundle)
60
+ )
61
+
62
+ out = clf("Which laptop should I buy for college?")
63
+ print(out["meta"])
64
+ print(out["model_output"]["classification"]["intent"])
65
+ ```
66
+
67
+ ---
68
+
69
+ ### 1. `transformers.pipeline()` — anywhere (Python)
70
 
71
  ```python
72
  from transformers import pipeline
 
165
 
166
  ---
167
 
168
+ ## Troubleshooting (avoid environment errors)
169
+
170
+ ### `No module named 'combined_inference'` (or similar)
171
+
172
+ This means the Hub repo root is missing required Python files. Ensure these exist at the **root of the model repo** (same level as `pipeline.py`):
173
+
174
+ - `pipeline.py`, `config.json`, `config.py`
175
+ - `combined_inference.py`, `schemas.py`
176
+ - `model_runtime.py`, `multitask_runtime.py`, `multitask_model.py`
177
+ - `inference_intent_type.py`, `inference_subtype.py`, `inference_decision_phase.py`, `inference_iab_classifier.py`
178
+ - `iab_classifier.py`, `iab_taxonomy.py`
179
+
180
+ ### `does not appear to have a file named model.safetensors`
181
+
182
+ Transformers requires a standard checkpoint at the repo root for `pipeline()` to initialize. This repo includes a **small dummy** `model.safetensors` + tokenizer files at the root for compatibility; the *real* production weights live in:
183
+
184
+ - `multitask_intent_model_output/`
185
+ - `iab_classifier_model_output/`
186
+ - `artifacts/calibration/`
187
+
188
+ ---
189
+
190
  ## Example Output
191
 
192
  ```json