doberst commited on
Commit
e2fac82
·
verified ·
1 Parent(s): 9c87204

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -10,7 +10,9 @@ license: apache-2.0
10
 
11
  slim-sentiment has been fine-tuned for **sentiment analysis** function calls, generating output consisting of JSON dictionary corresponding to specified keys.
12
 
13
- Each slim model has a corresponding 'tool' in a separate repository, e.g., [**'slim-sentiment-tool'**](https://huggingface.co/llmware/slim-sentiment-tool), which a 4-bit quantized gguf version of the model that is intended to be used for inference.
 
 
14
 
15
  Inference speed and loading time is much faster with the 'tool' versions of the model.
16
 
@@ -42,7 +44,7 @@ All of the SLIM models use a novel prompt instruction structured as follows:
42
 
43
  "<human> " + text + "<classify> " + keys + "</classify>" + "/n<bot>: "
44
 
45
- =
46
  ## How to Get Started with the Model
47
 
48
  The fastest way to get started with BLING is through direct import in transformers:
@@ -87,7 +89,6 @@ The fastest way to get started with BLING is through direct import in transforme
87
  except:
88
  print("could not convert to json automatically - ", output_only)
89
 
90
- '''
91
 
92
  ## Using as Function Call in LLMWare
93
 
 
10
 
11
  slim-sentiment has been fine-tuned for **sentiment analysis** function calls, generating output consisting of JSON dictionary corresponding to specified keys.
12
 
13
+ Each slim model has a corresponding 'tool' in a separate repository, e.g.,
14
+
15
+ [**'slim-sentiment-tool'**](https://huggingface.co/llmware/slim-sentiment-tool), which a 4-bit quantized gguf version of the model that is intended to be used for inference.
16
 
17
  Inference speed and loading time is much faster with the 'tool' versions of the model.
18
 
 
44
 
45
  "<human> " + text + "<classify> " + keys + "</classify>" + "/n<bot>: "
46
 
47
+
48
  ## How to Get Started with the Model
49
 
50
  The fastest way to get started with BLING is through direct import in transformers:
 
89
  except:
90
  print("could not convert to json automatically - ", output_only)
91
 
 
92
 
93
  ## Using as Function Call in LLMWare
94