Instructions to use OtterDev/otterchat2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use OtterDev/otterchat2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="OtterDev/otterchat2")# Load model directly from transformers import AutoTokenizer, AutoModelForQuestionAnswering tokenizer = AutoTokenizer.from_pretrained("OtterDev/otterchat2") model = AutoModelForQuestionAnswering.from_pretrained("OtterDev/otterchat2") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -57,6 +57,7 @@ This model can be used to extract data from text, such as an essay.
|
|
| 57 |
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
| 58 |
|
| 59 |
The main limitation of this model is you <ins>NEED</ins> to have data in order for it to work. More will be posted when more limitations are found.
|
|
|
|
| 60 |
|
| 61 |
### Recommendations
|
| 62 |
|
|
|
|
| 57 |
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
| 58 |
|
| 59 |
The main limitation of this model is you <ins>NEED</ins> to have data in order for it to work. More will be posted when more limitations are found.
|
| 60 |
+
Another limitation is it is quite slow when translating languages.
|
| 61 |
|
| 62 |
### Recommendations
|
| 63 |
|