Instructions to use BAAI/AquilaCode-multi with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BAAI/AquilaCode-multi with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("BAAI/AquilaCode-multi", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Commit ·
5901b16
1
Parent(s): b0fce2b
Upload pytorch_model-00001-of-00002.bin
Browse files
pytorch_model-00001-of-00002.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:1faabaa19bc93930a2b12778672c74278c739e782eb0845b343e6a92ac9ad242
|
| 3 |
+
size 9948612892
|