Instructions to use BAAI/AquilaCode-multi with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BAAI/AquilaCode-multi with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("BAAI/AquilaCode-multi", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Commit ·
7ab31cd
1
Parent(s): aaf932a
Update README.md
Browse files
README.md
CHANGED
|
@@ -37,7 +37,7 @@ The additional details of the Aquila model will be presented in the official tec
|
|
| 37 |
|
| 38 |
We will continue to release improved versions of Aquila model as open source.
|
| 39 |
|
| 40 |
-
- 2023/07/
|
| 41 |
- AquilaCode-mutil-01 md5: e6ea49fea7a737ffe41086ec7019cebb
|
| 42 |
- AquilaCode-mutil-02 md5: 4bba98eac44d785358ed5b6d2144a94a
|
| 43 |
- AquilaCode-Python-01 md5: e202e5b82db773ea369fe843fef1c34c
|
|
|
|
| 37 |
|
| 38 |
We will continue to release improved versions of Aquila model as open source.
|
| 39 |
|
| 40 |
+
- 2023/07/24 :release v0.9
|
| 41 |
- AquilaCode-mutil-01 md5: e6ea49fea7a737ffe41086ec7019cebb
|
| 42 |
- AquilaCode-mutil-02 md5: 4bba98eac44d785358ed5b6d2144a94a
|
| 43 |
- AquilaCode-Python-01 md5: e202e5b82db773ea369fe843fef1c34c
|