Feature Extraction
Transformers
PyTorch
English
roberta
fill-mask
smart-contract
web3
software-engineering
embedding
codebert
solidity
code-understanding
Instructions to use web3se/SmartBERT-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use web3se/SmartBERT-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="web3se/SmartBERT-v2")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("web3se/SmartBERT-v2") model = AutoModelForMaskedLM.from_pretrained("web3se/SmartBERT-v2") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -173,7 +173,6 @@ If you use **SmartBERT** in your research, please cite:
|
|
| 173 |
|
| 174 |
## Acknowledgement
|
| 175 |
|
| 176 |
-
This project was supported by:
|
| 177 |
-
|
| 178 |
- [Institute of Intelligent Computing Technology, Suzhou, CAS](http://iict.ac.cn/)
|
|
|
|
| 179 |
- CAS Mino (中科劢诺)
|
|
|
|
| 173 |
|
| 174 |
## Acknowledgement
|
| 175 |
|
|
|
|
|
|
|
| 176 |
- [Institute of Intelligent Computing Technology, Suzhou, CAS](http://iict.ac.cn/)
|
| 177 |
+
- [Macau University of Science and Technology](http://www.must.edu.mo)
|
| 178 |
- CAS Mino (中科劢诺)
|