Sentence Similarity
sentence-transformers
PyTorch
Safetensors
Transformers
roberta
feature-extraction
text-embeddings-inference
Instructions to use mchochlov/codebert-base-cd-ft with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use mchochlov/codebert-base-cd-ft with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("mchochlov/codebert-base-cd-ft") sentences = [ "That is a happy person", "That is a happy dog", "That is a very happy person", "Today is a sunny day" ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [4, 4] - Transformers
How to use mchochlov/codebert-base-cd-ft with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("mchochlov/codebert-base-cd-ft") model = AutoModel.from_pretrained("mchochlov/codebert-base-cd-ft") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -92,7 +92,7 @@ SentenceTransformer(
|
|
| 92 |
|
| 93 |
<!--- Describe where people can find more information -->
|
| 94 |
Please cite this paper if using the model.
|
| 95 |
-
|
| 96 |
@inproceedings{chochlov2022using,
|
| 97 |
title={Using a Nearest-Neighbour, BERT-Based Approach for Scalable Clone Detection},
|
| 98 |
author={Chochlov, Muslim and Ahmed, Gul Aftab and Patten, James Vincent and Lu, Guoxian and Hou, Wei and Gregg, David and Buckley, Jim},
|
|
@@ -100,4 +100,5 @@ Please cite this paper if using the model.
|
|
| 100 |
pages={582--591},
|
| 101 |
year={2022},
|
| 102 |
organization={IEEE}
|
| 103 |
-
}
|
|
|
|
|
|
| 92 |
|
| 93 |
<!--- Describe where people can find more information -->
|
| 94 |
Please cite this paper if using the model.
|
| 95 |
+
```latex
|
| 96 |
@inproceedings{chochlov2022using,
|
| 97 |
title={Using a Nearest-Neighbour, BERT-Based Approach for Scalable Clone Detection},
|
| 98 |
author={Chochlov, Muslim and Ahmed, Gul Aftab and Patten, James Vincent and Lu, Guoxian and Hou, Wei and Gregg, David and Buckley, Jim},
|
|
|
|
| 100 |
pages={582--591},
|
| 101 |
year={2022},
|
| 102 |
organization={IEEE}
|
| 103 |
+
}
|
| 104 |
+
```
|