Instructions to use SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune") model = AutoModel.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune") - Notebooks
- Google Colab
- Kaggle
Commit ·
251beaa
1
Parent(s): 14782fb
upload flax model
Browse files- flax_model.msgpack +3 -0
flax_model.msgpack
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:2ec5eaa99d63c40c50cc715764603f61f27081c2074811012dc86a92c173f2e4
|
| 3 |
+
size 242032202
|