How to use SEBIS/code_trans_t5_small_program_synthese_multitask with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="SEBIS/code_trans_t5_small_program_synthese_multitask")
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask") model = AutoModel.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask")
Validated by the pt_to_tf CLI. Max crossload hidden state difference=1.431e-06; Max converted hidden state difference=1.431e-06.
pt_to_tf
(merging as agreed here: https://huggingface.co/SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune/discussions/1)
· Sign up or log in to comment