How to use microsoft/tapex-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("table-question-answering", model="microsoft/tapex-base")
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("microsoft/tapex-base") model = AutoModelForSeq2SeqLM.from_pretrained("microsoft/tapex-base")
In the paper:TA PE X is conceptually simple and easy to implement. In this paper, we regard the pre-training as a sequence generation task and employ an encoder-decoder model
TA PE X is conceptually simple and easy to implement. In this paper, we regard the pre-training as a sequence generation task and employ an encoder-decoder model
Thanks for fixing!
· Sign up or log in to comment