Instructions to use law-ai/InLegalBERT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use law-ai/InLegalBERT with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="law-ai/InLegalBERT")# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("law-ai/InLegalBERT") model = AutoModelForPreTraining.from_pretrained("law-ai/InLegalBERT") - Inference
- Notebooks
- Google Colab
- Kaggle
How to use the model??!
#7
by liz12 - opened
How exactly do we use this model? Can you provide a demo code, like I understand the usage of pipelines, tokenizers etc, but how exactly do we deploy it to provide results?
liz12 changed discussion title from Request: DOI to How to use the model??!
How is this model to be used with in langchain framework? Can someone help?