| | --- |
| | license: openrail |
| | language: |
| | - en |
| | library_name: transformers |
| | pipeline_tag: text-generation |
| | tags: |
| | - legal |
| | --- |
| | # Distil GPT |
| |
|
| | <!-- Provide a quick summary of what the model is/does. --> |
| |
|
| | This is a small version of Generative Pre-trained Transformer 2 (GPT-2), pretrained on 10 GB of Pakistan's Legal Corpus to generate Legal text developed by **AI Systems** using Causal Language Modelling. |
| |
|
| |
|
| | <!-- Provide a longer summary of what this model is. --> |
| |
|
| | # Reference: |
| |
|
| | This model was orginally taken from "distilGPT2" developed by HuggingFace (https://huggingface.co/distilgpt2) |
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|