|
|
--- |
|
|
language: en |
|
|
tags: |
|
|
- summarization |
|
|
- bart |
|
|
- nlp |
|
|
pipeline_tag: summarization |
|
|
model_name: Automated-Text-Summarizer |
|
|
--- |
|
|
|
|
|
# Automated Text Summarizer |
|
|
|
|
|
This model is a fine-tuned version of **BART-large-cnn**, specifically designed to generate high-quality, abstractive summaries of long-form text. |
|
|
|
|
|
## Model Description |
|
|
- **Developed by:** Aditya Prasad Sahu |
|
|
- **Model type:** Transformer-based Encoder-Decoder (BART) |
|
|
- **Language(s):** English |
|
|
- **Task:** Text Summarization |
|
|
|
|
|
## Project Context |
|
|
This project was developed as part of my focus on **Natural Language Processing (NLP)** and **Deep Learning**. My experience includes a 20-day internship covering CNNs, RNNs, and Transformer models (BERT/GPT), and this summarizer is a practical application of those concepts. |
|
|
|
|
|
## Intended Use |
|
|
This model is intended for: |
|
|
- Summarizing news articles. |
|
|
- Condensing research papers or long reports. |
|
|
- Integrating into personal portfolio projects as a microservice. |
|
|
|
|
|
## Performance |
|
|
- **Framework:** PyTorch & Transformers |
|
|
- **Precision:** Float32 |
|
|
- **Base Model:** facebook/bart-large-cnn |