You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Automated Text Summarizer

This model is a fine-tuned version of BART-large-cnn, specifically designed to generate high-quality, abstractive summaries of long-form text.

Model Description

  • Developed by: Aditya Prasad Sahu
  • Model type: Transformer-based Encoder-Decoder (BART)
  • Language(s): English
  • Task: Text Summarization

Project Context

This project was developed as part of my focus on Natural Language Processing (NLP) and Deep Learning. My experience includes a 20-day internship covering CNNs, RNNs, and Transformer models (BERT/GPT), and this summarizer is a practical application of those concepts.

Intended Use

This model is intended for:

  • Summarizing news articles.
  • Condensing research papers or long reports.
  • Integrating into personal portfolio projects as a microservice.

Performance

  • Framework: PyTorch & Transformers
  • Precision: Float32
  • Base Model: facebook/bart-large-cnn
Downloads last month
79
Safetensors
Model size
0.4B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using VoltIC/Automated-Text-Summarizer 1