You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

slidehelper-outlineGen

This model is a fine-tuned version of TinyLlama/TinyLlama-1.1B-Chat-v1.0 for generating presentation outlines.

Model Description

This model has been fine-tuned to generate structured outlines for presentations based on given topics.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

# Load the model and tokenizer
tokenizer = AutoTokenizer.from_pretrained("Alestin/slidehelper-outlineGen")
model = AutoModelForCausalLM.from_pretrained("Alestin/slidehelper-outlineGen")

def generate_outline(topic, max_new_tokens=200):
    prompt = f"### Instruction:\nGenerate an outline for a presentation on: {topic}\n\n### Response:"

    inputs = tokenizer(prompt, return_tensors="pt")

    with torch.no_grad():
        outputs = model.generate(
            **inputs,
            max_new_tokens=max_new_tokens,
            temperature=0.7,
            top_p=0.9,
            repetition_penalty=1.2,
            do_sample=True,
            pad_token_id=tokenizer.eos_token_id
        )

    generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
    response = generated_text.split("### Response:")[1].strip()
    return response

# Example usage
outline = generate_outline("artificial intelligence")
print(outline)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 2 Ask for provider support

Model tree for Alestin/slidehelper-outlineGen

Finetuned
(511)
this model