YAML Metadata Warning: empty or missing yaml metadata in repo card
Check out the documentation for more information.
This is an experimental pretrained model.
Training progress: 0.28/1.5 epochs, ~2.5B / 13.5B tokens
Languages: English and Turkish
import torch
from transformers import pipeline, set_seed
set_seed(1881)
# Load the model pipeline
generator = pipeline("text-generation", model="Ba2han/checkpoint-muon-12665", device="cuda" if torch.cuda.is_available() else "cpu")
# Your prompts
prompts = ["Osmanlı Devleti'nin kurucusu", "Türkiye Cumhuriyeti'nin kurucusu", "Leyla ile", "Türkiye'nin başkenti", "Fransa'nın başkenti", "İran'ın başkenti", "Twitter is a"]
print("Generated outputs for all prompts:\n")
# Generate text for each prompt individually
for i, prompt in enumerate(prompts, 1):
try:
output = generator(
prompt,
temperature=0.7,
repetition_penalty=1,
top_p=0.95,
max_new_tokens=5,
return_full_text=False
)
# Extract the generated text
generated_text = output[0]['generated_text']
print(f"{i}. Prompt: '{prompt}'")
print(f" Generated: {generated_text}")
print()
except Exception as e:
print(f"{i}. Prompt: '{prompt}'")
print(f" Error: {e}")
print()
Generated outputs for all prompts:
- Prompt: 'Osmanlı Devleti'nin kurucusu'
Generated: Osman Bey'in oğ
- Prompt: 'Türkiye Cumhuriyeti'nin kurucusu'
Generated: Mustafa Kemal Atatürk
- Prompt: 'Leyla ile'
Generated: Mecnun izle
- Prompt: 'Türkiye'nin başkenti'
Generated: Ankara'da gece
- Prompt: 'Fransa'nın başkenti'
Generated: Paris'te bir terör
- Prompt: 'İran'ın başkenti'
Generated: Tahran'da bugün
- Prompt: 'Twitter is a'
Generated: site and a service.
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support