File size: 3,493 Bytes
1c1aaa5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
522f479
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f7f28a2
522f479
 
 
 
 
f7f28a2
522f479
 
 
 
 
 
 
24aa6b8
522f479
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9371958
522f479
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c1aaa5
522f479
 
 
 
 
 
 
 
24aa6b8
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
---
license: apache-2.0
datasets:
- StudyPal/education
language:
- hr
- en
base_model:
- Qwen/Qwen2.5-32B
library_name: transformers
tags:
- education
- croatian
- qwen2
- fine-tuned
- study-assistant
---

  # StudyPal-LLM-1.0

  A fine-tuned Croatian educational assistant based on Qwen2.5-32B-Instruct, designed to help students with learning and study materials.

  ## Model Details

  ### Model Description

  StudyPal-LLM-1.0 is a large language model fine-tuned specifically for educational purposes in Croatian. The model excels at generating educational content, answering study questions, creating flashcards, and        
  providing learning assistance.

  - **Developed by:** aerodynamics21
  - **Model type:** Causal Language Model
  - **Language(s):** Croatian (primary), English (secondary)
  - **License:** Apache 2.0
  - **Finetuned from model:** Qwen/Qwen2.5-32B
  - **Parameters:** 32.8B

  ### Model Sources

  - **Repository:** https://huggingface.co/aerodynamics21/StudyPal-LLM-1.0
  - **Base Model:** https://huggingface.co/Qwen/Qwen2.5-32B
  - **Adapter:** https://huggingface.co/aerodynamics21/StudyPal-LLM-1

  ## Uses

  ### Direct Use

  This model is designed for educational applications:

  - Generating study materials in Croatian
  - Creating flashcards and quiz questions
  - Providing explanations of complex topics
  - Assisting with homework and learning

  ### Usage Examples

  ```python
  from transformers import AutoModelForCausalLM, AutoTokenizer

  model = AutoModelForCausalLM.from_pretrained("aerodynamics21/StudyPal-LLM-1.0")
  tokenizer = AutoTokenizer.from_pretrained("aerodynamics21/StudyPal-LLM-1.0")

  # Generate educational content
  prompt = "Objasni koncept fotosinteze:"
  inputs = tokenizer(prompt, return_tensors="pt")
  outputs = model.generate(**inputs, max_length=200, temperature=0.7)
  response = tokenizer.decode(outputs[0], skip_special_tokens=True)

  API Usage

  import requests

  API_URL = "https://api-inference.huggingface.co/models/aerodynamics21/StudyPal-LLM-1.0"
  headers = {"Authorization": f"Bearer {your_token}"}

  def query(payload):
      response = requests.post(API_URL, headers=headers, json=payload)
      return response.json()

  output = query({"inputs": "Stvori kviz o hrvatskoj povijesti:"})

  Training Details

  Training Data

  The model was fine-tuned on a Croatian educational dataset containing:
  - Educational conversations and Q&A pairs
  - Flashcard datasets
  - Quiz and summary materials
  - Croatian academic content

  Training Procedure

  - Base Model: Qwen2.5-32B
  - Training Method: LoRA (Low-Rank Adaptation)
  - Training Framework: Transformers + PEFT
  - Hardware: RunPod GPU instance

  Evaluation

  The model demonstrates strong performance in:
  - Croatian language comprehension and generation
  - Educational content creation
  - Study material generation
  - Academic question answering

  Bias, Risks, and Limitations

  - Primary focus on Croatian educational content
  - May reflect biases present in training data
  - Best suited for educational contexts
  - Performance may vary on non-educational tasks

  Citation

  @model{studypal-llm-1.0,
    title={StudyPal-LLM-1.0: A Croatian Educational Assistant},
    author={aerodynamics21},
    year={2025},
    url={https://huggingface.co/aerodynamics21/StudyPal-LLM-1.0}
  }

  Model Card Authors

  aerodynamics21

  Model Card Contact

  For questions about this model, please visit the repository or create an issue.