Question Answering
Transformers
Safetensors
German
mistral
text-generation
Connect-Transport
Logics Software
German support chatbot
Deutscher KI Chatbot
Kundenservice Chatbot
Deutscher Chatbot
KI-Chatbots für Unternehmen
Chatbot for SMEs
Question-answering
QLoRA fine-tuning
LLM training
text-generation-inference
Instructions to use logicssoftwaregmbh/logicsct-mistral-nemo-instruct with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use logicssoftwaregmbh/logicsct-mistral-nemo-instruct with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="logicssoftwaregmbh/logicsct-mistral-nemo-instruct")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("logicssoftwaregmbh/logicsct-mistral-nemo-instruct") model = AutoModelForCausalLM.from_pretrained("logicssoftwaregmbh/logicsct-mistral-nemo-instruct") - Notebooks
- Google Colab
- Kaggle
training based on logicsct_train_prompts_20250208_113921.json
#1
by loghugging25 - opened
No description provided.
loghugging25 changed pull request status to merged