Question Answering
Transformers
Safetensors
German
phi3
text-generation
Connect-Transport
Logics Software
German support chatbot
Deutscher KI Chatbot
Kundenservice Chatbot
Deutscher Chatbot
KI-Chatbots für Unternehmen
Chatbot for SMEs
Question-answering
QLoRA fine-tuning
LLM training
custom_code
text-generation-inference
Instructions to use logicssoftwaregmbh/logicsct-phi4 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use logicssoftwaregmbh/logicsct-phi4 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Q4 and Q8 based on logicsct_train_prompts_20250208_113921.json
#2 opened over 1 year ago
by
loghugging25