Question Answering
Transformers
Safetensors
German
phi3
text-generation
Connect-Transport
Logics Software
German support chatbot
Deutscher KI Chatbot
Kundenservice Chatbot
Deutscher Chatbot
KI-Chatbots für Unternehmen
Chatbot for SMEs
Question-answering
QLoRA fine-tuning
LLM training
custom_code
text-generation-inference
Instructions to use logicssoftwaregmbh/logicsct-phi4 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use logicssoftwaregmbh/logicsct-phi4 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("logicssoftwaregmbh/logicsct-phi4", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
File too large to display, you can check the raw version instead.