metadata
license: mit
language:
- en
base_model:
- facebook/mbart-large-50-many-to-many-mmt
tags:
- audi
- automotive
- chronic
- problems
- dsg
- engine
- models
Audi Insight AI ππ§
Model Description
This model is a fine-tuned mBART-50 sequence-to-sequence model for diagnosing chronic issues in Audi vehicles.
It maps user input describing symptoms (engine, transmission, electrical, etc.) into technical explanations.
The model focuses on Audi models, engine types, and powertrains β helping identify issues such as timing chain problems, turbocharger failures, injector issues, DPF clogging, and more.
Dataset
- Collected from Audi models and known chronic problems
- Training pairs: user complaint (input) β technical explanation (target)
Usage Example
Provide the car model, engine type, power (if relevant), and the observed problem.
The AI Agent will generate a possible technical explanation.
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
repo_id = "MahmutCanBoran/audi-insight-ai"
tokenizer = MBart50TokenizerFast.from_pretrained(repo_id)
tokenizer.src_lang = "en_XX"
tokenizer.tgt_lang = "en_XX"
model = MBartForConditionalGeneration.from_pretrained(repo_id)
# Example input
inp = "Audi A5 2.0 TFSI,I hear a rattling noise at startup."
enc = tokenizer(inp, return_tensors="pt").to(model.device)
gen = model.generate(
**enc,
max_new_tokens=64,
num_beams=4,
forced_bos_token_id=tokenizer.lang_code_to_id[tokenizer.tgt_lang]
)
print(tokenizer.decode(gen[0], skip_special_tokens=True))
π» Example app.py
import gradio as gr
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
import torch
# --- Load model & tokenizer ---
REPO_ID = "MahmutCanBoran/audi-insight-ai"
tokenizer = MBart50TokenizerFast.from_pretrained(REPO_ID)
tokenizer.src_lang = "en_XX"
model = MBartForConditionalGeneration.from_pretrained(REPO_ID)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
# --- Diagnosis function ---
def diagnose(user_text, num_beams, max_new_tokens):
txt = (user_text or "").strip()
if not txt:
return "Please type a symptom description like: `in my audi a5 2.0 tfsi i hear rattling noise at cold start`"
# Always output English
tokenizer.tgt_lang = "en_XX"
enc = tokenizer(txt, return_tensors="pt").to(device)
gen = model.generate(
**enc,
max_new_tokens=int(max_new_tokens),
num_beams=int(num_beams),
forced_bos_token_id=tokenizer.lang_code_to_id[tokenizer.tgt_lang]
)
out = tokenizer.decode(gen[0], skip_special_tokens=True)
return f"""### Result
**Input**: `{txt}`
**AI Agent Explanation:**
{out}
"""
# --- Example text ---
EXAMPLE_TEXT = "in my audi a5 2.0 tfsi i hear rattling noise at cold start"
# --- Gradio UI ---
with gr.Blocks(theme=gr.themes.Soft()) as demo:
gr.HTML(
"""
<div style="text-align:center">
<h1 style="margin:10px 0 0 0">Audi Insight AI</h1>
<p>Type your full sentence (model + engine + symptom). Example:
<code>in my audi a5 2.0 tfsi i hear rattling noise</code></p>
</div>
"""
)
with gr.Row():
with gr.Column(scale=2):
user_tb = gr.Textbox(label="Describe the issue", placeholder=EXAMPLE_TEXT, lines=3)
with gr.Accordion("Settings", open=False):
beams = gr.Slider(1, 8, value=4, step=1, label="Beam size")
max_tok = gr.Slider(16, 256, value=64, step=8, label="Max new tokens")
with gr.Row():
submit = gr.Button("Diagnose", variant="primary")
fill_ex = gr.Button("Use Example")
clear = gr.Button("Clear")
with gr.Column(scale=3):
output_md = gr.Markdown(value="### Result\n\n*(Awaiting input)*")
def fill_example():
return EXAMPLE_TEXT
def clear_all():
return "", "### Result\n\n*(Awaiting input)*"
submit.click(diagnose, inputs=[user_tb, beams, max_tok], outputs=[output_md])
fill_ex.click(fill_example, outputs=[user_tb])
clear.click(clear_all, outputs=[user_tb, output_md])
# --- Launch app ---
if __name__ == "__main__":
demo.launch()
π§ Local Setup & Run
1. Clone repo
git clone https://huggingface.co/spaces/MahmutCanBoran/audi-insight-ai-space
cd audi-insight-ai-space
2. Install dependencies
pip install -r requirements.txt
3. Run the app
python app.py