metadata
			language:
  - en
license: mit
library_name: transformers
tags:
  - robotics
  - isaac-sim
  - code-generation
  - simulation
  - qwen2
  - causal-lm
  - text-generation
  - text2text-generation
  - omni
  - nvidia
  - robotics-simulation
pipeline_tag: text-generation
base_model: Qwen/Qwen2.5-Coder-7B-Instruct
model-index:
  - name: Qwen2.5-Coder-7B-Instruct-Omni1.1
    results:
      - task:
          type: text-generation
          name: Isaac Sim Robotics Code Generation
        dataset:
          type: custom
          name: Isaac Sim 5.0 Synthetic Dataset
        metrics:
          - type: accuracy
            value: 0.95
            name: Domain Accuracy
          - type: code_quality
            value: 0.9
            name: Python Code Quality
      - task:
          type: text-generation
          name: Robotics Simulation Setup
        dataset:
          type: custom
          name: Isaac Sim 5.0 Synthetic Dataset
        metrics:
          - type: accuracy
            value: 0.94
            name: Simulation Setup Accuracy
Isaac Sim Robotics Qwen2.5-Coder-7B-Instruct-Omni1.1
A specialized fine-tuned Qwen2.5-Coder-7B-Instruct model optimized for Isaac Sim 5.0 robotics development, computer vision, and simulation tasks.
π Quick Start
Option 1: HuggingFace Transformers (Recommended)
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Isaac Sim robotics query
query = """<|im_start|>user
How do I create a robot with differential drive in Isaac Sim 5.0?
<|im_end|>
<|im_start|>assistant"""
inputs = tokenizer(query, return_tensors="pt")
outputs = model.generate(**inputs, max_length=512)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Option 2: CTransformers (Lightweight)
from ctransformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained(
    "TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1",
    model_type="qwen2",
    gpu_layers=0  # CPU inference
)
# Same usage pattern as above
Option 3: GGUF Conversion (Advanced)
# Convert to GGUF format for llama.cpp
python scripts/convert_to_gguf.py
# Use with llama.cpp
./llama-server --model models/gguf/isaac_sim_qwen2.5_coder_q4_0.gguf --port 8080
π― Model Capabilities
- Isaac Sim 5.0 Expertise: Deep knowledge of robotics simulation APIs
- Computer Vision: Understanding of sensor integration and perception
- Robot Control: Programming differential drive, manipulators, and sensors
- Simulation Setup: Environment configuration and physics parameters
- Code Generation: Python scripts for Isaac Sim workflows
- Troubleshooting: Common issues and solutions
π Performance
- Base Model: Qwen2.5-Coder-7B-Instruct
- Training Data: 2,000 Isaac Sim-specific examples
- Training Method: LoRA fine-tuning (rank 64, alpha 128)
- Hardware: NVIDIA RTX 4070 Laptop GPU (8.5GB VRAM)
- Training Steps: 300 with curriculum learning
π§ Installation
# Clone repository
git clone https://github.com/TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1.git
cd Qwen2.5-Coder-7B-Instruct-Omni1.1
# Install dependencies
pip install -r requirements.txt
# Download models (choose one)
# Option 1: HuggingFace (5.3GB)
huggingface-cli download TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1 --local-dir models/huggingface
# Option 2: CTransformers (5.2GB)
huggingface-cli download TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1 --local-dir models/ctransformers
# Option 3: GGUF (616MB + conversion)
huggingface-cli download TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1 --local-dir models/gguf
π Examples
Isaac Sim Robot Creation
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1")
tokenizer = AutoTokenizer.from_pretrained("TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1")
query = """<|im_start|>user
Create a Python script to spawn a UR5 robot in Isaac Sim 5.0 with proper physics properties.
<|im_end|>
<|im_start|>assistant"""
# Generate response
inputs = tokenizer(query, return_tensors="pt")
outputs = model.generate(**inputs, max_length=1024, temperature=0.7)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Sensor Integration
query = """<|im_start|>user
How do I add a depth camera to my robot and process the depth data in Isaac Sim?
<|im_end|>
<|im_start|>assistant"""
β οΈ Known Limitations
GGUF Conversion Issues
The GGUF conversion currently has metadata compatibility issues:
- Error: Missing qwen2.context_lengthfield
- Workaround: Use HuggingFace or CTransformers formats
- Status: Under investigation for future updates
Hardware Requirements
- HuggingFace: 8GB+ VRAM for full precision
- CTransformers: 4GB+ VRAM for optimized inference
- GGUF: 2GB+ VRAM (when conversion is fixed)
π οΈ Troubleshooting
Common Issues
- Out of Memory Errors - # Use 8-bit quantization model = AutoModelForCausalLM.from_pretrained( "TomBombadyl/Qwen2.5-Coder-7B-Instruct-Omni1.1", load_in_8bit=True, device_map="auto" )
- GGUF Loading Failures - Use HuggingFace or CTransformers formats instead
- Check troubleshooting guide
 
- Isaac Sim Integration Issues - Ensure Isaac Sim 5.0+ is installed
- Check integration examples
 
π Documentation
- Model Card - Detailed model information
- Training Methodology - How the model was trained
- Performance Benchmarks - Evaluation results
- Troubleshooting Guide - Common issues and solutions
π€ Contributing
We welcome contributions! Please see our contributing guidelines for details.
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Acknowledgments
- NVIDIA Isaac Sim Team for the simulation platform
- Qwen Team for the base model
- Hugging Face for the training infrastructure
- Open Source Community for tools and libraries
π Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Full Documentation
Note: This model is specifically trained for Isaac Sim 5.0 robotics development. For general coding tasks, consider using the base Qwen2.5-Coder-7B-Instruct model.
