VeriCoder_Qwen14B / README.md
nielsr's picture
nielsr HF Staff
Enhance model card for VeriCoder: Add paper, GitHub link, pipeline tag, and library name
ff825f4 verified
|
raw
history blame
2.05 kB
metadata
base_model:
  - Qwen/Qwen2.5-14B-Instruct
datasets:
  - LLM4Code/expanded_origen_126k
license: apache-2.0
tags:
  - Verilog
  - CodeGen
pipeline_tag: text-generation
library_name: transformers

VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation

This repository hosts VeriCoder, a model presented in the paper VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation.

VeriCoder is a model for Register Transfer Level (RTL) code generation fine-tuned on a dataset validated for functional correctness. This fine-tuning dataset is constructed using a novel methodology that combines unit test generation with feedback-directed refinement. Given a natural language specification and an initial RTL design, a teacher model iteratively revises the RTL design based on simulation results using generated tests. Every example in the dataset is functionally validated, consisting of a natural language description, an RTL implementation, and passing tests.

For more details and code, visit the GitHub Repository.

Key Highlights

  • Functionally Validated Dataset: 125,000+ examples with simulation-passing RTL designs.
  • Feedback-Driven Construction: Iteratively refine designs and tests based on test results.
  • Superior Performance: Achieves up to +71.7% relative improvement on VerilogEval benchmarks.
  • Comprehensive Resources: Includes dataset, model weights, inference scripts, and training pipeline.

Citation

If you find VeriCoder helpful in your research, please consider citing:

@article{wei2025vericoder,
  title={VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation},
  author={Wei, Anjiang and Tan, Huanmi and Suresh, Tarun and Mendoza, Daniel and Teixeira, Thiago SFX and Wang, Ke and Trippel, Caroline and Aiken, Alex},
  journal={arXiv preprint arXiv:2504.15659},
  year={2025}
}