metadata
base_model: 01-ai/Yi-Coder-9B-Chat
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
quantized_by: stelterlab
Yi Coder 9B Chat by 01-Ai
Model creator: 01-ai
Original model: Yi-Coder-9B-Chat
AWQ quantization: done by stelterlab in INT4 GEMM with AutoAWQ by casper-hansen (https://github.com/casper-hansen/AutoAWQ/)
Model Summary:
Yi Coder 9B Chat is a new coding model from Yi, supporting a staggering 52 programming language, and featuring a max context length of 128k, making it great for ingesting large codebases.
This model is tuned for chatting, not auto completion, so should be chatted with for programming questions.
It is the first model under 10B parameters to pass 20% on LiveCodeBench.
Technical Details
Trained on an extensive set of languages:
- java
- markdown
- python
- php
- javascript
- c++
- c#
- c
- typescript
- html
- go
- java_server_pages
- dart
- objective-c
- kotlin
- tex
- swift
- ruby
- sql
- rust
- css
- yaml
- matlab
- lua
- json
- shell
- visual_basic
- scala
- rmarkdown
- pascal
- fortran
- haskell
- assembly
- perl
- julia
- cmake
- groovy
- ocaml
- powershell
- elixir
- clojure
- makefile
- coffeescript
- erlang
- lisp
- toml
- batchfile
- cobol
- dockerfile
- r
- prolog
- verilog
128k context length, achieves 23% pass rate on LiveCodeBench, surpassing even some SOTA 15-33B models.
For more information see original model card Yi-Coder-9B-Chat