Edit model card

Qwen1.5-4B-Chat-rkllm

This is a conversion from Qwen/Qwen1.5-4B-Chat to the RKLLM format for chat in Rockchip devices.

Support Devices

  • RK3588/RK3588s

Convert tools

To Converting LLMs for Rockchip's NPUs, please see the artical1,2 for model details.

Converted with RKLLM runtime

  • RKLLM runtime 1.0.1

License

Same as the original Qwen/Qwen1.5-4B-Chat

Trouble shot

  • E RKNN: [10:48:59.683] failed to allocate handle, ret: -1, errno: 12, errstr: Cannot allocate memory
    firefly@firefly:~/Documents/rknn-llm$ rkllm ./chatglm3-6b.rkllm
    rkllm init start
    rkllm-runtime version: 1.0.1, rknpu driver version: 0.8.2, platform: RK3588
    Warning: Your rknpu driver version is too low, please upgrade to 0.9.6.
    E RKNN: [10:48:59.683] failed to allocate handle, ret: -1, errno: 12, errstr: Cannot allocate memory
    
    can not create weight memory for domain1
    E RKNN: [10:49:00.480] failed to allocate handle, ret: -1, errno: 12, errstr: Cannot allocate memory
    
    can not create weight memory for domain2
    E RKNN: [10:49:05.216] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
    
    # Solution
    firefly@firefly:~/Documents/rknn-llm$ ulimit -n 102400
    

Reference

  1. airockchip/rknn-llm
  2. Pelochus/ezrknn-llm
  3. Qwen/Qwen1.5-4B-Chat
  4. 跑大模型遇到问题 #62
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Unable to determine this model's library. Check the docs .