Edit model card

llama-2-ko-story-7b

llama-2-ko-story-7bλŠ” beomi/llama-2-ko-7bλ₯Ό 기반으둜 ν•œκΈ€ μ†Œμ„€ raw 데이터λ₯Ό ν•™μŠ΅μ‹œν‚¨ 기반 λͺ¨λΈμž…λ‹ˆλ‹€.

ν•™μŠ΅ 데이터

llama-2-ko-story-7bλŠ” μ•½ 167MB의 ν•œκΈ€ μ†Œμ„€ λ§λ­‰μΉ˜λ‘œ ν•™μŠ΅λ˜μ—ˆμŠ΅λ‹ˆλ‹€. μ£Όμš” 데이터셋은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.

Source Size (MB) Link
ν•œκΈ€ μ†Œμ„€ λ§λ­‰μΉ˜ 115.0
κ³΅μœ λ§ˆλ‹Ή ν•œκ΅­ κ³ μ „ λ¬Έν•™ λ§λ­‰μΉ˜ 53.0 https://gongu.copyright.or.kr/

ν•™μŠ΅

llama-2-ko-story-7bλŠ” beomi/llama-2-ko-7bμ—μ„œ qlora둜 μΆ”κ°€ ν•™μŠ΅λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

  • lora_alpha: 16
  • lora_dropout: 0.05
  • lora_r: 32
  • target_modules: q_proj, v_proj
  • epoch: 3
  • learning_rate: 3e-4
Downloads last month
6
Safetensors
Model size
6.86B params
Tensor type
F32
Β·