Model Card for Model ID

ν•΄λ‹Ή λͺ¨λΈμ€ 법λ₯  도메인 데이터 20만개의 ν•œκ΅­μ–΄ μƒ˜ν”Œμ— λŒ€ν•˜μ—¬ μΆ”κ°€ν•™μŠ΅ν•œ λͺ¨λΈμž…λ‹ˆλ‹€.

Llama 3.1 8B base λͺ¨λΈμ—μ„œ ν•™μŠ΅μ΄ μ§„ν–‰λ˜μ—ˆμœΌλ©°, κ·Έ μ™Έ 좔가적인 instruction tuning은 μˆ˜ν–‰λ˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.

λ˜ν•œ, 이 λͺ¨λΈμ€ κ³Όν•™κΈ°μˆ μ •λ³΄ν†΅μ‹ λΆ€Β·κ΄‘μ£Όκ΄‘μ—­μ‹œκ°€ 곡동 μ§€μ›ν•œ β€˜μΈκ³΅μ§€λŠ₯ 쀑심 μ‚°μ—…μœ΅ν•© 집적단지 μ‘°μ„±μ‚¬μ—…β€™μœΌλ‘œ 지원을 λ°›μ•„ κ°œλ°œλ˜μ—ˆμŠ΅λ‹ˆλ‹€.

Model Details

Model Description

This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: fasoo
  • Model type: Causal Language Model
  • Language(s) (NLP): English, Korean
  • License: [More Information Needed]
  • Finetuned from model: Llama 3.1 8B

Model Sources [optional]

  • Paper [optional]: [More Information Needed]

Uses

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("fasoo/llama-3.1-legal")
model = AutoModelForCausalLM.from_pretrained("fasoo/llama-3.1-legal")

Environmental Impact

  • Hardware Type: NVIDIA H100 80GB HBM3
  • Compute Region: Gwangju, South Korea
Downloads last month
4
Safetensors
Model size
8.03B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support