stunting-qa-v6 / README.md
kodetr's picture
Update README.md
a3c90b1 verified
metadata
library_name: transformers
tags:
  - kesehatan
  - stunting
  - anak
license: mit
datasets:
  - kodetr/stunting-qa-2025
language:
  - id
metrics:
  - rouge
  - bleu
base_model:
  - meta-llama/Llama-3.2-3B-Instruct
pipeline_tag: text-generation

Model Description

Konsultasi(Q&A) stunting pada anak

  • Developed by: Tanwir
  • Language : Indonesia

Training

image/webp

Information Result Training

      ***** train metrics *****
  epoch                    =     2.9987
  num_input_tokens_seen    =    1900976
  total_flos               = 79944066GF
  train_loss               =      0.872
  train_runtime            = 1:06:36.18
  train_samples_per_second =      5.737
  train_steps_per_second   =      0.358

Evaluation

***** predict metrics *****
  predict_bleu-4                 =    38.2196
  predict_model_preparation_time =     0.0039
  predict_rouge-1                =    42.4273
  predict_rouge-2                =    24.2559
  predict_rouge-l                =    38.6878
  predict_runtime                = 1:24:43.63
  predict_samples_per_second     =      1.503
  predict_steps_per_second       =      0.752

Parameter

LlamaConfig {
 "architectures": [
    "LlamaForCausalLM"
  ],
  "attention_bias": false,
  "attention_dropout": 0.0,
  "bos_token_id": 128000,
  "eos_token_id": [
    128001,
    128008,
    128009
  ],
  "head_dim": 128,
  "hidden_act": "silu",
  "hidden_size": 3072,
  "initializer_range": 0.02,
  "intermediate_size": 8192,
  "max_position_embeddings": 131072,
  "mlp_bias": false,
  "model_type": "llama",
  "num_attention_heads": 24,
  "num_hidden_layers": 28,
  "num_key_value_heads": 8,
  "pretraining_tp": 1,
  "rms_norm_eps": 1e-05,
  "rope_scaling": {
    "factor": 32.0,
    "high_freq_factor": 4.0,
    "low_freq_factor": 1.0,
    "original_max_position_embeddings": 8192,
    "rope_type": "llama3"
  },
  "rope_theta": 500000.0,
  "tie_word_embeddings": true,
  "torch_dtype": "bfloat16",
  "transformers_version": "4.51.3",
  "use_cache": true,
  "vocab_size": 128256
}

Use with transformers

Pastikan untuk memperbarui instalasi transformer Anda melalui pip install --upgrade transformer.

import torch
from transformers import pipeline

model_id = "kodetr/stunting-qa-v6"
pipe = pipeline(
    "text-generation",
    model=model_id,
    torch_dtype=torch.bfloat16,
    device_map="auto",
)

messages = [
    {"role": "system", "content": "Jelaskan definisi 1000 hari pertama kehidupan."},
    {"role": "user", "content": "Apa itu 1000 hari pertama kehidupan?"},
]
outputs = pipe(
    messages,
    max_new_tokens=256,
)
print(outputs[0]["generated_text"][-1])