glm-4-ko-9b-chat / README.md
4n3mone's picture
Update README.md
25227b5 verified
|
raw
history blame
No virus
5.08 kB
metadata
library_name: transformers
language:
  - ko

Model Card for Model ID

readme coming soon

Model Details

Model Description

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: 4n3mone (YongSang Yoo)
  • Model type: chatglm
  • Language(s) (NLP): Korean
  • License: glm-4
  • Finetuned from model [optional]: THUDM/glm-4-9b-chat

Model Sources [optional]

  • Repository: THUDM/glm-4-9b-chat
  • Paper [optional]: [More Information Needed]
  • Demo [optional]: [More Information Needed]

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import AutoTokenizer
from vllm import LLM, SamplingParams


# GLM-4-9B-Chat
# If you encounter OOM (Out of Memory) issues, it is recommended to reduce max_model_len or increase tp_size.
max_model_len, tp_size = 131072, 1
model_name = "4n3mone/glm-4-ko-9b-chat-preview"
prompt = [{"role": "user", "content": "피카츄랑 아구몬 중에서 누가 더 귀여워?"}]

tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
llm = LLM(
    model=model_name,
    tensor_parallel_size=tp_size,
    max_model_len=max_model_len,
    trust_remote_code=True,
    enforce_eager=True,
    # If you encounter OOM (Out of Memory) issues, it is recommended to enable the following parameters.
    # enable_chunked_prefill=True,
    # max_num_batched_tokens=8192
)
stop_token_ids = [151329, 151336, 151338]
sampling_params = SamplingParams(temperature=0.95, max_tokens=1024, stop_token_ids=stop_token_ids)

inputs = tokenizer.apply_chat_template(prompt, tokenize=False, add_generation_prompt=True)
outputs = llm.generate(prompts=inputs, sampling_params=sampling_params)

print(outputs[0].outputs[0].text)

model.generate(prompt)

Training Details

Training Data

[More Information Needed]

Training Procedure

Preprocessing [optional]

[More Information Needed]

Training Hyperparameters

  • Training regime: [More Information Needed]

Speeds, Sizes, Times [optional]

[More Information Needed]

Evaluation

Testing Data, Factors & Metrics

Testing Data

[More Information Needed]

Factors

[More Information Needed]

Metrics

[More Information Needed]

Results

[More Information Needed]

Summary

Model Examination [optional]

[More Information Needed]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: [More Information Needed]
  • Hours used: [More Information Needed]
  • Cloud Provider: [More Information Needed]
  • Compute Region: [More Information Needed]
  • Carbon Emitted: [More Information Needed]

Technical Specifications [optional]

Model Architecture and Objective

[More Information Needed]

Compute Infrastructure

[More Information Needed]

Hardware

[More Information Needed]

Software

[More Information Needed]

Citation [optional]

BibTeX:

[More Information Needed]

APA:

[More Information Needed]

Glossary [optional]

[More Information Needed]

More Information [optional]

[More Information Needed]

Model Card Authors [optional]

[More Information Needed]

Model Card Contact

[More Information Needed]