π€ KoAirBERT βοΈ
ν곡 μμ λλ©μΈμ νΉνλ νκ΅μ΄ BERT λͺ¨λΈ
How to use
π€ Huggingface Hubμ μ λ‘λ λ λͺ¨λΈμ λ°λ‘ μ¬μ©ν μ μμ΅λλ€ :)
# Load model directly
from transformers import AutoTokenizer, AutoModelForPreTraining
tokenizer = AutoTokenizer.from_pretrained("oneonlee/KoAirBERT")
model = AutoModelForPreTraining.from_pretrained("oneonlee/KoAirBERT")
Reference
- BERT
- klue/bert-base
- huggingface/transformers - pytorch language-modeling examples
- ν곡 μμ μ λ³΄μ§ (GYRO)
- ν곡μν€
- κ΅ν κ΅ν΅λΆ ν곡μ©μ΄μ¬μ
- ν곡μμ μμ¨λ³΄κ³ λ°±μ(2021)
Citation
μ΄ μ½λλ₯Ό μ°κ΅¬μ©μΌλ‘ μ¬μ©νλ κ²½μ° μλμ κ°μ΄ μΈμ©ν΄μ£ΌμΈμ.
@software{lee_2023_10158254,
author = {Lee, DongGeon},
title = {KoAirBERT: Korean BERT Model Specialized for Aviation Safety Domain},
month = nov,
year = 2023,
publisher = {Zenodo},
version = {v1.0.0},
doi = {10.5281/zenodo.10158254},
url = {https://doi.org/10.5281/zenodo.10158254}
}
License
KoAirBERT
λ AGPL-3.0
λΌμ΄μ μ€ νμ 곡κ°λμ΄ μμ΅λλ€. λͺ¨λΈ λ° μ½λλ₯Ό μ¬μ©ν κ²½μ° λΌμ΄μ μ€ λ΄μ©μ μ€μν΄μ£ΌμΈμ. λΌμ΄μ μ€ μ λ¬Έμ LICENSE νμΌμμ νμΈνμ€ μ μμ΅λλ€.
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support