language:
- ko
- en
tags:
- Transformer
- korean
- romanization
- person name
- 한국어
- korean-romanization
license: cc-by-nc-sa-4.0
Kraft (Korean Romanization From Transformer)
The Kraft (Korean Romanization From Transformer) model translates the characters (Hangul) of a Korean person name into the Roman alphabet (McCune–Reischauer system). Kraft uses the Transformer architecture, which is a type of neural network architecture that was introduced in the 2017 paper "Attention Is All You Need" by Google researchers. It is designed for sequence-to-sequence tasks, such as machine translation, language modeling, and summarization.
Translating a Korean name into an English romanization is a type of machine translation task, where the input is a sequence of characters representing a Korean name, and the output is a sequence of characters representing the English romanization of that name. The Transformer model, with its attention mechanism and ability to handle input sequences of varying lengths, is well-suited to this type of task, and is able to accurately translate Korean names to English romanization.
Model description
The transformer model has an encoder and a decoder, in which the encoder takes a sentence in the source language and the decoder outputs it into the target language.
Intended uses & limitations
Note that this model primarily aims at translating Korean names into English romanization.
Authors
Queenie Luo
Yafei Chen
Hongsu Wang
Kanghun Ahn
Sun Joo Kim
Peter Bol
CBDB Group
Acknowledgement
Mikyung Kang
Hyoungbae Lee
Shirley Boya Ouyang
License
Copyright (c) 2023 CBDB
Except where otherwise noted, content on this repository is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0). To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.