voice_clone_v3 / transformers /docs /source /ko /autoclass_tutorial.md
ahassoun's picture
Upload 3018 files
ee6e328
|
raw
history blame
No virus
8.14 kB
<!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
specific language governing permissions and limitations under the License.
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
rendered properly in your Markdown viewer.
-->
# AutoClass둜 사전 ν•™μŠ΅λœ μΈμŠ€ν„΄μŠ€ λ‘œλ“œ[[load-pretrained-instances-with-an-autoclass]]
트랜슀포머 μ•„ν‚€ν…μ²˜κ°€ 맀우 λ‹€μ–‘ν•˜κΈ° λ•Œλ¬Έμ— μ²΄ν¬ν¬μΈνŠΈμ— λ§žλŠ” μ•„ν‚€ν…μ²˜λ₯Ό μƒμ„±ν•˜λŠ” 것이 μ–΄λ €μšΈ 수 μžˆμŠ΅λ‹ˆλ‹€. 라이브러리λ₯Ό 쉽고 κ°„λ‹¨ν•˜λ©° μœ μ—°ν•˜κ²Œ μ‚¬μš©ν•˜κΈ° μœ„ν•œ Transformer 핡심 μ² ν•™μ˜ μΌν™˜μœΌλ‘œ, `AutoClass`λŠ” 주어진 μ²΄ν¬ν¬μΈνŠΈμ—μ„œ μ˜¬λ°”λ₯Έ μ•„ν‚€ν…μ²˜λ₯Ό μžλ™μœΌλ‘œ μΆ”λ‘ ν•˜μ—¬ λ‘œλ“œν•©λ‹ˆλ‹€. `from_pretrained()` λ©”μ„œλ“œλ₯Ό μ‚¬μš©ν•˜λ©΄ λͺ¨λ“  μ•„ν‚€ν…μ²˜μ— λŒ€ν•΄ 사전 ν•™μŠ΅λœ λͺ¨λΈμ„ λΉ λ₯΄κ²Œ λ‘œλ“œν•  수 μžˆμœΌλ―€λ‘œ λͺ¨λΈμ„ μ²˜μŒλΆ€ν„° ν•™μŠ΅ν•˜λŠ” 데 μ‹œκ°„κ³Ό λ¦¬μ†ŒμŠ€λ₯Ό νˆ¬μž…ν•  ν•„μš”κ°€ μ—†μŠ΅λ‹ˆλ‹€.
μ²΄ν¬ν¬μΈνŠΈμ— ꡬ애받지 μ•ŠλŠ” μ½”λ“œλ₯Ό μƒμ„±ν•œλ‹€λŠ” 것은 μ½”λ“œκ°€ ν•œ μ²΄ν¬ν¬μΈνŠΈμ—μ„œ μž‘λ™ν•˜λ©΄ μ•„ν‚€ν…μ²˜κ°€ λ‹€λ₯΄λ”라도 λ‹€λ₯Έ 체크포인트(μœ μ‚¬ν•œ μž‘μ—…μ— λŒ€ν•΄ ν•™μŠ΅λœ 경우)μ—μ„œλ„ μž‘λ™ν•œλ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
<Tip>
μ•„ν‚€ν…μ²˜λŠ” λͺ¨λΈμ˜ 골격을 μ˜λ―Έν•˜λ©° μ²΄ν¬ν¬μΈνŠΈλŠ” 주어진 μ•„ν‚€ν…μ²˜μ— λŒ€ν•œ κ°€μ€‘μΉ˜μž…λ‹ˆλ‹€. 예λ₯Ό λ“€μ–΄, [BERT](https://huggingface.co/bert-base-uncased)λŠ” μ•„ν‚€ν…μ²˜μ΄κ³ , `bert-base-uncased`λŠ” μ²΄ν¬ν¬μΈνŠΈμž…λ‹ˆλ‹€. λͺ¨λΈμ€ μ•„ν‚€ν…μ²˜ λ˜λŠ” 체크포인트λ₯Ό μ˜λ―Έν•  수 μžˆλŠ” 일반적인 μš©μ–΄μž…λ‹ˆλ‹€.
</Tip>
이 νŠœν† λ¦¬μ–Όμ—μ„œλŠ” λ‹€μŒμ„ ν•™μŠ΅ν•©λ‹ˆλ‹€:
* 사전 ν•™μŠ΅λœ ν† ν¬λ‚˜μ΄μ € λ‘œλ“œν•˜κΈ°.
* 사전 ν•™μŠ΅λœ 이미지 ν”„λ‘œμ„Έμ„œ λ‘œλ“œν•˜κΈ°.
* 사전 ν•™μŠ΅λœ νŠΉμ§• μΆ”μΆœκΈ° λ‘œλ“œν•˜κΈ°.
* 사전 ν›ˆλ ¨λœ ν”„λ‘œμ„Έμ„œ λ‘œλ“œν•˜κΈ°.
* 사전 ν•™μŠ΅λœ λͺ¨λΈ λ‘œλ“œν•˜κΈ°.
## AutoTokenizer[[autotokenizer]]
거의 λͺ¨λ“  NLP μž‘μ—…μ€ ν† ν¬λ‚˜μ΄μ €λ‘œ μ‹œμž‘λ©λ‹ˆλ‹€. ν† ν¬λ‚˜μ΄μ €λŠ” μ‚¬μš©μžμ˜ μž…λ ₯을 λͺ¨λΈμ—μ„œ μ²˜λ¦¬ν•  수 μžˆλŠ” ν˜•μ‹μœΌλ‘œ λ³€ν™˜ν•©λ‹ˆλ‹€.
[`AutoTokenizer.from_pretrained`]둜 ν† ν¬λ‚˜μ΄μ €λ₯Ό λ‘œλ“œν•©λ‹ˆλ‹€:
```py
>>> from transformers import AutoTokenizer
>>> tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
```
그리고 μ•„λž˜μ™€ 같이 μž…λ ₯을 ν† ν°ν™”ν•©λ‹ˆλ‹€:
```py
>>> sequence = "In a hole in the ground there lived a hobbit."
>>> print(tokenizer(sequence))
{'input_ids': [101, 1999, 1037, 4920, 1999, 1996, 2598, 2045, 2973, 1037, 7570, 10322, 4183, 1012, 102],
'token_type_ids': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
'attention_mask': [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]}
```
## AutoImageProcessor[[autoimageprocessor]]
λΉ„μ „ μž‘μ—…μ˜ 경우 이미지 ν”„λ‘œμ„Έμ„œκ°€ 이미지λ₯Ό μ˜¬λ°”λ₯Έ μž…λ ₯ ν˜•μ‹μœΌλ‘œ μ²˜λ¦¬ν•©λ‹ˆλ‹€.
```py
>>> from transformers import AutoImageProcessor
>>> image_processor = AutoImageProcessor.from_pretrained("google/vit-base-patch16-224")
```
## AutoFeatureExtractor[[autofeatureextractor]]
μ˜€λ””μ˜€ μž‘μ—…μ˜ 경우 νŠΉμ§• μΆ”μΆœκΈ°κ°€ μ˜€λ””μ˜€ μ‹ ν˜Έλ₯Ό μ˜¬λ°”λ₯Έ μž…λ ₯ ν˜•μ‹μœΌλ‘œ μ²˜λ¦¬ν•©λ‹ˆλ‹€.
[`AutoFeatureExtractor.from_pretrained`]둜 νŠΉμ§• μΆ”μΆœκΈ°λ₯Ό λ‘œλ“œν•©λ‹ˆλ‹€:
```py
>>> from transformers import AutoFeatureExtractor
>>> feature_extractor = AutoFeatureExtractor.from_pretrained(
... "ehcalabres/wav2vec2-lg-xlsr-en-speech-emotion-recognition"
... )
```
## AutoProcessor[[autoprocessor]]
λ©€ν‹°λͺ¨λ‹¬ μž‘μ—…μ—λŠ” 두 가지 μœ ν˜•μ˜ μ „μ²˜λ¦¬ 도ꡬλ₯Ό κ²°ν•©ν•œ ν”„λ‘œμ„Έμ„œκ°€ ν•„μš”ν•©λ‹ˆλ‹€. 예λ₯Ό λ“€μ–΄ LayoutLMV2 λͺ¨λΈμ—λŠ” 이미지λ₯Ό μ²˜λ¦¬ν•˜λŠ” 이미지 ν”„λ‘œμ„Έμ„œμ™€ ν…μŠ€νŠΈλ₯Ό μ²˜λ¦¬ν•˜λŠ” ν† ν¬λ‚˜μ΄μ €κ°€ ν•„μš”ν•˜λ©°, ν”„λ‘œμ„Έμ„œλŠ” 이 두 가지λ₯Ό κ²°ν•©ν•©λ‹ˆλ‹€.
[`AutoProcessor.from_pretrained()`]둜 ν”„λ‘œμ„Έμ„œλ₯Ό λ‘œλ“œν•©λ‹ˆλ‹€:
```py
>>> from transformers import AutoProcessor
>>> processor = AutoProcessor.from_pretrained("microsoft/layoutlmv2-base-uncased")
```
## AutoModel[[automodel]]
<frameworkcontent>
<pt>
λ§ˆμ§€λ§‰μœΌλ‘œ AutoModelFor클래슀λ₯Ό μ‚¬μš©ν•˜λ©΄ 주어진 μž‘μ—…μ— λŒ€ν•΄ 미리 ν•™μŠ΅λœ λͺ¨λΈμ„ λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€ (μ‚¬μš© κ°€λŠ₯ν•œ μž‘μ—…μ˜ 전체 λͺ©λ‘μ€ [μ—¬κΈ°](model_doc/auto)λ₯Ό μ°Έμ‘°ν•˜μ„Έμš”). 예λ₯Ό λ“€μ–΄, [`AutoModelForSequenceClassification.from_pretrained`]λ₯Ό μ‚¬μš©ν•˜μ—¬ μ‹œν€€μŠ€ λΆ„λ₯˜μš© λͺ¨λΈμ„ λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€:
```py
>>> from transformers import AutoModelForSequenceClassification
>>> model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
```
λ™μΌν•œ 체크포인트λ₯Ό μ‰½κ²Œ μž¬μ‚¬μš©ν•˜μ—¬ λ‹€λ₯Έ μž‘μ—…μ— μ•„ν‚€ν…μ²˜λ₯Ό λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€:
```py
>>> from transformers import AutoModelForTokenClassification
>>> model = AutoModelForTokenClassification.from_pretrained("distilbert-base-uncased")
```
<Tip warning={true}>
PyTorchλͺ¨λΈμ˜ 경우 `from_pretrained()` λ©”μ„œλ“œλŠ” λ‚΄λΆ€μ μœΌλ‘œ 피클을 μ‚¬μš©ν•˜μ—¬ μ•ˆμ „ν•˜μ§€ μ•Šμ€ κ²ƒμœΌλ‘œ μ•Œλ €μ§„ `torch.load()`λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
일반적으둜 μ‹ λ’°ν•  수 μ—†λŠ” μ†ŒμŠ€μ—μ„œ κ°€μ Έμ™”κ±°λ‚˜ λ³€μ‘°λ˜μ—ˆμ„ 수 μžˆλŠ” λͺ¨λΈμ€ λ‘œλ“œν•˜μ§€ λ§ˆμ„Έμš”. ν—ˆκΉ… 페이슀 ν—ˆλΈŒμ—μ„œ ν˜ΈμŠ€νŒ…λ˜λŠ” 곡개 λͺ¨λΈμ˜ 경우 μ΄λŸ¬ν•œ λ³΄μ•ˆ μœ„ν—˜μ΄ λΆ€λΆ„μ μœΌλ‘œ μ™„ν™”λ˜λ©°, 각 컀밋 μ‹œ 멀웨어λ₯Ό [κ²€μ‚¬ν•©λ‹ˆλ‹€](https://huggingface.co/docs/hub/security-malware). GPGλ₯Ό μ‚¬μš©ν•΄ μ„œλͺ…λœ [컀밋 검증](https://huggingface.co/docs/hub/security-gpg#signing-commits-with-gpg)κ³Ό 같은 λͺ¨λ²”μ‚¬λ‘€λŠ” [λ¬Έμ„œ](https://huggingface.co/docs/hub/security)λ₯Ό μ°Έμ‘°ν•˜μ„Έμš”.
ν…μ„œν”Œλ‘œμš°μ™€ Flax μ²΄ν¬ν¬μΈνŠΈλŠ” 영ν–₯을 받지 μ•ŠμœΌλ©°, `from_pretrained`λ©”μ„œλ“œμ— `from_tf` 와 `from_flax` ν‚€μ›Œλ“œ κ°€λ³€ 인자λ₯Ό μ‚¬μš©ν•˜μ—¬ 이 문제λ₯Ό μš°νšŒν•  수 μžˆμŠ΅λ‹ˆλ‹€.
</Tip>
일반적으둜 AutoTokenizer ν΄λž˜μŠ€μ™€ AutoModelFor 클래슀λ₯Ό μ‚¬μš©ν•˜μ—¬ 미리 ν•™μŠ΅λœ λͺ¨λΈ μΈμŠ€ν„΄μŠ€λ₯Ό λ‘œλ“œν•˜λŠ” 것이 μ’‹μŠ΅λ‹ˆλ‹€. μ΄λ ‡κ²Œ ν•˜λ©΄ 맀번 μ˜¬λ°”λ₯Έ μ•„ν‚€ν…μ²˜λ₯Ό λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€. λ‹€μŒ [νŠœν† λ¦¬μ–Ό](preprocessing)μ—μ„œλŠ” μƒˆλ‘­κ²Œ λ‘œλ“œν•œ ν† ν¬λ‚˜μ΄μ €, 이미지 ν”„λ‘œμ„Έμ„œ, νŠΉμ§• μΆ”μΆœκΈ°λ₯Ό μ‚¬μš©ν•˜μ—¬ λ―Έμ„Έ νŠœλ‹μš© 데이터 μ„ΈνŠΈλ₯Ό μ „μ²˜λ¦¬ν•˜λŠ” 방법에 λŒ€ν•΄ μ•Œμ•„λ΄…λ‹ˆλ‹€.
</pt>
<tf>
λ§ˆμ§€λ§‰μœΌλ‘œ `TFAutoModelFor` 클래슀λ₯Ό μ‚¬μš©ν•˜λ©΄ 주어진 μž‘μ—…μ— λŒ€ν•΄ 사전 ν›ˆλ ¨λœ λͺ¨λΈμ„ λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€. (μ‚¬μš© κ°€λŠ₯ν•œ μž‘μ—…μ˜ 전체 λͺ©λ‘μ€ [μ—¬κΈ°](model_doc/auto)λ₯Ό μ°Έμ‘°ν•˜μ„Έμš”. 예λ₯Ό λ“€μ–΄, [`TFAutoModelForSequenceClassification.from_pretrained`]둜 μ‹œν€€μŠ€ λΆ„λ₯˜λ₯Ό μœ„ν•œ λͺ¨λΈμ„ λ‘œλ“œν•©λ‹ˆλ‹€:
```py
>>> from transformers import TFAutoModelForSequenceClassification
>>> model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
```
μ‰½κ²Œ λ™μΌν•œ 체크포인트λ₯Ό μž¬μ‚¬μš©ν•˜μ—¬ λ‹€λ₯Έ μž‘μ—…μ— μ•„ν‚€ν…μ²˜λ₯Ό λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€:
```py
>>> from transformers import TFAutoModelForTokenClassification
>>> model = TFAutoModelForTokenClassification.from_pretrained("distilbert-base-uncased")
```
일반적으둜, `AutoTokenizer`ν΄λž˜μŠ€μ™€ `TFAutoModelFor` 클래슀λ₯Ό μ‚¬μš©ν•˜μ—¬ 미리 ν•™μŠ΅λœ λͺ¨λΈ μΈμŠ€ν„΄μŠ€λ₯Ό λ‘œλ“œν•˜λŠ” 것이 μ’‹μŠ΅λ‹ˆλ‹€. μ΄λ ‡κ²Œ ν•˜λ©΄ 맀번 μ˜¬λ°”λ₯Έ μ•„ν‚€ν…μ²˜λ₯Ό λ‘œλ“œν•  수 μžˆμŠ΅λ‹ˆλ‹€. λ‹€μŒ [νŠœν† λ¦¬μ–Ό](preprocessing)μ—μ„œλŠ” μƒˆλ‘­κ²Œ λ‘œλ“œν•œ ν† ν¬λ‚˜μ΄μ €, 이미지 ν”„λ‘œμ„Έμ„œ, νŠΉμ§• μΆ”μΆœκΈ°λ₯Ό μ‚¬μš©ν•˜μ—¬ λ―Έμ„Έ νŠœλ‹μš© 데이터 μ„ΈνŠΈλ₯Ό μ „μ²˜λ¦¬ν•˜λŠ” 방법에 λŒ€ν•΄ μ•Œμ•„λ΄…λ‹ˆλ‹€.
</tf>
</frameworkcontent>