chendl's picture
add requirements
a1d409e
raw
history blame
13.9 kB
<!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
specific language governing permissions and limitations under the License.
-->
# 좔둠을 μœ„ν•œ Pipeline[[pipelines-for-inference]]
[`pipeline`]을 μ‚¬μš©ν•˜λ©΄ μ–Έμ–΄, 컴퓨터 λΉ„μ „, μ˜€λ””μ˜€ 및 λ©€ν‹°λͺ¨λ‹¬ νƒœμŠ€ν¬μ— λŒ€ν•œ 좔둠을 μœ„ν•΄ [Hub](https://huggingface.co/models)의 μ–΄λ–€ λͺ¨λΈμ΄λ“  μ‰½κ²Œ μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€. νŠΉμ • 뢄야에 λŒ€ν•œ κ²½ν—˜μ΄ μ—†κ±°λ‚˜, λͺ¨λΈμ„ μ΄λ£¨λŠ” μ½”λ“œκ°€ μ΅μˆ™ν•˜μ§€ μ•Šμ€ κ²½μš°μ—λ„ [`pipeline`]을 μ‚¬μš©ν•΄μ„œ μΆ”λ‘ ν•  수 μžˆμ–΄μš”! 이 νŠœν† λ¦¬μ–Όμ—μ„œλŠ” λ‹€μŒμ„ λ°°μ›Œλ³΄κ² μŠ΅λ‹ˆλ‹€.
* 좔둠을 μœ„ν•΄ [`pipeline`]을 μ‚¬μš©ν•˜λŠ” 방법
* νŠΉμ • ν† ν¬λ‚˜μ΄μ € λ˜λŠ” λͺ¨λΈμ„ μ‚¬μš©ν•˜λŠ” 방법
* μ–Έμ–΄, 컴퓨터 λΉ„μ „, μ˜€λ””μ˜€ 및 λ©€ν‹°λͺ¨λ‹¬ νƒœμŠ€ν¬μ—μ„œ [`pipeline`]을 μ‚¬μš©ν•˜λŠ” 방법
<Tip>
μ§€μ›ν•˜λŠ” λͺ¨λ“  νƒœμŠ€ν¬μ™€ μ“Έ 수 μžˆλŠ” λ§€κ°œλ³€μˆ˜λ₯Ό 담은 λͺ©λ‘μ€ [`pipeline`] μ„€λͺ…μ„œλ₯Ό μ°Έκ³ ν•΄μ£Όμ„Έμš”.
</Tip>
## Pipeline μ‚¬μš©ν•˜κΈ°[[pipeline-usage]]
각 νƒœμŠ€ν¬λ§ˆλ‹€ 고유의 [`pipeline`]이 μžˆμ§€λ§Œ, κ°œλ³„ νŒŒμ΄ν”„λΌμΈμ„ λ‹΄κ³ μžˆλŠ” μΆ”μƒν™”λœ [`pipeline`]λ₯Ό μ‚¬μš©ν•˜λŠ” 것이 일반적으둜 더 κ°„λ‹¨ν•©λ‹ˆλ‹€. [`pipeline`]은 νƒœμŠ€ν¬μ— μ•Œλ§žκ²Œ 좔둠이 κ°€λŠ₯ν•œ κΈ°λ³Έ λͺ¨λΈκ³Ό μ „μ²˜λ¦¬ 클래슀λ₯Ό μžλ™μœΌλ‘œ λ‘œλ“œν•©λ‹ˆλ‹€.
1. λ¨Όμ € [`pipeline`]을 μƒμ„±ν•˜κ³  νƒœμŠ€ν¬λ₯Ό μ§€μ •ν•˜μ„Έμš”.
```py
>>> from transformers import pipeline
>>> generator = pipeline(task="automatic-speech-recognition")
```
2. 그리고 [`pipeline`]에 μž…λ ₯을 λ„£μ–΄μ£Όμ„Έμš”.
```py
>>> generator("https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac")
{'text': 'I HAVE A DREAM BUT ONE DAY THIS NATION WILL RISE UP LIVE UP THE TRUE MEANING OF ITS TREES'}
```
κΈ°λŒ€ν–ˆλ˜ κ²°κ³Όκ°€ μ•„λ‹Œκ°€μš”? Hubμ—μ„œ [κ°€μž₯ 많이 λ‹€μš΄λ‘œλ“œλœ μžλ™ μŒμ„± 인식 λͺ¨λΈ](https://huggingface.co/models?pipeline_tag=automatic-speech-recognition&sort=downloads)둜 더 λ‚˜μ€ κ²°κ³Όλ₯Ό 얻을 수 μžˆλŠ”μ§€ ν™•μΈν•΄λ³΄μ„Έμš”.
λ‹€μŒμ€ [openai/whisper-large](https://huggingface.co/openai/whisper-large)둜 μ‹œλ„ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
```py
>>> generator = pipeline(model="openai/whisper-large")
>>> generator("https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac")
{'text': ' I have a dream that one day this nation will rise up and live out the true meaning of its creed.'}
```
훨씬 더 λ‚˜μ•„μ‘Œκ΅°μš”!
Hub의 λͺ¨λΈλ“€μ€ μ—¬λŸ¬ λ‹€μ–‘ν•œ 언어와 μ „λ¬ΈλΆ„μ•Όλ₯Ό μ•„μš°λ₯΄κΈ° λ•Œλ¬Έμ— κΌ­ μžμ‹ μ˜ μ–Έμ–΄λ‚˜ 뢄야에 νŠΉν™”λœ λͺ¨λΈμ„ μ°Ύμ•„λ³΄μ‹œκΈ° λ°”λžλ‹ˆλ‹€.
λΈŒλΌμš°μ €λ₯Ό λ²—μ–΄λ‚  ν•„μš”μ—†μ΄ Hubμ—μ„œ 직접 λͺ¨λΈμ˜ 좜λ ₯을 ν™•μΈν•˜κ³  λ‹€λ₯Έ λͺ¨λΈκ³Ό λΉ„κ΅ν•΄μ„œ μžμ‹ μ˜ 상황에 더 μ ν•©ν•œμ§€, μ• λ§€ν•œ μž…λ ₯을 더 잘 μ²˜λ¦¬ν•˜λŠ”μ§€λ„ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.
λ§Œμ•½ 상황에 μ•Œλ§žλŠ” λͺ¨λΈμ„ μ—†λ‹€λ©΄ μ–Έμ œλ‚˜ 직접 [ν›ˆλ ¨](training)μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€!
μž…λ ₯이 μ—¬λŸ¬ 개 μžˆλŠ” 경우, 리슀트 ν˜•νƒœλ‘œ 전달할 수 μžˆμŠ΅λ‹ˆλ‹€.
```py
generator(
[
"https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac",
"https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/1.flac",
]
)
```
전체 λ°μ΄ν„°μ„ΈνŠΈμ„ μˆœνšŒν•˜κ±°λ‚˜ μ›Ήμ„œλ²„μ— μ˜¬λ €λ‘μ–΄ 좔둠에 μ‚¬μš©ν•˜κ³  μ‹Άλ‹€λ©΄, 각 상세 νŽ˜μ΄μ§€λ₯Ό μ°Έμ‘°ν•˜μ„Έμš”.
[λ°μ΄ν„°μ„ΈνŠΈμ—μ„œ Pipeline μ‚¬μš©ν•˜κΈ°](#using-pipelines-on-a-dataset)
[μ›Ήμ„œλ²„μ—μ„œ Pipeline μ‚¬μš©ν•˜κΈ°](./pipeline_webserver)
## λ§€κ°œλ³€μˆ˜[[parameters]]
[`pipeline`]은 λ§Žμ€ λ§€κ°œλ³€μˆ˜λ₯Ό μ§€μ›ν•©λ‹ˆλ‹€. νŠΉμ • νƒœμŠ€ν¬μš©μΈ 것도 있고, λ²”μš©μΈ 것도 μžˆμŠ΅λ‹ˆλ‹€.
일반적으둜 μ›ν•˜λŠ” μœ„μΉ˜μ— μ–΄λ””λ“  λ§€κ°œλ³€μˆ˜λ₯Ό 넣을 수 μžˆμŠ΅λ‹ˆλ‹€.
```py
generator(model="openai/whisper-large", my_parameter=1)
out = generate(...) # This will use `my_parameter=1`.
out = generate(..., my_parameter=2) # This will override and use `my_parameter=2`.
out = generate(...) # This will go back to using `my_parameter=1`.
```
μ€‘μš”ν•œ 3가지 λ§€κ°œλ³€μˆ˜λ₯Ό μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
### κΈ°κΈ°(device)[[device]]
`device=n`처럼 κΈ°κΈ°λ₯Ό μ§€μ •ν•˜λ©΄ νŒŒμ΄ν”„λΌμΈμ΄ μžλ™μœΌλ‘œ ν•΄λ‹Ή 기기에 λͺ¨λΈμ„ λ°°μΉ˜ν•©λ‹ˆλ‹€.
νŒŒμ΄ν† μΉ˜μ—μ„œλ‚˜ ν…μ„œν”Œλ‘œμš°μ—μ„œλ„ λͺ¨λ‘ μž‘λ™ν•©λ‹ˆλ‹€.
```py
generator(model="openai/whisper-large", device=0)
```
λͺ¨λΈμ΄ GPU ν•˜λ‚˜μ— λŒμ•„κ°€κΈ° 버겁닀면, `device_map="auto"`λ₯Ό μ§€μ •ν•΄μ„œ πŸ€— [Accelerate](https://huggingface.co/docs/accelerate)κ°€ λͺ¨λΈ κ°€μ€‘μΉ˜λ₯Ό μ–΄λ–»κ²Œ λ‘œλ“œν•˜κ³  μ €μž₯할지 μžλ™μœΌλ‘œ κ²°μ •ν•˜λ„λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
```py
#!pip install accelerate
generator(model="openai/whisper-large", device_map="auto")
```
### 배치 μ‚¬μ΄μ¦ˆ[[batch-size]]
기본적으둜 νŒŒμ΄ν”„λΌμΈμ€ [μ—¬κΈ°](https://huggingface.co/docs/transformers/main_classes/pipelines#pipeline-batching)에 λ‚˜μ˜¨ 이유둜 좔둠을 일괄 μ²˜λ¦¬ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. κ°„λ‹¨νžˆ μ„€λͺ…ν•˜μžλ©΄ 일괄 μ²˜λ¦¬κ°€ λ°˜λ“œμ‹œ 더 λΉ λ₯΄μ§€ μ•Šκ³  였히렀 더 느렀질 μˆ˜λ„ 있기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
ν•˜μ§€λ§Œ μžμ‹ μ˜ 상황에 μ ν•©ν•˜λ‹€λ©΄, μ΄λ ‡κ²Œ μ‚¬μš©ν•˜μ„Έμš”.
```py
generator(model="openai/whisper-large", device=0, batch_size=2)
audio_filenames = [f"audio_{i}.flac" for i in range(10)]
texts = generator(audio_filenames)
```
νŒŒμ΄ν”„λΌμΈ μœ„ 제곡된 10개의 μ˜€λ””μ˜€ νŒŒμΌμ„ μΆ”κ°€λ‘œ μ²˜λ¦¬ν•˜λŠ” μ½”λ“œ 없이 (일괄 μ²˜λ¦¬μ— 보닀 효과적인 GPU μœ„) λͺ¨λΈμ— 2κ°œμ”© μ „λ‹¬ν•©λ‹ˆλ‹€.
좜λ ₯은 일괄 μ²˜λ¦¬ν•˜μ§€ μ•Šμ•˜μ„ λ•Œμ™€ λ˜‘κ°™μ•„μ•Ό ν•©λ‹ˆλ‹€. νŒŒμ΄ν”„λΌμΈμ—μ„œ 속도λ₯Ό 더 λ‚Ό μˆ˜λ„ μžˆλŠ” 방법 쀑 ν•˜λ‚˜μΌ λΏμž…λ‹ˆλ‹€.
νŒŒμ΄ν”„λΌμΈμ€ 일괄 처리의 λ³΅μž‘ν•œ 뢀뢄을 쀄여주기도 ν•©λ‹ˆλ‹€. (예λ₯Ό λ“€μ–΄ κΈ΄ μ˜€λ””μ˜€ 파일처럼) μ—¬λŸ¬ λΆ€λΆ„μœΌλ‘œ λ‚˜λˆ μ•Ό λͺ¨λΈμ΄ μ²˜λ¦¬ν•  수 μžˆλŠ” 것을 [*chunk batching*](./main_classes/pipelines#pipeline-chunk-batching)이라고 ν•˜λŠ”λ°, νŒŒμ΄ν”„λΌμΈμ„ μ‚¬μš©ν•˜λ©΄ μžλ™μœΌλ‘œ λ‚˜λˆ μ€λ‹ˆλ‹€.
### νŠΉμ • νƒœμŠ€ν¬μš© λ§€κ°œλ³€μˆ˜[[task-specific-parameters]]
각 νƒœμŠ€ν¬λ§ˆλ‹€ κ΅¬ν˜„ν•  λ•Œ μœ μ—°μ„±κ³Ό μ˜΅μ…˜μ„ μ œκ³΅ν•˜κΈ° μœ„ν•΄ νƒœμŠ€ν¬μš© λ§€κ°œλ³€μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
예λ₯Ό λ“€μ–΄ [`transformers.AutomaticSpeechRecognitionPipeline.__call__`] λ©”μ„œλ“œμ—λŠ” λ™μ˜μƒμ˜ μžλ§‰μ„ 넣을 λ•Œ μœ μš©ν•  것 같은 `return_timestamps` λ§€κ°œλ³€μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
```py
>>> # Not using whisper, as it cannot provide timestamps.
>>> generator = pipeline(model="facebook/wav2vec2-large-960h-lv60-self", return_timestamps="word")
>>> generator("https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac")
{'text': 'I HAVE A DREAM BUT ONE DAY THIS NATION WILL RISE UP AND LIVE OUT THE TRUE MEANING OF ITS CREED', 'chunks': [{'text': 'I', 'timestamp': (1.22, 1.24)}, {'text': 'HAVE', 'timestamp': (1.42, 1.58)}, {'text': 'A', 'timestamp': (1.66, 1.68)}, {'text': 'DREAM', 'timestamp': (1.76, 2.14)}, {'text': 'BUT', 'timestamp': (3.68, 3.8)}, {'text': 'ONE', 'timestamp': (3.94, 4.06)}, {'text': 'DAY', 'timestamp': (4.16, 4.3)}, {'text': 'THIS', 'timestamp': (6.36, 6.54)}, {'text': 'NATION', 'timestamp': (6.68, 7.1)}, {'text': 'WILL', 'timestamp': (7.32, 7.56)}, {'text': 'RISE', 'timestamp': (7.8, 8.26)}, {'text': 'UP', 'timestamp': (8.38, 8.48)}, {'text': 'AND', 'timestamp': (10.08, 10.18)}, {'text': 'LIVE', 'timestamp': (10.26, 10.48)}, {'text': 'OUT', 'timestamp': (10.58, 10.7)}, {'text': 'THE', 'timestamp': (10.82, 10.9)}, {'text': 'TRUE', 'timestamp': (10.98, 11.18)}, {'text': 'MEANING', 'timestamp': (11.26, 11.58)}, {'text': 'OF', 'timestamp': (11.66, 11.7)}, {'text': 'ITS', 'timestamp': (11.76, 11.88)}, {'text': 'CREED', 'timestamp': (12.0, 12.38)}]}
```
λ³΄μ‹œλ‹€μ‹œν”Ό λͺ¨λΈμ΄ ν…μŠ€νŠΈλ₯Ό μΆ”λ‘ ν•  뿐만 μ•„λ‹ˆλΌ 각 단어λ₯Ό λ§ν•œ μ‹œμ κΉŒμ§€λ„ 좜λ ₯ν–ˆμŠ΅λ‹ˆλ‹€.
νƒœμŠ€ν¬λ§ˆλ‹€ λ‹€μ–‘ν•œ λ§€κ°œλ³€μˆ˜λ₯Ό 가지고 μžˆλŠ”λ°μš”. μ›ν•˜λŠ” νƒœμŠ€ν¬μ˜ APIλ₯Ό μ°Έμ‘°ν•΄μ„œ λ°”κΏ”λ³Ό 수 μžˆλŠ” μ—¬λŸ¬ λ§€κ°œλ³€μˆ˜λ₯Ό μ‚΄νŽ΄λ³΄μ„Έμš”!
μ§€κΈˆκΉŒμ§€ 닀뀄본 [`~transformers.AutomaticSpeechRecognitionPipeline`]μ—λŠ” `chunk_length_s` λ§€κ°œλ³€μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€. μ˜ν™”λ‚˜ 1μ‹œκ°„ λΆ„λŸ‰μ˜ λ™μ˜μƒμ˜ μžλ§‰ μž‘μ—…μ„ ν•  λ•Œμ²˜λŸΌ, 일반적으둜 λͺ¨λΈμ΄ 자체적으둜 μ²˜λ¦¬ν•  수 μ—†λŠ” 맀우 κΈ΄ μ˜€λ””μ˜€ νŒŒμΌμ„ μ²˜λ¦¬ν•  λ•Œ μœ μš©ν•˜μ£ .
도움이 될 λ§Œν•œ λ§€κ°œλ³€μˆ˜λ₯Ό 찾지 λͺ»ν–ˆλ‹€λ©΄ μ–Έμ œλ“ μ§€ [μš”μ²­](https://github.com/huggingface/transformers/issues/new?assignees=&labels=feature&template=feature-request.yml)ν•΄μ£Όμ„Έμš”!
## λ°μ΄ν„°μ„ΈνŠΈμ—μ„œ Pipeline μ‚¬μš©ν•˜κΈ°[[using-pipelines-on-a-dataset]]
νŒŒμ΄ν”„λΌμΈμ€ λŒ€κ·œλͺ¨ λ°μ΄ν„°μ„ΈνŠΈμ—μ„œλ„ μΆ”λ‘  μž‘μ—…μ„ ν•  수 μžˆμŠ΅λ‹ˆλ‹€. μ΄λ•Œ μ΄ν„°λ ˆμ΄ν„°λ₯Ό μ‚¬μš©ν•˜λŠ” κ±Έ μΆ”μ²œλ“œλ¦½λ‹ˆλ‹€.
```py
def data():
for i in range(1000):
yield f"My example {i}"
pipe = pipe(model="gpt2", device=0)
generated_characters = 0
for out in pipe(data()):
generated_characters += len(out["generated_text"])
```
μ΄ν„°λ ˆμ΄ν„° `data()`λŠ” 각 κ²°κ³Όλ₯Ό ν˜ΈμΆœλ§ˆλ‹€ μƒμ„±ν•˜κ³ , νŒŒμ΄ν”„λΌμΈμ€ μž…λ ₯이 μˆœνšŒν•  수 μžˆλŠ” μžλ£Œκ΅¬μ‘°μž„μ„ μžλ™μœΌλ‘œ μΈμ‹ν•˜μ—¬ GPUμ—μ„œ κΈ°μ‘΄ 데이터가 μ²˜λ¦¬λ˜λŠ” λ™μ•ˆ μƒˆλ‘œμš΄ 데이터λ₯Ό κ°€μ Έμ˜€κΈ° μ‹œμž‘ν•©λ‹ˆλ‹€.(μ΄λ•Œ λ‚΄λΆ€μ μœΌλ‘œ [DataLoader](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader)λ₯Ό μ‚¬μš©ν•΄μš”.) 이 과정은 전체 λ°μ΄ν„°μ„ΈνŠΈλ₯Ό λ©”λͺ¨λ¦¬μ— μ μž¬ν•˜μ§€ μ•Šκ³ λ„ GPU에 μ΅œλŒ€ν•œ λΉ λ₯΄κ²Œ μƒˆλ‘œμš΄ μž‘μ—…μ„ 곡급할 수 있기 λ•Œλ¬Έμ— μ€‘μš”ν•©λ‹ˆλ‹€.
그리고 일괄 μ²˜λ¦¬κ°€ 더 λΉ λ₯Ό 수 있기 λ•Œλ¬Έμ—, `batch_size` λ§€κ°œλ³€μˆ˜λ₯Ό 쑰정해봐도 μ’‹μ•„μš”.
λ°μ΄ν„°μ„ΈνŠΈλ₯Ό μˆœνšŒν•˜λŠ” κ°€μž₯ κ°„λ‹¨ν•œ 방법은 πŸ€— [Datasets](https://github.com/huggingface/datasets/)λ₯Ό ν™œμš©ν•˜λŠ” κ²ƒμΈλ°μš”.
```py
# KeyDataset is a util that will just output the item we're interested in.
from transformers.pipelines.pt_utils import KeyDataset
pipe = pipeline(model="hf-internal-testing/tiny-random-wav2vec2", device=0)
dataset = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation[:10]")
for out in pipe(KeyDataset(dataset["audio"])):
print(out)
```
## μ›Ήμ„œλ²„μ—μ„œ Pipeline μ‚¬μš©ν•˜κΈ°[[using-pipelines-for-a-webserver]]
<Tip>
μΆ”λ‘  엔진을 λ§Œλ“œλŠ” 과정은 λ”°λ‘œ νŽ˜μ΄μ§€λ₯Ό μž‘μ„±ν• λ§Œν•œ λ³΅μž‘ν•œ μ£Όμ œμž…λ‹ˆλ‹€.
</Tip>
[Link](./pipeline_webserver)
## λΉ„μ „ Pipeline[[vision-pipeline]]
λΉ„μ „ νƒœμŠ€ν¬λ₯Ό μœ„ν•΄ [`pipeline`]을 μ‚¬μš©ν•˜λŠ” 일은 거의 λ™μΌν•©λ‹ˆλ‹€.
νƒœμŠ€ν¬λ₯Ό μ§€μ •ν•˜κ³  이미지λ₯Ό λΆ„λ₯˜κΈ°μ— μ „λ‹¬ν•˜λ©΄ λ©λ‹ˆλ‹€. μ΄λ―Έμ§€λŠ” 인터넷 링크 λ˜λŠ” 둜컬 경둜의 ν˜•νƒœλ‘œ μ „λ‹¬ν•΄μ£Όμ„Έμš”. 예λ₯Ό λ“€μ–΄ μ•„λž˜μ— ν‘œμ‹œλœ κ³ μ–‘μ΄λŠ” μ–΄λ–€ μ’…μΈκ°€μš”?
![pipeline-cat-chonk](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/pipeline-cat-chonk.jpeg)
```py
>>> from transformers import pipeline
>>> vision_classifier = pipeline(model="google/vit-base-patch16-224")
>>> preds = vision_classifier(
... images="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/pipeline-cat-chonk.jpeg"
... )
>>> preds = [{"score": round(pred["score"], 4), "label": pred["label"]} for pred in preds]
>>> preds
[{'score': 0.4335, 'label': 'lynx, catamount'}, {'score': 0.0348, 'label': 'cougar, puma, catamount, mountain lion, painter, panther, Felis concolor'}, {'score': 0.0324, 'label': 'snow leopard, ounce, Panthera uncia'}, {'score': 0.0239, 'label': 'Egyptian cat'}, {'score': 0.0229, 'label': 'tiger cat'}]
```
### ν…μŠ€νŠΈ Pipeline[[text-pipeline]]
NLP νƒœμŠ€ν¬λ₯Ό μœ„ν•΄ [`pipeline`]을 μ‚¬μš©ν•˜λŠ” 일도 거의 λ™μΌν•©λ‹ˆλ‹€.
```py
>>> from transformers import pipeline
>>> # This model is a `zero-shot-classification` model.
>>> # It will classify text, except you are free to choose any label you might imagine
>>> classifier = pipeline(model="facebook/bart-large-mnli")
>>> classifier(
... "I have a problem with my iphone that needs to be resolved asap!!",
... candidate_labels=["urgent", "not urgent", "phone", "tablet", "computer"],
... )
{'sequence': 'I have a problem with my iphone that needs to be resolved asap!!', 'labels': ['urgent', 'phone', 'computer', 'not urgent', 'tablet'], 'scores': [0.504, 0.479, 0.013, 0.003, 0.002]}
```
### λ©€ν‹°λͺ¨λ‹¬ Pipeline[[multimodal-pipeline]]
[`pipeline`]은 μ—¬λŸ¬ λͺ¨λ‹¬λ¦¬ν‹°(μ—­μ£Ό: μ˜€λ””μ˜€, λΉ„λ””μ˜€, ν…μŠ€νŠΈμ™€ 같은 데이터 ν˜•νƒœ)λ₯Ό μ§€μ›ν•©λ‹ˆλ‹€. μ˜ˆμ‹œλ‘œ μ‹œκ°μ  μ§ˆμ˜μ‘λ‹΅(VQA; Visual Question Answering) νƒœμŠ€ν¬λŠ” ν…μŠ€νŠΈμ™€ 이미지λ₯Ό λͺ¨λ‘ μ‚¬μš©ν•©λ‹ˆλ‹€. κ·Έ μ–΄λ–€ 이미지 λ§ν¬λ‚˜ 묻고 싢은 μ§ˆλ¬Έλ„ 자유둭게 전달할 수 μžˆμŠ΅λ‹ˆλ‹€. μ΄λ―Έμ§€λŠ” URL λ˜λŠ” 둜컬 경둜의 ν˜•νƒœλ‘œ μ „λ‹¬ν•΄μ£Όμ„Έμš”.
예λ₯Ό λ“€μ–΄ 이 [거래λͺ…μ„Έμ„œ 사진](https://huggingface.co/spaces/impira/docquery/resolve/2359223c1837a7587402bda0f2643382a6eefeab/invoice.png)μ—μ„œ 거래λͺ…μ„Έμ„œ 번호λ₯Ό 묻고 μ‹Άλ‹€λ©΄,
```py
>>> from transformers import pipeline
>>> vqa = pipeline(model="impira/layoutlm-document-qa")
>>> vqa(
... image="https://huggingface.co/spaces/impira/docquery/resolve/2359223c1837a7587402bda0f2643382a6eefeab/invoice.png",
... question="What is the invoice number?",
... )
[{'score': 0.42514941096305847, 'answer': 'us-001', 'start': 16, 'end': 16}]
```