File size: 966 Bytes
7934b29 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
NeMo Megatron
=============
Megatron :cite:`nlp-megatron-shoeybi2019megatron` is a large, powerful transformer developed by the Applied Deep Learning Research
team at NVIDIA. NeMo Megatron supports several types of models:
* GPT-style models (decoder only)
* T5/BART/UL2-style models (encoder-decoder)
* BERT-style models (encoder only)
* RETRO model (decoder only)
.. note::
NeMo Megatron has an Enterprise edition which contains tools for data preprocessing, hyperparameter tuning, container, scripts for various clouds and more. With Enterprise edition you also get deployment tools. Apply for `early access here <https://developer.nvidia.com/nemo-megatron-early-access>`_ .
.. toctree::
:maxdepth: 1
mlm_migration
gpt/gpt_training
batching
parallelisms
prompt_learning
retro/retro_model
References
----------
.. bibliography:: ../nlp_all.bib
:style: plain
:labelprefix: nlp-megatron
:keyprefix: nlp-megatron- |