Using a GPU for inference

#2
by woofadu - opened

How do you load the model onto a GPU if there is no 'device' or 'device_map' parameter for the MegatronBert model type?

University of Florida NLP Group org

Use CUDA_VISIBLE_DEVICES. Here is an example: https://github.com/uf-hobi-informatics-lab/ClinicalTransformerNER

set GPU

export CUDA_VISIBLE_DEVICES=0

Sign up or log in to comment