If enabled, Flash Attention 2 will be used to compute the attention. Otherwise, the attention will be computed using the standard attention mechanism. 

Flash Attention 2 is a new attention mechanism that is faster and more memory efficient than the standard attention mechanism. Only newer GPUs support this feature.

See https://arxiv.org/abs/2205.14135 for more details.