File size: 1,083 Bytes
d33e505
 
 
 
 
 
 
 
f894943
 
 
 
 
 
d33e505
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Graphcore and Hugging Face are working together to make training of Transformer models on IPUs fast and easy. Learn more about how to take advantage of the power of Graphcore IPUs to train Transformers models at [hf.co/hardware/graphcore](https://huggingface.co/hardware/graphcore).

# Wav2Vec2 Base model IPU config

This model contains just the `IPUConfig` files for running the Wav2Vec2 base model (e.g. [wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base)) on Graphcore IPUs.

**This model contains no model weights, only an IPUConfig.** 

## Model description
From [wav2vec 2.0: A Framework for Self-Supervised
Learning of Speech Representations](https://arxiv.org/pdf/2006.11477v3.pdf),  

“Wave2vec2 is a framework for self-supervised learning of speech representations. It masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned.” 

## Usage

```
from optimum.graphcore import IPUConfig
ipu_config = IPUConfig.from_pretrained("Graphcore/wav2vec2-base-ipu")
```