File size: 1,591 Bytes
82b75c8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
# FExGAN-Meta: Facial Expression Generation with Meta-Humans

![FExGAN-Meta GIF Demo](https://github.com/azadlab/FExGAN-Meta/blob/master/FExGAN-Meta.gif?raw=true)

This is the demo of the FExGAN-Meta proposed in the following article:

[FExGAN-Meta: Facial Expression Generation with Meta-Humans](https://www.arxiv.com)

FExGAN-Meta is the extension of [FExGAN](http://arxiv.org/abs/2201.09061). It takes input an image of Meta-Human and a vector of desired affect (e.g. angry,disgust,sad,surprise,joy,neutral and fear) and converts the input image to the desired emotion while keeping the identity of the original image.

![FExGAN-Meta GIF Demo](https://github.com/azadlab/FExGAN-Meta/blob/master/results.png?raw=true)

# Requirements

In order to run this you need following:

* Python >= 3.7
* Tensorflow >= 2.6
* CUDA enabled GPU with memory >=8GB (e.g. GTX1070/GTX1080)


# Usage

You can either run this on google colab or run it on your local system

* Install the pre-requisites
* Download the models (if any link fails in the notebook due to google drive restriction, try downloading them manually)
* Execute the rest of the notebook



# Citation

If you use any part of this code or use ideas mentioned in the paper, please cite the following article.

```
@article{Siddiqui_FExGAN-Meta_2022,
  author = {{Siddiqui}, J. Rafid},
  title = {{FExGAN-Meta: Facial Expression Generation with Meta-Humans}},
  journal = {ArXiv e-prints},
  archivePrefix = "arXiv",
  keywords = {Deep Learning, GAN, Facial Expressions},
  year = {2022}
  url = {http://arxiv.org/abs/2201.09061},
}

```