File size: 647 Bytes
9bf90f3
 
 
 
 
 
 
 
 
75dc956
 
 
 
 
 
8396b1e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
title: README
emoji: πŸ“‰
colorFrom: red
colorTo: red
sdk: static
pinned: false
---

The Nanotron team focus on sharing open knowledge and developping open-source libraries for efficient distributed training of large-scale AI models.

Some of its contributions are:

- the [Nanotron library](https://github.com/huggingface/nanotron)
- the [Picotron library](https://github.com/huggingface/picotron)
- the [Ultrascale-Playbook](https://huggingface.co/spaces/nanotron/ultrascale-playbook), a comprehensive book covering all distributed/parallelisation and low-level techniques that can be used to efficiently train models at the largest scales.