dev-slx commited on
Commit
9eb6151
1 Parent(s): bb28ab6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -11
README.md CHANGED
@@ -1,17 +1,37 @@
 
 
 
1
  # SliceX AI™ ELM (Efficient Language Models)
2
- This repository contains code to run our ELM models.
3
 
4
- Models are located in the "models" folder. ELM models in this repository comes in three sizes (elm-1.0, elm-0.75 and elm-0.25) and supports the following use-case.
 
 
 
 
 
 
 
 
 
 
 
5
  - news_classification
6
 
7
- Try out the ELM models in HF spaces at [slicexai/elm-demo-v1](https://huggingface.co/spaces/slicexai/elm-demo-v1)
8
 
9
- ## Download ELM repo
 
10
  ```bash
 
11
  sudo apt-get intall git-lfs
12
  git lfs install
13
- git clone git@hf.co:slicexai/elm-v0.1_news_classification
14
  ```
 
 
 
 
 
 
15
  (Optional) Installing git-lfs without sudo,
16
  ```bash
17
  wget https://github.com/git-lfs/git-lfs/releases/download/v3.2.0/git-lfs-linux-amd64-v3.2.0.tar.gz
@@ -21,12 +41,6 @@ git lfs install
21
  ```
22
 
23
 
24
- ## Installation
25
- ```bash
26
- cd elm-v0.1_news_classification
27
- pip install -r requirements.txt
28
- ```
29
-
30
  ## How to use - Run ELM on a sample task
31
  ```bash
32
  python run.py <elm-model-directory>
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
  # SliceX AI™ ELM (Efficient Language Models)
5
+ **ELM** (which stands for **E**fficient **L**anguage **M**odels) is the first version in the series of cutting-edge language models from [SliceX AI](https://slicex.ai) that is designed to achieve the best in class performance in terms of _quality_, _throughput_ & _memory_.
6
 
7
+ <div align="center">
8
+ <img src="elm-rambutan.png" width="256"/>
9
+ </div>
10
+
11
+ ELM is designed to be a modular and customizable family of neural networks that are highly efficient and performant. Today we are sharing the first version in this series: **ELM-v0.1** models.
12
+
13
+ _Model:_ ELM introduces a new type of _(de)-composable LLM model architecture_ along with the algorithmic optimizations required to learn (training) and run (inference) these models. At a high level, we train a single ELM model in a self-supervised manner (during pre-training phase) but once trained the ELM model can be sliced in many ways to fit different user/task needs. The optimizations can be applied to the model either during the pre-training and/or fine-tuning stage.
14
+
15
+ _Fast Inference with Customization:_ Once trained, the ELM model architecture permits flexible inference strategies at runtime depending on the deployment needs. For instance, the ELM model can be _decomposed_ into smaller slices, i.e., smaller (or larger) models can be extracted from the original model to create multiple inference endpoints. Alternatively, the original (single) ELM model can be loaded _as is_ for inference and different slices within the model can be queried directly to power faster inference. This provides an additional level of flexibility for users to make compute/memory tradeoffs depending on their application and runtime needs.
16
+
17
+ ## ELM-v0.1 Model Release
18
+ Models are located in the `models` folder. ELM models in this repository comes in three sizes (elm-1.0, elm-0.75 and elm-0.25) and supports the following use-case.
19
  - news_classification
20
 
 
21
 
22
+ ## Setup ELM
23
+ ### Download ELM repo
24
  ```bash
25
+ git clone git@hf.co:slicexai/elm-v0.1_news_classification
26
  sudo apt-get intall git-lfs
27
  git lfs install
 
28
  ```
29
+ ### Installation
30
+ ```bash
31
+ cd elm-v0.1_news_news_classification
32
+ pip install -r requirements.txt
33
+ ```
34
+
35
  (Optional) Installing git-lfs without sudo,
36
  ```bash
37
  wget https://github.com/git-lfs/git-lfs/releases/download/v3.2.0/git-lfs-linux-amd64-v3.2.0.tar.gz
 
41
  ```
42
 
43
 
 
 
 
 
 
 
44
  ## How to use - Run ELM on a sample task
45
  ```bash
46
  python run.py <elm-model-directory>