pabloce commited on
Commit
dcfb758
1 Parent(s): 9dc269e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -8,6 +8,9 @@ license: mit
8
 
9
  By Fernando, Eric and David
10
 
 
 
 
11
  This is a hack around pytorch + huggingface Transformers library to make the original Dolphin Phi-2 to behave in a way inspired by the Meta's paper "MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases" [ https://arxiv.org/abs/2402.14905 ]
12
 
13
  One of the key ideas is that it works as if it was like "an online passthrough", by applying a loop on a module SuperClass, that groups layers, in a such way they get their forward method repeated in a loop.
 
8
 
9
  By Fernando, Eric and David
10
 
11
+ [![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/cognitivecomputations)
12
+ Discord: https://discord.gg/cognitivecomputations
13
+
14
  This is a hack around pytorch + huggingface Transformers library to make the original Dolphin Phi-2 to behave in a way inspired by the Meta's paper "MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases" [ https://arxiv.org/abs/2402.14905 ]
15
 
16
  One of the key ideas is that it works as if it was like "an online passthrough", by applying a loop on a module SuperClass, that groups layers, in a such way they get their forward method repeated in a loop.