Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
mpt-7b-chat / custom_embedding.py
daking's picture
Add custom embedding (#22)
c7d8463
raw
history blame
308 Bytes
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch import Tensor
class SharedEmbedding(nn.Embedding):
def forward(self, input: Tensor, unembed: bool = False) -> Tensor:
if unembed:
return F.linear(input, self.weight)
return super().forward(input)