Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
dicta-il-dictalm-7b-QUANTO-float8bit-smashed
like
0
Follow
Pruna AI
142
Transformers
pruna-ai
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
refs/pr/1
dicta-il-dictalm-7b-QUANTO-float8bit-smashed
1 contributor
History:
3 commits
sharpenb
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
e84c709
verified
6 months ago
.gitattributes
Safe
1.52 kB
initial commit
6 months ago
README.md
Safe
5.3 kB
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
merges.txt
Safe
1.27 MB
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
model.pt
pickle
Detected Pickle imports (26)
"torch._C._nn.gelu"
,
"torch.nn.modules.sparse.Embedding"
,
"__builtin__.set"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTMLP"
,
"torch.device"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.configuration_megatron_gpt.MegatronGPTConfig"
,
"torch.float16"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayer"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTAttention"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayerNorm"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch.nn.modules.container.ModuleList"
,
"torch.BoolStorage"
,
"torch.FloatStorage"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTForCausalLM"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTModel"
,
"torch._utils._rebuild_parameter"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTRotaryEmbedding"
,
"transformers.activations.GELUActivation"
,
"quanto.nn.qlinear.QLinear"
,
"torch.float8_e4m3fn"
,
"transformers.generation.configuration_utils.GenerationConfig"
,
"quanto.tensor.qtype.qtype"
How to fix it?
11.1 GB
LFS
a6a3dd804de3a30b25b238def7166b012d83f54dec59713a7f0dbf6c1b49be8d
6 months ago
smash_config.json
Safe
1.02 kB
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
special_tokens_map.json
Safe
567 Bytes
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
tokenizer.json
Safe
3.87 MB
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
tokenizer_config.json
Safe
883 Bytes
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago
vocab.json
Safe
1.65 MB
4d51ebd4667077e8cf13f7b529b1e58fe9d932e30ca6b681744f2a0096b201c7
6 months ago