first commit
049ef97
-
test_wavs
first commit
-
1.48 kB
initial commit
-
272 Bytes
first commit
final.zip
Detected Pickle imports (47)
- "__torch__.torch.nn.modules.conv.Conv1d",
- "__torch__.wenet.transformer.attention.MultiHeadedAttention",
- "__torch__.torch.nn.modules.loss.CTCLoss",
- "__torch__.wenet.transformer.subsampling.Conv2dSubsampling4",
- "torch._utils._rebuild_tensor_v2",
- "__torch__.wenet.transformer.decoder_layer.DecoderLayer",
- "__torch__.torch.nn.modules.container.___torch_mangle_4.Sequential",
- "__torch__.torch.nn.modules.activation.ReLU",
- "__torch__.torch.nn.modules.sparse.Embedding",
- "__torch__.wenet.transformer.embedding.RelPositionalEncoding",
- "__torch__.torch.nn.modules.normalization.LayerNorm",
- "torch.jit._pickle.build_intlist",
- "torch.QInt8Storage",
- "__torch__.torch.nn.modules.conv.Conv2d",
- "__torch__.torch.nn.modules.loss.KLDivLoss",
- "__torch__.torch.nn.modules.container.ModuleList",
- "__torch__.wenet.transformer.convolution.ConvolutionModule",
- "__torch__.wenet.transformer.decoder.TransformerDecoder",
- "__torch__.torch.nn.quantized.modules.linear.LinearPackedParams",
- "torch.per_tensor_affine",
- "collections.OrderedDict",
- "__torch__.torch.nn.modules.conv.___torch_mangle_3.Conv1d",
- "__torch__.wenet.transformer.attention.RelPositionMultiHeadedAttention",
- "torch._utils._rebuild_qtensor",
- "__torch__.torch.nn.modules.conv.___torch_mangle_0.Conv2d",
- "__torch__.torch.nn.modules.activation.SiLU",
- "__torch__.torch.nn.modules.container.Sequential",
- "__torch__.wenet.transformer.positionwise_feed_forward.PositionwiseFeedForward",
- "__torch__.wenet.transformer.positionwise_feed_forward.___torch_mangle_5.PositionwiseFeedForward",
- "__torch__.wenet.transformer.asr_model.ASRModel",
- "__torch__.torch.nn.modules.container.___torch_mangle_1.Sequential",
- "__torch__.wenet.transformer.cmvn.GlobalCMVN",
- "__torch__.torch.nn.quantized.dynamic.modules.linear.Linear",
- "__torch__.wenet.transformer.label_smoothing_loss.LabelSmoothingLoss",
- "torch.FloatStorage",
- "__torch__.torch.nn.modules.container.___torch_mangle_6.ModuleList",
- "__torch__.torch.nn.modules.conv.___torch_mangle_2.Conv1d",
- "__torch__.wenet.transformer.embedding.PositionalEncoding",
- "__torch__.wenet.transformer.encoder.ConformerEncoder",
- "__torch__.torch.nn.modules.dropout.Dropout",
- "__torch__.wenet.transformer.ctc.CTC",
- "__torch__.wenet.transformer.decoder.BiTransformerDecoder",
- "__torch__.wenet.transformer.encoder_layer.ConformerEncoderLayer",
- "collections.OrderedDict",
- "torch.BoolStorage",
- "torch.FloatStorage",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
220 MB
first commit
-
1.64 kB
first commit
-
48.7 kB
first commit