why not the features is a tensor[1,768]? but a shape[1,577,768}??
#2
by
Michae1bear
- opened
why not the features is a tensor[1,768]? but a shape[1,577,768}??
@Michae1bear it states in the model card code example why, forward_features is unpooled...
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 577, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor