metadata
license: mit
Pre-computed CLIP embeddings
Embeddings are stored as HDF5 datasets with the following structure:
<DATASET_NAME>_<MODEL_NAME>_<OP>.hdf5
"""
DATASET_NAME: name of the dataset, e.g. "imagenette".
MODEL_NAME: name of the model, e.g. "open_clip:ViT-B-32".
OP: split of the dataset (either "train" or "val").
"""
dataset["embedding"] contains the embeddings
dataset["label"] contains the labels
To generate the dataset, run
$ python make_dataset.py -h
usage: make_dataset.py [-h] [--dataset DATASET [DATASET ...]] [--model MODEL [MODEL ...]]
options:
--dataset DATASET [DATASET ...] List of datasets to encode.
--model MODEL [MODEL ...] List of models to use.
Supported dataset names (see supported_datasets.txt):
imagenette
[dataset]
Supported model names (see supported_models.txt):
References
@misc{teneggi2024ibetdidmean,
title={I Bet You Did Not Mean That: Testing Semantic Importance via Betting},
author={Jacopo Teneggi and Jeremias Sulam},
year={2024},
eprint={2405.19146},
archivePrefix={arXiv},
primaryClass={stat.ML},
}