huseinzol05's picture
Update README.md
fac55b1 verified
|
raw
history blame
1.52 kB
metadata
task_categories:
  - image-feature-extraction

Google Image Malaysia Location Dedup

Original dataset https://huggingface.co/datasets/malaysia-ai/crawl-google-image-malaysia-location

Source code at https://github.com/mesolitica/malaysian-dataset/tree/master/vlm/dedup-malaysia-location

Dedup 50% similar

dedup-0.5.jsonl, total deduped 227937 images,

{'filename': 'train-00812-of-01000.parquet',
 'keyword': 'Taman Megah Jaya Ayer Tawar',
 'no': 16,
 'selected_indices': [2556, 2559, 2575, 2577, 2586, 2587, 2595]}

Dedup 60% similar

dedup-0.6.jsonl, total deduped 487301 images,

{'filename': 'train-00404-of-01000.parquet',
 'keyword': 'Kampung Tok Wan Nik Padang Besar',
 'no': 92,
 'selected_indices': [2100, 2102, 2103, 2104]}
  • filename is the parquet file from the original repository.
  • selected_indices is the index of dataframe of that filename.

Embedding

We convert to embedding using https://huggingface.co/google/siglip-base-patch16-512, we use MosaicML for faster indexing,

from streaming import MDSWriter
from streaming.base.format.mds.encodings import Encoding, _encodings
from streaming import LocalDataset
import streaming
import numpy as np
from tqdm import tqdm

class Float32(Encoding):
    def encode(self, obj) -> bytes:
        return obj.tobytes()

    def decode(self, data: bytes):
        return np.frombuffer(data, np.float32)

_encodings['float32'] = Float32

dataset = LocalDataset('embedding')