christopher
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -26,4 +26,51 @@ pretty_name: The MNIST-1D Dataset
|
|
26 |
size_categories:
|
27 |
- 1K<n<10K
|
28 |
---
|
|
|
|
|
|
|
|
|
29 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
size_categories:
|
27 |
- 1K<n<10K
|
28 |
---
|
29 |
+
> [!NOTE]
|
30 |
+
> The following is taken from the authors' GitHub repository: https://github.com/greydanus/mnist1d
|
31 |
+
>
|
32 |
+
# The MNIST-1D Dataset
|
33 |
|
34 |
+
Most machine learning models get around the same ~99% test accuracy on MNIST. Our dataset, MNIST-1D, is 100x smaller (default sample size: 4000+1000; dimensionality: 40) and does a better job of separating between models with/without nonlinearity and models with/without spatial inductive biases.
|
35 |
+
|
36 |
+
## Dataset Creation
|
37 |
+
|
38 |
+
This version of the dataset was created by using the pickle file provided by the dataset authors in the original repository: [mnist1d_data.pkl](https://github.com/greydanus/mnist1d/blob/master/mnist1d_data.pkl) and was generated like follows:
|
39 |
+
|
40 |
+
```python
|
41 |
+
import sys ; sys.path.append('..') # useful if you're running locally
|
42 |
+
import mnist1d
|
43 |
+
from datasets import Dataset, DatasetDict
|
44 |
+
|
45 |
+
# Load the data using the mnist1d library
|
46 |
+
args = mnist1d.get_dataset_args()
|
47 |
+
data = mnist1d.get_dataset(args, path='./mnist1d_data.pkl', download=True) # This is the default setting
|
48 |
+
|
49 |
+
# Load the data into a Hugging Face dataset and push it to the hub
|
50 |
+
train = Dataset.from_dict({"x": data["x"], "y":data["y"]})
|
51 |
+
test = Dataset.from_dict({"x": data["x_test"], "y":data["y_test"]})
|
52 |
+
DatasetDict({"train":train, "test":test}).push_to_hub("christopher/mnist1d")
|
53 |
+
```
|
54 |
+
|
55 |
+
## Dataset Usage
|
56 |
+
|
57 |
+
using the `datasets` library:
|
58 |
+
|
59 |
+
```python
|
60 |
+
from datasets import load_dataset
|
61 |
+
train = load_dataset("christopher/mnist1d", split="train")
|
62 |
+
test = load_dataset("christopher/mnist1d", split="test")
|
63 |
+
all = load_dataset("christopher/mnist1d", split="train+test")
|
64 |
+
```
|
65 |
+
|
66 |
+
|
67 |
+
## Citation
|
68 |
+
|
69 |
+
```json
|
70 |
+
@inproceedings{greydanus2024scaling,
|
71 |
+
title={Scaling down deep learning with {MNIST}-{1D}},
|
72 |
+
author={Greydanus, Sam and Kobak, Dmitry},
|
73 |
+
booktitle={Proceedings of the 41st International Conference on Machine Learning},
|
74 |
+
year={2024}
|
75 |
+
}
|
76 |
+
```
|