magichampz commited on
Commit
efc32a6
1 Parent(s): 126f9e0

updated readme.md

Browse files
Files changed (1) hide show
  1. README.md +32 -1
README.md CHANGED
@@ -1,5 +1,36 @@
1
  ---
2
  license: mit
3
  ---
 
 
 
 
 
 
 
 
4
  Sample database for the my lego sorter model uploaded <br>
5
- Contains both sample images as well as a numpy array file (.npy) that contains every image (~6000) used to train the model
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
4
+ # Dataset Card for Dataset Name
5
+ Database for lego sorter model uploaded <br>
6
+
7
+ This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
8
+
9
+ ## Dataset Details
10
+
11
+ ### Dataset Description
12
  Sample database for the my lego sorter model uploaded <br>
13
+ Contains both sample images from each class as well as a numpy array file (.npy) that contains every image (~6000) used to train the model. The numpy file was created so that the dataset could be loaded into Google Collab.
14
+ - **Curated by:** Aveek Goswami, Amos Koh
15
+
16
+ ### Dataset Sources [optional]
17
+ - **Repository:** https://github.com/magichampz/lego-sorting-machine-ag-ak
18
+
19
+ ## Uses
20
+ Dataset may be used to train any machine learning model.
21
+
22
+ ### Direct Use
23
+ Best use for this dataset is to train a model with similar architecture to the lego sorter model I uploaded. Dataset images designed to be classified into 7 distinct lego technic classes
24
+
25
+
26
+ ## Dataset Structure
27
+ database-sample contains 7 folders, each containing images from different categories of lego technic pieces. <br>
28
+ A .npy file is also uploaded, which has a shape of (5953,2), which means 5953 entries, with each entry containing the full image as one data point and the category label as the other data point.
29
+
30
+ ### Source Data
31
+ <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
32
+ #### Data Collection and Processing
33
+ All images were not processed, they were stored as the original image both in the folders and the numpy array. Image processing occurs in the odel training script uploaded as part of the lego sorter model repo
34
+
35
+ ### Recommendations
36
+ All images were taken under constant lighting conditions with a raspberry PiCamera 2, which limited the quality of the images obtained.