File size: 2,765 Bytes
009ed4b
 
 
 
df19fa6
 
 
 
 
 
009ed4b
 
df19fa6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
tags:
- pytorch_model_hub_mixin
- model_hub_mixin
datasets:
- scikit-learn/iris
metrics:
- accuracy
library_name: pytorch
pipeline_tag: tabular-classification
---

# mlp-iris

A multi-layer perceptron (MLP) trained on the Iris dataset.

It takes four inputs: 'SepalLengthCm', 'SepalWidthCm', 'PetalLengthCm' and 'PetalWidthCm'. It predicts whether the species is 'Iris-setosa' / 'Iris-versicolor' / 'Iris-virginica'.

It is a PyTorch adaptation of the scikit-learn model in Chapter 10 of Aurelien Geron's book 'Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow'. Find the scikit-learn model here: https://github.com/ageron/handson-ml3/blob/main/10_neural_nets_with_keras.ipynb

Code: https://github.com/sambitmukherjee/handson-ml3-pytorch/blob/main/chapter10/mlp_iris.ipynb

Experiment tracking: https://wandb.ai/sadhaklal/mlp-iris

## Usage

```
!pip install -q datasets

from datasets import load_dataset

iris = load_dataset("scikit-learn/iris")
iris.set_format("pandas")
iris_df = iris['train'][:]

label2id = {'Iris-setosa': 0, 'Iris-versicolor': 1, 'Iris-virginica': 2}
iris_df['Species'] = [label2id[species] for species in iris_df['Species']]

X = iris_df[['SepalLengthCm', 'SepalWidthCm', 'PetalLengthCm', 'PetalWidthCm']].values
y = iris_df['Species'].values

from sklearn.model_selection import train_test_split

X_train_full, X_test, y_train_full, y_test = train_test_split(X, y, test_size=0.1, stratify=y, random_state=42)
X_train, X_valid, y_train, y_valid = train_test_split(X_train_full, y_train_full, test_size=0.1, stratify=y_train_full, random_state=42)

X_means, X_stds = X_train.mean(axis=0), X_train.std(axis=0)

import torch
import torch.nn as nn
from huggingface_hub import PyTorchModelHubMixin

device = torch.device("cpu")

class MLP(nn.Module, PyTorchModelHubMixin):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(4, 5)
        self.fc2 = nn.Linear(5, 3)

    def forward(self, x):
        act = torch.relu(self.fc1(x))
        return self.fc2(act)

model = MLP.from_pretrained("sadhaklal/mlp-iris")
model.to(device)

X_new = X_test[:2] # Contains data on 2 new flowers from the test set.
X_new = ((X_new - X_means) / X_stds) # Normalize.
X_new = torch.tensor(X_new, dtype=torch.float32)

model.eval()
X_new = X_new.to(device)
with torch.no_grad():
    logits = model(X_new)
probas = torch.softmax(logits, dim=-1)
confidences, preds = probas.max(dim=-1)

print(f"Predicted classes: {preds}")
print(f"Predicted confidences: {confidences}")
```

## Metric

Accuracy on the test set: 0.9333

---

This model has been pushed to the Hub using the [PyTorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration.