Bhupen commited on
Commit
d332070
Β·
1 Parent(s): e25aab8

Add Deep Learning intro py file

Browse files
Files changed (2) hide show
  1. app.py +200 -0
  2. requirements.txt +7 -0
app.py ADDED
@@ -0,0 +1,200 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import streamlit as st
2
+ import numpy as np
3
+ import pandas as pd
4
+ from sklearn.datasets import fetch_california_housing
5
+ from sklearn.model_selection import train_test_split
6
+ from sklearn.preprocessing import StandardScaler
7
+ from sklearn.linear_model import SGDRegressor
8
+ from sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score
9
+ # Plot model architecture
10
+ from tensorflow.keras.utils import plot_model
11
+ from PIL import Image
12
+ import os
13
+
14
+ # Main function to run the Streamlit app
15
+ def main():
16
+ # Title of the app
17
+ st.subheader("Understanding Deep Learning")
18
+
19
+ # Expander for Deep Learning Introduction
20
+ with st.expander(f"➑️ What is Deep Learning?"):
21
+ st.write("""
22
+ **Definition (in simple terms):**
23
+ Deep learning is a type of machine learning that uses artificial neural networks to learn from large amounts of data. It mimics the way humans think and process information, allowing machines to make decisions based on that data.
24
+
25
+ **How it is different from Classical Machine Learning:**
26
+ - **Classical ML:** Relies on `feature engineering` - column selection. It uses algorithms like decision trees, SVM, or linear regression.
27
+ - **Deep Learning:** Automatically learns features from data without needing explicit programming. It uses layers of interconnected nodes (neurons) to learn complex patterns.
28
+
29
+ **Examples of Success Stories:**
30
+ - **Image Recognition:** Convolutional Neural Networks (CNNs) powering face recognition on social media and medical imaging diagnostics.
31
+ - **Natural Language Processing:** Transformers like `GPT`, `BERT`, and `T5`, which are used in chatbots, translation apps, and content generation.
32
+ - **Autonomous Vehicles:** Deep learning models help self-driving cars make decisions like braking, steering, and avoiding obstacles.
33
+
34
+ **Is `Generative` AI the same as Deep Learning?**
35
+ - **Generative AI** is a subset of deep learning that focuses on generating new content, like images, text, or music. It uses models like GANs (Generative Adversarial Networks) and VAEs (Variational Autoencoders). While all generative AI is deep learning, not all deep learning is generative AI.
36
+ """)
37
+
38
+ with st.expander("➑️ How Features Are Learned: ML vs DL (Image Data Example)"):
39
+ st.markdown("###### 🎯 Task: Classify handwritten digits (0–9)")
40
+
41
+ st.markdown("###### πŸ” Classical ML Approach")
42
+ st.markdown("""
43
+ In classical ML, we **extract features manually** from each image.
44
+
45
+ **Example features:**
46
+ - `pixel_mean`: Average pixel brightness
47
+ - `num_white_pixels`: Count of bright pixels
48
+ - `aspect_ratio`: Width-to-height of the digit
49
+ - `num_edge_pixels`: From an edge detection filter
50
+
51
+ These features are then used in algorithms like SVM or decision trees.
52
+ """)
53
+
54
+ st.markdown("###### 🧠 Deep Learning Approach")
55
+ st.markdown("""
56
+ In Deep Learning, we **skip manual features**. Instead, the model receives the raw pixel matrix (like 28x28 for MNIST).
57
+
58
+ - The model **learns features** automatically: curves, loops, corners, even digit shapes.
59
+ - Layers in a CNN extract **increasingly abstract patterns**.
60
+
61
+ > Think of it as: *We provide raw material, and the model forges patterns on its own.*
62
+ """)
63
+
64
+ st.markdown("###### 🧩 Analogy")
65
+ st.markdown("""
66
+ | Task | Classical ML | Deep Learning |
67
+ |------|---------------|----------------|
68
+ | Image of Digit | Engineer features manually from pixels | Feed raw pixels, model learns patterns |
69
+ | You’re Like | Handcrafting parts of a watch | Giving raw metal, and model forges its own design |
70
+ """)
71
+
72
+ # Expander for How Linear Regression Expands to Neural Networks
73
+ with st.expander("➑️ How is Simple Linear Regression Expanded to Neural Network?"):
74
+ st.write("""
75
+ **Conceptually Expanding Linear Regression to Neural Network:**
76
+ - **Linear Regression:** Models a relationship between input (X) and output (Y) using a straight line (Y = mX + b). It’s a simple case of predicting continuous values based on one or more features.
77
+ - **Neural Networks:** Expand this idea by adding layers of neurons, where each neuron performs its own version of "linear regression" but with the output of each neuron being passed through an activation function (like ReLU or sigmoid) before being sent to the next layer.
78
+
79
+ **Illustration:**
80
+ - For **Linear Regression**, think of a single equation `y = mX + b`.
81
+ - For a **Neural Network**, imagine adding layers of neurons. Each neuron applies a weighted sum, then an activation function. The model learns through backpropagation, adjusting weights to minimize the error over time.
82
+ """)
83
+
84
+ st.image("neuron.PNG", caption="Linear regression - single neuron unit", use_column_width ='auto')
85
+ st.image("NN.PNG", caption=" Neural Network Illustration", use_column_width ='auto')
86
+
87
+ # Expander for Perceptron Regression on California Housing Dataset
88
+ with st.expander("➑️ Linear Regression on California Housing Dataset"):
89
+ st.write("""
90
+ The Perceptron model is applied to the California Housing dataset for regression. We will train the model and evaluate it using several metrics.
91
+ """)
92
+
93
+ # Load the California housing dataset
94
+ data = fetch_california_housing()
95
+ df = pd.DataFrame(data.data, columns=data.feature_names)
96
+ df['target'] = data.target
97
+
98
+ # Split the data into features and target
99
+ X = df.drop('target', axis=1)
100
+ y = df['target']
101
+
102
+ # Split the dataset into training and testing sets
103
+ X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
104
+
105
+ # Standardize the features using StandardScaler
106
+ scaler = StandardScaler()
107
+ X_train = scaler.fit_transform(X_train)
108
+ X_test = scaler.transform(X_test)
109
+
110
+ # Train a Perceptron model
111
+ model = SGDRegressor(max_iter=1000, tol=1e-3)
112
+ model.fit(X_train, y_train)
113
+
114
+ # Make predictions
115
+ y_pred = model.predict(X_test)
116
+
117
+ # Display the regression metrics
118
+ mse = mean_squared_error(y_test, y_pred)
119
+ mae = mean_absolute_error(y_test, y_pred)
120
+ r2 = r2_score(y_test, y_pred)
121
+
122
+ st.write(f"Mean Squared Error (MSE): {mse:.2f}")
123
+ st.write(f"Mean Absolute Error (MAE): {mae:.2f}")
124
+ st.write(f"R-squared (R2): {r2:.2f}")
125
+
126
+ # Expander for Improving the Model with Deep Learning
127
+ with st.expander("➑️ Improve the Model Using Deep Learning (Keras)"):
128
+ st.write("""
129
+ We can improve our Perceptron model using a deeper neural network with multiple neurons and layers. Here we use a Keras sequential model for a better regression result.
130
+ """)
131
+
132
+ from tensorflow.keras.models import Sequential
133
+ from tensorflow.keras.layers import Dense
134
+
135
+ # Build the Keras model
136
+ deep_model = Sequential([
137
+ Dense(64, input_dim=X_train.shape[1], activation='relu'),
138
+ Dense(32, activation='relu'),
139
+ Dense(1)
140
+ ])
141
+
142
+ deep_model.compile(optimizer='adam', loss='mse')
143
+
144
+ # Train the deep learning model
145
+ deep_model.fit(X_train, y_train, epochs=50, batch_size=200, verbose=1)
146
+
147
+ # Make predictions using the deep models
148
+ y_pred_deep = deep_model.predict(X_test)
149
+
150
+ # Evaluate the deep learning model
151
+ mse_deep = mean_squared_error(y_test, y_pred_deep)
152
+ mae_deep = mean_absolute_error(y_test, y_pred_deep)
153
+ r2_deep = r2_score(y_test, y_pred_deep)
154
+
155
+ st.write(f"Mean Squared Error (Deep Model MSE): {mse_deep:.2f}")
156
+ st.write(f"Mean Absolute Error (Deep Model MAE): {mae_deep:.2f}")
157
+ st.write(f"R-squared (Deep Model R2): {r2_deep:.2f}")
158
+
159
+ # Show the model summary
160
+ st.markdown("**Deep Learning Model Summary**")
161
+ plot_path = "deep_model_plot.png"
162
+ plot_model(deep_model, to_file=plot_path, show_shapes=True, show_layer_names=True)
163
+
164
+ # Display the plot in Streamlit
165
+ if os.path.exists(plot_path):
166
+ image = Image.open(plot_path)
167
+ st.image(image, caption="Keras Sequential Model Architecture", use_column_width='auto')
168
+ else:
169
+ st.warning("Model plot could not be generated. Ensure pydot and graphviz are installed.")
170
+
171
+ with st.expander("πŸ“Š Model Performance Comparison and Key Improvements"):
172
+ st.markdown("###### πŸ”‘ Key Notes on Model Improvement")
173
+
174
+ st.markdown("""
175
+ ###### πŸ“‰ 1. **Error Reduction**
176
+ - **MSE dropped** from `0.78 β†’ 0.29`
177
+ β†’ Deep model makes **fewer big mistakes** (squared errors).
178
+ - **MAE dropped** from `0.58 β†’ 0.38`
179
+ β†’ Predictions are **closer to true values on average**.
180
+
181
+ ###### πŸ“ˆ 2. **Higher Accuracy (RΒ² Score)**
182
+ - **RΒ² improved** from `0.41 β†’ 0.78`
183
+ β†’ Deep model explains **78% of the variation** in house prices
184
+ β†’ vs. only 41% by the Linear regression.
185
+
186
+ ###### 🧠 3. **Why the Improvement?**
187
+ - **Perceptron** learns only **linear patterns**.
188
+ - **Deep Learning model** has multiple layers + nonlinear activations (`relu`), so it:
189
+ - Learns **complex, non-linear relationships**
190
+ - Detects **feature interactions** better
191
+
192
+ ###### πŸ–ΌοΈ 4. **Visual Analogy**
193
+ - Think of fitting a **straight line** (Linear regression) vs. a **flexible curve** (Deep Net).
194
+ - The curve adapts better to the **true shape of the data**.
195
+ """)
196
+
197
+ st.success("Deep learning brings a significant performance boost by capturing more complex relationships in the data.")
198
+
199
+ if __name__ == "__main__":
200
+ main()
requirements.txt ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ pandas>=2.1
2
+ numpy>=1.25
3
+ scikit-learn>=1.4
4
+ tensorflow>=2.15
5
+ pillow
6
+ graphviz
7
+ pydot