File size: 3,790 Bytes
9dce563
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
1️⃣ **Introduction to Neural Networks (One Hidden Layer)** πŸ€–  
   - A neural network is like a **thinking machine** that makes decisions.  
   - It **learns from data** and gets better over time.  
   - We build a network with **one hidden layer** to help it **think smarter**.  

2️⃣ **More Neurons, Better Learning!** 🧠  
   - If a network **isn’t smart enough**, we add **more neurons**!  
   - More neurons = **better decision-making**.  
   - We train the network to **recognize patterns more accurately**.  

3️⃣ **Neural Networks with Multiple Inputs** πŸ”’  
   - Instead of just **one piece of data**, we give the network **many inputs**.  
   - This helps it **understand more complex problems**.  
   - Too many neurons = **overfitting (too specific)**, too few = **underfitting (too simple)**.  

4️⃣ **Multi-Class Neural Networks** 🎨  
   - Instead of choosing between **two options**, the network can choose **many!**  
   - It learns to **classify things into multiple groups**, like recognizing **different animals**.  
   - The Softmax function helps it **pick the best answer**.  

5️⃣ **Backpropagation: Learning from Mistakes** πŸ”„  
   - The network **makes a guess**, checks if it’s right, and **fixes itself**.  
   - It does this using **backpropagation**, which adjusts the neurons.  
   - This is how AI **gets smarter with time**!  

6️⃣ **Activation Functions: Helping AI Decide** ⚑  
   - Activation functions **control how neurons react**.  
   - Three common types:  
     - **Sigmoid** β†’ Good for probabilities.  
     - **Tanh** β†’ Helps balance data.  
     - **ReLU** β†’ Fastest and most useful!  
   - These functions help the network **learn efficiently**.  

# πŸ“– AI Terms and Definitions (Based on the Videos) πŸ€–  

### 🧠 **Neural Network**  
A **computer brain** that learns by adjusting numbers (weights) to make decisions.  

### 🎯 **Classification**  
Teaching AI to **sort things into groups**, like recognizing cats 🐱 and dogs 🐢 in pictures.  

### ⚑ **Activation Function**  
A rule that helps AI **decide which information is important**. Examples:  
- **Sigmoid** β†’ Soft decision-making.  
- **Tanh** β†’ Balances positive and negative values.  
- **ReLU** β†’ Fast and effective!  

### πŸ”„ **Backpropagation**  
AI’s way of **fixing mistakes** by looking at errors and adjusting itself.  

### πŸ“‰ **Loss Function**  
A **score** that tells AI **how wrong** it was, so it can improve.  

### πŸš€ **Gradient Descent**  
A method that helps AI **learn step by step** by making small changes to improve.  

### πŸ—οΈ **Hidden Layer**  
A **middle part of a neural network** that helps process complex information.  

### πŸŒ€ **Softmax Function**  
Helps AI **choose the best answer** when there are multiple choices.  

### βš–οΈ **Cross Entropy Loss**  
A way to measure **how well AI is learning** when making choices.  

### πŸ“Š **Multi-Class Neural Networks**  
AI models that can **choose from many options**, not just two.  

### 🏎️ **Momentum**  
A trick that helps AI **learn faster** by keeping track of past updates.  

### πŸ” **Overfitting**  
When AI **memorizes too much** and struggles with new data.  

### πŸ˜• **Underfitting**  
When AI **doesn’t learn enough** and makes bad predictions.  

### 🎨 **Convolutional Neural Network (CNN)**  
A special AI for **understanding images**, used in things like face recognition.  

### πŸ“¦ **Batch Processing**  
Instead of training on **one piece of data at a time**, AI looks at **many pieces at once** to learn faster.  

### πŸ—οΈ **PyTorch**  
A tool that helps build and train neural networks easily.