Spaces:
Sleeping
Sleeping
File size: 3,790 Bytes
9dce563 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 |
1οΈβ£ **Introduction to Neural Networks (One Hidden Layer)** π€
- A neural network is like a **thinking machine** that makes decisions.
- It **learns from data** and gets better over time.
- We build a network with **one hidden layer** to help it **think smarter**.
2οΈβ£ **More Neurons, Better Learning!** π§
- If a network **isnβt smart enough**, we add **more neurons**!
- More neurons = **better decision-making**.
- We train the network to **recognize patterns more accurately**.
3οΈβ£ **Neural Networks with Multiple Inputs** π’
- Instead of just **one piece of data**, we give the network **many inputs**.
- This helps it **understand more complex problems**.
- Too many neurons = **overfitting (too specific)**, too few = **underfitting (too simple)**.
4οΈβ£ **Multi-Class Neural Networks** π¨
- Instead of choosing between **two options**, the network can choose **many!**
- It learns to **classify things into multiple groups**, like recognizing **different animals**.
- The Softmax function helps it **pick the best answer**.
5οΈβ£ **Backpropagation: Learning from Mistakes** π
- The network **makes a guess**, checks if itβs right, and **fixes itself**.
- It does this using **backpropagation**, which adjusts the neurons.
- This is how AI **gets smarter with time**!
6οΈβ£ **Activation Functions: Helping AI Decide** β‘
- Activation functions **control how neurons react**.
- Three common types:
- **Sigmoid** β Good for probabilities.
- **Tanh** β Helps balance data.
- **ReLU** β Fastest and most useful!
- These functions help the network **learn efficiently**.
# π AI Terms and Definitions (Based on the Videos) π€
### π§ **Neural Network**
A **computer brain** that learns by adjusting numbers (weights) to make decisions.
### π― **Classification**
Teaching AI to **sort things into groups**, like recognizing cats π± and dogs πΆ in pictures.
### β‘ **Activation Function**
A rule that helps AI **decide which information is important**. Examples:
- **Sigmoid** β Soft decision-making.
- **Tanh** β Balances positive and negative values.
- **ReLU** β Fast and effective!
### π **Backpropagation**
AIβs way of **fixing mistakes** by looking at errors and adjusting itself.
### π **Loss Function**
A **score** that tells AI **how wrong** it was, so it can improve.
### π **Gradient Descent**
A method that helps AI **learn step by step** by making small changes to improve.
### ποΈ **Hidden Layer**
A **middle part of a neural network** that helps process complex information.
### π **Softmax Function**
Helps AI **choose the best answer** when there are multiple choices.
### βοΈ **Cross Entropy Loss**
A way to measure **how well AI is learning** when making choices.
### π **Multi-Class Neural Networks**
AI models that can **choose from many options**, not just two.
### ποΈ **Momentum**
A trick that helps AI **learn faster** by keeping track of past updates.
### π **Overfitting**
When AI **memorizes too much** and struggles with new data.
### π **Underfitting**
When AI **doesnβt learn enough** and makes bad predictions.
### π¨ **Convolutional Neural Network (CNN)**
A special AI for **understanding images**, used in things like face recognition.
### π¦ **Batch Processing**
Instead of training on **one piece of data at a time**, AI looks at **many pieces at once** to learn faster.
### ποΈ **PyTorch**
A tool that helps build and train neural networks easily.
|