EverythingIsAFont / LogisticRegression.md
taellinglin's picture
Upload 61 files
9dce563 verified
7️⃣ **Cross Entropy Loss: Teaching AI to Learn Better** 🎯
- AI makes mistakes, so we **measure how bad they are** using a **loss function**.
- Cross Entropy Loss helps AI **learn from its mistakes** and get better.
- Instead of guessing randomly, the AI **adjusts itself to improve its answers**.
8️⃣ **Backpropagation: AI Fixing Its Own Mistakes** 🔄
- AI learns by **guessing, checking, and fixing mistakes**.
- It uses **backpropagation** to update itself, just like **learning from practice**.
- This helps AI **get smarter every time it trains**.
9️⃣ **Multi-Class Neural Networks: Picking the Best Answer** 🎨
- AI doesn’t always choose between **just two things**; sometimes, it picks from **many choices**!
- It uses **Softmax** to figure out which answer is **most likely**.
- This helps in **image recognition, language processing, and more**!
🔟 **Activation Functions: Helping AI Think Faster**
- AI uses **activation functions** to **decide which patterns matter**.
- Three important ones:
- **Sigmoid** → Helps with probabilities.
- **Tanh** → Balances data better.
- **ReLU** → Fastest and most useful!
- These make AI **learn faster and make better decisions**!