title: Tech Stack Advisor
emoji: π§
colorFrom: indigo
colorTo: pink
sdk: docker
app_file: app.py
pinned: false
license: apache-2.0
duplicable: true
π§ Tech Stack Advisor β ML App (with Docker & Hugging Face Deployment)
Tech Stack Advisor is a hands-on machine learning project designed to teach you how to build, containerize, and deploy an ML-powered web application using Docker and Hugging Face Spaces.
π― This project is part of the "Artificial Intelligence and Machine Learning (AI/ML) with Docker" course from School of DevOps.
π What You'll Learn
- Build and train a simple ML model using
scikit-learn
- Create a UI using
Gradio
- Containerize your app using a Dockerfile
- Push your Docker image to Docker Hub
- Deploy the Dockerized app on Hugging Face Spaces (free tier)
π Project Structure
tech-stack-advisor/
βββ app.py # Gradio web app
βββ train.py # Script to train and save ML model
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker build file (added during the lab)
βββ model.pkl # Trained ML model (generated after training)
βββ encoders.pkl # Encoders for categorical inputs (generated after training)
βββ LICENSE # Apache 2.0 license
βββ README.md # This guide
π§ Step 1: Setup and Train Your ML Model
- Clone the repository
git clone https://github.com/<your-username>/tech-stack-advisor.git
cd tech-stack-advisor
- Install dependencies
(Optional: Use a virtual environment)
pip install -r requirements.txt
- Train the model
python train.py
This creates:
model.pkl
: the trained ML modelencoders.pkl
: label encoders for input/output features
π₯οΈ Step 2: Run the App Locally (Without Docker)
python app.py
Visit the app in your browser at:
http://localhost:7860
π³ Step 3: Add Docker Support
Create a file named Dockerfile
in the root of the project:
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
EXPOSE 7860
CMD ["python", "app.py"]
π§ Step 4: Build and Run the Docker Container
- Build the image
docker build -t tech-stack-advisor .
- Run the container
docker run -p 7860:7860 tech-stack-advisor
Visit: http://localhost:7860
βοΈ Step 5: Publish to Docker Hub
- Login to Docker Hub
docker login
- Tag the image
docker tag tech-stack-advisor <your-dockerhub-username>/tech-stack-advisor:latest
- Push it
docker push <your-dockerhub-username>/tech-stack-advisor:latest
π Step 6: Deploy to Hugging Face Spaces
Go to huggingface.co/spaces
Click Create New Space
Select:
- SDK: Docker
- Repository: Link to your GitHub repo with the Dockerfile
Hugging Face will auto-build and deploy your container.
π§ͺ Test Your Skills
- Can you swap the model in
train.py
for aLogisticRegression
model? - Can you add logging to show which inputs were passed?
- Try changing the Gradio layout or theme!
π§Ύ License
This project is licensed under the Apache License 2.0. See the LICENSE file for details.
π Credits
Created by [Gourav Shah](https://www.linkedin.com/in/gouravshah) as part of the AI/ML with Docker course at School of DevOps.
π Happy shipping, DevOps and MLOps builders!