
- Mulham Fetna | Technical Portfolio/
- Projects/
- Kaggle Playground Series S6E4 - From Competition Model to Deployable API/
Kaggle Playground Series S6E4 - From Competition Model to Deployable API
Kaggle Playground Series S6E4 - From Competition Model to Deployable API#
This repository demonstrates a full path from Kaggle tabular modeling to a production-style inference service.
What this project proves#
- Competitive ML workflow for
playground-series-s6e4(Predicting Irrigation Need). - Reproducible training pipeline with configurable compute budgets.
- Deployable serving layer (FastAPI + Docker), not just notebook experimentation.
Repository structure#
kaggle_gpu_submission_workflow.ipynb
Kaggle Notebook workflow to train/evaluate and generatesubmission_ready.csv.train_advanced_and_submit.py
Advanced competition trainer (CatBoost + CV + optional search + report JSON).kaggle_submission_guide.md
Step-by-step notebook submission guide.train_api_model.py
Trains a CPU CatBoost model and exports serving artifacts tomodel_artifacts/.app/main.py
FastAPI service with:GET /healthGET /schemaPOST /predict
Dockerfile+requirements.txt
Containerized API deployment stack.
Architecture#
flowchart LR
A[train.csv] --> B[Feature selection + categorical preprocessing]
B --> C[CatBoost training]
C --> D[Model artifact .cbm]
C --> E[Metadata JSON]
D --> F[FastAPI inference service]
E --> F
F --> G[/predict -> Irrigation_Need class/]Kaggle workflow#
- Open the competition notebook environment.
- Run
kaggle_gpu_submission_workflow.ipynb. - Produce:
/kaggle/working/submission_ready.csv/kaggle/working/model_report.json
- Submit in competition UI.
API training and local serving#
1) Install dependencies#
pip install -r requirements.txt2) Train and export API artifacts#
python train_api_model.py --data-dir . --output-dir model_artifacts --iterations 700 --seed 423) Run FastAPI#
uvicorn app.main:app --host 0.0.0.0 --port 80004) Test endpoints#
curl http://127.0.0.1:8000/health
curl http://127.0.0.1:8000/schemaDocker deployment#
Build image (after generating model_artifacts/):
docker build -t irrigation-api:latest .Run container:
docker run --rm -p 8000:8000 irrigation-api:latestBusiness framing#
This repo is designed as a public proof asset for ML/Data Engineering work:
- measurable leaderboard performance
- reproducible experiments
- deployable inference interface
- clear operational documentation
There are no articles to list here yet.
