Overview
A fully serverless ML-powered course recommendation platform built and deployed in just 4 hours during a JCR (Junior Consultant Round) sprint challenge. The AI core is an ensemble of XGBoost and Random Forest multi-label classifiers — trained on member profiles (Filière, Cellule, Sexe + normalized numerics) to predict course suitability, blended 50/50, filtered at a 0.40 probability threshold to produce up to 3 personalized course recommendations per member while excluding already-taken courses. The entire platform runs on Cloudflare Workers edge compute with Neon Serverless Postgres (HTTP driver), Drizzle ORM for type-safe queries, Hono.js routing, and WebAssembly bcrypt auth — deployed globally with sub-50ms response times and $0 hosting cost.
The Problem
The JCR challenge required building a functional AI-powered application in 4 hours with specific constraints: it must be deployable without traditional servers, handle concurrent users globally, include authentication, and demonstrate practical AI integration.
Solution
Trained a custom ML ensemble for course recommendation: (1) XGBoost multi-label classifier with randomized search hyperparameter tuning, (2) Random Forest with 400 estimators and parallel processing. The data pipeline one-hot encodes categorical features (Filière, Cellule, Sexe), normalizes and clips numeric features, and applies multi-label binarization on target course labels. Both models are serialized as .joblib files for production serving.
Designed a 50/50 weighted ensemble prediction pipeline: calculate probability scores from both XGBoost and Random Forest, blend them with configurable weights (W_XGB=0.50), filter out courses the member has already taken, apply a 0.40 probability threshold, and return the top 3 recommendations per member — balancing precision (avoiding bad recommendations) with recall (finding all good ones).
Chose Cloudflare Workers for true edge computing — code runs in 300+ data centers worldwide, eliminating the cold-start penalty of traditional serverless. Integrated Neon Serverless Postgres via its HTTP driver, which works in Cloudflare's V8 isolate environment where traditional TCP connections aren't possible.
Used Drizzle ORM for type-safe, compile-time-checked database queries — catching SQL errors at build time instead of runtime, crucial when you only have 4 hours. Implemented bcrypt-based authentication using a WebAssembly-compiled bcrypt module that runs in Cloudflare's edge runtime.
Architecture
The ML pipeline (Python: XGBoost + Random Forest → .joblib models) trains offline and exports predictions. A single Cloudflare Worker handles all API routes via Hono.js, with Drizzle ORM executing queries against Neon Postgres via @neondatabase/serverless HTTP driver. Auth middleware validates bcrypt-hashed sessions on every request. The recommendation engine serves pre-computed ensemble predictions filtered per-user. The entire platform — login, auth, course recommendations, member profiles — was built and deployed in 4 hours.
Tech Stack
XGBoost + Random Forest Ensemble
Multi-label classification with 50/50 blended probabilities — randomized search tuning, 400 RF estimators, 0.40 threshold, top-3 recommendations
Data Pipeline (Python)
One-hot encoding (Filière, Cellule, Sexe), numeric normalization/clipping, multi-label binarization, course-already-taken filtering
Cloudflare Workers
Edge compute runtime in 300+ global data centers — zero cold starts
Neon Serverless Postgres
HTTP driver for edge compatibility in V8 isolates — zero-cold-start database access
Drizzle ORM + Hono.js
Type-safe SQL queries + ultra-lightweight edge web framework
WebAssembly bcrypt
Edge-compatible password hashing compiled to WASM for Cloudflare runtime
Results
Won the 4-hour JCR sprint challenge
XGBoost + Random Forest ensemble trained, tuned, and deployed in the 4-hour window
Global response time from any location via Cloudflare edge
Personalized course recommendations per member with 0.40 probability threshold
Runs on Cloudflare's free tier (100K requests/day)
Achievement Gallery
Moments that made it all worth it

Achievement 01
🥇 1st Place — JCR Sprint Closing Ceremony at INSAT, April 2025
Key Takeaways
Training a real ML model (XGBoost + Random Forest ensemble) and deploying a full platform with auth in 4 hours required extreme prioritization. The key was splitting work: one person on the ML pipeline (data cleaning → training → prediction export) while the other built the serverless API and frontend in parallel.
The 50/50 ensemble blend with a 0.40 threshold was a pragmatic choice — we didn't have time to optimize weights, but the ensemble still outperformed either model individually on micro-F1. Sometimes 'good enough fast' beats 'perfect never'.
Cloudflare Workers + Neon is the best serverless stack for time-constrained challenges — zero configuration, instant deployment, and no cold starts.
Drizzle ORM's compile-time SQL validation saved us from at least 3 runtime bugs that would have cost 30+ minutes each to debug in a 4-hour sprint.