A styling app that turns natural-language outfit intent into shoppable looks, with persistent style context across sessions and a TikTok-style feed for browsing outfits.
The core design challenge was hiding a roughly 10-second multi-service pipeline behind a feed: preserve user taste, search broadly, rank visual options, render the outfit, and prefetch the next result without blocking the interface.
Onboarding and behavior are compressed into a short taste profile that can ride along in every generation prompt.
Product search, filtering, ranking, and image generation run concurrently wherever possible so one slow external call does not freeze the feed.
Candidate products are rendered into one numbered grid so the vision model ranks relative options in one call instead of isolated yes/no checks.
The full input-to-output path takes about 10 seconds per outfit, so the next result is generated in the background and cached results are paced.