Building an AI Meal Planning App

About

Overview

Meal planning is inefficient—in terms of your time, your money, and your consumption. It's a problem that's been on my mind for a while, and as I've been learning more about AI, I saw an opportunity to explore it alongside emerging generative AI technologies.

My co-founder and I have been rethinking how people cook for the week—from discovering recipes to planning, shopping, and making meals. The project has gone through several versions, dozens of user conversations, and countless rounds of prototyping and iteration. I’ve been building the LLM integration in Cursor while designing the surrounding product experience.

This project has become a learning lab not just for approaching complex, everyday problems through new generative AI technology, but also for integrating AI tools into my own design and product workflows. Look out for our MVP launch soon and please reach out to me if you'd be interested in trying it out and/or if you have any feedback on what you read here.

Responsibilities

Well, everything, along with my awesome co-founder. I take the lead on product development, product design, product strategy, and branding.

Selected Work

LLM Integration

I’ve been experimenting in Cursor with different ways to integrate the LLM. The key challenge for me has been striking the right balance between flexibility and reliability. It’s definitely a learn-as-I-go process, so if you’re deep in this space, I’d really love to chat.

What We've Built so Far

  • A functional iOS app with LLM integration
  • An editing flow where the LLM acts as your assistant—drafting a plan, then following along as you make edits through prompts and UI interactions
  • Parsing structured responses into UI components
  • Experimenting with a double-LLM “judge” call to improve optimization (because why does it always suggest stir-fry with bell peppers!!)
  • LLM tool calls to bring reliability to core functions like saving meals

What We're Excited to Improve

  • Response speed
  • Improving recipe optimization (again, stop with the stir fry with bell peppers)
  • Personalization through user history and preferences. Ideally, you won’t even need to edit; it’ll just know what you want

What We're Excited to Explore

  • Voice mode:for hands-free cooking and shopping (i.e., no more greasy screens or can aisle collisions)
  • Recipe quality: improving the caliber and personality of LLM-generated recipes, ideally tuned to user preferences
  • Creator ecosystem: exploring a double-sided model with chef content (or chef-luencers, if you will); somehow creating value on both sides without turning into the Instagram of chefs

Product Design

The app has gone through several design iterations as we refine how planning, shopping, and cooking flow together.

Goals

  • Simplicity: plan, shop, cook, and save recipes
  • Anticipate & Deliver Quickly: get you off your phone and into the grocery store quicker
  • Optimize AI patterns: exploring what interactions are possible (!!!!!!!!!)

Moments I'm Liking

Some potential kernels...

Expandable cards within the chat for diving into recipes; Accomodating full details without breaking the conversational flow.

Integrating UI elements (buttons) with natural language chat. User can easily click, or type, to accept or replace recipes, and they'll show attached to the message context.

Tap to see which recipe each ingredient belongs to (because, do you really need parsley?)

I love the simplicity of the navigation bar, but having it at the top is definitely risky. The keyboard on the Plan tab complicates placing it at the bottom, but maybe I'm missing an obvious solution. And yet, there's something nice about the top because each tab represents a full workflow that isn't switched between often (you shop on Sunday, you cook on Monday). Phoning any friend for help!

Impact

Getting into the Code

Coming from a design background, diving into the codebase has been a huge learning curve—and one of the most rewarding parts of this project. Getting into the backend has been essential for understanding what’s actually possible with AI. When I started, I couldn’t imagine rendering a card inside the chat interface. Hundreds of “wait, WHAT HAPPENED?” Cursor prompts later, I feel like I’m only at the tip of the iceberg.

Designing Novel AI Patterns

I’m fascinated by how we can evolve patterns for interacting with LLMs. While chat can sometimes feel limited, it’s also what enables the open-ended, non-deterministic behavior that makes AI powerful. Lately, I’ve been exploring how UI elements can support that conversation—essentially, what a “GUI for prompting” could look like. Looking into the future, however, I’m really excited about voice interaction and ultimately invisible interfaces (my fingers are tired of typing).

Improving Product Strategy

As we approach our MVP launch, I’ve been building more structure around QA, testing, and prioritization in Linear. From a product strategy perspective, I'm constantly challenged by what’s enough for launch versus what belongs in version two. We're also exploring how we to leverage AI tools, like the Cursor-Linear integration, to speed up development losing sacrificing visibility or trust.