Podcaster Copilot

A mockup of what Podcaster Copilot could look like. A minimal sidebar app that listens to your conversation, and helps you ask more interesting or spicy questions.


I’d like to thank Dan Shipper and Tyler Cowan for being unwitting guests in my sketch of an idea. I love them both, and thought their conversation about AI stuff and things would be the perfect fit for this sketch.

If you haven’t listened to it yet, go check out and subscribe to the How Do You Use ChatGPT podcast

Mockup was done by the illustrious Jessica Gillis.


Podcaster Copilot gives podcasters context-aware feedback in real-time during live interviews.

  • Have you ever thought “wow, I wish I would have asked about X!” immediately after a conversation ended? Podcaster Copilot has your back, it listens to your conversation and provides in-the-moment feedback
  • Podcast guest research made easy, just paste in a few links about your guest, and a some text notes, and we’ll help you weave an exciting tapestry of questions.
  • Are your questions to plain? 🌶️ spice it up a little with a spicy question, or 🤔 curiosity building question.
  • Have you ever been in a meeting, and want to “put a pin in it”, and come back to a point before ending the conversation? PodPilot lets you pin a moment in time so you don’t forget to loop back.

How it works

There’s a desktop app, when you run it, it can listen to your system audio and microphone input to build a complete understanding of conversations in real time.

So as a podcaster, when you start your interview, it will have full historical context of your entire conversation, and will be able to provide context-aware questions throughout the interview.

The app is sort of like a “sidebar” window, you can get a stream of suggested questions, or you can ask Podcaster Copilot to generate a question of a specific flavor like “spicy”, “curious”, “clarity”, “optimistic”, etc.


A few weeks ago, @pvh tweeted On AI, I want an exoskeleton not a butler.

This got me thinking about the current state of LLMs and AI agents at the moment. Everything is tied up in chat, or exhibits some sort of butler-like command/control relationship.

But what does it mean to have an AI exoskeleton?

When you start to think about what the next 10 years of LLM and AI integration looks like in our lives, one pattern that emerges quite clearly is the ability to instantly query your entire corpus of data (notes, docs, meeting transcripts, health data, etc) with a voice query.

That got me thinking, what does the user experience for this actually look like? Are we able to leverage realtime AI transcription, retrieval augmented generation, and state of the art LLMs to access our notes and ask exciting questions and help us steer the conversation or journey?

It seems like this could lead to more insightful conversations, helping us all make sense of things as we learn and grow.

This is a lightweight cyborg human-ai interface pattern that I believe will become more popular over time.

Alternate use-cases

This idea could be used for several other verticals. For example lets say you have a large sales team that has a lot of voice conversations. What if this could transcribe their conversations in real time, and provide helpful context-aware questions and cues to help them close on their deals.

Conceivably you could see a tool like this also being used for reporting to a sales manager, or even providing coaching to a sales team in a helpful constructive way. What if you could lower the cost of training your salesforce by augmenting their raw capability with a tool like this?

Weekly Newsletter

Signup for my weekly-ish newsletter, exploring the future through artificial intelligence.

I'll send you a short email every week with links to things I've written, and other interesting things I've found throughout the week.