03ActiveAug 2025 – Present

MeetNote

Real-time meeting transcription for Chrome & macOS

Founder & Engineer

2
Platforms
Whisper
ASR Engine
GPT-4
Summary AI
<3s
Latency

Platforms

Chrome Extension

Injects a floating panel into any browser-based meeting (Google Meet, Teams, Zoom web). Captures audio via the Web Audio API, streams to the FastAPI backend, and returns live transcript segments.

macOS App (Swift)

Native SwiftUI app that captures system audio directly. Works with any meeting app — not just browser-based ones. Displays live transcript in a minimal floating window that stays on top.

AI Summary Engine

After each meeting, GPT-4 processes the full transcript to generate: key decisions, action items with owners, a TL;DR summary, and unresolved questions.

The Problem

I was constantly missing details in meetings — jotting half-formed notes while trying to pay attention. Existing tools were either too heavy, required Zoom bots that felt invasive, or only worked with one platform. I wanted something invisible that just worked everywhere.

How It Works

Audio is captured client-side (browser mic or macOS system audio), chunked into 10-second segments, and sent to a FastAPI backend. OpenAI Whisper transcribes each chunk. The transcript streams back to the UI in real-time. At meeting end, the full transcript is passed through a GPT-4 summarization chain to produce structured output.

Why Two Platforms

The Chrome extension covers browser meetings with zero install friction. The macOS app covers native apps like Slack Huddles, FaceTime, Discord, and local recordings. Together they cover 100% of my meeting scenarios — built both to scratch my own itch.

Tech Stack

SwiftSwiftUIChrome ExtensionJavaScriptOpenAI WhisperFastAPIPythonReact