A single brainstorming session that redefined how we think about building with AI
We ask AI to write code, fix bugs, and execute plans. But what about the messy, creative work that happens before the first line of code?
Product vision starts as a solo exercise -- sketching on whiteboards, drafting docs alone, then scheduling reviews days later to get feedback.
Current AI workflows excel at implementation -- "build this component" -- but rarely participate in the upstream thinking: "what should we build and why?"
Most AI interactions are transactional. A prompt, a response, done. Deep collaborative exploration -- the kind that produces breakthroughs -- is rare.
A developer sat down to design the next generation of Amplifier's user interface -- a web-based replacement for the CLI. Instead of sketching alone or scheduling a design review, they opened an Amplifier session and started a conversation.
What followed was not a prompt-response exchange. It was a sustained exploration.
The first breakthrough idea: instead of building a fixed UI that displays AI responses, let the AI dynamically decide how to render each response.
Developer designs the UI. AI fills in the content. Every response looks the same -- a text block in a chat bubble, regardless of what's being communicated.
AI decides the layout per response. A code review gets a diff view. A comparison gets a table. An architecture discussion gets a diagram. The container adapts to the content.
The session explored a radical simplification: what if the conversation IS the application? Not a sidebar to some other tool, but the center of the experience.
Tool outputs collapse by default. Thinking blocks show a preview. Complexity reveals itself only when the user asks for it.
Older conversation turns fade visually, creating a natural "focus on the present" while keeping scroll-back available. Context without clutter.
Real-time token streaming isn't just a technical feature -- it creates a sense of the AI thinking alongside you. The stream IS the collaboration.
While the creative conversation flowed, Amplifier dispatched multiple research agents simultaneously -- pulling in context that no single person could gather that fast.
Surveyed existing AI chat interfaces, identifying patterns and gaps in the market
Retrieved conversation UI patterns, progressive disclosure approaches, and information density studies
Analyzed how developers think about chat vs. IDE vs. terminal interfaces
Investigated protocol designs for real-time streaming, WebSocket vs. SSE tradeoffs
Research happened during the brainstorm -- not before, not after. Context arrived as the thinking evolved.
This wasn't pre-planned. It crystallized from sustained, uninterrupted exploration -- the kind of insight that only emerges from deep, uninterrupted collaborative thinking.
The brainstorming session didn't just produce ideas. It produced a vision clear enough to become a 5,184-line specification -- and then a shipped product.
5 documents covering product context, protocol, backend, frontend, and implementation details. Created Feb 13-17, 2026 in the amplifier-web-spec repo.
First implementation as amplifier-web -- React/TypeScript frontend with WebSocket protocol. Validated the core concepts.
Evolved into amplifier-chat -- a Preact SPA using SSE streaming. 163 commits, actively developed.
Primary contributor: Samuel Lee (145 of 163 commits in amplifier-chat) · Paul Payne (9 commits)
Spec: 6 commits across 5 days (Feb 13-17, 2026) · Implementation: March 3-20, 2026
The most valuable moment in any project is the beginning -- when assumptions are formed and direction is set. If AI can participate here, it changes everything downstream. Not just faster execution, but better decisions about what to execute.
This wasn't about efficiency. It was deliberation. The value wasn't in getting a quick answer -- it was in the late insight that the opening question couldn't have reached. AI as thinking partner requires patience and trust in the process.
The multi-agent research pattern -- dispatching agents for context while the creative conversation continues -- is a new workflow. It's like having a research assistant who fetches references while you sketch on the whiteboard.
"Experience over tokens" emerged from the conversation, not from a planning document. The best product principles are discovered through exploration, not decreed from above. AI can be part of that discovery process.
Data as of: March 22, 2026
Feature status: Active (amplifier-chat is actively developed and deployed)
Research performed:
amplifier-web-spec: 6 commits, Feb 13-17, 2026. Line count: wc -l *.md = 5,184 lines totalamplifier-chat: 163 commits total. git shortlog -sn --all shows Samuel Lee (145), Paul Payne (9), Amplifier Dev (7)wc -l src/chat_plugin/static/index.html = 8,942 linesamplifier-chat first commit March 3, 2026; latest commit March 20, 2026amplifier-web (prototype repo) confirming React/TypeScript/WebSocket architecturePrimary source for brainstorming session details: Developer's firsthand account of the 27-turn session. Exact conversation logs were not available for independent verification. Specific quotes and turn-by-turn arc are from the developer's recollection.
Repositories: All repos under microsoft/ org -- amplifier-web-spec, amplifier-web, amplifier-chat
Gaps: Exact conversation transcript not independently verified. Turn numbers for specific insights are approximate from the developer's account. Multi-agent research dispatch patterns are described conceptually -- specific agents and queries from that session were not logged.
The next time you're designing something new, open an Amplifier session and explore. Don't ask it to build. Ask it to think with you.
Try the web UI that emerged from this session
Read the 5,184-line spec that captured the vision
Use AI for the thinking phase, not just the building phase