VP of Product at a Series B fintech here. My team has tried every async collaboration tool on the market.
Result? Tool sprawl, not async culture.
We have: Loom (recordings), Notion (docs), Slack threads (discussions), Linear (async standups), Miro (brainstorming), Figma (design), GitHub (code).
That’s seven tools. And engineers still default to “quick Slack DM” instead of using any of them properly.
The Tool Inventory Reality Check
Let me walk through what we adopted and what actually gets used:
Loom (Async Video Updates)
Theory: Engineers record standups, design walkthroughs, feature demos. Async video replaces meetings.
Reality:
- Engineers spend 15 min recording what could be written in 3 min
- 7% watch rate on standup videos (we tracked it)
- Average watch time: 47 seconds (people scrub for relevant parts)
- Videos become graveyard content nobody reviews
Verdict: Video creates illusion of async but isn’t actually useful. Writing forces clarity; recording captures rambling.
Notion (Documentation)
Theory: Single source of truth for decisions, specs, processes.
Reality:
- Works when we enforce “decisions only count if documented”
- Fails when we allow Slack decisions + later documentation
- Search is decent but not great
- Notion becomes useful as soon as we make it non-optional
Verdict: Actually works, but only with cultural discipline. Tool enables async, doesn’t create it.
Slack Threads (Async Discussions)
Theory: Thread discussions allow async follow-up without cluttering main channel.
Reality:
- Threads get buried (notification UX is bad)
- Important decisions made in threads, then lost
- People forget to check thread notifications
- Async in theory, synchronous in practice (people expect fast responses)
Verdict: Slack optimizes for real-time, not async. Using it for async is fighting the tool’s design.
Linear (Async Standups)
Theory: Engineers post daily updates in Linear instead of standup meetings.
Reality:
- Works reasonably well
- Engineers update task status, blockers visible
- Managers can scan updates async
- But some engineers still prefer talking through blockers
Verdict: Functional for status updates. Doesn’t replace discussion though.
Miro (Async Brainstorming)
Theory: Collaborate on whiteboards asynchronously.
Reality:
- Nobody uses Miro async. We tried. People add sticky notes, then schedule meeting to discuss them.
- Creative collaboration seems to need real-time energy
- Async Miro boards feel sterile, disconnected
Verdict: Some things might not work async, and that’s okay.
What Actually Works: Fewer Tools + Clear Conventions
After 18 months of tool experimentation, we consolidated to boring tools with strict conventions:
1. Notion for Decisions (Single Source of Truth)
All significant decisions documented in Notion:
- Product briefs
- Engineering RFCs
- Architecture decision records (ADRs)
- Incident postmortems
Rule: If it’s not in Notion, it’s not a real decision.
2. GitHub PRs for Code Decisions
All code-adjacent decisions happen in PR comments, not Slack.
Rule: If you discuss code in Slack, summarize decision in PR comment for permanence.
3. Slack for FYI Only, Not Decisions
Slack is for quick questions, FYIs, social chat. Not for decisions or important discussions.
Rule: If decision needs to be made, move to Notion. If discussion getting long, create doc.
4. Structured Async Communication
When you need input from multiple people:
- RFC (request for comments) doc in Notion
- 48-hour comment window
- Explicit approver makes final call
Not: Open-ended Slack thread where discussion meanders forever.
The Real Problem: Tools Don’t Create Culture
Here’s the uncomfortable truth: We were trying to solve a process problem with tools.
The issue wasn’t that we lacked the right async tool. The issue was:
- Managers rewarded presence and fast responses
- Engineers got praised for shipping, not for documenting
- Meetings felt faster than writing docs
- No consequences for making decisions in Slack vs proper docs
Adding Loom didn’t make us async-first. Changing incentives did.
Once we:
- Made documentation part of “definition of done”
- Added “enabled others through docs” to promotion criteria
- Tracked “time to unblocked decision” as team metric
- Leadership modeled async-first behavior (CEO writing strategy docs instead of all-hands)
THEN the tools started being used properly. But tools alone changed nothing.
When Sync Still Wins
Not everything should be async. After all this experimentation, we kept these synchronous:
- 1-on-1s: Relationship building needs real-time connection
- Product brainstorming: Early ideation benefits from live riffing
- Incident response: Real-time coordination when systems down
- Onboarding: New hires need synchronous mentorship
- Quarterly planning: Strategic alignment works better live
The mistake: Trying to make EVERYTHING async. Some coordination is best done real-time.
Tool Proliferation vs Tool Discipline
Maya mentioned her design team has 7+ tools. I think tool sprawl is inevitable for complex work, but the question is:
Are we using tools strategically, or just collecting them?
Bad tool strategy:
- Adopt new tool because it’s trendy
- Let teams choose their own tools (inconsistency)
- No clear convention for what goes where
- Information scattered across 10 places
Good tool strategy:
- Each tool has clear purpose (what goes here vs there)
- Decision trail is searchable and consistent
- Async tools make information retrieval faster, not just capture easier
- Regular tool audits: “Do we actually need this?”
The Questions I’m Wrestling With
After 18 months of async tool experimentation:
-
Do we need new tools, or better discipline with existing ones? Most tools we adopted already did what we needed. We just weren’t using them right.
-
Is tool sprawl inevitable, or a sign of poor information architecture? Maya has 7 tools. We have 7 tools. Is this normal, or have we failed to consolidate properly?
-
What’s the minimum viable async toolkit? If I were starting from scratch, what would I actually need? Docs + Chat + Code + Task tracking? Is that enough?
-
How do you measure async tool effectiveness? What metrics actually matter? Time-to-information? Decision latency? Meeting reduction?
What’s Worked for You?
For those running distributed async-first teams:
- What’s your async toolkit? Which tools actually get used vs graveyard software?
- How do you prevent tool sprawl? What’s your governance model?
- Do you use Loom/async video? Or is writing always better?
- What stays synchronous? Where have you decided async doesn’t work?
I’m convinced the async-first tool market is mostly noise. The winning strategy is probably boring tools + excellent process.
But I’d love to be proven wrong.