Just got out of the âLaw Meets Codeâ panel at SF Tech Week. The AI regulation landscape is more complex than I thought. ![]()
What Iâm Seeing at SF Tech Week
Day 2 at Moscone - the vibe:
- Hundreds of founders asking the same questions about compliance
- VCs quietly warning portfolio companies about regulatory risks
- Enterprise buyers demanding compliance proof before signing
The panel I attended:
- Fenwick lawyers, a16z partner, Anthropic policy lead
- Topic: How to navigate EU AI Act + California regulations
- Room was PACKED (200+ people, standing room only)
Key Takeaway from the Panel
Quote from Fenwick partner:
âIf youâre building AI in 2025 and not thinking about compliance, youâre building on borrowed time. The EU AI Act is already in force as of February 2025, and it applies to you even if youâre in California.â
This hit hard.
The Numbers They Shared
EU AI Act compliance costs for startups:
- Low-risk systems: $50K-$80K (one-time)
- High-risk systems: $160K-$330K (one-time + ongoing)
- 33% of AI startups classified as âhigh-riskâ
Source: EU AI Act Compliance Analysis
Penalties for non-compliance:
- Up to âŹ40M or 7% of global revenue (whichever is higher)
- Even for startups with zero revenue in EU
California SB 53 (signed Sept 2025):
- Transparency requirements for frontier AI models
- Mandatory safety incident reporting
- Less onerous than failed SB 1047, but still adds compliance burden
Source: Governor Newsom signs SB 53
What This Means for Startups
The Anthropic policy lead said:
âThe bar for launching AI products is rising. In 2023, you could ship fast and figure it out later. In 2025, you need compliance from day one or you risk everything.â
Three compliance tiers they described:
Tier 1: Minimal-risk AI (general tools)
- Example: Grammar checker, basic chatbot
- Compliance: Light documentation, transparency
- Cost: $10K-$30K
- Timeline: 2-4 weeks
Tier 2: Limited-risk AI (customer-facing)
- Example: Recommendation systems, content moderation
- Compliance: Bias testing, explainability, user consent
- Cost: $50K-$150K
- Timeline: 2-3 months
Tier 3: High-risk AI (critical decisions)
- Example: Hiring tools, credit scoring, medical diagnosis
- Compliance: Third-party audits, ongoing monitoring, incident reporting
- Cost: $160K-$330K initial + $50K-$100K annual
- Timeline: 4-6 months
The Startup Dilemma
From a16z partner on the panel:
âBig tech can afford compliance. They have legal teams and budgets. Startups are caught in a bind: comply and burn through runway, or skip compliance and risk catastrophic fines later.â
He shared data:
- Compliance now represents 3-8% of seed funding
- Pre-seed companies delaying launches by 2-3 months
- Some pivoting away from âhigh-riskâ use cases entirely
Real Examples from the Panel
Case 1: HR Tech Startup (hiring AI)
- Built AI screening tool for resumes
- Discovered theyâre âhigh-riskâ under EU AI Act
- Compliance cost: $280K (wiped out 35% of seed round)
- 4-month delay to launch
- CEO quote: âWe almost ran out of money doing complianceâ
Case 2: Healthcare AI
- Medical diagnosis assistant
- Required: CE marking, clinical validation, ongoing monitoring
- Compliance cost: $450K
- Decided to launch consumer wellness version instead (lower risk category)
Case 3: LLM Wrapper Startup
- Built tool on top of OpenAI API
- Thought compliance was OpenAIâs problem
- Lawyer: âNo, youâre the deployer. Youâre liable.â
- Cost: $80K for documentation + legal review
- Nearly killed the company (only raised $500K)
Questions for This Community
1. How many of us are actually thinking about AI compliance?
2. For those building AI products: What risk tier are you in?
3. Is $160K-$330K compliance cost feasible for seed-stage startups?
4. Should we just avoid âhigh-riskâ use cases entirely?
5. How do we compete with big tech that can absorb these costs?
My Take After This Panel
Honest assessment:
Before SF Tech Week: âRegulation is future problem, focus on product-market fit nowâ
After this panel: âRegulation is NOW problem, need to factor into MVP planningâ
The shift in founder mindset is real.
I saw founders frantically taking notes. This isnât abstract policy anymore - itâs affecting what we build and how we fundraise.
More to share from other SF Tech Week sessions. Anyone else here this week?
David ![]()
Sources: