We Used GenAI to Migrate From Angular to React. Here's Everything That Went Wrong

I need to write the honest version of this story because I keep seeing posts about “AI-powered migrations saved us months!” and I want to tell you what actually happened when we tried it.

Spoiler: It didn’t save us time. It cost us 7 extra months and taught us painful lessons about what AI is actually good at versus what the marketing says.

The Context

Side project that grew into a real product: design system management tool built in Angular 3 years ago, about 40,000 lines of code. The Angular ecosystem was moving away from us, React had better component library options we wanted to use, seemed like the right time to migrate.

The Plan That Sounded Great

Read case studies about GenAI handling 70% of framework migrations. Thought: “Perfect! We’ll use AI to convert components, spend 2 months cleaning up edge cases, ship it.”

The plan:

  • Month 1: AI converts 60% of components, we review and test
  • Month 2: Handle remaining 40% manually, integrate, fix bugs
  • Month 3: Polish and ship

Timeline: 2-3 months. Reality: 9 months. Let me tell you what went wrong.

Month 1: The Honeymoon Phase

AI (Claude and GPT-4 combined) converted components at incredible speed. Fed it Angular component, got back React component. Syntax looked clean, imports were right, even had prop types.

Converted 60% of our codebase in 3 weeks. Felt like magic. Showed demos to stakeholders. Everyone was impressed.

Then we tried to actually run it.

Months 2-4: Everything Broke

Problem 1: State Management Translation
Angular uses RxJS observables everywhere. AI converted them to React useState, which is syntactically correct but semantically wrong. Lost all our reactive data flow. Had to manually rewrite state management for 80% of components.

Problem 2: Event Handling Inconsistency
AI converted Angular event bindings to React onClick handlers, but didn’t preserve the event handling logic correctly. Some components used synthetic events, some used native events, some mixed both. Integration broke everywhere.

Problem 3: Accessibility Features Lost
Our Angular components had ARIA labels, keyboard navigation, focus management. AI converted the visual structure but stripped accessibility. Didn’t realize until we ran accessibility audit. Had to manually re-add a11y to 90% of components.

Problem 4: Responsive Design Broken
AI converted CSS but didn’t understand our responsive breakpoint system. Components looked fine in demo (desktop browser) but completely broke on mobile. Our grid system logic was lost in translation.

Months 5-7: Complete Rewrite of AI-Generated Code

At month 4, we had to have an uncomfortable conversation: “The AI-generated code is actually slower to fix than rewriting from scratch.”

We made the painful decision: keep the AI-generated structure as reference, but rewrite the logic manually. This would have been faster if we’d just done manual migration from the start.

The problem: AI optimized for syntactic conversion, not semantic correctness. It could translate <button> to <button>, but it couldn’t translate “how this button behaves in our system.”

Months 8-9: Finally Shipping, Lessons Learned

We finally shipped in month 9, about 7 months later than planned. The AI-generated code was mostly gone, replaced by manual React implementations that actually understood our requirements.

What Went Wrong: AI Can’t Translate Mental Models

The fundamental issue: framework migration isn’t syntax translation, it’s translating between different mental models of how UIs work.

Angular Mental Model: Dependency injection, observables, two-way binding, decorators
React Mental Model: Components as functions, unidirectional data flow, hooks, props down/events up

AI can convert the syntax but it doesn’t understand the paradigm shift. It generates React code that looks like Angular translated to JavaScript, not actual React patterns.

What Would Have Worked

Looking back, here’s what we should have done:

1. Use AI for Greenfield React Components
Instead of converting Angular to React, use AI to generate new React components from designs/specs. AI is great at boilerplate generation, terrible at logic translation.

2. Manual Migration with AI-Assisted Documentation
Manually rewrite components in React, use AI to document what the old Angular code did. This helps with understanding but doesn’t create false confidence.

3. Extract Shared Logic First
Pull business logic out of Angular components into framework-agnostic modules. Then build new React UI around that logic. AI can help with extraction but not translation.

4. Smaller Scope
We tried to convert 40K LOC at once. Should have started with one feature area, validated the approach, then scaled. Would have discovered problems in week 1, not month 4.

What AI Was Actually Good At

Not everything failed. AI genuinely helped with:

Test Generation: AI analyzed Angular component behavior and generated React test cases. Caught many issues during migration. This was valuable.

Documentation: AI created docs explaining what each Angular component did, which helped when rewriting in React. Not perfect but useful.

Boilerplate: AI-generated folder structure, import statements, prop types - mechanical stuff. Saved time on setup.

CSS Conversion: Basic CSS-in-JS conversion worked well. Had to fix responsive logic manually but initial conversion was helpful.

The pattern: AI works for mechanical transformation, fails at conceptual translation.

The Honest Assessment

If I could go back and tell myself 9 months ago:

“Don’t use AI for the migration itself. Use AI to understand the existing code and generate scaffolding for the new code. But write the actual React components yourself, with understanding of React patterns.”

We didn’t save time. We wasted 7 months trying to fix AI-generated code that looked right but worked wrong.

The Uncomfortable Question

How many companies are making the same mistake - trying to use AI for framework migrations without realizing it’s translating syntax, not semantics?

The marketing says “AI handles 70% of migration.” Maybe that’s true for version upgrades (Next.js 13 to 14) where the paradigm doesn’t change. But for framework changes (Angular to React), the paradigm shift is exactly what AI can’t handle.

Questions for Community

  1. Has anyone successfully used AI for major framework migrations? What made it work? What was different from our experience?

  2. Are version upgrades (same framework, new version) fundamentally different from framework changes in how AI can help?

  3. What’s the right scope for AI-assisted migration? Smaller components? Specific types of logic?

This was a humbling experience. I wanted to believe the AI hype. I learned the hard way that AI is a tool, not a replacement for understanding the actual problem you’re solving.

Appreciate the honesty. This is why we’re requiring “proof of concept” phase for any AI-assisted major refactoring - 2 weeks, one module, full integration tests. If AI can’t handle representative sample, don’t scale the approach. The cultural challenge: junior engineers believe AI marketing, seniors are skeptical from experience. Your story is perfect for our engineering all-hands about realistic AI expectations. How did you manage team morale when the 2-month promise became 9 months?

From product perspective, this is the conversation that never happens early enough: 7 extra months equals massive opportunity cost.

Let me calculate what 7 months of engineering capacity costs for a typical product team:

  • 7 months × 3-person team = 21 engineer-months
  • Could have shipped 2-3 major features
  • Could have addressed customer feature requests
  • Could have improved key metrics (activation, retention, monetization)
  • Lost competitive positioning (competitors don’t wait)

The AI Migration Promise Created Planning Problems:

When engineering estimated “2 months with AI assistance,” product planned around that:

  • Committed to Q2 feature launches
  • Told customers features were coming
  • Built roadmap assuming migration done by April

Then migration took until November. Those commitments failed. Customer trust damaged. Feature roadmap disrupted.

The Hidden Product Lesson: AI migration promises create unrealistic planning assumptions. I’m now pushing back on any engineering estimates that assume “AI will accelerate this.” Want two timelines: “AI-assisted estimate” and “manual estimate.” Plan for the manual timeline, hope for the AI acceleration.

Question: How did this impact your product roadmap and customer commitments? Did you have features blocked by the migration? How did you communicate delays to users/customers?

The engineering lesson is “AI can’t translate paradigms.” The product lesson is “don’t let AI promises drive planning timelines.” Both valuable.

From product perspective, 7 extra months equals massive opportunity cost. Engineering estimated 2 months with AI, product planned around that timeline, then migration took until month 9. Those commitments failed. Customer trust damaged. I’m now pushing back on any estimates assuming AI will accelerate. Want two timelines: AI-assisted and manual. Plan for manual, hope for AI acceleration.

This is why we’re requiring proof of concept phase for any AI-assisted major refactoring - 2 weeks, one module, full integration tests. If AI can’t handle representative sample, don’t scale. The cultural challenge: juniors believe AI marketing, seniors are skeptical from experience. Your story is perfect for our engineering all-hands about realistic AI expectations.