Tech layoffs hit 60K in Q1—companies say AI, but is this automation or just better optics?

I’ve been tracking the layoff announcements, and the numbers are staggering: 59,121 tech workers laid off across 171 separate events in Q1 2026 alone. That’s 704 people losing their jobs every single day.

But here’s what caught my attention—the narrative has completely shifted.

The Old Script vs. The New Script

Remember 2022-2023? Companies blamed pandemic overhiring. “We grew too fast,” they said. “We misjudged demand.”

Now in 2026? It’s all about AI. According to the data I’ve seen, over 20% of layoffs (about 9,238 jobs) are explicitly linked to AI and automation in company announcements. Resume.org surveyed 1,000 hiring managers—55% expect layoffs this year, and 44% point to AI as the primary driver.

“Competitive Necessity” Became the Default Line

The pattern is consistent: Company invests in AI tools. Company audits which roles can be automated. Company announces layoffs framed as “competitive necessity” or “strategic AI transformation.”

And Wall Street loves it. When Atlassian announced their 1,600 layoffs with AI positioning, their stock jumped 2% in after-hours trading. The market rewards AI-justified headcount cuts.

But Here’s the Thing—Is It Real?

Harvard Business Review published something that stopped me cold: “Companies are laying off workers because of AI’s potential—not its performance.”

That’s the distinction I keep coming back to. Some companies like Amazon are legitimately deploying automation at scale (they hit their one-millionth warehouse robot in early 2026). But many others? They’re making cuts based on what AI might do in the future, not what it’s actually doing today.

The Engineering Leadership Dilemma

As a VP of Engineering, this puts me in an impossible position. I’m being asked:

  • Can AI reduce our engineering headcount?
  • Which roles are “at risk” of automation?
  • How do we stay competitive if others are cutting faster?

But I’m also watching young engineers in AI-exposed roles face a 14% drop in job-finding rates and a 3% unemployment increase. I’m seeing my team members get anxious every time there’s an all-hands meeting.

The Real Question

How do we separate genuine technological transformation from what some are calling “AI-washing”—using AI as a convenient narrative for cuts that are fundamentally about cost pressure, not capability replacement?

If we’re on pace for 265,000 tech layoffs by year-end (extrapolating current rates), and AI automation was truly ready to deliver those productivity gains… wouldn’t we be seeing massive velocity improvements across the industry instead of teams stretched thinner than ever?

What are you all seeing in your organizations? Is AI genuinely changing headcount needs, or is this the new politically acceptable way to execute financial strategies?

Because I’m increasingly convinced it’s more narrative than reality—and that distinction matters for how we lead our teams through this.

Keisha, you’ve nailed the distinction that keeps me up at night: AI potential versus AI performance.

I’ve been in tech long enough to watch narrative cycles. We went from “Internet will revolutionize everything” (true, but took 20 years) to “Cloud will eliminate IT jobs” (transformed them, didn’t eliminate them) to now “AI will replace knowledge workers.”

The Capability Gap Nobody Talks About

Here’s what I’m seeing at the CTO level: Yes, some automation is real and substantial. Amazon’s warehouse robotics you mentioned—that’s legitimate operational transformation with measurable ROI.

But when I talk to other CTOs off the record? Most are under intense pressure from boards and investors to show an “AI strategy.” And the easiest way to demonstrate you’re serious about AI efficiency gains is… headcount reduction.

The problem: In many cases, the AI tools aren’t actually ready to replace the roles being cut. We’re laying off people based on a roadmap, not a reality.

The Investor Incentive Problem

You mentioned Atlassian’s 2% stock bump. That tells you everything about what’s driving this. Wall Street has decided that:

  • AI investment = Forward-thinking leadership
  • Headcount reduction = Improved margins
  • Both together = Buy signal

I’ve sat in board meetings where the conversation wasn’t “Can AI actually do this work?” but rather “What headcount reduction can we credibly tie to our AI investments?”

That’s not transformation. That’s financial engineering with an AI veneer.

The Paradox That Proves Your Point

If AI automation was genuinely delivering the promised productivity gains, we should be seeing:

  • Delivery velocity increasing across the industry
  • Engineering teams shipping faster with fewer people
  • Measurable productivity improvements in public earnings calls

Instead, what are we hearing? Teams are stretched. Burnout is at record highs. Technical debt is accumulating because there aren’t enough people to maintain systems.

Meanwhile, tech companies cut 60K workers in Q1 while posting record revenues and profit margins. That’s not automation replacing work—that’s cost optimization masquerading as innovation.

Where I Think This Goes

My prediction: In 12-18 months, we’ll see a wave of “rebuilding” as companies realize they cut too deep and the AI tools aren’t covering the gap. But by then, they’ll have shown several quarters of improved margins to Wall Street, the narrative will shift again, and nobody will connect the dots back to these 2026 layoffs.

The people who pay the price? The 704 per day losing their jobs, and the teams left behind scrambling to cover the work with tools that weren’t ready for prime time.

Are there companies genuinely transforming with AI? Absolutely. But I’d estimate it’s maybe 20-30% of the ones claiming it. The rest are using AI as the most palatable justification for cuts driven by other factors: overcorrection from pandemic hiring, margin pressure, or positioning for acquisition.

The narrative is easier to sell than the truth.

Both of you are hitting on something I’m experiencing from the trenches—the gap between the AI narrative at the executive level and what’s actually happening on engineering teams.

What I’m Seeing in Financial Services

In my world (fintech and financial systems), there’s enormous regulatory scrutiny around automation. You can’t just replace a compliance analyst with ChatGPT and call it a day. Audit trails matter. Explainability matters. Human judgment in edge cases matters.

Yet I’m still getting questions from leadership: “Can AI reduce our compliance team?” “How many QA engineers can we automate away?” “What’s our AI-driven efficiency roadmap?”

The honest answer is: Some tasks? Yes. Entire roles? Not remotely ready. But that nuanced answer doesn’t fit the narrative investors want to hear.

The Human Cost Nobody’s Measuring

Keisha mentioned the stats on young engineers—14% drop in job-finding rates, 3% unemployment increase for AI-exposed roles. But there’s another cost that’s harder to quantify: trust erosion.

When you tell your team “we’re investing in AI to augment your work, not replace you,” and then three months later there are layoffs justified by AI efficiency… how do you rebuild that trust?

I’m watching my best engineers update their LinkedIn profiles and start taking recruiter calls. Not because they’re underperforming, but because they see the writing on the wall. The narrative itself is driving attrition, regardless of whether the automation threat is real.

The Operational Reality Check

Here’s what actually happens when companies cut too deep based on AI promises:

  1. Context loss: Senior engineers who understand the system architecture are gone. AI tools can generate code, but they can’t explain why we built it this way in the first place.

  2. Maintenance debt: We have fewer people to maintain systems, but AI doesn’t eliminate maintenance—it often increases it because generated code needs more review and testing.

  3. Velocity illusion: Yes, we’re generating code faster. But code generation was never the bottleneck—it was requirements gathering, architectural decisions, and cross-team coordination. AI doesn’t solve those.

Michelle’s prediction about the 12-18 month “rebuilding” cycle resonates. I’ve already seen it in my organization on a smaller scale. We cut three backend engineers last year with the justification that AI tooling would cover the gap. Six months later, we’re desperately trying to hire them back because technical debt has exploded and our release velocity is down 30%.

The Paradox of AI Hiring While Cutting

Here’s the thing that keeps me up at night: While we’re cutting 60K tech workers and blaming AI, AI engineering roles are growing at 92% with a 56% wage premium.

So we’re not actually reducing headcount needs—we’re shifting where the talent goes. We’re cutting generalist engineers, QA, support roles, and hiring ML engineers, prompt engineers, and AI infrastructure specialists.

That’s not automation. That’s workforce restructuring with AI as the theme, not the mechanism.

What This Means for Leadership

As engineering leaders, we’re stuck in an impossible position:

  • Push back too hard on AI efficiency narratives, and we look out of touch or resistant to innovation
  • Go along with it, and we damage team morale and potentially cut roles we’ll desperately need in 12 months
  • Stay silent, and we’re complicit in what Michelle correctly called “financial engineering with an AI veneer”

The hardest part? Even when we know the cuts are premature, we’re often overruled by business pressures that have nothing to do with technical capability.

I think the industry is about to learn an expensive lesson about the difference between generating code and building sustainable systems. The 704 people per day losing their jobs are the tuition payment for that lesson.

Coming at this from the product and business side, I think we’re all circling around the same uncomfortable truth, but maybe I can add some context on the market forces driving this.

The VC/Investor Pressure Is Real

Michelle mentioned board pressure, and I want to emphasize just how intense it is right now. The funding environment has fundamentally shifted. In 2020-2021, VCs funded growth at any cost. “Get to scale first, profitability later.”

2026? Completely different conversation:

  • “Show us your unit economics”
  • “What’s your path to profitability?”
  • “How are you managing burn rate?”

VCs are demanding capital efficiency, and AI provides the perfect narrative vehicle for demonstrating you’re serious about it. It’s not just about cutting costs—it’s about signaling to your current and future investors that you’re being “smart” about resource allocation.

The Paradox Luis Mentioned Is the Key

Luis pointed out the 92% growth in AI roles with 56% wage premium while 60K tech workers get cut. That’s not a bug, it’s the feature.

Companies aren’t actually trying to reduce total headcount costs—they’re trying to reposition their talent spend to match what the market (investors) values right now.

Think about it from a pitch deck perspective:

  • “We have 200 engineers” sounds expensive
  • “We have 150 engineers augmented by AI, with 20 ML specialists” sounds innovative

Same budget, different story, completely different investor reaction.

The Business Model Question Nobody’s Asking

Here’s what I think is really happening: These companies are using AI-justified layoffs to cover for a more fundamental shift—moving from growth-focused to profitability-focused business models.

That requires different team composition:

  • Fewer engineers building new features
  • More focus on reliability and efficiency
  • Smaller customer success teams with better tooling
  • Consolidated roles and responsibilities

AI didn’t create this need—changing market conditions did. AI just provides a forward-looking narrative that’s easier to sell than “we overbuilt for a growth rate that didn’t materialize.”

The Honest Calculation

I’ve been in strategy sessions where we literally model this out:

Option A: “We’re cutting 15% of headcount due to market conditions and overcapacity”

  • Analyst reception: Weak demand signal, competitive vulnerability
  • Stock impact: -5% to -8%

Option B: “We’re cutting 15% of headcount while investing in AI transformation to drive efficiency”

  • Analyst reception: Forward-thinking, margin improvement story
  • Stock impact: +2% to +5%

Same cut. Same people losing jobs. Wildly different market perception.

Where I Disagree (Slightly) with Michelle

Michelle predicted a 12-18 month “rebuilding” cycle when companies realize AI didn’t cover the gap. I think that’s true for some companies—probably the ones doing genuine “AI-washing.”

But I suspect many companies know AI isn’t fully replacing the work. They’re cutting because they need to cut, and AI is the narrative wrapper. When they rebuild, it’ll be quietly and they’ll never connect it back to these layoffs publicly.

The Part That Keeps Me Up

The thing that bothers me most: We’re creating this massive talent churn—704 people per day losing jobs, trust erosion in remaining teams like Luis described, institutional knowledge walking out the door—all in service of a narrative that makes quarterly earnings calls sound better.

And the costs of that churn? The context loss, the rebuilding expenses, the damaged employer brands, the innovation slowdown from anxious teams? Those won’t show up in financial reporting for 12-24 months.

By then, the executives who made these calls will have hit their performance metrics, potentially moved to other roles, and the cleanup will be someone else’s problem.

The Uncomfortable Answer

Keisha asked: “Is AI genuinely changing headcount needs, or is this the new politically acceptable way to execute financial strategies?”

From where I sit, it’s overwhelmingly the latter. AI is changing some headcount needs in some functions at some companies. But the scale and speed of these cuts? That’s financial strategy dressed in innovation language.

The 60K people laid off aren’t losing jobs because AI can do their work. They’re losing jobs because companies need to show margin improvement, and AI provides a palatable story to tell investors, employees, and the market about why those cuts are strategic, not reactive.

The question we should be asking: When does this narrative collapse under the weight of its own fiction?

Reading through this thread is equal parts validating and depressing. You’re all describing the disconnect I lived through when my startup failed.

The Automation Readiness Gap

We built a design-to-code tool—literally betting the company on the premise that AI could replace significant chunks of design and frontend work. Raised on that story. Built for 18 months. Launched.

And you know what we discovered? The AI could generate code. It could create components. But it couldn’t:

  • Understand the why behind design decisions
  • Navigate the political reality of getting changes approved
  • Handle the 50 edge cases that pop up in real products
  • Know when to break the pattern vs. when to follow it

We overestimated AI capability by about 3 years. By the time we figured that out, we’d burned through most of our runway.

The Human Cost at Scale

The thing that’s hitting me hardest in this discussion: 704 people per day.

That’s not a statistic—that’s 704 people updating LinkedIn, explaining to their kids why they’re home during the day, canceling subscriptions, stretching severance packages, wondering if their whole career trajectory just got invalidated.

And for what? David’s Option A vs. Option B calculation is brutal in its honesty. The same people lose their jobs either way—the only difference is which story plays better in analyst calls.

The Part Nobody’s Saying Out Loud

Companies are posting record revenues while cutting workers and blaming AI. Let me translate what that actually means:

“We figured out how to extract more value with fewer people, and AI gives us a narrative that sounds like innovation instead of exploitation.”

That’s not automation. That’s not transformation. That’s just… efficiency gains rebranded.

Where I Push Back

Luis mentioned that AI tools can generate code but can’t explain why we built things a certain way. That resonates so hard.

In design, I see the same pattern. AI can create mockups. It can suggest layouts. But it can’t tell you:

  • Why this user flow exists (because we learned users were getting confused at the checkout step)
  • Why this component is structured weirdly (because of a compromise with engineering constraints)
  • Why we decided NOT to build something (because research showed users didn’t actually need it)

That institutional knowledge—the context, the war stories, the “here’s why we don’t do it that way”—that’s what walks out the door with those 704 people per day.

The 12-Month Cycle Is Already Happening

Michelle predicted 12-18 months until rebuilding. I’m watching it happen now in my network:

Company cuts 20% of engineering staff in Q4 2025, citing AI efficiency. By March 2026, they’re:

  • Missing deadlines because context is gone
  • Burning out remaining engineers with increased scope
  • Quietly contracting former employees as consultants at 2x their old rate
  • Posting job openings that look suspiciously like the roles they cut

But they’ll never admit the connection. The earnings calls will talk about “strategic hiring” and “specialized roles” without mentioning that they’re rebuilding what they destroyed.

What This Means for People Like Me

I’m at a design systems lead role now—exactly the kind of position David described as potentially “at risk.” And I’m watching my manager get asked: “Can AI build our design system? How many designers do we really need?”

My honest answer? AI can help. But it can’t replace the judgment, the stakeholder management, the cross-team alignment, the “why” conversations that actually make design systems work.

But that nuanced answer doesn’t fit on a slide deck. And when the pressure comes to show “AI efficiency,” nuance is the first casualty.

The Question That Haunts Me

Keisha asked if this is automation or just better optics for financial strategies. David answered it’s overwhelmingly the latter.

But here’s what keeps me up: Even if we all know it’s financial strategy dressed as innovation… what changes?

The investors still reward the AI narrative. Wall Street still bumps stocks after layoff announcements. Executives still hit their performance metrics. The 704 people per day still lose their jobs.

Knowing it’s mostly narrative doesn’t stop it from happening. And that might be the most depressing part of all.

The Uncomfortable Truth

When my startup failed, I learned that the market doesn’t care about your intentions or your innovation if the numbers don’t work.

What I’m seeing now: The market also doesn’t care if your AI efficiency story is real, as long as it improves margins and sounds plausible.

And the people paying the price for that plausibility? They’re not in the room when these decisions get made.