245,000 Tech Jobs Cut in 2025 — And 44% Expect AI to Drive 2026 Layoffs

I’ve been in engineering leadership for 14 years. I’ve built teams, grown orgs from 12 engineers to 200+, and celebrated countless promotions and launches. But this past year, I’ve also had to sit across from talented engineers and tell them their roles were being eliminated. That part never gets easier. And what’s coming next might be worse.

Let me lay out what we know.

The Numbers Are Staggering

In 2025, approximately 245,000 tech workers lost their jobs globally, according to tracking data from Layoffs.fyi and TrueUp. About 70% of those cuts came from U.S.-headquartered companies. This wasn’t a blip — it was the third consecutive year of mass layoffs following the 2022-2023 correction. Intel alone cut roughly 33,900 positions as it restructured from 109,000 employees down to 75,000. Microsoft eliminated around 15,000 roles. Meta cut 3,600 in 2025 and then followed up with another 1,500 from Reality Labs in January 2026.

But here’s what keeps me up at night: according to the World Economic Forum’s 2025 Future of Jobs Report, 41% of employers globally plan to downsize their workforce due to AI by 2030. In the United States, that number is 48%. A CNBC survey of senior HR leaders found that 89% expect AI to impact jobs in 2026, with roughly 44% saying it will affect at least half of all positions.

These aren’t hypothetical projections from academics. These are decisions being made in boardrooms right now.

The Rise of Invisible Unemployment

Jason Lemkin at SaaStr coined a term that perfectly captures what I’m seeing: invisible unemployment. It doesn’t show up in Bureau of Labor Statistics reports. There are no headlines. Instead, the jobs just… don’t materialize. Hiring processes drag on for months and then quietly fizzle. Roles get posted, interviews happen, and then the req gets pulled because someone realized an AI workflow could handle 60% of what that person would have done.

IBM’s voluntary attrition dropped to under 2% in 2025 — the lowest in 30 years. People aren’t leaving because there’s nowhere to go. And when people don’t leave, companies don’t backfill. When companies don’t backfill, there are no job openings. It’s a vicious cycle that creates a labor market that looks stable on paper but feels suffocating for anyone trying to find work.

SaaStr predicts 2026 will see a significant acceleration of this invisible unemployment, particularly in entry-level roles. The end of junior sales, junior marketing, and yes — junior engineering positions is already underway.

How AI Changes Headcount Planning

I sit in headcount planning meetings every quarter. The conversation has fundamentally shifted. Two years ago, we’d say: We need 6 more engineers to hit our roadmap. Now we say: We need 4 engineers plus better AI tooling, and we can still hit the roadmap — maybe even exceed it.

A Harvard study of 62 million workers found that when companies adopt generative AI, junior developer employment drops by 9-10% within six quarters, while senior employment barely changes. Companies that needed 10 developers are finding that 4 developers with AI tools can deliver equivalent output. One experienced engineer working alongside AI can do what used to require a three-person team.

But here’s the nuance that gets lost: 95% of generative AI pilots in the enterprise have failed to deliver measurable ROI, and two-thirds of tech leaders who integrated AI into their backend haven’t actually saved a single headcount. The promise of AI efficiency is driving layoffs, even when the reality hasn’t caught up. An HBR analysis in January 2026 put it bluntly: companies are laying off workers because of AI’s potential, not its performance.

Our Ethical Obligation as Leaders

Here’s where I want to get uncomfortable. Engineering leaders have an ethical obligation to be honest about what’s coming. Not fear-mongering. Not pretending everything is fine. Honest.

If you’re a Director or VP and you know your company is planning to reduce headcount through AI-driven attrition, you owe it to your team to:

  1. Be transparent about the timeline. Don’t let people find out through a Slack message at 8am on a Tuesday.
  2. Invest in reskilling now. If AI ops, prompt engineering, and AI safety are the growth areas — train your people for those roles before the reorg happens.
  3. Redefine career ladders. The traditional IC track of Junior → Mid → Senior → Staff needs to account for the fact that AI is compressing the bottom of the funnel. Help people skill up faster.
  4. Advocate for your team in planning meetings. When the CFO says ‘do more with less,’ push back with data about what’s actually achievable vs. what’s a fantasy spreadsheet.

I’m not anti-AI. I use Copilot and Claude daily. My teams are more productive than they’ve ever been. But productivity gains and workforce displacement are two sides of the same coin, and pretending otherwise helps no one.

The 245,000 people who lost their jobs in 2025 weren’t casualties of inefficiency. Many of them were excellent engineers, PMs, and designers who happened to be in roles that got caught in a structural shift. The least we can do is be honest about the shift that’s still coming.

What are you seeing at your companies? Are your leadership teams having these conversations openly, or is it all happening behind closed doors?

Luis, I respect the hell out of your honesty here, and I agree that leaders owe their teams transparency. But I want to push back on the narrative framing, because I think it’s feeding a panic that doesn’t match what I’m seeing on the ground.

My Team Grew Because of AI — Not Despite It

I run a 140-person engineering org. In 2025, we added 22 net-new headcount. Not because we were ignoring AI — because we were leaning into it. Here’s what actually happened:

We stood up a dedicated AI Operations team (8 people) to manage our LLM infrastructure, fine-tune models, build evaluation pipelines, and handle prompt versioning. These roles didn’t exist 18 months ago. We hired 5 AI Safety and Trust engineers because when you deploy AI-generated outputs to customers, you need people who understand hallucination detection, bias auditing, and compliance. We created a Developer Experience team (4 people) focused entirely on integrating AI tools into our engineering workflows — custom Copilot configurations, internal knowledge retrieval systems, automated code review pipelines.

None of these people would have been hired in 2023. AI created their roles.

The Layoffs Are Structural, Not AI-Driven

Let’s be real about what actually happened in 2021-2022. Tech companies went on an unprecedented hiring binge fueled by zero-interest-rate money and pandemic-era demand assumptions. Headcount at major companies grew 20-40% in two years. The 2023-2025 corrections are the hangover from that binge, not evidence that AI is replacing workers at scale.

When Intel cuts 33,900 jobs, that’s a manufacturing and strategic restructuring story. When Meta cuts Reality Labs, that’s a pivot from VR to AI. These aren’t cases of AI automating people out — they’re strategic bets that didn’t pay off.

You cited the stat that 95% of enterprise AI pilots haven’t delivered measurable ROI. I’d actually use that as evidence for my point: if AI isn’t delivering ROI yet, how can it be the primary driver of layoffs? The layoffs are about correcting over-hiring and improving margins for Wall Street. AI is the convenient narrative.

The Real Opportunity Is Being Missed

What frustrates me is that the “AI is coming for your job” narrative makes engineers freeze up instead of leaning in. I’ve seen talented people spend more time doom-scrolling layoff trackers than building AI skills. Meanwhile, the engineers on my team who embraced AI tools early are getting promoted faster, shipping more impactful work, and becoming indispensable.

The WEF report you cited also found that 77% of employers are investing in reskilling their workforce to work alongside AI, and 47% are planning to transition employees from declining roles into growing ones. That’s the story I wish got more airtime.

I’m not saying there’s zero displacement. I’m saying the displacement is concentrated in roles that were already overstaffed, and the growth is real for anyone willing to adapt. The question isn’t whether AI will change engineering — it already has. The question is whether we’ll lead with fear or with strategy.

I want to add the CFO’s perspective here, because the headcount decisions Luis is describing don’t originate in engineering leadership meetings — they originate in finance and board conversations. Understanding that pipeline changes the whole picture.

The Headcount-to-Revenue Ratio Is the Real Story

Every board deck I’ve seen in the past 18 months includes a slide on headcount efficiency — revenue per employee, or more specifically, engineering cost as a percentage of revenue. The benchmark used to be that engineering should run 15-20% of revenue for a mature SaaS company. Boards are now pushing for 10-15%, and they’re pointing at AI as the justification.

Here’s the math they’re running: if AI tools can increase developer productivity by even 20-30%, then a team of 100 engineers should deliver the output of 120-130. That means you either (a) deliver 20-30% more product, or (b) reduce headcount by 15-20% while maintaining current output. Guess which option boards prefer when growth is slowing?

This is what “do more with less” actually means in practice. It’s not a motivational slogan — it’s a financial model.

Why Boards Are Demanding Efficiency Gains Now

Three forces converging:

  1. Interest rates stayed higher for longer. The cost of capital is meaningfully higher than the ZIRP era. Every dollar of operating expense gets more scrutiny. Headcount is typically 70-80% of a tech company’s opex, which makes it the biggest lever.
  2. Public market multiples reward efficiency. Companies that demonstrate expanding margins are getting rewarded with higher revenue multiples. Investors aren’t paying for headcount growth anymore — they’re paying for efficient growth. Look at Meta’s stock performance after their “year of efficiency.”
  3. AI gives boards a credible narrative. Before AI, cutting 20% of engineering was seen as a red flag — you must be in trouble. Now, cutting 20% while saying “we’re investing in AI productivity” is seen as forward-thinking. AI provides political cover for financial decisions.

Frameworks for Justifying Headcount in an AI World

For engineering leaders trying to protect their teams, here’s what actually works in finance conversations:

Tie headcount to revenue outcomes, not activity. Don’t say “we need 6 engineers to build feature X.” Say “these 6 engineers will enable M in new ARR based on customer commitments.” Revenue math beats efficiency math.

Quantify AI productivity gains honestly. If your team is already using AI tools, show the actual productivity data — cycle time improvements, deployment frequency, defect reduction. If the gains are real, use them to argue for more ambitious roadmaps rather than fewer people.

Model the risk of under-investment. Boards respond to risk. Show them what happens to product velocity, technical debt, and customer satisfaction if you cut too deep. The companies that over-corrected in 2023 are now scrambling to re-hire, and that re-hiring is expensive.

The uncomfortable truth is that the 245K layoffs weren’t all rational, optimized decisions. Many were reactive, driven by peer pressure (“everyone else is cutting”) and board pressure (“your margins should look like theirs”). The companies that navigated this thoughtfully — investing in AI while retaining experienced talent — are the ones pulling ahead now.

I appreciate the leadership perspectives here, but I want to share what this looks like from the IC side — because the lived experience is different from the boardroom view.

The Anxiety Is Real and Rational

I’m a mid-career backend engineer, 8 years in. I’m good at what I do. I’ve architected distributed systems, debugged gnarly production incidents at 3am, mentored junior devs, and shipped features that moved revenue numbers. By every traditional measure, I should feel secure.

I don’t.

Every week I watch AI capabilities grow. Claude and GPT can now write code that would have taken me hours to produce. Cursor and Copilot autocomplete not just lines but entire functions. I read about companies replacing 10-person teams with 4 engineers plus AI tooling, and I do the math: am I one of the 4 who stays, or one of the 6 who goes?

The Harvard study Luis cited — junior employment dropping 9-10% within six quarters of AI adoption — hits close to home. I’m not junior anymore, but “mid-level” is starting to feel like it’s in the compression zone too. The data showing that only 7% of new hires at major tech companies are recent graduates (down from 9.3% in 2023) tells me the bottom of the ladder is being pulled up. How long before the middle rungs follow?

How I’m Future-Proofing (Not by Competing with AI)

I stopped trying to be faster than AI at writing code about a year ago. That’s a losing game. Instead, I’ve been deliberately building skills that AI genuinely can’t replicate — at least not yet:

System design under real constraints. AI can generate a system design diagram, but it can’t sit in a room with a product manager, a security lead, and a database team and negotiate the tradeoffs between consistency, availability, cost, and time-to-market. That’s a human skill that requires organizational context, political awareness, and judgment.

Stakeholder management. When a P0 incident happens and the CEO is asking for an update every 30 minutes, someone has to triage, communicate, and make hard calls about what to fix now vs. later. AI can help analyze logs, but it can’t own the incident response.

Debugging production systems at 3am. This is the one that gets underestimated. When something breaks in a way that doesn’t match any documented pattern — a race condition triggered by a specific sequence of events under load — you need someone who understands the full stack, the deployment history, and the organizational context of why that code exists. AI is a great assistant here, but it can’t own the problem.

Mentoring and team dynamics. Helping a struggling engineer find their confidence, navigating a difficult code review conversation, building consensus on architectural decisions — these are deeply human activities.

What I’d Ask Leadership

Keisha, I hear you that AI created new roles on your team, and that’s genuinely encouraging. But for every org standing up AI Ops teams, there are five quietly freezing hiring and letting attrition do the cutting. Carlos, your frameworks for justifying headcount are helpful — but they only work if engineering leaders actually use them in those finance meetings instead of just accepting the cuts.

What I’d ask of leaders: don’t just tell us to “lean in and adapt.” Give us the time, the training budget, and the psychological safety to do it. The engineers who are doom-scrolling layoff trackers instead of building AI skills aren’t doing it because they’re lazy — they’re doing it because they’re scared, and nobody in leadership is giving them a credible path forward.