Skills-Based Hiring Is Replacing Degree Requirements in 2026: Are We Fixing the Talent Pipeline or Quietly Lowering the Bar?

Something has been bugging me for the last two hiring cycles and I want to get this community’s honest take.

We just wrapped Q1 hiring at my EdTech company. For the first time, zero of our engineering job postings require a four-year degree. Not “preferred,” not “or equivalent experience”—just gone. And honestly? The results are forcing me to rethink assumptions I’ve held for 16 years.

The Numbers That Made Us Change

The macro trend is undeniable:

  • 53% of employers removed degree requirements in 2025, a 30% increase from 2024
  • 70% of employers now use skills-based hiring for entry-level roles (up from 65% last year per NACE)
  • 87% of hiring managers are shifting toward skills-based evaluation over rigid degree requirements
  • Over 65% of US mid-level job postings no longer strictly require a Bachelor’s degree

Google dropped degree requirements for most technical roles. IBM eliminated them for 50% of US positions. Apple, Netflix—the list keeps growing.

What We’re Actually Seeing

Here’s where it gets interesting. After 8 months of skills-first hiring:

The good:

  • Our candidate pipeline expanded roughly 6x for general roles and over 8x for AI-specific positions
  • Time-to-hire dropped significantly with structured skills assessments replacing credential screening
  • Two of our strongest hires this year came through coding bootcamps, not CS programs
  • Our retention improved—skills-based hires seem to stay longer (industry data suggests 25-34% higher retention)

The uncomfortable:

  • Our senior engineers initially pushed back hard. “We’re lowering the bar” was the literal phrase used in a hiring retro
  • Assessment design is way harder than just filtering for a degree. We spent 3 months building our technical evaluation framework
  • Some bootcamp grads needed more ramp-up time on systems design and CS fundamentals
  • We still don’t have great data on long-term performance (only 8 months in)

The Real Question

McKinsey’s research says skills-based hiring is 5x more predictive of job success than education alone. But I’ve also seen the Burning Glass Institute research showing that the gap between stated intentions and actual practice is massive—85% of companies claim to do skills-based hiring, but only 1 in 700 hires is actually affected by degree requirement removal.

So which is it? Are we genuinely expanding access to engineering careers for people who were arbitrarily gatekept by credential requirements? Or are some companies using “skills-based hiring” as cover for cost optimization—hiring cheaper talent and calling it inclusive?

I’m specifically curious about:

  1. If you’ve removed degree requirements, what did your assessment process actually look like? How do you evaluate systems thinking and architectural judgment without using a degree as a proxy?

  2. For those who think this is lowering standards—what’s the evidence? Is it the bootcamp grads specifically, or is it a broader concern about rigor?

  3. How do you handle the internal resistance from tenured engineers who feel like their CS degrees are being devalued?

This isn’t an academic question for me. I’m building the hiring playbook for 2026 H2 right now and I need to decide whether to double down on this approach or course-correct. The data looks promising but 8 months isn’t enough to know for sure.

Would love to hear from folks who are further along on this journey—or who tried it and reversed course.

Keisha, this one hits close to home and I appreciate the honest framing.

I’m a first-generation college graduate. My CS degree from UTEP was the thing that got my foot in the door at Intel back in 2008. So when skills-based hiring first came up in our leadership discussions, my initial reaction was complicated—part of me felt like the ladder I climbed was being pulled up behind me, but in reverse?

But then I looked at our actual hiring data at [financial services company] and the picture got real clear, real fast.

What We Found in Our Own Pipeline

We did an internal study last year: took 200 engineers across our org, correlated their performance reviews, promotion velocity, and incident response quality against their educational background. The correlation between degree prestige and engineering performance was statistically insignificant. Not low—insignificant.

What DID correlate strongly:

  • Years of hands-on experience with relevant technology stacks
  • Quality of work samples submitted during the interview process
  • Structured behavioral interview scores on collaboration and problem-solving

We now run a 3-stage assessment process:

  1. Take-home technical challenge (real-world problem, not leetcode—we give them a degraded microservice and ask them to debug and improve it)
  2. Live system design session (whiteboard an architecture for a real business scenario)
  3. Behavioral + values panel (cross-functional, includes a non-engineer from product)

The take-home alone filters more effectively than any degree requirement ever did.

The “Lowering Standards” Pushback

To your question about senior engineer resistance—I dealt with this directly. One of my staff engineers said, verbatim: “I spent 4 years and $80K on a CS degree. Are we saying that doesn’t matter?”

My response: “Your degree absolutely helped YOU. The question is whether requiring it for everyone screens out people who can do the job equally well.” That reframe helped. We also published our internal performance data (anonymized) so the team could see that degree vs no-degree wasn’t predictive.

The resistance faded when the data was transparent. Not with arguments—with numbers.

One Concern I Do Have

Where I think this conversation gets messy: we can’t pretend that assessment quality doesn’t vary wildly between companies. If your “skills-based hiring” is just removing the degree requirement and doing the same mediocre interview process, you WILL get worse outcomes. The degree was functioning as a lazy proxy for learning ability. If you remove the proxy, you need to replace it with something better—and most companies haven’t invested in that.

Building good assessments is expensive and hard. That’s the part that doesn’t show up in the 53%-removed-requirements headline.

This is one of those topics where the loudest voices tend to be at the extremes, so let me try to bring some nuance from 25 years of hiring across Microsoft, Twilio, startups, and now as CTO.

The Strategic View: This Is a Talent Market Problem, Not an Ideology Problem

Let’s be clear about WHY this is happening at scale. It’s not primarily because companies had an equity awakening (though some did). It’s because:

  1. The talent market is brutal. Time to fill developer roles is doubling in 2026. You literally cannot afford to filter out qualified candidates over a credential.
  2. AI is reshaping what “qualified” means. The skills needed are shifting so fast that a 2022 CS curriculum is already partially obsolete. Demonstrated ability to learn and adapt matters more than what you learned 4 years ago.
  3. The data supports it. IBM’s own analysis after removing degree requirements from 50% of roles showed better skills fit, improved diversity, AND longer retention.

What “Lowering Standards” Actually Looks Like

Here’s my unpopular opinion: the companies that ARE lowering standards are the ones that removed degree requirements without investing in better assessment. And that group is larger than we’d like to admit.

When the Burning Glass Institute found that only 1 in 700 hires is actually affected by degree removal—that tells me most companies are performatively removing the requirement while their hiring managers still filter for it informally. That’s the worst of both worlds: you get the PR of “skills-based hiring” without the pipeline benefits, and you create cynicism.

At my company, we invested heavily in structured assessment:

  • Every interview question maps to a specific competency rubric
  • Interviewers are calibrated quarterly (yes, we train interviewers)
  • We track first-year performance against interview scores and iterate the process

That’s expensive. Small companies may not be able to afford this level of rigor. And I think that’s a legitimate concern.

The CS Degree Question

I have a CS degree. It taught me things I still use: algorithmic thinking, complexity analysis, operating system internals. But it also taught me things that were irrelevant within 5 years of graduation.

The question isn’t “do CS degrees have value?” Of course they do. The question is: “is a CS degree the ONLY reliable signal of engineering capability?” And the answer to that, in 2026, is clearly no.

Bootcamp grads, self-taught engineers, career-switchers from physics or math—I’ve seen exceptional performers from all these backgrounds. I’ve also seen CS degree holders from top programs who couldn’t ship production code.

My Recommendation

@vp_eng_keisha—for your H2 playbook, I’d suggest:

  1. Keep skills-first but invest in assessment quality. Don’t just remove the degree filter; replace it with something more predictive.
  2. Track cohort performance rigorously. Compare 6-month and 12-month outcomes across hiring pathways. Share the data with your team.
  3. Build structured onboarding paths. If bootcamp grads need more systems design ramp-up, build that ramp. It’s cheaper than losing them or never hiring them.

The companies that win in 2026-2027 won’t be the ones with the strictest or most relaxed hiring criteria. They’ll be the ones with the most accurate assessment of actual capability.

OK so I have a perspective on this that’s maybe different from the leadership angles here.

I’m someone who pivoted careers. Started as a graphic designer, taught myself UX, eventually founded a startup (that failed, but that’s a different thread), and now I lead a design systems team. My educational background is in fine arts. Zero CS coursework. Zero.

And you know what? I’ve been on both sides of this equation.

When I Was the “Non-Traditional” Hire

Early in my engineering-adjacent career, I got rejected from SO many roles because of the degree filter. Not because I couldn’t do the work—I had a portfolio of shipped products, open source contributions, and side projects. But the ATS filtered me out before a human ever saw my application.

When I finally got hired at a company that evaluated my actual work instead of my credentials, I outperformed engineers with CS degrees on the same team. Not because I’m some genius—because I had spent years in the trenches actually building things while they’d spent years in classrooms theorizing about building things.

That said…

The Part Nobody Wants to Admit

There ARE gaps. I didn’t understand Big-O notation for years. I had to self-teach data structures when I hit scaling problems. I once wrote a quadratic algorithm for something that should have been linear and it took down a staging environment. A CS grad probably wouldn’t have made that mistake.

But here’s the thing—I learned from that mistake and never made it again. The question isn’t whether non-traditional hires have gaps. EVERYONE has gaps. The question is whether those gaps are learnable or fundamental.

Systems design? Learnable. Algorithmic thinking? Learnable. Work ethic, creativity, user empathy, ability to ship? Those are harder to teach, and they don’t correlate with where you went to school.

What Worries Me About the Current Trend

My real concern isn’t that companies are removing degree requirements. It’s that some companies are using “skills-based hiring” to:

  1. Pay less. “We hire based on skills, not credentials” can also mean “we don’t have to compete on salary because our hires don’t have the credential leverage to negotiate.”
  2. Avoid investing in talent development. Why fund a mentorship program or learning budget when you can just hire people who already have the exact skills you need right now?
  3. Optimize for replaceable parts. Skills-based hiring can become “hire for today’s tech stack, discard when the stack changes.”

If your skills-based hiring is genuinely about expanding access and evaluating people fairly—that’s amazing. If it’s about reducing headcount cost with a diversity veneer, people will figure it out fast.

My Advice

Build your assessment around work product and problem-solving process, not trivia. The best interview I ever did was a company that gave me a real (simplified) problem from their codebase and asked me to talk through how I’d approach it. No right answer—they wanted to see how I think.

That told them more in 45 minutes than my degree (or lack thereof) ever could.

Jumping in from the product/business side because I think there’s an angle missing from this thread.

The Talent Pool Math is the Real Story

Everyone’s debating quality vs access, but let me frame this as a market problem:

The engineering talent shortage is real and getting worse. Time to fill developer roles is projected to double in 2026. If you’re filtering out qualified candidates based on a credential that McKinsey says is 5x less predictive than skills assessment—you’re not maintaining standards. You’re voluntarily shrinking your addressable talent market during a supply crisis.

From a product leader’s perspective, that’s like saying “we only want customers who find us through organic search” during a growth phase. It’s not principled—it’s self-defeating.

The Business Case Numbers

Let me put some frameworks around what Keisha shared:

Cost of vacancy: An unfilled senior engineering role costs a typical company $1,000-$2,500/day in delayed features, overloaded teams, and missed deadlines. If skills-based hiring cuts your time-to-hire by even 30%, the ROI is massive.

Pipeline expansion: Going from 1x to 6x candidate pipeline doesn’t mean you hire 6x more people. It means you have 6x more candidates to choose from, which means you can be MORE selective on the dimensions that actually matter. Skills-based hiring, done right, should RAISE your effective bar, not lower it.

Retention economics: If skills-based hires stay 25-34% longer (as the data suggests), the compound savings on recruiting, onboarding, and knowledge transfer are substantial. A 30% reduction in annual turnover for a 50-person engineering team is worth $500K-$1M annually depending on your cost structure.

Where I Push Back

That said, I want to challenge something from @maya_builds’ post. The concern about companies using skills-based hiring to “pay less” is real, but I’d argue it’s a separate problem. Bad-faith actors will exploit any hiring model. The solution isn’t to keep degree requirements as a negotiation floor—it’s to build transparent compensation bands tied to role levels and performance, not credentials.

At my company, we publish salary bands internally. A Level 3 engineer makes the same range whether they have a PhD or a bootcamp certificate. The degree doesn’t factor into comp. The work does.

The Assessment Investment Question

@cto_michelle raised the right point about assessment quality being expensive. But here’s how I’d think about that investment:

  • Cost of building strong skills assessment: Maybe $50-100K in engineering and recruiting time upfront, plus ongoing calibration
  • Cost of one bad hire: $150-300K (6 months salary + recruiting + ramp + team impact)
  • Cost of one missed great hire (filtered by degree requirement): Impossible to measure, but in a talent-scarce market, potentially far higher

The math works. The investment in assessment quality pays for itself with the first avoided bad hire or first great non-traditional hire you would have otherwise missed.

My take: double down on skills-based hiring for H2, Keisha. But pair it with the assessment rigor Michelle described. The companies that figure this out first have a structural talent advantage that compounds over time.