Two years ago, our edtech startup made a deliberate shift: we eliminated degree requirements and moved to skills-based hiring. It was transformative. We started finding incredible engineering talent from bootcamps, self-taught developers, and career-switchers who would have been filtered out by traditional resume screens.
The quality of our technical hires improved dramatically. Fewer false negatives. More diverse backgrounds. People who could actually ship code rather than just talk about algorithms they memorized for interviews.
But six months in, I started noticing a pattern that troubled me.
The Pattern Nobody Warned Me About
Some of our most technically brilliant engineers—people who aced our coding challenges, who demonstrated deep technical knowledge, who had impressive portfolios—were struggling. Not with the code. With remote work itself.
I’m not talking about Zoom fatigue or missing office snacks. I’m talking about fundamental work patterns:
-
Waiting for permission instead of moving forward: Engineers who could solve complex algorithmic problems but got blocked waiting for clarification on simple product questions that could be resolved with reasonable assumptions.
-
Synchronous dependency in an async world: Brilliant engineers who needed immediate feedback on every decision, turning our async-first culture into an all-day Slack conversation.
-
Inability to work through ambiguity: Strong technical contributors who excelled when given crisp requirements but floundered during discovery phases when we were figuring out what to build.
One example that crystallized this for me: We hired an engineer—let’s call him Alex—who crushed our technical assessment. Literally top 5% of all candidates we’d seen. Strong algorithmic thinking, clean code, great architectural instincts.
But Alex struggled in our remote environment. He needed near-constant check-ins. He’d send a Slack message, then wait hours for a response instead of making a judgment call. He’d get stuck on ambiguous product requirements rather than documenting assumptions and moving forward.
In an office, Alex would have been fine. He could have walked over to my desk or grabbed a product manager for a quick conversation. But in our distributed, async-heavy team, these collaboration patterns created bottlenecks.
We Were Evaluating Half the Picture
Here’s what I realized: We had optimized our hiring process to identify technical skills, but we hadn’t validated remote work competencies at all.
The skills that make someone successful remotely—self-direction, comfort with ambiguity, proactive communication, documentation discipline, autonomous decision-making—are completely orthogonal to technical ability.
They’re not “soft skills.” They’re critical work skills. And we weren’t assessing them.
According to recent data, 36% of job openings now include remote or hybrid options, and remote hiring is 29% faster for technical roles. The market has adapted to remote work. But I’m not convinced we’ve adapted our hiring criteria to match.
What I’m Trying Now
I’ve started building a dual evaluation framework:
- Technical proficiency (what we were already doing well)
- Remote work readiness (what we were missing)
For remote readiness, I’m looking at:
- Past autonomous work: Have they built side projects? Contributed to open source? Worked independently before?
- Communication patterns in interviews: Do they ask clarifying questions asynchronously during take-home projects? Or do they immediately jump on a call?
- Comfort with ambiguity: During case studies, do they document assumptions and move forward? Or wait for perfect information?
- Evidence of self-unblocking: Can they describe times they were stuck and figured it out themselves?
This isn’t about filtering out people who need support—good remote organizations should provide structure. But there’s a baseline level of self-direction that remote work requires, and I don’t think we can hire our way around it with better onboarding alone.
The Question That Keeps Me Up
Are we selecting for technical skills while accidentally filtering against the collaboration patterns that make remote teams successful?
I’d love to hear from other engineering leaders: How do you evaluate remote work competencies during hiring? Have you seen similar patterns? And critically—how do you do this without it becoming a subjective “culture fit” filter that replicates bias?
Because the shift to skills-based hiring was a huge step forward for equity and access. I don’t want to undo that progress. But I also can’t ignore that some technically strong engineers struggle in remote environments, and I owe it to both them and the team to get this right.
What are you seeing in your organizations?
Sources: