Three months ago, I watched an engineer report "100% completion" on his weekly check-in while his actual work sat broken in production. That disconnect taught me everything about why traditional onboarding fails—and what actually works.
The Expensive Mistake We Used to Make
We used to approach onboarding the standard way: laptop setup, codebase walkthrough, starter tasks. Encourage people to contribute quickly, then move on to other priorities.
Months later, we'd discover someone couldn't scale with complexity, created friction with the team, or had fundamentally different standards around quality. The mismatch was expensive—not just their salary, but the team velocity lost while trying to make it work.
That's when I realized those first 90 days aren't just about productivity. They're about prediction.
What 90 Days of Real Work Actually Shows
When someone joins your team, they're on their best behavior. But authentic patterns emerge over three months of actual deadlines, honest feedback, and increasing complexity.
The engineer who becomes faster at similar tasks will continue to accelerate. Those who remain at the same level after 12 weeks will likely remain stuck indefinitely. The person who claims everything is fine while missing deadlines will never develop accurate self-assessment.
These patterns don't change. We've never seen someone who plateaued during onboarding suddenly become a growth-oriented performer six months later.
The Wake-Up Call
This year, an engineer passed our interview process with flying colors, boasting a strong technical background, excellent communication skills, and solid references. Six weeks later, I'm getting messages from his project manager about broken features and missed deadlines.
However, what confused me was that every week, this person would fill out our health check, stating everything was fine. "Tasks completed," "no blockers," "on track." Meanwhile, his actual output told a completely different story.
That disconnect was my lightbulb moment. We had all these feedback mechanisms—buddy system, manager check-ins, self-assessments—but we weren't connecting the dots. The patterns were evident from week one, but it took two months to become clear.
The question wasn't whether this person could do the work; it was whether they could do it well. It was whether our onboarding process could predict who would succeed before we invested months of team time finding out.
The System That Emerged
We started treating those 90 days as a deliberate evaluation, not just information transfer. Instead of hoping problems wouldn't surface, we created multiple ways to spot patterns early.
The buddy system became more intentional. Instead of informal check-ins, we assigned experienced engineers to have structured daily meetings for the first two weeks, followed by weekly meetings thereafter. They're not the person's manager—that would create weird power dynamics. They serve as a cultural guide and a safe sounding board.
The questions became predictable: What did you work on yesterday? What's the plan for today? Any blockers or confusion? How does this compare to your previous experience?
Over time, these conversations reveal authentic patterns of integration. Are questions getting more sophisticated? Is curiosity about the business growing? Do they seek help appropriately, or do they struggle in isolation?
Project manager feedback became systematic. Instead of waiting for monthly reviews, PMs started giving structured feedback every couple of days during onboarding. Simple stuff: How's the collaboration? Are deadlines realistic and met? How's the quality of deliverables?
This catches execution problems before they become embedded habits. Some people consistently underestimate tasks but improve their calibration over weeks. Others stay wildly optimistic about their own capacity.
The weekly health checks got more useful. We were already asking people to self-assess their progress, but we started comparing those responses with what everyone else was observing. The gaps became predictive.
The engineer reports high confidence, while the project manager sees struggles. That's a self-calibration problem that rarely fixes itself.
My own weekly 1:1s became feedback laboratories. Instead of casual check-ins, I began providing specific, actionable feedback each week and tracking whether it led to behavior change.
The people who implement feedback quickly and ask follow-up questions are usually keepers. Those who nod politely but repeat the same patterns won't work out.
The Patterns That Never Lie
After running dozens of people through this system, clear patterns emerged.
People who thrive in the long term tend to show acceleration during onboarding. They get noticeably faster at similar tasks. Their questions evolve from "How do I do this?" to "Why do we do it this way?" to "Should we consider doing it differently?"
They make mistakes but don't repeat them. They ask for help but become less dependent over time. Their self-assessment aligns with team observations—they know when they're struggling and communicate about it proactively.
People who struggle tend to exhibit plateau patterns early. Task completion times don't improve. Question quality stays static. Feedback doesn't translate into behavior change.
Most telling is the persistent disconnect between how they perceive their performance and how others perceive it. This self-calibration gap rarely closes.
Cultural fit reveals itself through work, not social interaction. Someone can be friendly and likable, but fundamentally misaligned with how you approach problems, quality, and accountability.
Our five core values—stay a student, prioritize helping others, accountability, raise the bar, keep trying—become visible through daily work patterns, not team lunch conversations.
Month by Month Reality
Month one is about learning velocity. How quickly do they absorb your development practices, understand project context, and start contributing meaningfully? We expect the first pull request within a week and multiple meaningful contributions by week four.
The speed isn't what matters—it's the trajectory. Are they accelerating or staying stuck at the same level?
Month two tests execution under decreasing guidance. We gradually reduce scaffolding while increasing complexity. Can they estimate tasks accurately? Do they communicate blockers proactively? How do they handle feedback?
This is where fundamental work patterns become clear. Some people need more guidance indefinitely. Others naturally transition to independence.
Month three reveals strategic potential. Can they contribute to architectural discussions? Do they identify potential issues before they become problems? Can they operate with minimal oversight while maintaining quality?
By day 90, we know whether someone can scale with increasing responsibility or will always need close management.
When All Signals Align
The failed hire I mentioned earlier showed concerning patterns across all channels. Buddy observed poor question quality and tendencies towards isolation. The project manager documented execution problems and issues with deadlines. Health checks revealed a self-assessment disconnect. Weekly feedback sessions showed resistance to behavior change.
When multiple observation methods point in the same direction, the signal becomes unmistakable.
Compare that to successful hires where everything aligns positively. Buddy reports growing engagement and sophisticated questions. The PM sees improvements in execution and proactive communication. Self-assessment matches team observations. Feedback translates into immediate behavior adjustment.
These people almost always succeed long-term.
The Investment That Pays for Itself
This structured approach takes more upfront effort than traditional onboarding. Daily buddy meetings, systematic PM feedback, weekly manager sessions, careful documentation—it adds up.
But the return on investment is massive. We essentially never have hiring regrets anymore. People who pass our 90-day evaluation rarely need performance management later—no expensive course corrections or awkward team dynamics.
The engineer who reports accurate self-assessment during onboarding continues to self-manage effectively for years. The individual who demonstrates growth acceleration during trials continues to accelerate as responsibilities increase.
Most importantly, teams know that new additions will strengthen rather than burden them. That confidence changes everything about how complex projects get staffed and how quickly new people get integrated into critical work.
What Actually Matters
Traditional onboarding optimizes for speed to productivity. Our approach optimizes for the accuracy of long-term fit.
The difference is profound. Instead of hoping someone will work out, we know within 90 days. Instead of managing performance problems later, we prevent them upfront.
The engineers who thrive in our systematic evaluation process continue to excel for years afterward. The patterns established in those first three months—growth mindset, accurate self-assessment, feedback integration—become the foundation for everything that follows.
Your onboarding process is already making predictions about who will succeed. The question is whether those predictions are accurate.
What patterns do you notice in the first 90 days with new engineers? What early signals predict long-term success in your experience?
If this approach resonates, consider sharing it with another engineering leader who is facing hiring accuracy challenges. Better teams start with better prediction systems.
Subscribe for weekly insights from the trenches of engineering leadership: fundamental problems, real solutions, no frameworks.