Interviewing is a Debacle

Evaluation of Candidates

In a couple different software roles, I’ve been able to give input on candidates and whether they would be successful in the role and I have to say, hiring frequently feels like a guessing game full of luck.

Does scoring candidates on their knowledge of C++, Python, or Kubernetes actually determine whether they’ll be successful in the role, especially when constrained in a 1 hour interview slot. I’d like to think that it does because it protects my own ego from realizing I have no evaluation capabilities, but I just don’t know. There’s a lot more than tech skills that goes into being a successful employee, and even more than just knowing language agnostic data structures and algorithms. Just because somebody forgot how pointers work in C++ or can’t remember Python syntax in an interview setting doesn’t mean they wouldn’t be capable of looking info. up on the job.

Does system design knowledge give an indicator of success for more senior members? System design, at least in more complicated software/hardware projects, is telling of past experiences but doesn’t necessarily define how somebody can systematically break down a problem on an everyday work basis. Just because you can map out features at a high level doesn’t mean you know how they’d actually be integrated and maintained at lower levels. Anybody who has done an ounce of preparation could most likely regurgitate some framework that can apply to the problem, but doesn’t take into account creativity, innovation, practicality, resource allocation, etc.

Is STAR behaviorals the answer then? The common complaint is it leads to “gut feelings”, selecting only members who you feel fit a certain mold or similar to what you’ve worked with before, or candidates over exaggerating their experience with no hard skills to back up their claims. Anybody can claim they’ve done X, Y, and Z but you’ll never truly know their capabilities until you work with them.

Having been on both the giving and receiving end of interviews, I have to say the most important factor after doing due diligence preparation feels like luck. Did the interview have a bad day and doesn’t like your particular answer? Does he/she not like your background? Does the interviewer think if you can’t code his/her DS&A problem how they pictured you’re incompetent (that he/she may or may not be able to code themself)? Hiring systemically, outside of behavioral answers, was started by large corporations doing massive-scale hiring like Google and Microsoft to reduce bias and make interviewing equitable, but does not mean it’s the best fit for smaller more niche companies or even the tech giants anymore.

Should I Draw Out of a Hat?

In an attempt to roll out universal best hiring practices, it left behind the most important piece: the uniqueness of each candidate and their past experiences. Behaviorals are scripted. Candidates are cramming data structures and algorithms leading up to an interview. System design questions are generic and to get sufficient marks, simply study the proper frameworks and know them well enough to break the problem down. As job postings advertise they’re looking for technical rockstars, out-of-the-box thinkers, “gifted” people, and quirky perceptiveness, the interview process screens for the exact opposite. Instead of looking to see what makes each candidate unique, the interview process is looking for gotchas to ding points off candidates and make the decisions easier. It really can feel like taking a standardized test instead of a mutual discussion about how you can add value based on your experiences.

Obviously, there’s no 100% effective way to ascertain what each candidate brings to the table from short interviews but we’re only fooling ourselves if we think there’s a formulaic way to fine-tune hiring practices. Is it better to double down on a formulaic approach or can common sense win over?

While the perfect hiring process is still a work-in-practice, here’s some tips I would like to propose:

  1. Only let principal/staff interviewers conduct technical interviews. While principal/staff interviewers can still let their personal biases get in the way or let their ego show, they’ve hopefully had more emotional experience to not let their emotions get in the way and to look for better signals they’ve noticed in candidates from their past experiences. This would reduce the number of senior/junior engineers asking unreasonably difficult problems to boost their own egos, and many principal/staff engineers were around before leetcode-style interviews existed.
  2. Limit data structures and algorithms interviews to 1-2 rounds max. At a certain point, you’re testing more than whether the candidate has basic programming experience and transcend into testing how much they studied.
  3. Come up with language agnostic, pair programming sessions with the interviewer. Working through a problem together can give a good sense of what it might be like to work with each other in an everyday work setting.

These are only a couple tips that I think could immensely help the process. I think the interview process will always have bias built into it but I think many of the interview problems can be attributed to interviewers not properly trained to look for the right signals. I know sometimes I’ve had no idea what to make of candidates past how they answered my direction questions, and I think more senior members can sometimes sniff out better what unique experiences look like and how they could be to contribute to the current state of the code and system design.

I’ve heard Google used to test candidates based on questions like how many marbles are in a 737 to see how people can think on their feet and break down a generic problem. I see this as a weak proxy at best for gauging neurodiverse thinking. Since the evolution of DS&A questions, it has become an arms race to prepare for interviews. Smart candidates with great fundamentals can get by with little practice. Other candidates who could be equally as successful in the role might grind problems for months to feel confident in their problem solving abilities. While I see DS&A dummy problems as a step up, I think this process will be revised in the coming years.

The Future of Interviews

It’s so hard to foresee a completely better method of interviews when you see why they are the way they are and the pros/cons of each practice. I would love to say I have the answer to all of the drawbacks, but I think any system creates as many problems as it solves. The tips I provided should hopefully make the process a little more smooth, but to evolve the hiring process involves a radical new way of conducting them, something not being seen by the candidates being interviewed today.

Leave a comment

Your email address will not be published. Required fields are marked *