A rather tired conversation in my opinion. Everybody likes to complain about interviews, but every single alternative I've heard is even worse:
Take-homes are biased toward those willing to work hours for free, a "track record" is biased toward those who inflate their resume, a casual conversation is biased toward the likable and native english speakers, high-level whiteboarding is biased towards people who are good at whiteboarding
Recently I started trying something new for our interviews. I start with a phone interview that is more about discussing their background, expertise, what they need to feel productive, and explain the role. If I think they’d be a good culture and expertise fit, I send them something like this (I’ve made a few): https://gist.github.com/tywalch/cf60e4c942ab6a80e481f4ba6754....
It’s a small “api service” that represents a POC with some bugs and bad practices. I tell them to look over it, write down notes on how to make it production ready, setup a time to discuss their thoughts in a second call. No code writing necessary.
It ends up being an excellent Rorschach test where folks can really show/express their unique skill sets and experience. I ultimately let them drive the conversation and it gives me great insights into what experiences have shaped their career and what it might be like to collaborate with them. Best of all it does resemble what some of their day to day might look like better than hand crafting some tree algorithm.
I had an interview a few years ago where they literally hauled out some of their legacy code (unstructured PHP written in the PHP 4 era), and asked me to make a list of what I'd do to clean it up and make it more manageable. No coding involved.
It was a very pleasant experience from this side too and allowed me to express all of my domain knowledge without making a significant investment. This is such a great approach IMO.
the only way it could possibly go away for good is database libraries that don’t support dynamic SQL. But you almost always need the flexibility at some point, and SQL is so powerful that taking it off the table is hard to justify.
Yes, there's lots of ways to fix SQL injection. I'm more surprised that after knowing about this vulnerability for 20+ years, its STILL lost on many developers. You and I pick up on it immediately, but there's (apparently) a lot of developers who think the code is "good enough, works for me" and ship it to production.
So it needs to be asked for in interviews, and if you find it and discuss it you're apparently one of the "better" programmers. The bar for being a passable programmer is literally on the floor.
That’s cool, I’d actually enjoy getting this instead of being asked to build yet another production ready app in “two hours but you can spend more if you like hint hint and then we might even get back to you”
I disagree. This taps into a lot of technical know-how and communication skills.
It’s a well-rounded examination of how someone can contribute to the development of a product. That such a well-rounded assessment happens to include examining your likability or “other emotional response” is just a byproduct of testing how well someone interfaces with other humans - part of the job almost always.
The grandposter actually doesn't say "social skills" are worthless. You're kinda putting words into their mouth. They only claim that that type of interview is biased for one factor.
"Culture fit" is one of those things that seems to be used as a proxy for traits that are illegal to take into consideration during the hiring process.
At its worst case, sure. But it also means, can I work with this person? Can we be productive together? Are they picking up what I and the rest of the team will be throwing down?
Everyone has had to do a terrible group project in school where the group dynamic was just not there and the whole thing suffered, despite all the smarts being at the table. This is to attempt to avoid that exact same thing from happening, because often you just don't have the time to flub around telling someone how to document something properly for the 10th time.
We recently spent a long time filling a position. We're a relatively small team, where it is paramount that one works well with the rest of the team.
That doesn't mean you got to fit some tight social profile, far from it.
But since we're a small team each person has a lot of responsibility, and so we need to trust the right decisions are made for the right reasons, that communication won't be an issue, that you can handle dealing with customers for projects and troubleshooting etc.
We don't need the best skilled coders. We need developers who're good at finding solutions to our customers problems, within the constraints of us being a small team with limited resources (time most of all). We need someone who's capable of learning new technologies as the needs arise, and we need someone who can communicate well within the team and with our customers.
When I got hired, my actual coding skill wasn't really a topic. I didn't get a single programming quiz or similar question. They were far more interested in my background, what sort of projects I had been working on, what motivated me etc.
I can understand this concern though I can assure you when I say "culture fit" I'm considering it a two way street.
A good example: I had an interview earlier this week for a full-stack role and the inteviewee had recently come off a two year project that was heavy on react. The project he'd be on has a jquery frontend. I told him candidly that the project would likely never make a "modern" refactor a priority, and asked him if he could still be happy in a role that used jquery.
Our full-time team is small and ensuring we can all collaborate and work together is very important for us to be effective. On the flip side we try to be transparent about what our team looks like too and give you a chance to decide if you could be locked in a room, hashing out which circle talks to which on a whiteboard, with us.
I've found asking people to write a small program works pretty well. Like ask someone to build a command line app that arranges your mp3 files in folders of their artist's name. You can even ask it to be whiteboarded, or coded in a notepad, no code execution or IDE.
I found that to quickly tell you if someone can write programs. Surprisingly (and maybe not surprisingly), a lot of people can do difficult leetcode questions, but can't write a small app, or can't do it well.
It'll quickly show you about how the candidate can come up with some level of UX, handle OSish level stuff, how they structure and think about their code and readability, how they model data, and depending on the app you ask them to write, it could include a little algorithm in it as well.
That kind of UX experience, OSish or design small app whatever can be learned very quick on job. However Leetcode skill is not -- you have to devote 2~3 hours per day to grind and practice.
Yes, it's extremely tired. FAANG companies are not dumb, they hire at scale and know how to find talented engineers using a rigorous/repeatable interview process. I think most of the complaints come from people who fail these interviews (I used to be one), and don't want to admit that maybe they're not good enough.
There is an abundance of people more than “good enough” for roles at FAANGs. They may simply not be stellar at the examination process used by FAANGs to assess candidates.
As much as many folks at those companies might want to partake in some fart-huffing and call that a predictor of engineering excellence, it’s not. It’s an optimization to shovel candidates at scale, and intentionally throws out ton of exceptional and even over-qualified individuals to instead hire a predictable style of engineer with as little investment as possible.
I think OP meant "not good enough" to pass the interview, not to perform on the job.
Saying that a certain $PREDICTOR is bad in itself doesn't say much, in particular it doesn't say whether there's a better one available under the same constraints. Widespread leetcode interviews is just an indication that large companies have not found a better predictor yet (again, under the same constraints with respect to time, budget, etc.).
> FAANG companies are not dumb, they hire at scale and know how to find talented engineers using a rigorous/repeatable interview process.
Eh... maybe some do. My experience is that they hire people they already know. In fact that's how I got hired at a FAANG company. I had applied there many times and been rejected. Then I saw that a team where I was a great fit had hired someone I knew and had an opening. So rather than submit my resume through their job board, I sent it to him and he passed it on to the hiring manager. Most of the rest of the team got hired because they were all a team at another company that got cut. One of them got an interview and said, "Hey, I have about 20 coworkers who just lost their jobs and already know all the stuff you need for that new product you want to write. Why not hire them all?" And they did.
Sure, there are a few who went through the regular recruiting path. But many did not. I'd say on my team, most did not. (And they're all good engineers, too.)
It takes a talented bullshitter to convince themselves that a 30 minute whiteboarding session can determine anything of worth about a person’s first month at a company, let alone year or two. “We only hire the best” is just marketing.
Well, that's why the standard interview loop consists of at least 5-6 45 minute sessions that intentionally do not overlap in subject matter done by separate people. If the signal is not strong enough, then more interviews are done. Works pretty well I think.
Do you have any empirical data demonstrating that this “works well” or are you just assuming? What is your definition of “works well?” Low false positive? Low false negative?
Empirical data that it works pretty well? I think the fact that this has been the de facto way of interviewing at the top tech companies during the time that tech has absolutely exploded in ways never seen before, I'd say yeah, it's working pretty well. I think you're seeking an explanation where one is not owed. If it wasn't working well, it would be changed, just like the brain teaser questions of the early 2000s pioneered by Google.
And look, I get it, it's frustrating. I share in that frustration as someone that wants to switch companies but knows that I will have to put in the effort outside of work to prepare for that. But I still think the process is much more meritocratic and predictable than virtually every other industry.
> I think you're seeking an explanation where one is not owed.
Forgive my curiosity and desire to make things better.
> If it wasn't working well, it would be changed, just like the brain teaser questions of the early 2000s pioneered by Google.
This is a gigantic assumption, though. Tech has been exploding for a long time and basically all companies are ravenous for engineering talent. It’s not at all clear to me that this interviewing process is the cause of this growth or in spite of it. If I had to guess I would bet they’re wholly unrelated.
> And look, I get it, it's frustrating. I share in that frustration as someone that wants to switch companies but knows that I will have to put in the effort outside of work to prepare for that.
I have no problem passing these interviews, I just think they’re silly. Their prevalence in the industry frustrates me because I think they give an unrealistic portrayal of what’s important in software engineering. But then again my greatest area of expertise is legacy software maintenance, which is the work that nobody else wants to do.
> But I still think the process is much more meritocratic and predictable than virtually every other industry.
That’s an extremely low bar.
My favorite approach that I’ve seen is Latacora’s hiring process, in which they give you a work sample problem that can be graded quantitatively in stages as well as any prep materials that you need for it.
The FAANG interview process is designed to allow people who are bad at interviewing to assess candidates in a way that's replicable. It's designed to be a flowchart that you can't screw up.
The point is to remove wiggle room. It's a domain-specific solution. Note e.g. DeepMind don't hire the same way.
Their number one problem is automating the beurocracy at scale with people who are fundamentally not that good at evaluating other candidates, or more likely don't care if they make a mistake. Everything else is downstream from that. Copying them without understanding their objectives is a mistake.
I completely agreed with you. People who hate Leetcode, I am wondering, are those who refuse or lazy to devote hours per day to practice.
Because of covid-19 there are overwhelmed programmers, newbie and veteran on the job market to compete with fewer programming openings. Companies have reasons to be extremely picky, and job seekers either to comply with the game rule (Leetcode) or consider to change career.
And those who have job also need to consider to regularly practice Leetcode (one or two questions per day or every other day) to stay employable.
There was a time where you would just hire someone and fire them if it wasn't working out. Why make the barrier to entry so high and prone to bias when you can always just let people go who aren't doing the work?
Yes, I've worked at places where that was the approach. Problem is if you end up doing it very often it costs you a lot of time and you do have to eat some inital payroll and administrative costs. Plus out of respect for the candidate, they've quit their prior job, maybe moved and signed a new lease, and now they are unemployed 30 or 60 days later.
You really need to have some basis for confidence that the candidate is a good fit before you put yourself and them through that. Even for entry level jobs where what they know coming in isn't too important, you want smart people who can get along with the team and learn.
I've paid candidates and been paid by potential employers to complete take homes. I've also given candidates the opportunity to substitute a piece of open source code in place.
This is admittedly still biased towards people who have available time, paid or unpaid, but I think it's a step in the right direction.
What if there were a certification authority that could claim that professionals are, indeed, competent in X, Y, Z within software? And, of course, that authority would need to be recognized by A, B, C companies. That'd certainly skip the endless, often mind-numbing, repetitiveness of technical interviews (especially for those with many years of experience) and can jump straight to culture/team/behavior fit.
This is basically what all engineering disciplines do and it's why I take "software engineer" with a massive grain of salt.
I have several certifications from Microsoft and Google and, for the most part, mean jack to an interviewer. I am still forced to leetcode and answer mind-numbing questions.
On a side note, of all the choices I prefer a take-home, as long as it is timeboxed to no more than 1-2 hours.
It never is one or two hours though. It might be if you designed the thing and know exactly how to get there and handle all the corner cases. Oh and also we always underestimate.
I have yet to see a 1–2h take home test actually be doable in that amount of time. It’s more like 8–10h to get going, have something meaningful, and code that isn’t inscrutable.
It can be if they give you something to start with.
One company gave out sample code and asked me to optimize it so it ran under 5 seconds. The exercise was in parallelizing or caching/reusing what you could per the requirements.
The timebox has to be enforced on the company side, meaning that you have e.g. 2h to submit your answer once you've opened their link (which you should be free to open at any time, so that you're sure you have the time allotted).
I think the <textarea> problem can be solved if the interviewer gives enough information so that the interviewee can have their IDE working correctly by the time they get started.
The pressure problem can be solved by giving more time than what would be expected in a work setting. If you expect a task to be completed in 1h, just give 2h. Of course that implies that you don't give a task which would take an expected 6h to complete.
Of course these are things to consider from the company side, if they think they want to hire people who can program under high pressure, they should probably keep that as part of the interview, for both the company's and interviewee's sake.
i would agree a universally accepted accreditation organization, like the American Medical Association, can make interviewing for technical competency easy.
But like with AMA, they would (and does) gatekeep. What kind of policy can the organization be instituted with so that gatekeeping is impossible?
Doctors _should_ be a dime a dozen. They should not cost so much - because their services are valuable and therefore, more of them should be produced to make medical costs lower.
Software engineers, if they followed the AMA model, would cost $500k in student loans to train, and there's be barely any graduates to fill the demand.
> Doctors _should_ be a dime a dozen. They should not cost so much - because their services are valuable and therefore, more of them should be produced to make medical costs lower.
Physician salaries make up about 8% of medical costs. And even if you doubled medical school class size, you wouldn’t get 2x the doctors because there aren’t enough resident slots available.
And even if you had enough slots available, you’d still need to guarantee high salaries afterwards. Most people who are smart/driven enough to be doctors aren’t going to work 80-100 hours in a high stress position for 3 years if they are only going to come out making $80k.
So the only way to drastically increases (3x,4x) the supply of doctors is to drastically decrease the standard of training.
So you reduce residency down to 1 year and you triple the number of doctors. Now you’ve reduced the quality of doctors, and you save people less 5% in healthcare costs, but no one even notices because that number gets drowned out by the continued rise in the other 92% of healthcare costs.
You're assuming the bugs are required. Residency as it is now is a terrible system that had no need to exist in its life destroying form. The industry had already figured this out and is moving work from doctors to nurses and PAs and offshore radiologists who don't need a residency.
Moving work to lower level practitioners is functionally equivalent to reducing physician requirements.
Of course companies are going to do anything they can to reduce prices. And of course the industry is going to put out flattering white papers in support of their cost cost cutting measures. Yet those reduced costs haven't been passed on to consumers or measurably improved medical care, so the argument merely that the industry is doing something is a poor argument in favor of it from a consumer/public health/societal standpoint.
Some companies love to outsource labor to countries with relaxed labor laws. That isn't a convincing argument against the need for labor regulations.
Maybe because building a bridge is a very standardized process that lives through regulation. Maybe software will end up there some day, hopefully a day by which I am retired.
It may make sense to do this for certain disciplines, like software tailored for other engineering disciplines (think medical grade software, airplane/car control software, etc.).
However for the broad mass of software projects, certification makes no sense and is in fact much worse than what we have now. Do you think certification can't be gamed? It will usually be much easier to game that one since certification by definition is meant to be achievable by a broad mass of people. You don't build a certification program for elites.
Professional leetcode programmer? Sure. So let's count the facts we have at the end of the interview:
* You can code yourself through a number of problems, answer follow-up questions that dig into random specifics of your problem, slightly modify the problem, etc. You can demonstrate structured problem solving, ad-hoc clean code and you make an overall good impression in that performance.
* You are able to demonstrate the necessary experience through targeted behavioural questions that probe deeply into specific aspects of your job history and alignment with the company values
* You are able to design a complex software system and are able to answer unknown follow ups and able to follow drill-downs into random aspects of your design.
Congrats, you are a professional software developer. Can you game the leetcode part? Potentially, however I met several "leetcoders", some of which were even kind enough to put that on their CV... They usually fail quite spectacularly at random follow ups and modification to the problem statement. But even if you are able to somehow "wing it" convincingly, you need to succeed several times and you need to still demonstrate the adequate proficiency in the other two pillars (behaviour and software design).
What we might need is an "interviewer" certification.
I don't have statistics in front of me, but I'm fairly certain that the vast majority of people with "real" engineering degrees aren't Professional Engineers (which is what I assume you're referring to). On top of that, there was a SWE PE offered in the US for several years. NSPE stopped offering it for lack of interest.
I can leetcode hours for free or I can do small projects for free and usually learn a new thing while doing it instead of trying to memorize something about a binary tree.
What about a pair-programming session? Even having failed some pair-programming interviews, I've felt at least like it was a fair assessment. It's an equal time investment between myself and the company, working on something pertinent to what I would be doing, in a realistic environment.
Contrast to whiteboard interviews where I've been assessed on how to calculate the number of moves of a knight on a chessboard given position X to position Y on a whiteboard in front of three people as a front-end developer. Contrast to take-home interviews which basically function as a way for the interviewing company to save on time and money by putting the whole time-investment on the interviewee.
a "track record" is biased toward those who inflate their resume
Could you explain this a bit? I'd have thought this is the only real way to judge someone. People will lie but figuring that out is part of the process.
Other orgs don't have recruiting problems that FAANGs do. They can do better than imitating FAANG processes. They can afford to go wrong and self correct more. They can find a strategy that works for them which can combine whiteboarding, pairing, casual conversations, references, real life problems, letting the candidate showcase their work.
The problem isn't that they don't work and they gave up, problem is they aren't trying.
I like take-home as a filter for candidate from institutions that have low signal to noise ratio (getting hundreds of applicants from "some technical institute" and then have a sub 5% hire rate).
But I know from college applicants that take-home end up at the bottom of the pile in terms of priority vs in-person interview with a real engineer. Unless it's FAANG of course.
This shouldn't be downvoted, it's exactly right. For every "do this instead" I've ever seen in articles bemoaning the state of tech interviewing, I've seen a dozen people attacking that other thing just as vehemently. There's nothing even remotely approaching a consensus viewpoint on what we should do instead.
And with all things in life, the answer lies in the middle. You should ask programming questions, but they should be applicable, universal and not “trick” questions. You can do a take home, but if you do, make it 60 minutes and fun. You will end up whiteboarding in a room, but you should train your team to be good at this, shadow them and help them get better at it. Nothing is wrong with these techniques, per se, but often they end up as a terrible candidate or hiring experience.
Take-homes are biased toward those willing to work hours for free, a "track record" is biased toward those who inflate their resume, a casual conversation is biased toward the likable and native english speakers, high-level whiteboarding is biased towards people who are good at whiteboarding