It's interesting to see how much "standardized cross-company interview for software eng" has been consistently a cursed problem in the industry.
Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
I remember way back in the day where it felt like Triplebyte might finally figure that one out, but it unfortunately never happened.
If you create a standardized test it will be gamed. Even with the small modicum of standardization around interview questions that we currently see, people have published books like Cracking The Code Interview, making it easier for people who don’t have the skills for a particular job to pass interviews at any place that uses standard-ish questions.
Furthermore, as an avowed enemy of “Clean Code”, I don’t want to see standardization because I fear that well-promoted ideas that I think are terrible would become required dogma. I prefer chaos over order that I don’t like.
The current system is already gamed and virtually standardized. The only difference that official standardization would present is that applicants would no longer have to go through the Leetcode gauntlet each time they want to switch jobs, which would save a breathtaking amount of time and effort currently occupied by wasteful redundancy in the interview process.
Corporations can use that standard exam/license as a baseline and then focus their interviews on domain-specific questions and the like. The existence of standardization does not negate custom processes.
> The current system is already gamed and virtually standardized.
This is only remotely true if you're looking at a very narrow slice of software development jobs. Those companies and jobs are overrepresented here, but remember that even in the US the majority of developers do not work at recognizable "tech companies." Much less the rest of the world.
I've been a professional software developer for over a decade, changed jobs many times, and never done an intense algorithm interview and I haven't been going out of my way to avoid them. I've even worked at some startups, though never one based in NY or SF. A handful of massive tech companies and their practices are disproportionately influential but they are not actually the norm speaking broadly.
This might be a regional thing, but I have done probably around 100 technical interviews in my career (both enterprise and startups) mostly in the Bay Area and the vast majority of these involved algorithm questions that had no relation with the job function. Most were around the difficulty of "find the largest palindrome in a string" or "reverse a singly linked list". On the harder end were things like "serialize and deserialize a tree".
I'll defend this a little bit in the sense that "had no relation to the job function" is just kind of unavoidable in interviews, or at least hard to avoid without paying major costs. The only way to have an interview that even comes close to reflecting real work is a pretty long take-home, and there are good arguments for not doing those (not least that most candidates really don't want to).
But yeah, the entangling of algorithms questions and coding questions is unfortunate. They're just separate skills. Some people are excellent coders who think "big-O" means something obscene, and some people are a walking discrete math textbook who can't problem-solve to save their lives. Triplebyte split (and Otherbranch splits) the two into separate sections, with coding problems explicitly designed NOT to require any of the common textbook algorithms. It's sometimes a little darkly funny how quickly a particular sort of candidate folds when asked to do something novel that steps outside what they've been able to memorize.
> and the vast majority of these involved algorithm questions that had no relation with the job function.
Consider the problem that you're hiring a software engineer and the company has has openings in several different teams that only have the job title in common.
Do you have four different sets of problems that are related to job functions? Does the interview take four times longer? Or do you extend offers to a dozen software developers this week and then have the teams that have the most need / the applicant appears to be best suited for add the headcount there?
If you are giving the same interview to all the candidates (so that you're trying to eliminate bias of asking different questions of different people) ... would that not tend to be solved by asking more abstract questions that are then compared to an agreed upon rubric as to which candidates were "meets expectations"?
... And what if it is for one team that has one set of problems... Do you have the candidates sign NDAs so that you can show them the actual problems that you then go pursue if something leaks? And if today's actual problem is solved tomorrow (unrelated to the applicants solution ... though I've experienced some "here is some real problems we are having" and startups trying to get an hour or two of consulting time for free with no intent to hire), do you give a different interview for someone next week?
The standardized and unrelated work means that you're not accidentally getting in trouble with HR with some bias in the questions or running afoul of FSLA by having someone do uncompensated work that might be related to real things being done.
I once got dinged at Facebook for using a tree serialization scheme that differed from the expected one in a way that saved linear space but made deserialization slightly harder to explain :)
You're not wrong, but Triplebyte exists to service that very narrow (but very well-funded, very lucrative, and very influential) segment, and so does this site, the fund behind this site, and most of the commenters in this thread.
True. Except that should be past-tense. Triplebyte existed to serve a narrow market, and failed largely because of that narrow view.
I think people really do underestimate how much FAANG and Silicon Valley practices have skewed the viewpoint of engineers and technology jobs in the United States. Not just in terms of comp, but in terms of architectural and technology approaches as well. Most of what the big guys do works for them at their enormous scales, but are plain dumb for the vast majority of companies and use cases. Yet we are all infected by the FAANG approach.
People underestimate how much cultural baggage influences things.
I'll give a very simple example. I did a few SWE interviews in 2020, and several companies did the initial screen over the phone, and the on-site over Zoom.
In both cases it was a remote interview. There was no reason not to do both over Zoom. The only reason was that the previous process was a phone interview and then an in-person onsite, and they realized they had to replace the in-person on-site with Zoom, but they didn't think to replace the phone screen. If you started from scratch it makes no sense though.
In this case, the whole origin of the Leetcode interview is "we're going to hire the smartest people in the world.". You can dispute whether that was true back in 2009 but it was certainly part of Google / Facebook's messaging. Now, in 2024, I think it has morphed much closer to a standardized test, and even if people might begrudgingly admit that, there's still the cultural baggage remaining. If a company used a third-party service, they'd be admitted they're hiring standardized candidates rather than the smartest people in the world. Which might be an "unknown known" - things that everybody knows but nobody is allowed to admit.
I definitely agree that this industry, for all of its self-proclaimed freethinking and innovation, is rife with cultural baggage. Allowing for an independent standardized interview step would defy the not invented here syndrome that many leading corporations ascribe to, that their process is best. Not to mention reducing friction for applicants (by don't repeating your Leetcode stage) is inimical to employee retention incentives, that is preventing them from shopping around for new employers. So me saying that we oughta have a standardized test to save everybody's time is more wishful thinking than anything.
This is definitely a factor. "You don't understand, we have a really high bar and we only hire the best people" is a bit of a meme in recruiting circles because you will never ever ever ever not hear it on a call.
I don't think we found it a barrier to getting adoption from companies though - perhaps because "we're a really advanced company using this state of the art YC-backed assessment" satisfies that psychological need? Unclear.
> but it was certainly part of Google / Facebook's messaging.
It entered the online cultural zeitgeist before that, with Microsoft talking about their interview processes, and indeed early interview books were written targeting the MSFT hiring process that many other companies copied afterwards.
I graduated college in 2006 and some companies still did traditional interviews for software engineers (all soft skills, and personality tests, no real focus on technology, except maybe some buzzword questions), and then you had the insane all day interview loops from MSFT and Amazon.
Back then, Google famously only hired PhDs and people from Ivy Leagues, so us plebs didn't even bother to apply. In comparison, Microsoft would give almost everyone who applied from a CS program at least a call back and a phone screen.
How do we let someone fly hundreds of people through the upper atmosphere with a certificate, but you can't make a login page with javascript without a unique multi-day interview for each distinct company?
Obviously the current situation is crazy, but part of the issue is that the specific asks for a particular developer job are dependent on 1) the stack in use and 2) the org chart at that company.
1 is obvious: if you need JS devs, most hiring managers won't want to hire Pascal devs and hope they figure it out. We can question the wisdom of this, but it is the reality.
2 is less obvious but not super obscure. Depending on how you structure your teams, similar positions at different companies might require more full-stack knowledge, or better people skills, or something else. IME there is little to no standardization here for developer roles, especially compared to something like HR or Accounts Payable, or even very similar IT-adjacent industries like game development.
Fix both of these issues and we would be able to have something more like a formal apprentice/journeyman/master system for various classes of software developer. As it is, each role actually is pretty much totally unique, at least compared to similar roles in other companies (there tends to be more standardization within the same company).
To (1): this is true, and more true than it should be, but I think this falls into the category of "trying to optimize expected value" moreso than "a hard requirement" at most employers. There's usually only like...maybe 1-2 hard tech requirements even for pretty picky roles if they're confident someone is good. It's just that they don't know ahead of time who is, so they may as well bet on the better-matched candidates.
For air transport in the U.S., it's not just one certificate, it's many. You get your private license, instrument rating, multi-engine rating, commercial certificate, instructor certificate, and finally the air transport certificate. And you're not allowed to even think about that last step until you've accumulated 1500 flight hours on the previous steps. Being allowed to write a Javascript login page is easy pickings compared to that.
My dad was an airline pilot. To hear him talk he was surrounded by morons.
That said, at least with airplanes you have to put in the flight hours and they have to be documented. Most commercial pilots start off at very low pay doing puddle jumpers and regional jets. Military experience might give you a leg up because it shows you've spent a lot of hours going through a very regimented program.
My dad was fortunate to get in the industry right before it became an option for the masses rather than luxury and business. It was high paying, glamorous job.
Not exactly, but that's why we have different certificates, endorsements, and type ratings that show demonstrated competence with each type of airplane.
A popular aircraft type is likely to be built for 20+ years and be flying for 40 or more. The 737 was rolled out in 1967, and the fourth generation is still being built. This is rather like a major chunk of the world's computing infrastructure running on Fortran (F77, F90... 2008, 2023).
Maybe it’s time to start thinking about doing to software what has been done with other professional fields: licensing and checking out of various levels. If I have to spend 30 hours or so learning, practicing and demonstrating my knowledge about aircraft instrument procedures before I can attempt that as a pilot in real airspace, maybe it’s not that big of a jump that we’d license different software features, and going outside of those bounds would be subject to loss of license.
Then we’d know this set of language features you’re familiar enough with to hold that cert. It might cut down on the waste and proliferation of useless tech that seems to be strangling the industry because people just want it on their resume.
It would do enough to dissuade companies from hiring non-licensed engineers (hey you could actually call yourself and engineer and not feel like an imposter), and would put a hard liability on things that definitely need it: financial and health data, which seems to be ripe for exploit and disclosure.
One way or another the insanity of the current system needs to stop.
I think the hard thing is that there’s just a lot of mediocre programmers out there writing mediocre software. Should they be accredited or not?
I think a lot of average programmers will end up accredited if they see it as a path to a job, just like we see with Microsoft certificate programs. And if that happens, I wouldn’t want the accreditation test to be my company’s hiring bar. I’ll still run my own interview to make sure candidates are actually good. And so will most companies. So the time consuming interviews won’t actually go away.
The one big problem licensing would solve is that we could insist some baseline amount of security knowledge is in the tests. Right now, plenty of people get jobs handling sensitive personal data without knowing the first thing about how to keep data secure. That just can’t continue. It’s insane.
I have interviewed people who have attested to having a Java certification from Oracle that while they were able to pass that test, they were unable to use their knowledge to develop solutions or solve problems.
I could ask about how the class loader worked or the syntax associated with a particular construct (that that language level - not anything later) and get the correct answer.
They could pass tests and follow instructions.
Licensure for problem solving is difficult. Extend that to different domains and it is an even harder problem to solve.
> The Software Engineering PE exam, which has struggled to reach an audience, will be discontinued by the National Council of Examiners for Engineering and Surveying after the April 2019 administration. The exam has been administered five times, with a total of 81 candidates.
> NCEES’s Committee on Examination Policy and Procedures reviews the history of any exam with fewer than 50 total first-time examinees in two consecutive administrations and makes recommendations to the NCEES Board of Directors about the feasibility of continuing the exam.
> In 2013, the software exam became the latest addition to the family of PE exams. The exam was developed by NSPE, IEEE-USA, the IEEE Computer Society, and the Texas Board of Professional Engineers—a group known as the Software Engineering Consortium. Partnering with NCEES, the consortium began working in 2007 to spread the word about the importance of software engineering licensure for the public health, safety, and welfare.
> This collaboration was preceded by Texas becoming the first state to license software engineers in 1998. The Texas Board of Professional Engineers ended the experience-only path to software engineering licensure in 2006; before the 2013 introduction of the software engineering PE exam, licensure candidates had to take an exam in another discipline.
Not a pilot but it sounds like 250 hours minimum to become one commercially [1]. My guess is that unless you can buy your own airplanes it'll take more than that for somebody that owns a plan to trust you being in charge on them to get to your 250 hours.
It varies on your state but to become a Master X it often requires >5 years. (ex 5 for CO Master Plumber [2], 4 for TX Master Plumber [3], 7 or 5 + Bachelors in NY [4]). Imagine needing to pair program with somebody for 5 years before you could be considered Senior. That'd probably cause a lot more professional development than the current copy from stack overflow meta but also really irritate a lot of young professionals. You can't even quit and start your own business because you can't write software as a journeyman without senior supervision!
It would destroy coding boot camps and outsourcing but many people already pursue B.S. degrees in Computer Science or Computer Engineering. If the laws changed to require a 5 year paid apprenticeship that allowed you to skip the college degree I don't think too many people entering the field would be upset so long as we planned and accounted for the transition (like a path towards accreditation for currently employed software developers since no accredited "masters" exist to pair under right now).
I think another issue is that there's no feasible inspection process or liability management. We have crap like SOC2 and PCI compliance but they're so dependent on self reporting that they mean little practice. Mountains of spaghetti code accumulated over decades are not inspectable like a building is. Software salary costs are already very high and this would push it even further up. It would eliminate offshoring/outsourcing as an option from businesses and they would lobby hard against it. Uncertified software products from other countries would need import controls, and all other sorts of issues that don't exist in our unregulated environment right now
It's also hard to imagine what sort of massive software failure would be required to spur regulatory change that we haven't already experienced and collectively shrugged and did nothing about.
I'm sorry, are you serious or being deliberately obtuse?
Even the most minimal pilots license, a.k.a. the PPL, still requires approximately 40 hours give or take of instruction, taking the ground school exam, and finally passing a check ride with a certified flight instructor.
Flying a passenger airliner as you seem to be alluding to basically means that you have a commercial license and likely an ATP - we're talking hundreds if not thousands of hours of flight experience.
Pilots have it easy because they actually have a series of tests they can take to prove they can do something. Once they have it then that's done, proof acquired. I have around thirty thousand hours working as a software engineer and have been part of many successful product deliveries, have progressively greater accomplishments, and hold a degree in computer science and all of that is worth almost zilch to the next interviewer.
> If you create a standardized test it will be gamed.
Well, the medial profession has a standardized licensing process. It's not perfect, but it certainly keeps the interview process to (mostly) mutual interest.
I think we can learn from the medical profession here. Otherwise, "I prefer chaos" implies that the incompetents are the ones who are the ones who will lose.
Just out of curiosity, what are some of the problems with "Clean Code"? I thought most of it made sense as basic guidelines. It's been a while since I read it though
I think https://qntm.org/clean makes a good case that the advice it gives can be taken to very bad extremes -- and that the author of the book does so in some cases when providing "good" examples. That's not to say that the advice is all bad, but that the book as a whole is not a good presentation and inexperienced programmers can enthusiastically learn the wrong lessons from it.
Edit: grabbed the wrong link from my history. Updated to the correct link.
I think the root problem is that a lot of people want books to tell them how to think. I think that's why I hate things like Oprah book clubs, complete with quizzes to make sure you think the right things now.
My best reading experiences involve arguing with the book. And talking about those books and my disagreements with them has been useful, too.
Orthogonally, all humans tend to overuse new knowledge/skills. That's part of how humans learn. We try to find out how far the use stretches and in what ways we can apply our new toys! I would expect any successful book on practices to be seen as overused.
In my opinion, the only significant contribution Clean Code made was the concept of clean code. The problem is that my definition of clean code is almost completely contradictory to what the author of the book thinks constitutes clean code.
Honestly it’s DRY that I oppose more than anything else, I’ve watched too many codebases turn into unreadable spaghetti because engineers thought everything needed to be abstracted. With regard to Clean Code, I think Uncle Bob’s takes on function length are ridiculous (something like “functions should almost never be over 4 lines”). In general, I just feel like he thinks very little of programmers and comes up with rules with an eye towards constraining bad programmers, not empowering good programmers.
Standardization reminds me of old stories about 1970-80's blue chip companies trying to hire programmers like they hired secretaries. They'd test applicants for things like word per minute typing speed, simple programming tests, hire in bulk and then dole batches of them out to various departments. Which sounds like triplebytes model, the motivation behind things like clean code and the webshitification of everything.
Opposite of that is the idea that work and interpersonal habits, communication skills, and domain knowledge are more important than raw programming skill for most jobs.
Standardized process doesn't have to mean a purely checklist-based rubric. Triplebyte wasn't - and Otherbranch especially isn't - devoted to the idea that a good engineer can be reduced to checkboxes. And speaking for myself as a founder, I in particular believe very strongly in the idea of intangibles as important criteria. Having a standard process makes intangibles easier to detect, not harder, because you can look for those ethereal little bits of signal against a familiar backdrop.
The last question on the grading form for our interviewers is, to quote it exactly in its current form:
-----
Full interview overall score
(Would you vouch for this person's skills?)
* No no no no no
* Some redeeming qualities but not someone you'd recommend even as a junior hire
* Good for a junior role, still a bit short of a senior one
* Would recommend for a senior role
* Incredible, get this person a job right now
-----
That, to me, is the opposite of what you're talking about. Expert opinion is a central part of what we do, and a big part of what I think gives us an advantage over something like an automated coding test. We just take treat expert opinion as a type of data in its own right so that we can, for example, adjust for whether one interviewer grades more harshly on average than another and make sure that it is actually producing valid results down the line.
> Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
The airplane pilot interview process on top of the standardized government certifications includes:
- On-line application (resume and cover letter)
- On-line Psychometric/Aptitude testing (sometimes this is hands-on, on site for some airlines)
- Video Interview, SKYPE or Telephone interview
- Assessment Day (includes: Technical Questions / Panel Behavioral Interview / Scenario Based Questions / Flight Planning Exercise and sometimes a Clinical Assessment)
- Sim Test
- Referee Check
- Medical Check
The exact details differ by airline and I'm assuming the risk profile of the cargo (ie: passengers or not).
Gosh, not so different from software engineers, is it? Except you also need to do a bunch of bureaucratic certifications on top of that.
Not to mention all of the licensing, regulations, and formalized training hours that you have to put in just to reach that point. It’s all substantially harder than studying LeetCode for a short slice of your life.
It’s amazing how often I hear about how easy interviews are in other professions, according to engineers who dislike coding interviews.
Then you look into those other professions and it turns out changing jobs is actually a lot harder than the internet comments would lead you to think.
It's completely different from software engineering interviews. The process you described for airline pilots gets to the actual qualifications for the job. Whereas for software engineers, literally no one needs to reverse a binary tree yet they base the decision on in large part on this sort of question. Ideally, the BS would be encapsulated in a certification so that interviews can focus on the real useful stuff.
So needing to get yearly re-certifications in Leetcode questions is viewed as preferably by you than having to only study them for interviews? And spending 10+ hours on coding exercises, un-paid, per job is also seen as preferably by you to the current status on top of the yearly Leetcode certification? On top of a full panel of behavioral interviews? Plus needing to take an IQ style assessment online before each job (see the second item on my list)?
I'll take having to study Leetcode every few years over all of that any day of the week.
For many roles the interview is as much a cognitive function and socialization test as it is a skills test. You can have exquisitely detailed knowledge of systems internals (skill) but if you have limited working memory (cognitive) then you will struggle to design non-trivial systems software. These are orthogonal dimensions. You might prefer someone with high cognitive aptitude and low skill, since the latter is readily trainable.
Cataloging a list of skills is insufficient information to determine if a person is suited for a role. I don't find it likely that software engineers will be subjecting themselves to a battery of standardize cognitive function tests any time soon.
Is law a good example? My understanding is if you didn’t go to a top 14 school (whoever came up with that arbitrary number) it basically forecloses on the best opportunities.
A similar pattern exists in tech startup hiring practices and which ones attract VC funding. Not unusual to see funded startups with founders who have no work experience but Stanford degrees. Before my time at The Atlantic, I had a couple recruiters for no-name startups tell me I didn't have a prestigious enough background to be hired. There is a highly-visible class hierarchy in tech that many people in the industry seem unaware of. Perhaps this is because base salaries are high-enough that the middle class is just happy to be included at all.
100%. I'm not sure if this was always the case or if it was a slow result of tech becoming overrun by finance, but it's a very motivating thing for me. I started a company for a lot of reasons, and this isn't the top one, but I sure would love to show success without playing the class-signalling games that the valley seems overrun with.
On the other hand, I'm posting my content post here in part because I know the HN candidate pool is about a trillion times better than I'll get anywhere else. So perhaps I've already lost that battle.
After 30+ years in the field..it definately wasn't always the case. All the leet code and take home stuff is a pretty new thing. Can't say I've seen it result in higher quality teams. Seems mostly a way to rank recent grads that are working from memory and not experience?
Well, companies aren’t looking for someone who grinds leetcode problems. They’re looking for the people who can pass their hiring bar without needing to do that sort of practice in the first place.
In my more cynical moments I think a lot of the tech hiring process is just a complex IQ test dressed up as a skill test to work around the fact that IQ tests are illegal.
And there’s only so many great engineers around. More companies fighting over the same candidates doesn’t result in a lot of high quality teams.
>>Well, companies aren’t looking for someone who grinds leetcode problems. They’re looking for the people who can pass their hiring bar without needing to do that sort of practice in the first place.
Thats basically what I was saying. Recent grads don't (typically) have a lot of work experience to look at. So they're looking for the ones that learned/remember the most from their education.
>>In my more cynical moments I think a lot of the tech hiring process is just a complex IQ test dressed up as a skill test to work around the fact that IQ tests are illegal.
Heh - I never thought of that...
>> And there’s only so many great engineers around. More companies fighting over the same candidates doesn’t result in a lot of high quality teams.
True - where I am now we have an awesome team ... some really great talent but oddly, this is one of the places we don't do leet code/take home crap. Its mostly talking about how you solved real problems...IBM on the other hand, was a total cluster f*ck. Anyway, just what I've encountered running around Wall St. and Silicon Valley - your mileage may vary :-)
That sucks. It can definitely be an uphill battle if you didn’t go to an “elite” school or have a non traditional career path. That said, I think tech gives many more chances than law. Bigger companies discriminate far less (also not perfect) and once you have that on your resume it’s a strong social signal moving forward.
They might ask for a writing sample, but because filings are public anyway, it's easier to share work you've done before. You could always change or redact the names on ones that aren't public and still get the substance of the candidates work.
It's because companies don't want capable, experienced or well-equipped. They want genius and it is really hard to test for genius. Granted, almost nobody that gets through any process is an actual genius....
I'd say it's the exact opposite. There are hordes of unqualified people applying to every software dev role imaginable regardless of what you put in the job description or requirements. The tests are there because people are good at lying but bad at faking skills.
Seriously, if you've never hired before, you have no idea how bad this can get.
Here's [1] our practice coding problem. It's quite similar to the one we use on our interview, and not too far from the one Triplebyte used in the past (ours is tuned to be slightly harder at the beginning and slightly easier at the end). The vast majority of candidates, even with some reasonable pre-filtering, do not get past the first step. A very non-trivial number would not even get that far.
that's actually pretty great, but it brings up another issue I have with the state of tech interviewing - it focuses on being able to write code fast. the more senior you get, the more you tend to focus on depth, taking your time to think over the problem and write a good robust solution rather than banging out code fast, so coding up a solution in 25 minutes versus an hour is not really a good test of what the company presumably wants to hire you for.
This is true, and it's part of why this is one section of three.
If you were slow but high-quality on the coding section but crushed the knowledge and system design, we'd probably recommend you - or at least, recommend you to clients that aren't specifically looking for fast coders. Someone we recommended to a client recently had the equivalent of like 1.75 steps on the task linked here, but got consistently high scores everywhere else.
I do wish we could do a more complex, longer coding problem, and one of the things I've been considering is cutting some other stuff to get it up to 45 minutes or something. The current length isn't a principled decision, it's a resource constraint - keeping interviewing costs manageable is essential when you're trying to bootstrap a company in a rough market. Speed matters, but speed over such a short timescale is absolutely artificial (I'd much rather measure speed over a day instead, it's just not practical to conduct a top-of-funnel interview for so long.)
I was one of triplebyte’s interviewers years ago and I can speak to this. In short, you’re right. But two notes:
First, you massively underestimate the range of coding speed you see in an interview. The slowest programmers weren’t senior people who were out of practice. (I interviewed plenty of them). It was people who just seem bad at programming. Like, so bad it takes them 25 minutes to make a hello world program run. (In their favorite language, on their computer and with full access to the internet during the test).
A 2x programming speed difference would have rarely changed the outcome of our overall assessment.
Second, there was an aspect of triplebyte’s interviewing process that I’d love to see replicated elsewhere in the industry that resolves this. And that is, we should be assessing debugging ability. At triplebyte we gave candidates a smallish program (few hundred lines) with 4 bugs and a failing test case for each one. The candidates had half an hour to fix as many of the bugs as they could.
Watching people debug was fascinating.
One clear pattern that emerges is exactly what you are predicting. Smart kids right out of school were great at the programming section. But it was always the more senior engineers who smashed the debugging section. Junior engineers would get lost in the weeds and struggle to get very far in the time we gave them. Some of the senior people I interviewed dived straight in, and even found some bugs we didn’t even know about in our own test.
It seems to me that being able to read unfamiliar code and fix bugs in it is a hard to learn skill that matters on the ground. And frankly I suspect it’s more useful skill than a lot of leetcode problems. I’d rather hire someone who’s amazing at debugging than someone who’s amazing at data structures. Well, I suppose I want one of each on my team.
If I was ever making a programming test, this is something I’d include for sure.
Fair. A good debugging challenge is pretty hard to write well. I remember one of the triplebyte ones I worked on passed through several hands trying to get the calibration right.
That actually looks pretty good aside from the time limit..
It takes me a while to 'get in the zone' - and especially with you base datastructures you wanna think about it a bit as it has
real ramifications on how hard/easy everything else can be.
Still, seems better then most of the 'leet code' type stuff I see :-)
Oh, that brings back memories. When I interviewed with Triplebyte in 2018 and was rejected, my first item in the 'positive feedback' part of the email was "We saw some real strength on the tic tac toe section.", and my first item in the immediately following "the areas we think you can improve" paragraph was "We didn't see the coding proficiency we were looking for in the tic tac toe section--you didn't make very much progress."
...it might have been me, I was writing those emails in the year 2018. That was my first job there.
Jokes aside, we generated those emails largely from a template. If I remember right, "saw some real strength" was "got at least an OK score for code quality", while the negative feedback below was about number of steps. The number of mishaps with that system is part of why we do the feedback a bit differently [1] at Otherbranch, at least for now. I think we did revamp it very late in that team's lifetime, just before Triplebyte pivoted and laid off most of the team that did those (two of us, me + one of my colleagues, moved over to a different team, which is why I'm in a position to tell the whole story).
That is the sort of coding problem I would love to work on as an exercise however the 25 minute time limit would put me off and I wouldn't even start. I enjoy programming and I don't like to feel rushed, and if you're placing that sort of time constraint then you probably aren't a great company to work for.
So, having completed the practice problem with about 45 seconds to spare, what sort of openings are there for the aging-but-not-aged Canadian who can probably only manage part-time remote work?
I don't anticipate us getting a lot of clients with budget for a part-time employee anytime soon, unfortunately. I imagine a lot will be remote, but probably not part time. Still happy to have you in our pool in case we do (I think I see you in the signups list - something about oddball proprietary languages, yeah?) but it's not a wildly high-probability bet in the short term.
I really, really, really wish I had a better solution for this sort of thing.
I do wonder how my “part time” output would compare the expectations from full-time output. During the periods I've worked more conventional jobs, the vast majority of the value I brought was during the oddball times/contributions. Surely there must be some way I can be exploited (in a good way) without implicitly bundling ancillary seat warming into the contract. :D
It got submitted properly and looks like the correct email. Airflow - at least the plan we're on - caps the number of emails to different addresses that it'll send a day, so we hadn't set up confirmations, but I guess normal volume probably isn't a problem for that. (Woulda been fun today, though.)
I have and you are right, it's terrible. However, I'm coming from Ops side of the house so my knowledge on Dev Hiring is talking to them and making sure they are not going to launch LedgerStore without talking to us first. Ops I'm more experienced with.
However, looking at Other Branch example test, it's another data structure question which I would totally bomb. Is this really the end all be all to dev hiring? If they can't do Data Structures, are they worthless to Silicon Valley companies? I'm honestly stumped because we get dev candidates that crash and burn with Fizzbuzz or our REST API test.
> If they can't do Data Structures, are they worthless to Silicon Valley companies?
I think you might be overthinking this - understanding how to model a problem with a data structure is a core competency of any developer. This isn't a "gotcha" question where you need to know union find sets or how to invert a linked list in place.
If the question was rephrased, "design a JSON schema for the state of the board" would you know how to approach it? Because that's essentially what step 1 is asking.
You consider that a "data structure question"? Honest question - I would (do) characterize it as a sort of "fizzbuzz+" that is deliberately NOT data-structure-y, and I'm surprised by this response. Can you give an example of short coding tasks you would consider not data-structure-y?
(For the record, we do ask about DBs and system design in other sections of the interview. The coding is one portion of three for the interview as a whole.)
Minesweeper is all about storing data about the board and displaying the numbers is all about running through data. Seems very data structurey to me but maybe that's non heavy dev side coming through.
Maybe I'm just thinking too much problem, overloaded my small attention brain and misread it. I wouldn't call Fizzbuzz data structure-y however.
I'm also mostly outsider looking in. Never worked at FAANG and working with YC companies, I come on much later and tends to be more keeping the lights on and stopping the duct tape rocket from exploding.
I disagree. Minesweeper doesn’t require anything more exotic than a 2d array of enums. This is bread and butter stuff that you’ll run into in 99% of programs. If someone doesn't know how to use their language’s lists, enums and structs, I don’t think they know the language yet.
A “data structure problem” would involve more exotic data structures, usually of the kind the candidate has to implement themselves. For example, b-trees, heaps, skip lists, and so on.
The reason a lot of people don’t like custom data structure questions is that they come up rarely in most people’s jobs. Lists, structs and enums on the other hand are used everywhere. Your programming job will almost certainly require you to understand them.
I'm a competition programmer so my perspective is generally miscalibrated, but part 5 is usually solved with a BFS. You can say BFS is basic, and it's maybe the part of competition programming that comes up more often than any other in real life (tied with sentinels, which you usually use in your BFS?), but I think it's a "data structures" problem.
If this test is calibrated like the triplebyte programming challenges, less than 1% of people will finish part 5 in the time given. I think when I was interviewing, only about 3% of people reached step 5 at all. Finishing was super rare, and it really only existed to separate the top couple percent. (And yes, we told the candidates that we didn’t expect them to finish in the time allocated). It doesn’t matter if step 5 is harder. Most people will never attempt it anyway.
But even then, while BFS would definitely work here, so would DFS and that’s simpler. A simple, unoptimised recursive DFS flood fill would be like 8 lines or something given by that stage you already have a function to reveal a cell. You just have to call it on all the adjacent cells. I don't see that as a data structure problem. You could argue it’s an algorithm problem, but it’s about as easy as those come.
I think its because the person is thinking you are looking for something especially smart when in fact something like a list of lists would be fine enough, if you can explain the tradeoffs for using them.
I think you are selling yourself short - the problem is just some minor education to realize you probably know most of what you need to know. And I still cant figure out how to get a kube cluster working.
You don’t need to embed game rules into the matrix to solve this problem. (What does that mean anyway?) Just store the board in a 2d array of some sort and write some functions to interact with it.
That’s just not what a “data structure problem” commonly means. If it was, all programs would be data structure problems because all programs interact with data in some form.
A data structure problem is a problem where designing and implementing an appropriate data structure is in some way hard. A 2d array doesn’t fit the bill. It’s not exotic enough.
This process is also pretty much guaranteed never to yield mid-career geniuses at the height of their powers. Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever? Effective filtering of the candidates who actually engage with the platform can, at best, accurately identify the next tier down: effective engineers in mid- and late-career, and inexperienced whiz kids. Not that this is a bad thing; that first category makes the world go 'round.
> Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever?
Because I don't know what’s out there, or who will give me the best offer. If you’re skilled and in the middle of your career, it’s easy to find a job, but if your options are wide open, a matchmaking service like this with a wide pool of companies is very valuable.
The problem is that software development is less like hiring an airline pilot or a structural engineer, and more like hiring an artist. Try making up a "standard exam" that will tell you whether an artist will produce several great unique works for you in the future, so you know which one to hire...
That's an interesting point, but then one wonders that if software eng are ultimately artists, why are we not having them work on their portfolios like the other art disciplines? Is that the fundamental problem?
I think it's frowned upon in many circles because most of the work you do for a living cannot be released to the public, so there's an expectation that you will be cranking out work for free on the side just to build a portfolio, which is not inclusive for say.. a minority mother of three, who doesn't have the time, and it only favors young affluent white and Asian males with no dependents.
A lot of painting still life comes down being comfortable with brushes and knowing how paint works. A lot of music comes down to know scales and chords work. Most art still requires fundamental mechanics.
Yet frankly, what most of us do is more like plumbing than art. In that we're just fitting systems together and in that it's actual skilled labor and in that we're seen by everyone else as the ones willing to do the shitty work.
Management puts up with us and they pay us because even though they think they can do our work, they wouldn't want to.
Plumbers are licensed and unionized, two possible solutions to the problems posed in this thread.
Part of me wonders if the recruiting itself is over-engineered anyway. I mean, imagine if you just asked:
Implement Bubble Sort, in 2 Languages of your choice, with multithreading or other language-provided parallelism, with the correct number of parallel threads to be most algorithmically efficient
Would that really not weed out a lot of people? I think it would. I know the above algorithm is hardly production-ready, but the requirements are easy to understand. (It's also a bit of a trick question - there is no optimal algorithmically efficient number of threads in a bubble sort, only the number of CPU cores in the system.)
Our coding problem is easier than that (by a fair margin, it's all completely synchronous single-threaded procedural code unless you're doing something extremely weird) and it weeds out the vast majority of applicants.
The same was true of all three of the standard coding problems Triplebyte used. They're not quite literal fizzbuzz, but they require - at best - some basic critical thinking and basic language features, and that is a filter that eliminates 90+% of applicants to dev jobs. Now, granted, this is under time pressure. I imagine, given several hours, most could finish it (although maybe even that is overestimating things). But still.
There's an old Randall Munroe article quoting a physicist:
> The physicist who mentioned this problem to me told me his rule of thumb for estimating supernova-related numbers: However big you think supernovae are, they're bigger than that.
and I feel like this applies to recruiting: however bad you think the average applicant is, they're worse than that.
I ask candidates to implement a simple dynamic data structure, and I even describe it first, so there is nothing to memorize. It turns out many people don't know how to set up classes or data structures, even when you describe it to them first. Forget about computational complexity.
Triplebyte used to send candidates a link to a page which basically listed everything that was in the interview. Most candidates didn’t read it. They could probably have done away with the interview entirely and just put a link at the bottom of the prep material saying “click here to pass the interview”.
If anything I think it would have lowered the triplebyte passing rate.
Well, triplebyte is dead. That’s one reason not to do the experiment.
Also arguably the “customer” for triplebyte was the company that eventually hires the candidate. They’re paying in part for the screening process triplebyte did. We wouldn’t have been doing our job if we skipped the interview part of the process.
This sounds like what a lot of companies do - except the scaled problem with this approach (and the certification approach of the grandparent) is that most companies want to avoid candidates who've memorized a specific solution, as then they don't get any data about whether they can code anything aside from what was memorized.
The other problem is that implementing bubble sort will tell you about their skills in a particular dimension, but being a software engineer these days may look very different depending on the job.
I do a tree-search-ish thing when interviewing people. I’ll start with a super basic question about something beginner-ish on their resume. If they can’t answer that, the interview is politely wrapped up. I’ve eliminated a surprising number of people who had jQuery on their resume by asking them to write code that will make a div with the ID “mydiv” disappear if the user clicks a button with the id “mybutton”.
After that I ask a super difficult or niche trivia question like “in CSS, what two overflow properties cannot be used together?” If I’m hiring a mid-level frontend developer and they nail that one, I go “fantastic, great answer, do you have any questions for us?” And the interview can end early.
But if they miss that, no sweat, I’ll start asking more mid-level technical questions to figure out where they’re at.
It's also a mostly useless problem for determining engineer quality, in many cases.
It tests for pure coding ability, when most organizations should be optimizing for engineers that are going to be productive-if-not-spectacular, that can design and build maintainable systems.
Could I have written the above problem back in my engineering days? Probably not, since I went years not working with threads. But I also wasn't working on problems that would ever have benefited from knowing that. Most software engineering roles are essentially building CRUD or ETL systems, maybe with a user interface. Any coding problems given should be optimized for weeding out bozos (which are still plentiful), not for weeding out the most people.
I find picking good questions is hard, and many fall into similar patterns, making them something candidates can practice for.
Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.
> Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.
True, it's more of a backend role question. The reason I threw it in there, is from my assumption a leetcode grinder would be very likely to immediately go, "well, the most efficient number of threads is log (n)" or "the most efficient number of threads is the square root of n" or some other plausible-sounding BS answer. But the reason I chose bubble sort, is that it's so simple to understand, that you can fairly easily (I would hope) figure out there's no benefits to more threads than CPU cores at all, as long as you stop and actually think about what it is doing.
Trick questions are a waste of every one’s time IMHO and it was something I made sure to not do when I took over the hiring process at my job.
It’s a form of hazing and rarely does it have any connection to the day-to-day work you do in the job. All of our tests or questions relate directly to the work we will ask you to do in the job, anything else is just trying to be clever and I dislike cleverness in both job interviews and code in general. Cleverness almost always means inscrutable and unmaintainable.
1. It's the dumbest algorithm for sorting possible (bubble sort). Compare two objects, swap them if one is bigger than the other, go down the list, repeat. If a developer doesn't know that algorithm, what algorithm could they possibly know? It's the lowest bar.
2. 2 Languages of your choice is enough to show you aren't a frameworker and can think in more than one box. Doesn't matter if you do it in JavaScript and C#, Go and Rust, Python and Haskell. It's a chance to show off, while being quite obtainable for a competent developer.
3. The parallelism trick question merely shows that you actually understand what you are talking about. A leetcoder might make the trap of assuming it's log(n), or the square root of n elements, or some other overly-thought-through math that is bogus. If you think it through though, bubble sort is simple enough, that it's very easy to realize (in my opinion) why more threads than CPU cores doesn't really help.
If I want to separate the real algorithm masters from the rest I’d avoid questions about sorts with total ordering but instead cover algorithms with partial ordering such as topological sort, BSP trees and such which are super-beautiful and I think quite useful but obscure.
Bubble sort is quite interesting from an algorithm analysis perspective (prove it completes, prove it gets the right answer, prove how it scales with N) but I’d almost rather a programmer I work with not know about it from a coding perspective because it increases the chance they code up a bubble sort than use the sort function that comes with the standard library.
> or some other overly-thought-through math that is bogus.
So the mathematically-correct answer would be a minus while "in my opinion" would be a plus?
I think your interview question brings you to your intended answer: the reason why every company makes their own process is because what's good for you would probably get you an instant fail in NASA and would get you labeled as "experience not relevant" in some research settings.
> So the mathematically-correct answer would be a minus while "in my opinion" would be a plus?
My opinion was only that it would be relatively easy to figure out. As far as I know, it is a certainty that sectioning a bubble sort, more or less than CPU cores, is less efficient. 16 threads on an 8 core CPU just means balancing 2 threads on every CPU core, each doing their own sorting and needing eventual merging back together. There's no way that can be more efficient.
I don't think we are over-engineering it. You want to "weed out" everyone but the best candidate for the role, or the best candidates for your open roles. It's a very hard problem to identify the "best" person from a group of people. It would be different if all programmers that are good enough are the same, but we all know the skill ceiling on programming is very high. Selecting the best devs is a critical function for any software project.
Compare [0, 1] [2, 3] [4, 5] ... in parallel and swap if necessary, compare/swap [1, 2] [3, 4] [5, 6] ... in parallel, then go back and forth until no more swaps are made - second element in pair is always greater/less than the first.
That does suggest that the theoretical ideal number of threads is n / 2 ignoring cores, though you'll also want to consider things like cache line size, cache coherency, and actual problem size so it's probably less.
At the end of the day, the important thing would be to actually benchmark your attempts and seeing how it scales with processors/problem size and checking the isoefficiency.
Airline pilot is not a great choice for comparison.
Airline pilots are selected to be entrusted with many lives, with the utmost professionalism, and to perform with ability under stress.
And it shows, in the amazing track record of aviation safety.
Most software developers, on the other hand, are mainly paid to mechanically type out code, cribbing from StackOverflow or ChatGPT whenever they don't know something, to "get it done".
And it shows, with the atrocious track record of the entire industry at information security, for one obvious metric.
Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
I remember way back in the day where it felt like Triplebyte might finally figure that one out, but it unfortunately never happened.