I actually like asking math questions on interviews. It shows how people approach a problem. Asking code questions in an arbitrary interview setting shows just about nothing - no access to a reference doc, somebody peering over your shoulder. Heck, I couldn't code my way out of a wet paperback in that setting.
Certainly, asking only math questions is stupid as well, people should know at least a little about the stuff they're supposed to work with, but teaching an actual language to a smart person eager to learn is a breeze compared to teaching problem solving to someone who memorized the reference manual.
Asking math questions might exclude some good candidates. I know more than a couple very productive programmers that did not go through a formal CS education. Asking math questions to those candidates could even scare them for no reason. I hired people who had no idea about Lagrange Multipliers but were able to ship code in various languages and even learn new paradigms when necessary.
There are not only smart people and persons referencing reference manuals. Being a programmer often means solving bugs in messed-up codebases, build web apps using the technology du jour, or making data go from one place to another, and asking math questions does not help a lot to find people able to do this. This blog post resonates with some people I met.
I have been programming for a while, and went through a CS education, but my experience with hiring made me realize that being good with maths and being a productive programmer are not necessarily two things that always come together.
> Asking math questions might exclude some good candidates.
Every question can exclude a good candidate, especially if you only ask the question and tick the "correct/incorrect answer" box. However, often you can learn the most interesting trait from asking a question which the candidate can't answer right off the bat: How does the candidate deal with failure or lack of knowledge. Does she/he start guessing? Does she/he ask the right questions moving in the general direction of solving the problem?
I'm not checking for academic knowledge in interviews.
> and even learn new paradigms when necessary.
This often requires knowledge about the stuff you don't know. That is a value of formal education: Not the stuff that you memorize, the bigger value I derive from my formal education is all that "I know that there's a solution to the problem but can't remember exactly" kind of knowledge. I can't remember all the sort algorithms I had to code, but I remember there's more than one and that there are tradeoffs between all of them. So if I'm constrained on memory and have a pre-sorted list I can go luck up how bubblesort is implemented exactly. That's a knowledge that self-taught programmers often lack [1].
> being good with maths and being a productive programmer are not necessarily two things that always come together.
No, nobody proclaimed so. But having a trait for problem solving and logical puzzles certainly helps :)
[1] n.b: often. Some of them have read and digested tons of theory books which could count as formal education.
> However, often you can learn the most interesting trait from asking a question which the candidate can't answer right off the bat: How does the candidate deal with failure or lack of knowledge.
Some research suggests that tests are much better at predicting performance than informal grading.
I agree - whenever interview processes come up, commenters mostly criticise the interview processes for excluding good candidates.
But that's only one part of the equation - the number of unsuitable candidates that slip through is normally more important.
Suppose that somehow we magically know that 20% of candidates would be good hires - and the other 80% are unsuitable. But we don't know which are which!
As an interviewer, I'd be very happy with an interview process that discards 50% of the good candidates and 99% of the unsuitable candidates, because that leaves me with 10 good hires and 1 unsuitable hire for every 100 candidates.
On the other hand, if a different interview process discarded 20% of the good candidates and 80% of the unsuitable candidates, that would result in 16 good hires and 16 unsuitable hires - which would be disastrous!
Even from the point of the interviewee, one probably wouldn't want to work somewhere where 50% of your colleagues are not suited to their jobs!
Summary - it's a shame to discard good candidates, but it's worse to let too many unsuitable candidates slip through ...
Hell, even those can exclude a good candidate, especially at larger companies. Candidate not a good fit for the team? Probably won't do as well as one who is.
I think you don't get the point. Learning a new language after you programmed for 5 years in a variety of paradigms like assembler, object-oriented, purely functional, will take you between a couple of hours and a week. But if you have NOT had those 5 years of varied programming experience, then the new language is not your problem. Learning the concepts is, and that will take time.
I think you don't get my point: I expect a domain expert to be a domain expert, but I don't expect a trainee to know anything and I'll structure my interview accordingly. I'll ask the domain expert questions about his domain. But I don't expect a domain expert to know the programming language that we work with just because he's a domain expert. And seriously, I'll prefer the eager-to-learn math graduate with little coding experience over the bored cs graduate that knows the java reference doc by heart and now thinks he knows how to build stuff. In any case coding examples in your interview will show you little beyond the fact that somebody memorized the docs.
The actual pain point I read from the article is a different one: There's a mismatch between hiring process and expectations. If I need a lead programmer for the iOS project that I'm about to start next week then I can't hire anyone that doesn't know Objective C and then I absolutely need to structure my interview accordingly. But if I have a couple of weeks more I'll happily teach him. And if I'm hiring somewhat smart I try to avoid those "we need someone urgently" situations.
I think you vastly underestimate the time it takes to learn these sorts of things. Sure, you can teach someone to write a simple, passable iOS app that does a few basic things in a couple weeks. But they're still going to be a raw-novice iOS developer. Maybe the app you need them to build is super simple, but if not, you're doing yourself and your company a disservice by not hiring someone who's done iOS before.
I'm speaking from experience here: I learned iOS (even after having previous MacOS X desktop devel experience) on the job, when a friend asked me to write an iOS app for her startup. I learned quickly, but made a lot of mistakes in how I structured the app that came back to bite me months later. If I'd had the time to start over from scratch, I would have done things quite differently and the whole thing would have been a lot easier.
And I was slow. Every new framework I had to learn slowed me down and added days to implementing the part of the app that needed it. A seasoned iOS developer wouldn't have run into problems like that.
I think you vastly underestimate the value of "other knowledge" for any kind of complex undertaking. Unless you get a domain expert who happens to be a rock star programmer and a ninja architect and top-notch devops, you'll have to compromise somewhere. Take an iOS game: Depending on the game it can be vastly more value to have knowledge of game programming and game AI than having knowledge about iOS frameworks. If there's a team on the project I'd even go as far and say that I would pick the domain expert over the framework expert as team lead, given that all other capabilities were on par.
Sure, there are always compromises. I wasn't claiming otherwise. I was merely pointing out that ramping up on a particular platform is nowhere near as easy as the parent suggests.
Games are a bit of a special breed, though, and there are several game engines out there that hide the platform pretty well. I draw a bit of a distinction between "building and iOS app" and "building an OpenGL app that runs on iOS". The latter is much easier if you lack iOS experience but have the required OpenGL skills.
I also think that there might be a mismatch between hiring process and expectations. When you say "But if I have a couple of weeks more" then you obviously underestimate the time it takes to learn if we are actually talking about somebody without much programming experience. Try "if I have a couple of years more".
See, there's so much more that you need to learn when you start on a project in a company. There's the framework they use, their libraries, the requirements for the project, the problem domain, the people you work with, the process they use, ... just to list the few that come to the top my head. Especially learning the problem domain can be very hard and challenging to programmers with little or no knowledge of the field. It can easily take a couple of month, if not years to become reasonably proficient.
So yes, "a couple of weeks" is certainly not enough to transform a graduate into a project lead for bespoke iOS project, but it could easily be enough to transform a decent java programmer with solid project lead experience. Transforming a crack Objective C Programmer with no lead experience can take easily as much time, if not more.
My gist is that programming knowledge will only get you so far and depending on what position you're hiring for other experience combined with the ability and willingness to learn may be much more interesting. And whatever I do, I tend to hire for that trait since that allows people to pick up other abilities when required.
Learning a new language is fairly easy. Learning the language's standard library to the point that you don't have to look at the reference docs every 30 seconds takes a bit more time.
The author clearly showcases how this is not true in his (and probably others') case. Anonymous functions, function pointers, etc. are not language-specific features, but rather something you learn when learning to program.
I think the point of the article is that being smart at puzzles is not enough. It may not matter whether you have 3 or 10 years of programming experience, but if you have 20 years of puzzle solving experience with only 6 months of (real-life) programmering experience you have a lot of non-trivial learning left to do.
Being in a startup setting where everyone (usually) is expected to be a bit of jack-of-all-trades, I would argue that having programming experience trumphs being smart, at least for the first period. You may be the master of finding smart algorithms to design your application, but if you can't build the first CRUD website it does you little good.
So true. I've been a professional developer for 15 years and I've worked with several languages (C++/Java/PHP/Python/Perl/C#/Ruby/JS/Groovy), but man, does this whole "functional" stuff is being hard to get a hold of.
Specially Scala, for me, is very hard to understand the syntax, even after a year of working with Play. The most frustrating part is understanding the flow of the data in chained calls like in map-reduce.
Did you read the OP? The point of the article was that one can be great at that sort of thing and yet quite terrible as a production programmer. You are not measuring the skill you want with math problems, you are measuring a proxy.
Furthermore, I've known plenty of smart math people that just never seemed to be able to program (well). I think they are different skill sets, certainly with plenty of overlap, but plenty of differences as well, and those differences matter. I laughed out loud at the "variables, variables everywhere!" answer in the OP - I've had to deal with so much code written that way. Some people, very smart people, just don't 'get' design in that way. I worked with a guy that used to run around the office, asking brain teasers, sharing tidbits of knowledge, but he couldn't execute a basic project - couldn't plan what to do, couldn't do things in a rational order, couldn't experiment and gather data, couldn't incrementally design, develop, refactor, nor do a big-bang waterfall kind of design, and so on. He was not sorely missed when let go. Smart as a whip, and useless (for programming).
I've worked at several companies with staff mathematicians. Sooner or later they got their hands on a compiler. Oh my. No, let me do that. You tell me what is wrong with my Kalman filter, but I'll take care of the implementation, thank you.
Its easy to kvetch at somebody else's answer without offering an alternative (I do agree whiteboard progamming is disastrous). So, instead of asking math, why not ask them to write a simple routine, but then start asking real world problems about the code they would face - how would you make this API interface robust? What kind of documentation would you write. How would you handle errors? Is this code exception safe? Thread safe? How would you make it either/both of those. Suppose your problem size was n=100MM, how might you need to change this (say they have a data structure that loaded everything into memory)? Ask them some problems they will see in production - what is the network delay, or whatever your problem case is. You still get to see how they approach problems, but in the context of the actual decisions they will be making while programming for you.
Anyway, that is what I try to do. I am revising my thoughts even on that, because I find people flopping on the 'code the simple problem (and, it is simple)' yet doing great on all the engineering questions, and doing fine if we hire them.
> Certainly, asking only math questions is stupid as well
So what I actually advocate is asking questions about the field that the person is going to work in. I still like to ask some math questions - a lot of the stuff that programs deal with basically fall back to math. Heck, all relational database stuff falls back to set theory, so knowledge in that area certainly helps a lot. I'm not advocating employing pure theoretical mathematicians to develop contact forms in php. However, from my experience the following two statements hold true:
* Studying CS does just about nothing good to your actual programming skills and your knowledge of real-world problems. So just don't expect recent graduates to be able to develop a program in a rational order, using a reasonable process, refactor or any of the skills you're asking for just by virtue of having "programmed". If you're looking for an experienced programmer hire for an experienced programmer, but the OPs tale clearly shows that she was no experienced programmer - why would I quiz her in the way I'd quiz a 10-years-learned ruby programmer?
* If you have time to teach and educate a programmer, prefer a smart and eager type over someone who spools down memorized knowledge. While memorized knowledge can sometimes help, the ability to learn and educate yourself is much more helpful once your memorized stuff is not saving you.
Yes, I did read your post. The point is whether asking math questions at all is valuable compared to the other questions you asked. I do not believe you particularly addressed that in your post or in this one. It is a proxy for what you are actually trying to measure - why not measure what you want to measure?
Your first asterisked point does not at all square with my experience. It depends on the program of study; I've certainly come across kids with no real pragmatic ability, and other programs that turned out, as much as you can in that environment, very pragmatic and skilled programmers. Certainly this is a skill set that improves over time. But let's not quibble over that; you raise a larger, valid point about interviewing recent grads vs more experienced people.
As far as that goes, I try to question them about school projects. "So, if I was to try to take that and do Y with it, what would be the consequences" type questions. Like you w/ the math questions, I do not expect any particular expertise and experience in actually solving the problem. But, I can start to see how they think about things. If they aren't thinking that clearly, throw them a bone and see what they do with it. Does it give them an 'aha' moment that then leads them to a better answer (I infer, perhaps incorrectly, that they can learn and be mentored), or do they just stonewall, not make the connection, or what have you. My suggestion is pretty simple. Measure what you want measured, not some proxy. I will point out that recent data suggests I am right. Google has admitted that all of their algorithm type questions are not good predictors of on the job performance, but questions about experience and "how would you X" are. I don't consider their data the last word on the subject (their hiring is quite narrow after all), but certainly suggestive.
I absolutely agree with asterisk 2, so I didn't address it.
> The point is whether asking math questions at all is valuable compared to the other questions you asked.
The author uses a graph search problem as an example - which is a very typical problem in IT. How do you approach such a problem? This is a valid math question that certainly adds value compared to other questions I might ask. It actually checks for multiple things: Does the candidate have a grasp of the underlying mathematical concepts. How does the candidate approach a problem decoupled of the actual real-world constraints of a programing language? Can the candidate describe a problem in clear, concise terms? Same for set theory: intersection, union, functions mapping input to output. Complexity analysis of algorithms - all of that is valuable knowledge and it's absolutely valid to ask people for that. How does binary AND/OR/XOR work? How does an exponential decay curve look like compared to a gauss or linear curve? This GH issue https://github.com/elasticsearch/elasticsearch/issues/3423 was posted on HN as example of a great feature description and it's full of formulas describing how it works. Vectors, matrix multiplication is a fundamental thing when you're doing natural language processing. Statistical problems are not exactly uncommon in programming either. Map/reduce are mathematical concept. What's wrong asking for that? Failing the answer does not mean that the interview is over, but to assert a candidate I need to know what she knows. As annoying as it sometimes is, math is _the_ _fundamental_ underpinning of what we do every day.
> As far as that goes, I try to question them about school projects.
Totally fine and I agree. Still - what's wrong with asking math questions?
> Google has admitted that all of their algorithm type questions are not good predictors of on the job performance
You're running into a problem if you hire only for math knowledge. But -repeat- that's not what I've been advocating.
What kind of math questions do you ask? Are then relevant to the job? Is it knowledge that the employee will need to have for their day-to-day tasks?
Asking math questions that are unrelated to the employee's tasks has strong bias for recent graduates. Ask a 40 year old a question about an equation they haven't used since they were 19 and they won't remember.
I was once asked to do binary math on a phone interview. I hadn't had to do binary math since I was in school 10 years prior. I was pretty much guaranteed to fail. This was not a good test. If I needed it I could easily relearn binary math in a few hours at most.
Memorising the manual is the strategy of a surface learner. Surface learners can make pretty good PHP programmers, but they'll struggle with, say, passing Javascript closures.
A deep learner will look for the underlying principles and abstractions. They can quickly get an overview that, even when it's fuzzy, is still accurate enough for them to know where the gaps in their knowledge are, so they can fill those gaps quickly when they need to. They're like Mendeleev with his first periodic table. He didn't need somebody to show him a sample of gallium to know that it existed: he could inferred its presence and properties from the overall structure of the system.
Of course, a really advanced programmer will have done both. They will know and understand everything. But those guys are few and far between.
What I mean is that people who enjoy programming probably tend to be smart. I haven't done any statistical surveys, but the motivational feedback loop for learning programming rewards intelligence.
I doubt that. There's a lot of people going into programming because they think
a) IT is akin to playing computer games all day.
b) It's easy work in a climatized office with solid pay
c) The jobs are relatively secure and abundant.
I've seen many people go into IT that would much better have been employed elsewhere, so I think that we, as programmers are not more intelligent on average than the rest of the population. I might agree that people who enjoy programming have a knack for a certain type of intelligence and problem solving ability, but I don't think that programmers are limited to the group of people that enjoy programming.
looks like the one part you said you might agree with in your very last sentence is 100% of the group that they were talking about: people who enjoy programming.
Actually, since I'm the author of the post they're referring to I can say that the group we're referring to is "programmers" or rather "people you'd hire as a programmer" which is a superset of "people who enjoy programming." Even if you only consider people who enjoy programming I'd be very careful about calling them "more intelligent". They're probably good solvers of a certain kind of logical puzzle, but intelligence encompasses much much more than that.
*presuming that you went to an undergraduate institution where the variance in student intelligence is large enough to detect an appreciable difference between the avg intelligence in these two majors.
The phenomenon of the Computer Science graduate who can't write a FizzBuzz program, or even the post-graduate who can't write a simple recursive function, is well attested. But a Math student who can't handle a recursive definition is unlikely to make it through the first term.
> I would guess that there is actually significant overlap between these two groups.
I wouldn't. Smart people get bored sitting around memorizing things. They'd rather be thinking. Purely anecdotally, the smartest people I know rarely have encyclopedic knowledge of anything.
But thinking has the side effect of storing lots of stuff in your memory. So you can end up with encyclopedic knowledge of subjects that you've spent a lot of time thinking about. It just isn't knowledge that got stored by explicitly trying to "memorize" things (which means it's more reliable anyway, since it's knowledge that's connected to other things you know).
I store a lot of "I've read the solution for that problem somewhere while researching something else." I still can't recite the stdlib doc for any of the programming languages I work with. There's only so much stuff I can cram in my head and reference docs are fairly easy to look up.
Not necessarily; an encyclopedia isn't just an unconnected catalogue of facts--at least, it's not supposed to be. An encyclopedia article about a given subject is supposed to show the subject as a connected whole; it will contain facts, but will also contain important relationships between the facts, general principles, theories that explain the facts, etc. If you have that kind of knowledge of a subject, you don't have to memorize all its facts, because you can easily get to them from the facts you do have memorized via one of many interconnections.
Certainly, asking only math questions is stupid as well, people should know at least a little about the stuff they're supposed to work with, but teaching an actual language to a smart person eager to learn is a breeze compared to teaching problem solving to someone who memorized the reference manual.