CORRECTION: The post referenced in this article is an old post by Scott. Somehow, I managed to look over the date of Scott's post and assume this was the most recent post. So while my reaction is very much late, and I believe Scott has moved on (as he mentions in his most recent post), I believe the points made still stand. While Scott won't use a list of trivia questions anymore, too many companies still do. My apologies to Scott for this.
This is a reaction to a post by Scott Hanselman on what great .NET developers ought to know. You might want to read it first. Skimming is permitted, as it's basically a long list of interview questions.
Now, first of all, I love Scott Hanselman, his work and his writings, and I believe he is a very smart guy. I've seen him talk at conferences and if you ever get the chance, go see him. Regardless of the subject. The guy is hilarious.
But in his most recent post, I believe he misses the point on how to interview developers. In the article, Scott makes a list of questions that can test the skill level of a developer. Examples are:
- Is this valid? Assembly.Load("foo.dll");
- What does this do? gacutil /l | find /i "Corillian"
- How does the XmlSerializer work? What ACL permissions does a process using it require?
As a disclaimer, he does mention that knowing or not knowing the answers to the questions may not make you a good or bad developer, but it WILL save you time when problems arise.
However, I believe providing this list does more bad than good.
These questions usually provide a bad insight into the skills of a candidate developer. It shows you whether or not the developer has encountered that specific problem. And more so, if he/she has encountered it recently.
As an example: I've worked with anonymous types in VB.NET before, but wouldn't be able to write it on paper immediately, because it's been so long. However, if I were to jump head first into VB.NET code again, I would soon be up to speed.
What these type of questions don't test, is how good a developer really is. But investigating that requires more effort from the interviewer. Firstly, because it requires more in-depth interviewing instead of just passing out sheets of paper with the questions. But also because it requires the interviewer to be stronger at a technical level.
Let me explain. What I'm missing in Scott's questions are questions about TDD, Dependency Injection, SOLID principles, architectural stuff, and other things I find important. Focussing on these aspects of software development in an interview, will generate an interesting discussion and dialogue with the interviewee, providing more insight in his/her experience, workflow and problem solving abilities.
But it also requires the interviewer to accept that the interviewee might know things the interviewer doesn't. Which is fine, but in my experience, quite some interviewers (usually with titles like 'lead developer' or 'architect') have a hard time with this feeling.
So the interviewer sums up a list of questions, which are naturally biased towards what he/she knows.
I'm not saying Scott won't also focus on these more general concepts of programming, but in providing this list, I believe he strengthens the idea that we can gauge developer skill by just providing a list. As long as we focus on these kind of questions, we won't be hiring the best developers.
As an extra, besides talking about the stuff that matters, talk about the things the interviewee has accomplished. As Joel Spolsky mentions in his Guerilla Guide to Interviewing, you want to hire people that are
- and get things done
What projects have they finished? How did it go? How did they tackle certain problems?
If you scroll down to the comments of the post, you will see a comment by Phil Haack saying more or less the same thing.