Do You Speak Binary?

by Ville Laurikari on Tuesday, October 20, 2009

How well do you know the fundamental unit of information, communication, computation?

When interviewing coders, I often like to ask them to explain how they would implement a function which counts the number of bits set in an integer. This is often called the “population counting” problem. In Beautiful Code: Leading Programmers Explain How They Think, Henry S. Warren, Jr. gives a wonderful treatment of the population count problem. This C code implements a clever algorithm to compute number of set bits in one 32-bit word:

/* Counts 1-bits in a word. */
int pop(unsigned x) {
  x = x - ((x >> 1) & 0x55555555);
  x = (x & 0x33333333) + ((x >> 2) & 0x33333333);
  x = (x + (x >> 4)) & 0x0F0F0F0F;
  x = x + (x >> 8);
  x = x + (x >> 16);
  return x & 0x0000003F;
}

Can you figure out how it works?

For a really hard-core puzzle, try to figure out this piece of code from the Quake code. It computes the inverse square root of a floating point number:

float InvSqrt(float x){
   float xhalf = 0.5f * x;
   int i = *(int*)&x;
   i = 0x5f3759d5 - (i >> 1);
   x = *(float*)&i;
   x = x*(1.5f - xhalf*x*x);
   return x;
}

Wow. If you need to cheat (I did), the web is full of explanations. The one that really helped me understand was by Kalid Azad at BetterExplained.

I firmly believe that every programmer should try to build a good working understanding of the bit world. It’s not that you have to know that stuff to be able to write (most) programs. But look at it this way. That computer on your desk has a CPU in it. That CPU is executing billions of instructions per second to execute your code. Are you really not interested at all on what’s going on in there? Don’t you want to know why computers suck at math?

If the next coder you interview has trouble answering more than a couple of the following, that should be a no hire:
The next time you are interviewing a coder, here are some questions you could ask to probe their understanding of the binary system:

  • What are the first ten powers of two?
  • What is 0×7F in binary? In decimal?
  • What does two’s complement mean?
  • How would you extract the 8 low-order bits from a word?
  • How would you count the number of bits in a word?
  • What arithmetic operation does shifting right by one bit correspond to?

To me, total lack of understanding in this area is a big red flag and should cause a no hire. It means that you either haven’t been programming long enough to encounter these bit-level problems, or worse, you did encounter them but gave up for one reason or another.  It’s a real confidence booster to know that you can, if need be, go all the way down to the bit level to solve a problem or understand a bug.

It’s all ones and zeroes, right? The bits are your friends. How well do you know them?

If you liked this, click here to receive new posts in a reader.
You should also follow me on Twitter here.

Leave a Comment

{ 11 comments… read them below or add one }

hackerboss October 20, 2009 at 11:41

Do you speak binary? http://bit.ly/20OJVO

This comment was originally posted on Twitter

Crossbrowser October 20, 2009 at 14:34

As you said yourself, most people will never ever have to work at the bit-level, then why is it important to know this stuff? I can answer most of these questions, at least partially, but I wouldn’t be able to code anything that needs binary operations (from the top of my head, if I have access to the Internet that’s another story).

I think what’s most important is how the person answers, not what the answer is because I wouldn’t expect a lot of coders to know that. Furthermore, if I was interviewing for a job and was asked that, it would raise a red flag on my side, I wouldn’t want to work somewhere where they’re still doing bit operations.

Ville Laurikari October 20, 2009 at 16:05

Thanks for your comment – yes, learning about the bit-level is not necessary, but I still encourage you to learn it. It’s not that hard, and it can prove to be useful one day.

Part of my feeling towards this (and other similar skills) is 50% cognitive dissonance. I’ve programmed in assembler, twiddled the bits, learned C, learned Lisp and Haskell, and I feel all this made me a better programmer. It would be hard to admit that it was all a waste of time.

The other 50% is that there seems to be a strong correlation between the capability to get things done and the versatility of skills you possess as a programmer. It’s not really about any particular skill; it’s about the general interest towards the craft of programming, constantly learning more and more and more, and becoming a better and better and better programmer in the process.

As Andrew Hunt and David Thomas say in The Pragmatic Programmer:

Tip 8: “Invest Regularly in Your Knowledge Portfolio”

Some basic bit twiddling skills is a great candidate for the next thing to add in your knowledge portfolio.

I cannot resist closing with this famous quote from Robert A. Heinlein:

A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.

Crossbrowser October 20, 2009 at 17:34

I understand that knowing binary is a good thing, what I disagree with is when you say “If the next coder you interview has trouble answering more than a couple of the following, that should be a no hire”. Many people have a no-hire checklist but if all those no-hire points were respected all the time, not many programmers would have a job.

Ville Laurikari October 20, 2009 at 18:44

Yeah, I agree, that sounds a bit steep. It’s the “total lack of understanding” which should cause the no hire flag to rise. I toned it down, but I maintain that all programmers should understand the binary system. It’s not hard, and it really helps you understand the world of computing a lot better.

Toni Kaikkonen January 15, 2010 at 20:58

Everyone who has been doing 8-bit coding as a kid should be able to count the powers of two up to 2^16 as easily as alphabets. Your post made me realize that for non-programmers those numbers are probably meaningless :)

Ville Laurikari January 15, 2010 at 21:42

Meaningless indeed. When I turned 32, it was tough trying to explain my mother why the cake had “0100000″ on it.

Susanna Kaukinen March 22, 2010 at 00:38

I’ve seen my part of job interviews and I also have to disagree w/you here. I don’t see why this is relevant unless you’re hiring a programmer for embedded systems. I have always been able to deal w/bits when I’ve had to, but it’s rare.

If I was hiring a programmer, I’d like to have someone who can write code that does just what it’s meant to do and nothing else in a style that is easily readable by other programmers. I’d like to have someone who can demonstrate that they can understand algorithmic concepts and that they understand and care about good (architectural) design and writing code that is easy to understand and to modify. If this person can do all these things and doesn’t know much about bits, I’m sure they can learn as they go – if they ever need to.

I’ve seen too many clever hacks to be too amused about them. In most cases they offer a local optimisation that has no effect on the overall performance of the program and they’re all too often the source of hard to fix bugs.

Yes, it’s nice to know this stuff you talk about, but is it something you really should use in job interviews? I’d say no. How about looking for programmers who are committed to quality and who can work well with others and who can get things done alone, as well?

I think there are more important things. :-)

Ville Laurikari March 22, 2010 at 14:45

Of course mad binary skillz isn’t exactly at the top of the list when hiring programmers (for most roles). I am certainly not trying to promote promiscuous use of clever hacks.

Asking about bit twiddling in job interviews – depends really on who you’re hiring. If they claim to be “expert C programmer” I expect them to know this stuff. If they claim to be “expert rails guru” I probably would not.

But fair enough, based on the feedback received on this post so far, I may have to adjust my position on this. Am I growing old?

Susanna Kaukinen March 23, 2010 at 01:42

I’ve been working w/C/C++ since 1998 and only in my current work I actually need to know the bitwise operations a bit more. I can remember the basic ones off the top of my head when I think a bit, but I’ve never actually needed to use the more complicated ops anywhere. I’ve looked at them a couple of times just out of curiosity, but what you don’t use, you tend not to remember.

One time we created a message passing system that needed some performance, but we used a kernel contained linked list to do it and got to millions of message per second on a Pentium machine, IIRC. Surely there was some clever optimizations in the code that we interfaced to, but we never needed to write it ourselves.

I think you need to be working in a specific area to need bitwise ops often. Perhaps games/graphics, device drivers or low performance embedded devices. Very often it is the case that the sane thing to do is to use some sufficiently good library instead of going about hacking yet another clunky wheel by yourself.

Just my 2 cents on this thing. I have little doubt that I could master the 1337est bitwise ops soon enough, if the occasion should ever arise. I’m just not so sure that it will. Quite often I’ve just quickly written macros or functions for the ops I’ve needed to get rid of the amount of detail that just distracts me from what I’m really building.

I haven’t needed much assembler, either. I think we did create some in the university compiler course, though. But professionally, I think I’ve used basically just one assembler call to get values from the tick register and I’ve needed to read some to figure out if the compiler is optimizing as we’d like it to do. Of course, as I suggested to my colleague before the fact, it did.

To be honest, going bitwise or doing assembler before profiling seems to me like premature optimisation. Even in the event that you find a bottle-neck in your code, it might very well be that you’ve just botched up the design or used the wrong algorithms to do the job. Going bitwise or asm would be in my opinion the last resort.

The only “clever” optimization that comes to mind at once that I think might be worth doing is making your string class allocate some memory initially from the stack. That’s faster than calling reserve() for STL strings, which uses dynamic allocation. However, even this depends on what you’re doing.

It depends very much on the context. In many situations you are dealing w/so small amounts of data that even using the perfect algorithm or container for the job might not be worth it, if it makes the code harder to read. In most situation it doesn’t really matter if seeking is slow, if you have just dozens of items. YMMV, though.

To me the essence of writing programs is about attempting to grasp the problem at hand perfectly. This means that you make the program do just what it’s supposed to do and nothing else. Keeping it small, tight, to the point often removes you of the need to start optimising the program. It’s good enough to begin w/.

I think the Einstein quote says it all: “Make everything as simple as possible, but not simpler.”

010001110101010101011001 June 16, 2013 at 04:22

010011000100111101001100

Additional comments powered by BackType

Previous post:

Next post: