Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Beyond OS kernels , drivers and embedded systems is there any real use for this?

I only ask because we were never taught about bit shifting at university and I can't think of a time where it would have ever been useful in my work but despite this it seems to be a very common thing to ask about at interviews so I have sort of educated myself about it for that reason alone.



I first learned about bit-shifting when I took DIP / Computer Vision as an undergrad. All the assignments were done as plugins for ImageJ, which is apparently widely used in the scientific community (or so the course claimed). ImageJ stores the pixel values for images as bytes, ints, or longs (depending on the color-depth), so to get the individual component values from a 32-bit RGBA image (8 bits per channel), you would do something like this:

int pixel = image.get(x, y);

int alphaVal (pixel & 0xFF000000) >> 24;

int redVal = (pixel & 0x00FF0000) >> 16;

int greenVal = (pixel & 0x0000FF00) >> 8;

int blueVal = (pixel & 0x000000FF);

That's just one example w/ one piece of software, but I know similar approaches are often used within the world of imaging / graphics. Maybe networking? Seem like it would correlate well to IP address operations.


I bookmarked this comment since it more or less answers that question: http://news.ycombinator.com/item?id=3452869

In short, when you're dealing with a performance sensitive application they can be helpful to make it blazing fast. When you have a piece of code being executed many times they can be helpful since micro-optimizations start mattering then, too. For the same reason most of us don't write in assembly, most of us probably don't need it for our applications, we're free to waste, but besides being fun/interesting the practical applications where bit hacks can be beneficial do indeed go beyond your short-list. (Game engines (physics, graphics, AI, networking) and databases are two more general topics I can think of off the top of my head, compression is another but could just be a special case of databases.)


I've used bit shifting a lot when dealing with any sort of audio/video applications. Specifically, when muxing audio and video into a container (i.e. MPEG transport streams) you need to set up a bunch of bit flags that are packed very tightly and also need to frequently need to write data into non-byte boundaries. The result is a couple hundred lines of code of all pointer arithmetic and bit shifts to convert between verbose data structures and the format in question.


> I only ask because we were never taught about bit shifting at university

At CMU, three of the first four intro programming classes discuss bit shifting in depth.


Are OS kernel, driver, and embedded systems development off topic on HN? There were several embedded developer posts in the latest Who's Hiring thread. Web development is by far the largest software segment that HN focuses on, but is by no means the only one.


I never suggested it was off topic, I usually enjoy reading the low level programming threads on HN.

I just wondered why it seems to considered something every CS grad/programmer should understand.


As an embedded developer/EE, I feel the same way, but from the other side. There is too much breadth for any single person to be proficient at all 7 layers, but from time to time, you still see job descriptions that ask for assembly, C, C++, Java, PHP, HTML, Python, CSS, Flash, Haskell, PCB Layout, VHDL, 7 yrs iOS...


> Beyond OS kernels , drivers and embedded systems is there any real use for this?

If you're parsing a binary format you'll probably use bit shifting.

Bit shifting is just bitwise arithmetic. Assuming you did CS or a related degree, it's more likely you forgot about being taught about it. In fact it'd pretty hard to come up with a CS syllabus that does not mention bitwise arithmetic.

These are some subjects where bitwise arithmetic is bound to appear: Intro to Programming, Data Structures, Operating Systems, Computer Architecture/Organization, Graphics, Cryptography, Implementation of Programming Languages.


> we were never taught about bit shifting at university

This is surprising to me. I'm taking a computer architecture course now and it's the third class that bitwise operations have been mentioned and in the greatest detail (exactly how it's implemented in hardware). In fact, I think it was first mentioned in my intro to CS course, which used C++ (more like C with streams, but whatever).

Is it a CS program or something like CIS?


CS courses tend to explain what bit shifting is but not when you would use it.

For example you'll often see a lecturer drawing lots of 1s and 0s on a whiteboard to illustrate it but you're less likely to see a simple example Java program which uses shifting to get RGB values deconstructed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: