Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

LtRammstein

macrumors 6502a
Jun 20, 2006
570
0
Denver, CO
What a silly question.

Bitwise operations are pretty cool, besides the multiply/divide by two. In certain algorithms you can speed up code execution because of it. Bitwise operators are hardware level math that the ALU of the computer uses. Depending on how you use the operators, you can get some awesome results in a more timely manner. However, on a normal Mac/PC you won't see a difference, but if you write parallel code for supercomputers you can see a difference.

Also, under certain frameworks in OS X and Windows .NET there are objects that require bitwise operations. For example, in .NET if you want to use a button to do multiple functions, you would use a bitwise-OR. This way you don't have to write new handlers for each potential operation.

In microcontroller/microprocessor programming you end up doing bitwise operations all the time comparing registers to values to achieve a result.

Hopefully this helps. If said something wrong or didn't add more detail, please reply with a correction.
 

lee1210

macrumors 68040
Jan 10, 2005
3,182
3
Dallas, TX
What if your datatype isn't an int?

While a compiler may optimize it, a bitshift on a CPU is much cheaper than an actual multiplication or divide.

If the right-hand operand is a variable, then you would have to do a lookup, calculation, etc. to see what to multiply or divide by. If you know how many bits you want to shift, why use a different operation? Multiplying or dividing by powers of 2 is not more fun than a bit shift.

Sometimes your intentions in code will be more clear/easier to read if you use a bitshift. For example, if you want to test that bit 14 is high, x&(1<<14)!=0 is easier for me to decipher than x&16384!=0.

Nowadays it's not that common to store statuses in int or short datatypes, you just use a BOOL or some other typedef, accept the wasted bits, and be done with it. This has not always been so, so if you are storing 16 discrete values in one short, it is pretty critical that you know how to shift around, test bits, etc.

Again, not common, but if you are storing 4 8-bit values in a 32-bit int, shifting/masking to pull out the values would be pretty commonplace.

Rarely is something there that is wholly useless. Uses might not readily present themselves to you, and they may be outdated, but they certainly exist.

-Lee
 

Sander

macrumors 6502a
Apr 24, 2008
521
67
It may also be worth pointing out that on older CPUs, multiply/divide instructions weren't available (and on many microcontrollers, they still aren't). Compilers nowadays are probably smart enough to optimize away a division when they "see" it can be fully represented in a bit shift, but with the first compilers this was not the case.

So the remaining reason for using them in modern code is, as lee1210 said, to make it clear to a reader of the code what it is you're trying to do. If I'm extracting the "green" value out of an RGBA pixel, code like (rgba >> 16) & 0xff is idiom everyone used to dealing with bitmaps will recognize, and rgba/65536 - 256*(rgba/16777216) looks very strange.
 

chown33

Moderator
Staff member
Aug 9, 2009
10,751
8,424
A sea of green
Thanks for the explanation guys, just came across bit shifts and really wasn't sure what they were for.

In general, multiplication and division are implemented with an algorithm where shifting is one of the basic operations. This is so in software or in hardware.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.