Hey guys. A while back I got the O'Reilly book and worked through the first few chapters, but got busy with school and and a little discouraged (especially because the bood was already outdated by xcode's updates by then) so I stopped. Then the Trent and McCormack book came out and it renewed my interest in learning Cocoa, so I hit the threads here and saw that I really ought to check out the books by Kochan and by Hillegass. So, I decided to get all 3 and really do it this time. I've been reading Kochan's book first (which is so much better than the O'Reilly one), and got half-way through chapter 4, and then they started getting into Bit Operations. And I must admit I got scurrrred. See, I'm dyslexic, and my brain has trouble with computational tasks, but is great at grasping conceptual ones. So when Kochan started talking about right shifting hexadecimals, I got nervous. Now I get the concept behind non-base ten numbers, but I'm not sure how/have no experience in converting between them. Furthermore, I know that everything on a computer is converted into binary, but I don't understand the process. I always assumed it was the machine level code/kernel that deals with all of this, but it appears that this is handled by the programmer as well. My questions are: To what extent is Cocoa programming involved with bit level operations? Is this something that is used frequently, and I should stop the Kochan book until I fully understand this process? or is it something that Cocoa programmers have access to but rarely utilize? or something that is utilized frequently in developing certain types of applications (if so which types), but not in most others? I'm gonna keep going on in the book and see how I do, but I appreciate any clarification you can bring to my back-***wards brain Thanks.