Calling toupper in C

Discussion in 'Mac Programming' started by cybrscot, Mar 21, 2011.

  1. cybrscot macrumors 6502

    cybrscot

    Joined:
    Dec 7, 2010
    Location:
    Somewhere in Southeast Asia
    #1
    Book says that programs that call touppper need to have the following directive at the top.

    #include <ctype.h>

    What I want to know is if I'm writing a program that calls on toupper, do I now use to directives?
    #include <stdio.h>
    #include <ctype.h>


    Or just the <ctype.h>??
     
  2. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #2
    Use both unless it is guaranteed that one will include the other. Even then it won't hurt to use both because some trickery will prevent multiple includes of the same file.

    -Lee
     
  3. cybrscot, Mar 21, 2011
    Last edited: Mar 21, 2011

    cybrscot thread starter macrumors 6502

    cybrscot

    Joined:
    Dec 7, 2010
    Location:
    Somewhere in Southeast Asia
    #3
    Thanks Lee, this type of question is exactly what the book leaves out. I think it's a perfectly reasonable question to ask, how would anyone know if you are to use only one or both? The book doesn't tell me that at all. I guess this is yet another area that the author assumes an instructor will intervene. If you have one!!

    Thanks man!

    EDIT: One more thing the book leaves out. When or why would I ever want/need to convert/change the case of the letter from upper to lower, etc etc? I'm thinking it's only if the user enters a case, such as upper or lower, and I want my program to display their input, but I want their input displayed specifically as either upper or lower, thereby converting it to the case I want. Is this right?
     
  4. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #4
    There are many reasons you might want to change the case of a character or string. Say you want the user to enter yes or no, then test which, if either, they entered. You could force them to type it exactly in lowercase, or check for:
    YES
    YEs
    YeS
    Yes
    yES
    yEs
    yeS
    yes
    NO
    No
    nO
    no
    or use toupper and compare to:
    YES
    NO

    This is probably the most common... Performing a case-insensitive comparison. If both things you're comparing are the same case, it's much easier.

    Say case is irrelevant or you know some piece of data like a serial number should only have uppercase characters. Don't trust the user, just force it to uppercase.

    Basically most functions in libraries are there because people needed them. You may not need one, I'm sure not many people use all of them, but if there was no use no one would have written it in the first place.

    -Lee
     
  5. cybrscot thread starter macrumors 6502

    cybrscot

    Joined:
    Dec 7, 2010
    Location:
    Somewhere in Southeast Asia
    #5
    That makes sense Lee, your example of a serial number is quite practical. I understand that. Now I also see about the yes/no answer. The program won't know what answer the user entered if they screwed up the case or mistyped. So you're saying that toupper will convert it to exactly what we want, so the program can run even if the user makes a typo. Rather than forcing the user to use a particular case.
     
  6. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #6
    It's about flexibility (the yes/no). Maybe they love entering YES or Yes and we don't want to deprive them. If we only matched yes or no, we'd tell the user they entered something incorrectly. If we didn't modify the case we'd have to check all 12 permutations in an ugly chunk of code. If we use toupper before comparing we can just check YES and NO in a clean piece of code and pick up all 12 permutations. The user is happy because they're not forced to enter a specific case, and we're happy because out code isn't a rat's nest of permutations. This is sometimes referred to as case folding. We had 12 permutations, plus a thirteenth if they didn't enter any of those. toupper allowed us to fold that down to three cases, YES, NO, or niether/error.

    -Lee
     
  7. Sander macrumors 6502

    Joined:
    Apr 24, 2008
    #7
    Your recent posts make me think you should perhaps try a different book... :)
     
  8. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #8
    We've been telling cybrscot that for a while now. :p

    FWIW I just finished your book and I think it's an excellent practical guide to C for the scientifically inclined. I like how you build up what is needed in a logical and direct sequence (unlike K N King's book, which I find delays the introduction of a lot of useful tools and focuses on minutia too early) and touch on many topics of interest to the target audience. The Synopsis and Other Languages sections of each chapter are quite valuable. Especially at the price of the eBook it's very hard to beat it's value.

    As an experimentalist/engineer the only topic missing was some basic discussion of data acquisition via RS-232, GPIB or Ethernet which is a common task in the lab. (Though this may be complicated by the fact that it varies by hardware platform).

    B
     
  9. Bill McEnaney macrumors 6502

    Joined:
    Apr 29, 2010
    #9
    Too concise?
    Code:
    bool should_continue(void)
    {
      puts("Do you want to continue?");
      return toupper(getchar()) == 'Y';
    }
     
  10. notjustjay macrumors 603

    notjustjay

    Joined:
    Sep 19, 2003
    Location:
    Canada, eh?
    #10
    I think what's important to understand is the concept behind why you are adding these directives.

    As you have probably figured out, the reason you need to "#include <ctype.h>" in the program when you want to use the toupper() function, is because ctype.h is where the toupper() function is defined, and your program will have no idea what the heck "toupper()" is until you include the header file. It is a part of the standard C libraries, but you need to ask for it, so to speak, in order to get access to it.

    As your programs increase in complexity you will begin drawing on functions from all over the C library, and you will begin including more and more header files. You will also eventually begin writing your own code libraries and begin including header files from your own code, either because you're writing something that's simply too big to store in just one file, or because you are accumulating useful utilities or, eventually, classes, that you want to reuse in other programs.

    Think of it like writing a paper and inserting bibliographical references. A small paper might only require one or two references, but as you refer to more and more, you need to keep inserting references to all of them.
     
  11. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #11
    To expand on this analogy, for a short paper you may only have one source, but you could have 5 or 6 references to this source. The source would be the header file, and the references would be functions defined in the header. You then find that you need to reference a new source as your paper (program) grows, so you have to add it to your bibliography (#include the header file) so you can reference information (functions) in the new source.

    -Lee
     
  12. cybrscot thread starter macrumors 6502

    cybrscot

    Joined:
    Dec 7, 2010
    Location:
    Somewhere in Southeast Asia
    #12
    Okay, I'm trying to get learn C on the Mac, and I will try to switch my learning to that book and see how it compares. Hopefully it makes it all seem easier and more intuitive. As for the programming book for the scientifically inclined, it's the "scientifically inclined" part that scares me. I think if I was so inclined, my current book would be just fine for me, maybe I'm not so scientifically inclined and that book would be really difficult to understand?

    I'm currently still in chapter 7. I've gone much slower lately and taken many days off because it was becoming too intense and I was staying up way too late banging my head against the wall. Unfortunately taking time away has made me forget some things that I knew pretty well before.

    Also, it seems that almost nothing from ch 1-6 is helping me to understand Ch 7. All the new escape characters are confusing with regard to character constants. The sizeof operator makes no sense to me at all, nor do the related explicit conversion and implicit conversion operations. (can I call them operations or functions?) And, using the cast operator. Usual arithmetic conversions and conversion during assignment, and type definitions.

    The problem for me is that while reading this I don't see the connection or usefulness based on what I learned already. These are such new concepts that I can't think of a situation when I'd want to use them.

    In other words, as I read ( I think I'm like most people) I'm trying to think about in what situation would I say to myself, "okay, I'll need to use this new trick". I can't currently imagine a problem or situation that when I encounter it, I'll say to myself, "okay, I need to use a cast ". The book says we can use a cast to compute the fractional part of a float value. But I ask myself, "when will I know upon creating a program that I will want to compute the fractional part of a float value?"

    Book: "Cast expressoins enable us to document type conversions that would take place anyway. " I know if they take place anyway, they are implicit, but what does it mean to "document type conversions that would take place anyway, and why do I need to do that"

    Balamw: You may have recommended a book in the past, but I don't remember if you did or not. Will the learn C on a Mac be good, or do you steadfastly recommend a specific book? I value your opinion on this.
     
  13. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #13
    If you add a float and an int to get a float, the int is treated as a float. Explicitly casting it removes any doubt as to how you want it to be treated.

    So even though:
    Code:
    float sum = myfloat + myint;
    
    works
    Code:
    float sum = myfloat + (float)myint;
    
    is better because the casting is explicit and thus documented.
    Everyone is different, and I don't have a current favorite C book.

    Learn C on the Mac seems to be a decent choice and certainly worked well enough for larswik. The little I've seen of it (Thanks Amazon!) makes it look a lot simpler than KN King's book. The one thing it may be lacking is an overarching project like several recent books I have enjoyed recently.

    I actually used a much older edition of this book: C Primer Plus when I set out to formally learn C many moons ago. So I don't know if I would still recommend it.

    Sander's book is very good if you already think scientifically and are inclined to write code to solve math/science problems, but I wouldn't recommend it as a general purpose C book as that is not its purpose.

    B
     
  14. notjustjay, Mar 24, 2011
    Last edited: Mar 24, 2011

    notjustjay macrumors 603

    notjustjay

    Joined:
    Sep 19, 2003
    Location:
    Canada, eh?
    #14
    I think the bottom line is that any book is an extension of the author's own knowledge and learning style. If I were to write a book about programming I would base the flow of the book upon my own experiences and knowledge. I would try to guess how quickly I think you could learn a concept and provide just enough explanations to move things along at that pace. It might be that my pace is too slow for someone ("yeah, yeah, I get it, let's move on already") or too fast for someone else ("huh? how'd you get from point A to point B?")

    That's why I advocate reading multiple books (or following up on book knowledge with a Google search, for example). Different people explain things in different ways, and reading multiple explanations can help you piece together the subtle things that one author might leave out but another author explains better.

    Maybe instead of following through one book, chapter by chapter, exercise by exercise, you should start coming up with practical projects that would help you exercise the skills you have already learned. Come up with a challenge for yourself, such as: "I want to write a simple text-based game", or "I want to write a program that will help me solve (a problem that I'm working on in my own life)" and work your way toward your own success. Instead of implementing the author's idea of an exercise program, and having you say "I don't get why he would ever want me to do that", if you are motivated by actually getting your own creation to work.

    Start simple, and work your way bigger. Don't get too ambitious ("I want to write my own version of Angry Birds" is probably not a good beginner project).

    I think part of the problem is that you're missing some background information that apparently the book isn't explaining. Knowing about character constants doesn't mean much if you aren't aware of, say, ASCII character representations (and why you'd want to use them), or the various unprintable but useful characters such as newline, carriage return, linefeed, escape key, etc. -- admittedly somewhat antiquated now that we no longer tend to think of computers as typewriter-like devices.

    Someone like myself might understand carriage returns because I remember using typewriters that literally had a carriage that returned to the beginning of the next line of paper, and my first printer was a dot-matrix printer that did much the same thing. I know about ASCII characters because I remember when computers booted up into DOS or some similar OS where the entire screen was a matrix of 80x24 "cells" that were only capable of displaying one of X number of different characters -- represented by the ASCII set. So if I were to write a book, I might assume that everyone knows all about that, and forget to explain it...


    Casting operations tells the compiler "treat this variable as if it were a ___". Otherwise it takes its best guess, but sometimes this can be wrong.

    For example if you have an integer number of pizzas, and an integer number of people, you might want to calculate how to divide up the pizzas as follows:

    int numberOfPeople = 6;
    int numberOfPizzas = 3;
    float pizzaFraction = numberOfPizzas / numberOfPeople;
    printf("Everyone gets %f pizzas", pizzaFraction);

    You'd expect that if you have 3 pizzas and 6 people, then each person gets 3/6 or 1/2 or 0.5 pizzas, right? You might therefore be surprised to find that the program says that nobody gets any pizza at all! What happened? You gave it two integers and asked it to divide, so the compiler assumes you want integer division. That means there are no fractions, and 3 divided by 6 is not an integer value, so the answer is 0. If you had used different numbers (say, 15 pizzas divided by 6 people) you would not get 2.5 pizzas per person, like you'd expect, but the integer value of 15 / 6, which is just 2. Either way the results are wrong.

    If you wanted it to work right, you'd have to tell the compiler "I know these are integers, but I want you to treat them like floating point values when you do the math, because I do want fractional parts":

    float pizzaFraction = (float) numberOfPizzas / numberOfPeople;

    The cast helps to fix the bug in this program, but it also serves to remind you that it's there for a reason, that you're asking the compiler to pay special attention here, which is what the book meant by "documenting" the type conversion.

    There are many other uses for casting, too, especially when you get to pointers and structures, but this is the most basic use.
     
  15. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #15
    Exactly, and cybrscot needs to find the book where the author's teaching style matches his learning style.

    That's a bit of a chicken and egg problem as you point out with the Angry Birds example. Newbies are generally poor judges of how big a problem is or isn't. The favorite programming books I have read recently use a larger project as the backbone of the book as a way of driving home how these things fit together. In iPhone Programming Homepwner takes up much of the book and is built up bit by bit. The fraction calculator in Kochan, etc.... This is one of the main things I see as missing from King.

    I'm a bit of a top down to bottom up kind of guy. I like to get a quick lay of the land and then spend more time filling in the detail. For me, KN King's book seems appropriate for that second pass. Once you have lost the initial fear of tackling a C program and are ready to see what finer points you may be missing.

    In another thread, there was a suggestion to learn programming structure without learning a language first using for example the guide here: http://lepslair.com/tutorials/fundamentals/ and maybe that's appropriate here for cybrscot to step back and get the view from 30,000 feet before diving back in to the detail.

    B
     
  16. notjustjay macrumors 603

    notjustjay

    Joined:
    Sep 19, 2003
    Location:
    Canada, eh?
    #16
    This is true.

    I like to use very simple, text-based games. For example, a program that thinks of a number between 1-100 and asks you to take a guess, and says "smaller" or "larger" depending on your guess. You can start adding more features to it after you get it working, like keeping track of the number of guesses it took you, and tracking the best score.

    A fun twist on the problem might be a program that asks YOU to think of a number between 1-100, and IT takes guesses, and you tell it if it's smaller or larger.

    One of the memorable programming projects I had to do as a high school co-op student was to write a currency conversion program. The first version asked you for a conversion factor and then asked you for the input currency, and output the destination currency. The second version had to have a variety of popular currencies built-in, and the conversion factors were hard-coded. You'd select a source and destination country and a value and it would calculate the rest. The final version had to read the countries and conversion factors from a file. Each version had to be written as professionally as possible, with well-formatted and well-documented code. The code had to be as bulletproof as possible, gracefully handling error situations.

    I would recommend either of those as a great starting point. (I guess this is a bottom-up approach, where you start small, and work your way bigger. Though it can also be seen as a top-down approach if you know that the eventual version will, say, read values from a file, but you start by hard-coding.)
     
  17. Sander macrumors 6502

    Joined:
    Apr 24, 2008
    #17
    (about my book)

    Agreed. I actually had a chapter about this half-done, but I couldn't get it to blend in with the rest of the book. My first approach tried to explain data acquisition from the basics, but that needed far too much detail. There is also the difference in how the OS handles things: On Unix, you would expect any acquisition device to present itself as a file, and you could open it and read from it. On Windows, that is not the modus operandi.

    I did another approach where I took a few example devices, but this version fell out of tone with the rest of the book because it was too much "don't try to understand this, just click here". Which is exactly the approach I was trying to avoid.

    Maybe I'll re-visit this topic for a second edition (but it's a lot of work to write a book, I severely underestimated it). What I then will also add is a secion about OpenGL in the "graphics" chapter.

    Exactly. The primary reason for me to start writing my book is because I found too many beginners books which assumed that a beginner isn't interested in how things really work. I wasn't like that when I was a beginner myself, so I wanted to offer an alternative.

    Don't be scared :) I just think there are various ways of learning, including (but not limited to) the "Just tell me what to do, I'll figure out why it works that way if I ever feel the need" approach, and the "Just tell me how it works, I'll figure out what I need to do from there" approach. My book aims at the latter.
     
  18. Flynnstone macrumors 65816

    Flynnstone

    Joined:
    Feb 25, 2003
    Location:
    Cold beer land
    #18
    The user never screws up.
    Never, if at all possible, tell the user they made a mistake.
    People do not like being informed they made a mistake, espicially by a computer.
    Why is Apple software generally good ...
    Read "The Humane Interface" by the late Jef Raskin.
     
  19. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #19
    Yet, users will always do the unexpected, so validate all input!

    B
     
  20. jiminaus macrumors 65816

    jiminaus

    Joined:
    Dec 16, 2010
    Location:
    Sydney
    #20
    I think Flynnstone was referring to the attitude of a program, not the reality of the situation.
     
  21. Flynnstone macrumors 65816

    Flynnstone

    Joined:
    Feb 25, 2003
    Location:
    Cold beer land
    #21
    Absolutely.
    Try to design the program to reduce the need to do extensive validation.

    For example, (contrived):
    With a GUI, instead of have a user enter "Yes" in a textbox, use Radiobuttons, "Yes" and "No". The user only has 2 choices.
     
  22. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #22
    Agreed on both counts. So, let me rephrase what I wrote above as:

    Users will always do the unexpected if given the chance, so either validate all input, or restrict their choices so you don't have to!

    I had originally written another paragraph to that effect and deleted it 'cause it wasn't reading right. I had said something about using "gentle nudges" rather than a sledgehammer.

    This is one thing I really like about most good Mac/iOS apps. The dialog boxes are far friendlier than the typical Windows OK/Cancel or Abort/Retry/Fail. :p

    NOTE: I think that input is one of the reasons why learning C before Objective-C and Cocoa can be dangerous unless these points are stressed early on. If your only tool for input is scanf, then everything looks like free-form input.

    This is why I liked to see an explicit requirement for data validation in larswik's Pascal homework assignment in this recent thread: http://forums.macrumors.com/showthread.php?t=1125903.

    Kochan also deals with it a bit in his Objective-C book by pointing out early on:
    Note too that even if your UI restricts input values, there may still be other reasons to validate input, as users can sometimes maliciously modify saved files in order to force bad behavior and "game" your program. It could be as simple as getting a score that is impossible to achieve by normal means in a game or trying to jailbreak your OS.

    B
     

Share This Page