switch to catch non int

Discussion in 'Mac Programming' started by KhuntienNang, Jun 26, 2010.

  1. KhuntienNang macrumors member

    Joined:
    Feb 20, 2008
    #1
    I'm new in objective-c (or programming in general)

    so, I was trying out switch statement, say:

    Code:
    		switch (option){
    			case 1:
    				statement
    				break;
    			case 2:
    				statement
    				break;
    			case 3:
    				statement
    				break;
    			default:
    				statement
    				break;
    
    option is an int type.

    the default catches other int besides 123, but it doesnt catch other character (non int)

    is there a way to add on the default so that it catches something other than non interger?
     
  2. lloyddean macrumors 6502a

    Joined:
    May 10, 2009
    Location:
    Des Moines, WA
    #2
    Since you did say 'character' I'll throw out

    Code:
    int option = '?';
    
    switch ( option )
    {
        case 'a':
            ...
            break;
        case 'A':
            ...
            break;
        case '?':
            ...
            break;
    }
    
     
  3. Bill McEnaney macrumors 6502

    Joined:
    Apr 29, 2010
    #3
    Can you your option variable be a character variable instead of an integer variable? That way, if each option number were a digit between, say, '1' and '6', the default clause would catch any character that was outside that range.
     
  4. mrbash macrumors 6502

    Joined:
    Aug 10, 2008
    #4
    There are two things you could do. One is to use atoi (stdlib), the other is to use int (iostream).

    If you are fairly sure that you are only getting numbers, then atoi is your function.
    char letter = '1';
    int number = atoi(&letter);
    number ->1
    If you have a letter say, 'a', you would get 0.

    If you want the ascii equivalent of the letter 'a', you would use:

    char letter = 'a';
    cout << int(letter) << endl;

    would print 97.
     
  5. chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #5
    You're taking some potentially serious risks with your "not quite a nul-terminated string" there.
     
  6. mrbash macrumors 6502

    Joined:
    Aug 10, 2008
    #6
    This should be better:

    Code:
    char letter = '1';
    char string[2];
    string[0] = letter;
    string[1] = 0;
    int number = atoi(string);
    
     
  7. Bill McEnaney macrumors 6502

    Joined:
    Apr 29, 2010
    #7
    Did you mean to put the integer zero into the string? Why not put the '0' character there instead?
     
  8. notjustjay macrumors 603

    notjustjay

    Joined:
    Sep 19, 2003
    Location:
    Canada, eh?
    #8
    The integer 0 is the equivalent of the NULL character. Every C-style string must consist of a number of characters (your string) followed by the NULL terminator. If your string does not include the terminator, very bad things can happen. (At best, you get garbage results -- at worst, you get either crashes or one of those security vulnerabilities that allows a hacker to inject his/her own code.)
     
  9. chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #9
    C's quoted-string notation is legal as initializers for char arrays. So here's the same thing, more conciserishly:
    Code:
    char string[] = "1";
    int number = atoi(string);
    
     
  10. chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #10
    Because '0' doesn't have an integer zero value.
    Code:
    #include <stdio.h>
    main( int argc, char** argv )
    {  printf( "%d %d\n", '0', 0 );  }
    
    http://en.wikipedia.org/wiki/C_string
     
  11. Bill McEnaney macrumors 6502

    Joined:
    Apr 29, 2010
    #11
    I know that, but I'm used to writing in programming languages where that assignment would have made the computer complain about a type mismatch.
     
  12. Sander macrumors 6502

    Joined:
    Apr 24, 2008
    #12
    In that case, you'd probably have used '\0'.
    Arguably, the languages you speak of make this more error-prone by this insistance, exactly as shown in your example. You want to null-terminate a string, so you assign the value 0 to its last character; the compiler doesn't like that, so you change it to '0' after which the compiler is happy (but you aren't!).
    If the language has qualms about integer-to-character conversions (which is a good thing in itself), then the 0 should probably be an exception (just like it is for pointers).

    This manual string composition looks rather brittle to me, anyway.
     
  13. Bill McEnaney macrumors 6502

    Joined:
    Apr 29, 2010
    #13
    I can ignore these details when I write in a strongly-typed language. I don't need to choose between zero and '\0' because that language won't let me put an integer zero into a string. I don't know about you. But I'd rather use a language that forbids the strange, potentially dangerous "tricks" that C allows. Mrbash was right: Unfortunately, I wrote a dangerous C-function where I assumed that an array had already been allocated. I just hadn't considered his point because it hadn't occured to me when I wrote that function. But many C programmers knowingly write dangerous code. So I wish someone would explain why those programmers do that.
     

Share This Page