Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

KhuntienNang

macrumors member
Original poster
Feb 20, 2008
59
0
I'm new in objective-c (or programming in general)

so, I was trying out switch statement, say:

Code:
		switch (option){
			case 1:
				statement
				break;
			case 2:
				statement
				break;
			case 3:
				statement
				break;
			default:
				statement
				break;
option is an int type.

the default catches other int besides 123, but it doesnt catch other character (non int)

is there a way to add on the default so that it catches something other than non interger?
 

lloyddean

macrumors 65816
May 10, 2009
1,047
19
Des Moines, WA
but it doesnt catch other character (non int)

is there a way to add on the default so that it catches something other than non interger?

Since you did say 'character' I'll throw out

Code:
int option = '?';

switch ( option )
{
    case 'a':
        ...
        break;
    case 'A':
        ...
        break;
    case '?':
        ...
        break;
}
 

Bill McEnaney

macrumors 6502
Apr 29, 2010
295
0
Can you your option variable be a character variable instead of an integer variable? That way, if each option number were a digit between, say, '1' and '6', the default clause would catch any character that was outside that range.
 

mrbash

macrumors 6502
Aug 10, 2008
251
1
There are two things you could do. One is to use atoi (stdlib), the other is to use int (iostream).

If you are fairly sure that you are only getting numbers, then atoi is your function.
char letter = '1';
int number = atoi(&letter);
number ->1
If you have a letter say, 'a', you would get 0.

If you want the ascii equivalent of the letter 'a', you would use:

char letter = 'a';
cout << int(letter) << endl;

would print 97.
 

mrbash

macrumors 6502
Aug 10, 2008
251
1
You're taking some potentially serious risks with your "not quite a nul-terminated string" there.

This should be better:

Code:
char letter = '1';
char string[2];
string[0] = letter;
string[1] = 0;
int number = atoi(string);
 

Bill McEnaney

macrumors 6502
Apr 29, 2010
295
0
Did you mean to put the integer zero into the string? Why not put the '0' character there instead?
This should be better:

Code:
char letter = '1';
char string[2];
string[0] = letter;
string[1] = 0;
int number = atoi(string);
 

notjustjay

macrumors 603
Sep 19, 2003
6,056
167
Canada, eh?
Did you mean to put the integer zero into the string? Why not put the '0' character there instead?

The integer 0 is the equivalent of the NULL character. Every C-style string must consist of a number of characters (your string) followed by the NULL terminator. If your string does not include the terminator, very bad things can happen. (At best, you get garbage results -- at worst, you get either crashes or one of those security vulnerabilities that allows a hacker to inject his/her own code.)
 

chown33

Moderator
Staff member
Aug 9, 2009
10,753
8,438
A sea of green
This should be better:

Code:
char letter = '1';
char string[2];
string[0] = letter;
string[1] = 0;
int number = atoi(string);
C's quoted-string notation is legal as initializers for char arrays. So here's the same thing, more conciserishly:
Code:
char string[] = "1";
int number = atoi(string);
 

Sander

macrumors 6502a
Apr 24, 2008
521
67
In that case, you'd probably have used '\0'.
Arguably, the languages you speak of make this more error-prone by this insistance, exactly as shown in your example. You want to null-terminate a string, so you assign the value 0 to its last character; the compiler doesn't like that, so you change it to '0' after which the compiler is happy (but you aren't!).
If the language has qualms about integer-to-character conversions (which is a good thing in itself), then the 0 should probably be an exception (just like it is for pointers).

This manual string composition looks rather brittle to me, anyway.
 

Bill McEnaney

macrumors 6502
Apr 29, 2010
295
0
I can ignore these details when I write in a strongly-typed language. I don't need to choose between zero and '\0' because that language won't let me put an integer zero into a string. I don't know about you. But I'd rather use a language that forbids the strange, potentially dangerous "tricks" that C allows. Mrbash was right: Unfortunately, I wrote a dangerous C-function where I assumed that an array had already been allocated. I just hadn't considered his point because it hadn't occured to me when I wrote that function. But many C programmers knowingly write dangerous code. So I wish someone would explain why those programmers do that.
In that case, you'd probably have used '\0'.
Arguably, the languages you speak of make this more error-prone by this insistance, exactly as shown in your example. You want to null-terminate a string, so you assign the value 0 to its last character; the compiler doesn't like that, so you change it to '0' after which the compiler is happy (but you aren't!).
If the language has qualms about integer-to-character conversions (which is a good thing in itself), then the 0 should probably be an exception (just like it is for pointers).

This manual string composition looks rather brittle to me, anyway.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.