Hello everyone,
I've got a bit of a methodology question for anyone who can help. I am writing a very simple game where someone is shown a word in, say, French, and then they have to choose the correct word in English from a list of possible options. All of these options are represented by UILabels.
Currently, when the user taps the screen, I compare the touchLocation of that touch to everyone of those labels' locations, and if it matches I then take appropriate action.
Is this the right way to go about it? I was wondering if instead the labels should be somehow touch aware, and when touched they could send the message themselves...
I hope this is clear. I have never really programmed a UI before so I'm not sure what the best way is to go about doing it.
Cheers,
Tom
I've got a bit of a methodology question for anyone who can help. I am writing a very simple game where someone is shown a word in, say, French, and then they have to choose the correct word in English from a list of possible options. All of these options are represented by UILabels.
Currently, when the user taps the screen, I compare the touchLocation of that touch to everyone of those labels' locations, and if it matches I then take appropriate action.
Is this the right way to go about it? I was wondering if instead the labels should be somehow touch aware, and when touched they could send the message themselves...
I hope this is clear. I have never really programmed a UI before so I'm not sure what the best way is to go about doing it.
Cheers,
Tom