detecting touches when 3D is being used

Discussion in 'iOS Programming' started by Chirone, Oct 20, 2009.

  1. macrumors 6502

    Joined:
    Mar 2, 2009
    Location:
    NZ
    #1
    using openGLES and detecting touches in 2D space is pretty simple because the coordinates map onto the coordinates of quartz easy

    but, when you want to know when someone has touched an object in 3D space, how would you do it?
    the coordinates aren't the same as they would be in 2D space
     
  2. macrumors member

    Joined:
    Jun 24, 2009
    #2
    well your current viewpoint when you the screen is touched will be showing technically a 2D picture, so from using your viewpoint and where the user pressed on the screen, you can work out what he may have touched (or atleast where in a 3D world)
     
  3. thread starter macrumors 6502

    Joined:
    Mar 2, 2009
    Location:
    NZ
    #3
    due to a nice and welcome plot twist someone changed my code to make this easier by having the camera move to a position where a unit in 3D equals a pixel on the phone
     
  4. macrumors member

    Joined:
    Mar 23, 2008
    #4
    You can this as a ray-casting problem - imagine shooting a ray from the 2D point in window space into the scene and you want the coordinates of the intersection with the first object it hits. Take a look here for some additional references:

    http://en.wikipedia.org/wiki/Ray_casting

    Chris
     
  5. thread starter macrumors 6502

    Joined:
    Mar 2, 2009
    Location:
    NZ
    #5
    yeah i thought of a concept similar to ray tracing, i just didn't know how to do that

    well... ray casting sounds slightly different so i'll give that a search too
     

Share This Page