OpenGL using Vertex Buffer Array

Discussion in 'iOS Programming' started by harrys1590, Jul 14, 2013.

  1. harrys1590, Jul 14, 2013
    Last edited: Jul 14, 2013

    harrys1590 macrumors newbie

    Joined:
    Jul 14, 2013
    #1
    Hello,

    In my app I am using OpenGL ES to render a file downloaded from the internet that is then parsed into a vertex array, so I have to input vertex and normals data after launch. I am new to OpenGL ES but I have been reading and learning. I have setup a vertex and normals buffer that seem to be working fine, but I think that I am putting the vertex and normals data into the buffers incorrectly because when I load the view there is an object there that vaguely resembles the shape that I want but with triangles veering off in various directions and parts of the shape missing. Here is my code for inputting my data into the buffers:
    Code:
            for (int i = 0; i < triangle_cnt; i++) {
                int base = i * 18;
                GLfloat x1 = vertices[base];
                GLfloat y1 = vertices[base + 1];
                GLfloat z1 = vertices[base + 2];
                GLfloat x2 = vertices[base + 6];
                GLfloat y2 = vertices[base + 7];
                GLfloat z2 = vertices[base + 8];
                GLfloat x3 = vertices[base + 12];
                GLfloat y3 = vertices[base + 13];
                GLfloat z3 = vertices[base + 14];
               
                vector_t normal;
                vector_t U;
                vector_t V;
                GLfloat length;
               
                U.x = x2 - x1;
                U.y = y2 - y1;
                U.z = z2 - z1;
               
                V.x = x3 - x1;
                V.y = y3 - y1;
                V.z = z3 - z1;
               
                normal.x = U.y * V.z - U.z * V.y;
                normal.y = U.z * V.x - U.x * V.z;
                normal.z = U.x * V.y - U.y * V.x;
               
                length = normal.x * normal.x + normal.y * normal.y + normal.z * normal.z;
                length = sqrt(length);
               
                base = i * 9;
                verticesBuff[base] = x1;
                verticesBuff[base + 1] = y1;
                verticesBuff[base + 2] = z1;
               
                normalsBuff[base] = normal.x;
                normalsBuff[base + 1] = normal.y;
                normalsBuff[base + 2] = normal.z;
               
                verticesBuff[base + 3] = x2;
                verticesBuff[base + 4] = y2;
                verticesBuff[base + 5] = z2;
               
                normalsBuff[base + 3] = normal.x;
                normalsBuff[base + 4] = normal.y;
                normalsBuff[base + 5] = normal.z;
               
                verticesBuff[base + 6] = x3;
                verticesBuff[base + 7] = y3;
                verticesBuff[base + 8] = z3;
               
                normalsBuff[base + 6] = normal.x;
                normalsBuff[base + 7] = normal.y;
                normalsBuff[base + 8] = normal.z;
    
                fprintf(stderr, "%ff, %ff, %ff,          %ff, %ff, %ff, \n", x1, y1, z1, normal.x, normal.y, normal.z);
                fprintf(stderr, "%ff, %ff, %ff,          %ff, %ff, %ff, \n", x2, y2, z2, normal.x, normal.y, normal.z);
                fprintf(stderr, "%ff, %ff, %ff,          %ff, %ff, %ff, \n", x3, y3, z3, normal.x, normal.y, normal.z);
            }
    
    If I take those logs and paste them into the vertex array of a sample app using the code Apple supplies when creating an OpenGL ES app then the object renders beautifully, so I have deduced that I must just be putting the vertex data in wrong.

    So can someone help me understand what I am doing wrong when entering the vertices and normals? Any help is appreciated.

    Also this is what my render looks like:
    [​IMG]

    And this is, in shape at least, what it should look like:
    [​IMG]
     
  2. teagls macrumors regular

    Joined:
    May 16, 2013
    #2
    Glancing at your code nothing seems to stand out.

    A few questions:

    -Post the code where you calculate the buffer sizes and allocate them. Did you calculate them correctly?

    -Are you using ES 1.0 or ES 2.0 with shaders. Show your rendering code if you are using shaders.

    -What is the structure of the vertex data that you download? Maybe they are quads and not triangles.
     
  3. harrys1590 thread starter macrumors newbie

    Joined:
    Jul 14, 2013
    #3
    I figured out what was wrong, I didn't need to be doing any of that. I simply set my verticesBuffer to the array that I got from the parser and stopped inputting normals data and the object rendered just fine.
     

Share This Page