Game Development Stack Exchange is a question and answer site for professional and independent game developers. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

This question is an exact duplicate of:

I'm having trouble mapping UV coordinates to vertices in OpenGL. I'm binding my buffers and enabling all of my attrib arrays perfectly well and things are rendering - but what I'm stuck on is how to map UV and Normal data.

You can't just store UV and Normal data with the Vertex data, because obviously you'll be re-using vertices, but not necessarily with the same Normal or UV.

For clarity this is the general flow of what I'm doing at the moment:

  1. I bind my shader program.
  2. I bind my vertex array buffer with GL.BindBuffer
  3. I Bind my element buffer with GL.BindBuffer
  4. I use GL.Uniform (1, 2, 3 or 4) to bind values to my uniform variables
  5. I enable all of my vertex buffer attributes with GL.EnableVertexAttribArrau and then encode their layout using GL.VertexAttribPointer
  6. I draw using GL.DrawElements
  7. I disable all of my vertex attrib arrays again

I'm totally stuck on how to encode UV and Normal data to my vertex attributes. Do I need to use a second buffer? If so; how does OpenGL know to use a different set of indices to find them? I'm totally lost and any help would be greatly appreciated.

share|improve this question

marked as duplicate by Josh Petrie Apr 6 at 20:35

This question was marked as an exact duplicate of an existing question.

OpenGL does not allow for indexing position, uv and normals separately. There can be only one index buffer. Your best bet is to allow redundant position data. Some reuse by using indexing may be possible (and yield better quality) where faces share an averaged normal and uv.

share|improve this answer

Not the answer you're looking for? Browse other questions tagged or ask your own question.