I'm currently trying to change my project to use GL_TRIANGLE_ADJACENCY instead of GL_TRIANGLES.
Following this question, I have managed to construct my index buffer fine, but when it comes to the drawing stage, I'm getting unexpected results.
Here is my geometry shader code. Bare in mind that I store my indices like so:
(vertex1/adjacent1/vertex2/adjacent2/vertex3/adjacent3
)
Geometry Shader
#version 330
precision highp float;
layout (triangles_adjacency) in;
layout (triangle_strip, max_vertices = 3) out;
smooth in vec2 vVaryingTexCoords[];
smooth in vec3 vVaryingNormals[];
smooth out vec2 gsUV;
smooth out vec3 gsNormals;
void main(void)
{
int i;
for(i = 0; i < gl_in.length(); i++)
{
switch(i)
{
case 0:
case 2:
case 4:
gl_Position = gl_in[i].gl_Position;
gsUV = vVaryingTexCoords[i];
gsNormals = vVaryingNormals[i];
EmitVertex();
break;
default:
break;
}
}
EndPrimitive();
}
Any ideas?
EDIT
Just like to point out that setting the adjacent index to be the same as the vertex index, i.e v1/v1/v2/v2/v3/v3
still produces the same results.