Take the 2-minute tour ×
Game Development Stack Exchange is a question and answer site for professional and independent game developers. It's 100% free, no registration required.

My camera looks like this:

Camera settings

I am trying to add 2D polygons to the screen based on the percentage that they take up of a 1920x1080 pixel canvas. Therefore, I calculate this and then use Camera.ViewportToWorldPoint() to determine the world point for the mesh vertices. Below you can see how I declare vertices and uvs (I am confident my triangles are correct) for the polygon.

foreach (var point in poly.path)
    {
        uv.Add(new Vector2((float)(point.X - envelope.left) / (envelope.right - envelope.left), (float)(point.Y - envelope.bottom) / (envelope.top - envelope.bottom)));
    }

I assume this is the correct way to calculate the uvs since they are basically the percentage that vertices lie within the bounds of the polygon's bounding envelope.

Here is what I am not so sure on. When I set the vertices like so (assume temp is the array of vertices used):

        var temp = new List<Vector3>();
    var envelope = GetEnvelope();
    foreach (var point in path)
    {
        var vect = new Vector3((float)(point.X - envelope.left) / (envelope.right - envelope.left),
            (float)(point.Y - envelope.bottom) / (envelope.top - envelope.bottom), 1.0f);

        temp.Add(camera.ViewportToWorldPoint(vect));
    }
    return temp.ToArray();

everything works and the polygon is displayed to my screen except it scales so that the polygon draws as though the viewport is the entire envelope. I expect this, however it is not the functionality I want. So instead I try and declare vertices like so in order to have the polygon displayed in its correct relative position on the screen:

var temp = new List<Vector3>();
    foreach (var point in path)
    {
        var vect = new Vector3((float)point.X/1920,
            (float)point.Y/1080, 1.0f);

        temp.Add(camera.ViewportToWorldPoint(vect));
    }
    return temp.ToArray();

however when I set the vertices this way, I no longer see the meshes being rendered. My goal with changing the code to this is to have the polygons be their true size on the screen as they are on the 1920x1080 canvas rather than being skewed as they are in the code that I said was working. Am I misunderstanding how vertices and uvs relate?

Thank for any help!

share|improve this question

closed as off-topic by Josh Petrie Jan 7 at 17:39

This question appears to be off-topic. The users who voted to close gave this specific reason:

  • "Questions about debugging a problem in your project must present a concise selection of code and context so as to allow a reader to diagnose the issue without needing to read all of your code or to engage in extensive back-and-forth dialog. For more information, see this meta thread." – Josh Petrie
If this question can be reworded to fit the rules in the help center, please edit the question.

1 Answer 1

up vote 0 down vote accepted

My meshes were being generated correctly. However they were not rendering to the camera. I had to change the mesh's scale's z value to any value less than 0f for it to render. I am closing this question and opening another to figure out why exactly this is.

share|improve this answer
    
This is because your indices are backward, and your geometry is facing backward. BackFace culling is preventing it from being drawn. Make sure that your vertices are drawn in a counter clockwise order. –  sharvey Jan 7 at 22:00

Not the answer you're looking for? Browse other questions tagged or ask your own question.