Take the 2-minute tour ×
Game Development Stack Exchange is a question and answer site for professional and independent game developers. It's 100% free, no registration required.

I'm working on a shader in Unity that uses a Binary Tree to store some precomputed values, this binary tree should be available to the shaders and would ideally also be constructed on the graphics card (each node in this tree is based on 1 pixel) is there a good way of doing this?

share|improve this question
    
The best way of passing such data to the GPU via shaders would be with a texture. Knowing that, your own question answers it: a texture with one pixel for each node of the tree. –  glampert Mar 10 at 18:10
1  
Good idea, I would accept this if it was an answer. –  Thijser Mar 10 at 20:17
    
Well, I don't really have enough knowledge about Unity shaders to develop this into an answer. Perhaps now that you have a pointer to start from, you can edit this into a more specific question and someone with good knowledge of Unity can help you further. –  glampert Mar 10 at 20:44
    
@glampert I think he was suggesting that you post your comment as an answer so he could mark it as the answer he was looking for :) –  Alexandre Vaillancourt Mar 10 at 21:00
    
@AlexandreVaillancourt Okay. –  glampert Mar 10 at 21:09

1 Answer 1

up vote 2 down vote accepted

The best way of passing such data to the GPU, using the shaders pipeline, would probably be with a texture. Knowing that, your own question answers it: A texture with one pixel for each node of the tree.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.