I am trying to use the touch screen on the tablet to control my camera movements and generate the view matrix accordingly.
I get the x and y coordinates of the screen and after doing some sanity checks on it I use them. My problem is I get the image and based on the movement it moves - but the object is always at a offset from the touch/pointer position. And I am not able to understand why. ( Kind of new at it)
Here is the camera logic - Where pX and pY are coordinates of the device screen.
pitch = pY * 0.05;
yaw = pX * 0.05;
LOGD(">>pitch %f", pY);
pitch = min(pitch,90.0f);
pitch = max(pitch,-90.0f);
LOGD(">>pitch(alter) %f", pitch);
if(yaw < 0) {
yaw += 360.0f;
}
if(yaw > 360.0f) {
yaw -= 360.0f;
}
I am not sure whether I am going about it in the right direction. Any suggestions will be appreciated.