Game Development Stack Exchange is a question and answer site for professional and independent game developers. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I have an application which is currently PC/ Windows Standalone, however I need it to be able to work with the touchscreen on a Surface Pro tablet when the same build is run on here. The interaction the user has is that of clicking buttons to open and close windows etc (all buttons use the OnClick() functionality), so any input required would just be simple button taps.

I simply need this to work on a Surface Pro touchscreen display when it is run on there, so mouse clicks to be replaced with taps of the buttons.

I've read many posts on the subject, but many are from Unity 4.6 and nothing conclusive for Unity 5. Many posts I found stated that this feature was introduced for Unity 5, but others suggest it is not supported and third party plugins are needed.

I don't currently have access to a Windows touch screen device, so am unable to do a quick check. I therefore want to make sure it is possible before I purchase a device and commit to the update.

From what I gather, the 'Standalone Input Module' has now replaced the 'Touch Input Module' which apparently had a checkbox to tick if you wanted to have it work in Standalone. As such a checkbox does not exist in the former, I am unsure whether this feature is still supported.

So I would like to know -

1 - Is it supported in Unity 5?

If so -

2 - Does it 'just work' automatically or does extra work need to be done to tell it to use touch screen if it is running on a Windows touch screen device?

Many thanks in advance

share|improve this question
    
The last time I tried this (August), I found multitouch data was correctly detected and available through Input.Touches/GetTouch etc. But I couldn't use multitouch on built-in UI controls/events (they seemed to be locked into mouse emulation mode, so trying to initiate two drag actions would lead to a dragged object following the midpoint of two touches). I was able to hack around this by just writing my own drag handling scripts, but there may be a more elegant solution in configuring the UI input/raycaster/event systems – DMGregory Sep 29 '16 at 18:16
    
@DMGregory Hi, thanks for your prompt response. That's good to know. currently I don't need any multi-touch or any advanced touch control, it is just tapping buttons to open and close images etc. As the Surface Pro is essentially a desktop, I assume it would detect the buttons in a Unity application as clickable automatically? I have a Wacom bamboo tablet which has 'Touch' on it, running the application on my PC and using the Wacom as the mouse. It works fine, although it is essentially working the same as a trackpad on a laptop, but I assume the touch on the SP detects anything clickable? – NIMBLE JIM Sep 29 '16 at 19:25
    
Yes, single-touch works out of the box — the touch works like a mouse press. – DMGregory Sep 29 '16 at 19:31
    
Excellent, thanks for the clarification. – NIMBLE JIM Sep 29 '16 at 19:51

Yes it works, as you'd expect. Just install Unity on your tablet and run a demo project.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.