I have acquired a Wacom Cintiq 22HD Touch which I will be using creatively with my Mac, but also as a multi-touch display for Ventuz development. Having connected this up to our VBOX it is recognised by the system as a touch device without the need to install any Windows drivers (so I've opted not to complicate things by installing anything, at least for the time being). Ventuz feedback shows me raw touch data so I figure all seems good... so far!
First question: when I create an Eyefinity Display Group including the Cintiq and 2 other 1920x1080 monitors, I get the 2+1 arrangement I want for the current Ventuz project (2 displays for main output, single touch display for interaction), however touches on the Cintiq are mapped across the entire canvas area of the display group... this is presumably the expected (and correct) behaviour? Is there a way to correctly map the touches on the Cintiq display to its' physical position in the display group, and therefore have the co-ordinates mapped to the physical display inside Ventuz? (I'm presuming perhaps this is an Eyefinity feature that hasn't been written yet!)
Second question: I am getting around this problem by creating a sort of 'touch overlay' whereby hidden geometry for required buttons etc. are placed across the entire canvas. If I render this into a render target, scale it using mapping and use it to texture a rectangle that is positioned where the touch display is, it all seems to work OK... however I'm wondering if I'm going about this the correct way?
Third (and final) question: what happens if I want to add a second touch display to the group? Would this perhaps require a workaround using TUIO (rather than Windows Touch) for each display on different ports?
As is probably clear, this is my first chance to experiment with touch interactivity, and I know moving forward we will want to be doing more exciting fancy things with it, so I'm hoping someone can give me some pointers to ensure I'm deploying 'best practice' rather than just bodging a solution
