Change VRTK_UIPointer click handlers to Down instead of Release
See original GitHub issueI’ve run into an interesting issue and a possible inconsistency. InteractableObject events fire on down/click, while GUI events fire on up/release.
Background: In my app, an Always_On pointer can be aimed at a gameObject. Upon trigger press, that gameobject displays a popup GUI over it. Then, the pointer can be used to interact with GUI buttons. But what’s happening is that on trigger press, the GameObject (correctly) shows the GUI, but then on trigger release, whichever GUI button happens to be under the popup menu is inadvertently fired. Within a sub-second press/release of the trigger, two actions occur, when ideally only one should.
More generally, how should users interact with elements? In the desktop/mouse environment, mouseUp is typically when actions execute, and users are familiar with that convention. But in VR… does it seem more appropriate to perform actions on trigger click, grip, etc? That would more closely mirror real life. It would also make GUI interaction consistent with InteractableObject interaction. (Consider this: If I had modeled buttons as meshes / InteractableObjects, they would have fired on press. But because I’m using GUI buttons, they fire on release. Devs know the difference… but users don’t, and when they’re standing in a VR environment with solid-looking objects at waist height, the distinction feels arbitrary.)
Anyway, just wondered if this issue is worth considering?
As an aside, I’m making the transition over to VRTK from NewtonVR, where I never had this issue (but ran into several others, heh). In NVRCanvasInput, I see ExecuteEvents.ExecuteHeirarchy (which I don’t pretend to understand), with the following explanation:
// we want to do click on button down at same time, unlike regular mouse processing
// which does click when mouse goes up over same object it went down on
// reason to do this is head tracking might be jittery and this makes it easier to click buttons
As always, thanks for the great toolkit!
Issue Analytics
- State:
- Created 7 years ago
- Comments:15 (6 by maintainers)
Top GitHub Comments
Thanks for responding, ridoutb. No apology necessary. 😃 I really had to think this through myself – having been habituated to the desktop world for so long. Kipman (Hololens lead) said this recently (paraphrasing):
Kinda’ crazy, huh? I suspect we’ll find that a lot of the old 2D desktop conventions don’t translate well… even moreso as finger tracking becomes standard. In VR/AR, we’ll expect to push a button and have it function on press. Like your keyboard. Like an elevator button. Like squishing a bug. 😃 To have actions happen on release would be maddening! It doesn’t seem to matter whether the element’s appearance is flat or dimensional… (?) In the real world, actions occur on press. (I say we blame physics. 😄)
@getwilde My apologies. I have to admit that I’ve changed my mind and I agree with you. I may not have given your post enough consideration. After going back to my own game, I think you are right. It even surfaced some memories where I thought “damn - that’s annoying” when using the UI pointer on my own menus. Firing on press does make sense. Good analogy. I wonder if “pulling the trigger” vs pressing a button brings different expectations in our mind.