Access locations of gesture touches
See original GitHub issueThanks for the great library!
I’m looking to create a system in which one gesture handler (rotation/pinch) might apply its effects to one of a few different animated views nested within it. In order to figure out which layer to affect, I’ll need to access the locations of the individual touches that comprise the gesture. Unfortunately, the focalX/Y and anchorX/Y don’t provide enough info for my purposes.
PanResponder from React Native exposes this feature natively, but I’d much prefer to use this declarative library.
It might look something like this:
{
nativeEvent: {
// ...
touches: [{
x: 123,
y: 456
}]
}
}
But ultimately, the format isn’t important — as long as it’s possible to access the location (and maybe other properties) of each touch on both the state change and handler events.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:4
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Touch gestures for Windows - Microsoft Support
Touch gestures ; Open notification center. Swipe with one finger in from the right edge of the screen ; See widgets. Swipe with...
Read more >Use AssistiveTouch on iPhone - Apple Support
Touch -and-hold gesture: Touch and hold your finger in one spot until the recording progress bar reaches halfway, then lift your finger. Be...
Read more >Using Touch Events - Web APIs | MDN
The touch events interfaces support application specific single and multi-touch interactions such as a two-finger gesture.
Read more >Tracking Position in a custom UIGestureRecognizer
I'm building my own UIGestureRecognizer subclass in order to combine the tap and swipe gestures and track the direction of ...
Read more >Touch shortcuts and gestures in Photoshop on the iPad
Using secondary touch shortcut: Tap and slide to the outer edge of the touch shortcut with your other hand to activate the secondary...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
+1 for this feature, but for a different use case. I’m having trouble combining Pinch and 2 finger Pan handlers (read the docs, tried the demo, still having issues). With some of this raw data it should be possible for me to use the PanGestureHandler to account for both pan and pinch.
Hi, @cwhenderson20 Thank you for this suggestion as it might be a nice feature to be implemented. I think we’ll take a look on it but I cannot predict when it could happen as we have another important targets now