question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Access locations of gesture touches

See original GitHub issue

Thanks for the great library!

I’m looking to create a system in which one gesture handler (rotation/pinch) might apply its effects to one of a few different animated views nested within it. In order to figure out which layer to affect, I’ll need to access the locations of the individual touches that comprise the gesture. Unfortunately, the focalX/Y and anchorX/Y don’t provide enough info for my purposes.

PanResponder from React Native exposes this feature natively, but I’d much prefer to use this declarative library.

It might look something like this:

{
  nativeEvent: {
    // ...
    touches: [{
      x: 123,
      y: 456
    }]
  }
}

But ultimately, the format isn’t important — as long as it’s possible to access the location (and maybe other properties) of each touch on both the state change and handler events.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:4
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
chirhoteccommented, Sep 26, 2018

+1 for this feature, but for a different use case. I’m having trouble combining Pinch and 2 finger Pan handlers (read the docs, tried the demo, still having issues). With some of this raw data it should be possible for me to use the PanGestureHandler to account for both pan and pinch.

2reactions
osdnkcommented, Jun 22, 2018

Hi, @cwhenderson20 Thank you for this suggestion as it might be a nice feature to be implemented. I think we’ll take a look on it but I cannot predict when it could happen as we have another important targets now

Read more comments on GitHub >

github_iconTop Results From Across the Web

Touch gestures for Windows - Microsoft Support
Touch gestures ; Open notification center. Swipe with one finger in from the right edge of the screen ; See widgets. Swipe with...
Read more >
Use AssistiveTouch on iPhone - Apple Support
Touch -and-hold gesture: Touch and hold your finger in one spot until the recording progress bar reaches halfway, then lift your finger. Be...
Read more >
Using Touch Events - Web APIs | MDN
The touch events interfaces support application specific single and multi-touch interactions such as a two-finger gesture.
Read more >
Tracking Position in a custom UIGestureRecognizer
I'm building my own UIGestureRecognizer subclass in order to combine the tap and swipe gestures and track the direction of ...
Read more >
Touch shortcuts and gestures in Photoshop on the iPad
Using secondary touch shortcut: Tap and slide to the outer edge of the touch shortcut with your other hand to activate the secondary...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found