question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Touch input API proposal

See original GitHub issue

Inspired by:

https://developer.mozilla.org/en-US/docs/Web/API/Touch_events https://developer.mozilla.org/en-US/docs/Web/API/Touch https://developer.mozilla.org/en-US/docs/Web/API/Touch_events/Using_Touch_Events http://who-t.blogspot.com/2012/01/multitouch-in-x-touch-grab-handling.html https://docs.microsoft.com/en-us/xamarin/xamarin-forms/app-fundamentals/gestures/

Touch events

struct Touch
{
    long Id;
    Point Position;
}


class TouchEventArgs : RoutedEventArgs
{
   int SequenceId;
   TouchEventType Type;
   ulong Timestamp;
   List<Touches> Touches;
   List<Touches> AddedTouches;
   List<Touches> ChangedTouches;
   List<Touches> RemovedTouches;

   IAcceptedTouchSequence AcceptTouches(IControl control);
}

TouchCancelledEventArgs : TouchEventArgs
{
   IControl AcceptorControl;
}

TouchesBegan(TouchEventArgs args) event

Triggered when the first touch point is activated

TouchesChanged(TouchEventArgs args) event

Triggered when a touch point is moved or when new touch point is added or when a touch point is removed but there is at least one left.

TouchesEnded(TouchEventArgs args) event

Triggered when the last touch point is removed. Both Touches and ChangedTouches are Empty.

TouchesCancelled(TouchCancelledEventArgs args) event

Triggered when touch sequence is cancelled due to platform-specific reasons or when touch sequence is accepted by another control.

Event flow

Touch events are triggered in a sequences. A sequence begins when first touch point is activated and bounds itself to the control determined by the hittest of the first touch point’s position. TouchesChanged and TouchesEnded events are sent with that control being event target. It’s guaranteed that there is only one active touch sequence

Events follow the normal tunnel/bubble flow until some control accepts the sequence. After a control has accepted the sequence touch events from that sequence will only be sent to that control. Target control will be sent a TouchesCancelled event sent through the tunnel/bubble flow, ignoring the acceptor control.

This way we can have touch gesture recognition hierarchies when at the start every control in the tree can react to touch events, but once gesture is recognized events will only flow to the recognizer.

Gesture recognition

Each control will have AvaloniaList<IGestureRecognizer> GestureRecognizers property, default implementations of touch event handlers will delegate their work to gesture recognizers associated with the control. The end user API is similar to Xamarin.Forms.

Gesture recognizer can accept or reject the current touch sequence. When gesture recognizer rejects the sequence it won’t be sent any further events from that sequence. If recognizer accepts the sequence, the control will accept the sequence and send events to acceptor recognizer. Recognizers can register themselves to run on tunnel, bubble or both stages of the event flow.

Example gesture recognizer:


class TapGestureRecognizer : IGestureRecognizer
{
    ulong _started;
    Point _startPoint;
    const double Distance = 20;
    const ulong MaxTapDuration = 500;
    GestureRecognizerResult.IGestureRecognizer Handle(TouchEventArgs args)
    {
       // Multi-touch sequence
       if(args.Touches.Count > 1)
           return GestureRecognizerResult.Reject;
       // Sequence started, save the start time
       if(args.Type == TouchEventType.TouchesBegan)
       {
           _started = args.Timestamp;
           _startPoint = ev.Touches[0].Position;
           return GestureRecognizerResult.Continue;
       }
       if(args.Type == TouchEventType.TouchesEnded)
       {
           var endPoint = args.RemovedTouches[0];
           if(Math.Abs(endPoint.X - _startPoint.X) < Distance
               && Math.Abs(endPoint.Y - _startPoint.Y) < Distance
               && (args.Timestamp - _started) < MaxTapDuration)
           {
               args.Source.RaiseEvent(new RoutedEventArgs(TappedEvent));
               return GestureRecognizerResult.Accept;
           }
           else
               return GestureRecognizerResult.Reject;
       }
       return GestureRecognizerResult.Continue;
    }
}

Touch event emulation

InputManager.EnableTouchEmulationViaMouse will enable touch emulation, so at least one-finger touch support could be developed and debugged on touch-incapable machines. Mouse pointer would behave like a single touch point when pressed.

Cancelling an accepted touch session

(this might not be required, we need to try other things before implementing)

Gesture recognizer (or control) can reject already accepted grab. In this case touch events that happened after the grab would be replayed.

interface IAcceptedTouchSequence 
{
     void Cancel();
}

“Common” events

We are currently converting unhandled pointer events to Tapped and DoubleTapped here: https://github.com/AvaloniaUI/Avalonia/blob/master/src/Avalonia.Input/Gestures.cs#L29

Same would happen in touch gesture recognizer. We would place TapGestureRecognizer to TopLevel’s GestureRecognizers collection by default.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:10
  • Comments:26 (21 by maintainers)

github_iconTop GitHub Comments

2reactions
mstr2commented, Jan 16, 2019

If I understand you correctly, there can only be exactly one active touch sequence. This may be very limiting, since it will not allow for more advanced multi-touch interactions. For example, on the home screen in iOS, you can grab an app icon with one finger (assuming that you’re in wiggle mode), and then use a second finger to scroll to another home screen page while you are holding onto the app icon.

Another example is the UIDatePicker control in iOS, which allows you to scroll the individual spinners simultaneously. There are several more examples of such interactions, so it’s not a fringe feature (at least in iOS).

0reactions
kekekekscommented, Jun 17, 2021
Read more comments on GitHub >

github_iconTop Results From Across the Web

Implementing touch with Input System's Enhanced Touch API
Hello! I've put together a new tutorial that covers how to use the EnhancedTouch API and how to hook up the Input System...
Read more >
Touch and input overview
The following pages cover everything about user input, from basic touch input and gestures to keyboards and game controllers.
Read more >
Struct Touch | Input System | 1.0.2
The API makes a distinction between "fingers" and "touches". A touch refers to one contact state change event, i.e. a finger beginning to...
Read more >
Developers - Touch input API proposal -
Touch input API proposal. Perspex. 14 January 2019 Posted by Nikita Tsukanov. Inspired by: https://developer.mozilla.org/en-US/docs/Web/API/Touch_events
Read more >
How to use TOUCH with the Input System in Unity - YouTube
... 0:25 Set Up 1:53 Input Action Assets 8:45 Player Input 9:41 Detecting Touch in Script 18:35 Build and Test 20:29 Enhanced Touch...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found