Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More shared touch/mouse handling over all platforms #2395

Open
pauldendulk opened this issue Jan 7, 2024 · 7 comments
Open

More shared touch/mouse handling over all platforms #2395

pauldendulk opened this issue Jan 7, 2024 · 7 comments

Comments

@pauldendulk
Copy link
Member

Use more shared code in handling touch and mouse

Currently all platforms have different implementations, which are also very different in nature. What we are looking for is a way to map the platform specific events to a platform neutral event handler. Which events should that be? Do we deal with touch and mouse events separately, or can we use the pointer event (applies to both touch and mouse). How about pinch zoom? Are all version available on all platforms? To make a start I went through all MapControl to register the events that are currently used.

What is used right now?

We have nine different platforms. WinUI and Uno have shared code, so eight MapControls to go through.

Android

  • 👇Touch
  • 👇GestureDetector.SingleTapConfirmed
  • 👇GestureDetector.DoubleTap
  • 👇MapControlGestureListener.Fling;

Avalonia

In Avalonia Pointer events are individual fingers or the mouse. We have to track the number of fingers

  • 🖱️👇PointerPressed
  • 🖱️👇PointerReleased
  • 🖱️👇PointerMoved
  • 🖱️👇PointerExited
  • 🖱️👇PointerCaptureLost
  • 🖱️👇PointerWheelChanged
  • 🖱️👇DoubleTapped

Blazor

  • 🖱️OnMouseWheel
  • ? OnDblClick
  • 🖱️👇OnPointerDown
  • 🖱️👇OnPointerUp
  • 🖱️OnMouseMove
  • 👇OnTouchStart
  • 👇OnTouchMove
  • 👇OnTouchEnd

Eto

  • 🖱️OnMouseDown
  • 🖱️OnMouseUp
  • 🖱️OnMouseMove

iOS

  • 👇UITapGestureRecognizer.OnDoubleTapped
  • 👇UITapGestureRecognizer.OnSingleTapped
  • 👇TouchesBegan
  • 👇TouchesMoved
  • 👇TouchesEnded

MAUI

  • 🖱️👇SKGLView.Touch or SKCanvasView.Touch

The touch events have all kinds of touch types.

  • Entered
  • Pressed
  • Moved
  • Released
  • Cancelled
  • Exited
  • WheelChanged

We determine which events this triggers, even LongPress and DoubleTapped.

To calculate Fling the Mapsui.FlingTracker is used.

WinUI / Uno

  • 👇ManipulationStarted
  • 👇ManipulationDelta
  • 👇ManipulationCompleted
  • 👇ManipulationInertiaStarting
  • 🖱️👇Tapped
  • 🖱️👇PointerPressed
  • 🖱️👇DoubleTapped
  • 🖱️👇PointerMoved
  • 🖱️👇PointerWheelChanged

WPF

  • 🖱️MouseLeftButtonDown
  • 🖱️MouseLeftButtonUp
  • 🖱️MouseMove
  • 🖱️MouseLeave
  • 🖱️MouseWheel
  • 👇TouchUp
  • 👇ManipulationStarted
  • 👇ManipulationDelta
  • 👇ManipulationCompleted
  • 👇ManipulationInertiaStarting

To calculate Fling the Mapsui.FlingTracker is used

@charlenni
Copy link
Member

That is really difficult. To find a least common level is hard. And sometimes the framework has even more possibilities.

Perhaps we should first discuss, what the map need as basic events. Then check, what each platform provides and implement the connection between both.

@charlenni
Copy link
Member

When I look through the events we have, then only Down/Move/Up exists on each platform. And even for Move it isn't clear, if it is only a Down-Move or also a Hover (which doesn't exists on touch platforms).

If you want to handle the events in the shared code, then you must make all the events (click/tap, double-click/tap and so on) by yourself in the shared code. Could be, that they not react like the native events. In the other way, you have to catch the events of the native platform in the native part of MapControl and connect them with the Mapsui function. I assume, that all possible reactions are already existing in Mapsui.

And some of the events are provided by SkiaSharp.

@pauldendulk
Copy link
Member Author

@charlenni I am now in the phase where I am investigating the problem. Many options are open. By posting this issue I am also trying to explain the way I am thinking so that I can take others along in my thoughts.

Perhaps the first thing I should have mentioned is the underlying goal, which is to have a code base which is easy to maintain. I want to spend less time fixing bugs and it should be easy to add new functionality. To make it easier to maintain I want to have less code, more shared code, consistent implementations and clear responsibilities.

That is really difficult.

The mapping of platform specific events to platform independent events is something we are already doing for the widgets. We are even using double click and the shift key modifier. It is, however, not implemented on all platforms and not working correctly om some platforms. For instance, in current main the 'Editing' 'Editing Add Line' sample works different for Avalonia and WPF. This is because some of the events listed above overlap. So, a DoubleTapped is triggered while there is also a PointerUp.

So, since we are already doing this kind of mapping we need to clean it up and make sure that what we have is correct.

btw, in analyzing this we should also have clear what the use cases are. One thing to note is that support for widgets and support for app builders are different categories.

To find a least common level is hard.

This is the kind of thing I am investigating in this issue. I am looking in both directions, what do we need or want (the listing above is related to that), and what do the platforms provide (this is something I have not fully clear yet).

And sometimes the framework has even more possibilities.

If a platform has some advanced functionality not supported by the others then I would like to stay away from it. App builders on that platform would still be able to listen to those events and I want to make it possible to get MapInfo through extension methods on that platform. So, now we pass MapInfo in our events, instead the users could request MapInfo without the platform specific event.

When I look through the events we have, then only Down/Move/Up exists on each platform. And even for Move it isn't clear, if it is only a Down-Move or also a Hover (which doesn't exists on touch platforms).

The listing above is what we are currently using on those platforms. I did not fully investigate what is supported on all platforms, but this is the kind of research that should be part of this issue. So far I just looked at what we are using. From what I saw yesterday it looked like all platforms had some support for DoubleTapped. This is one of the decisions to take. Perhaps our widgets should not have a double tap.

And some of the events are provided by SkiaSharp.

Yes, that is something that I wanted to make clear with the listing. I knew it was used in MAUI, but it seems it is only used in MAUI. Also, MAUI uses nothing but SkiaSharp events. When aligning things with other MapControls it would make sense to use MAUI events in MAUI. Then again, it is also an option to use SkiaSharp events everywhere, there would be more shared code. Since I have also seen some bugs in the SkiaSharp touch handling I doubt if this is the best solution. Perhaps we could even provide both, so SkiaSharp touch handling would be an alternative touch handler for all platforms.

@pauldendulk pauldendulk changed the title Align mouse/touch events over all platforms and use shared code where possible More shared touch/mouse handling over all platforms Jan 8, 2024
@charlenni
Copy link
Member

Perhaps it would be also helpful to look what other map library provide. I always look, what Google Maps does, because that's what users expect.

I assume, that Mapsui could handle all the things, that are possible and expected on all platforms. But each platform expects it in a slightly different way. So I'm not sure, if you could reduce the size of code used by platform and increase the shared part.

The next thing is, that you should have an expert on each platform. When I look into the code of SkiaSharp I found, that with each new platform, the handling is reduced, because on the new platform hasn't some of the events. It seems, that platform independence has its own limitations. Much work for a suboptimal result.

@pauldendulk
Copy link
Member Author

Perhaps it would be also helpful to look what other map library provide. I always look, what Google Maps does, because that's what users expect.

Yes, that is something I can not find enough time for. I worked with several in the past, but that was long ago. I've looked into Leaflet somewhat recently and that was very useful. We should do that more.

The next thing is, that you should have an expert on each platform. When I look into the code of SkiaSharp I found, that with each new platform, the handling is reduced, because on the new platform hasn't some of the events. It seems, that platform independence has its own limitations. Much work for a suboptimal result.

'Much work'? Much work is spending three long days stepping through unintelligible code to figure out why widget handling is not working. While I wrote the Blazor touch handling in three hours. This issue is all about less work.

@charlenni
Copy link
Member

I tried today to get a first overview over this problem. I found, that it will not be possible to make this easily. Perhaps it is easier to provide functions for things, that you would like to use with a map:

  • Pan
  • Pinch
  • Fling
  • Zoom in
  • Zoom out
  • Select something (point, widget and so on)
  • Drag
  • Drop

These actions are connected on the different platforms with different events

  • On a keyboard/mouse device a pan is started by a pointer down, on a mobile device with a touch
  • On a keyboard/mouse device a zoom in is done by a mouse wheel operation or a double click, on a mobile device by a pinch gesture
  • On a keyboard/mouse device a drag is started by a mouse hold down, on a mobile device perhaps by a long click

And then there are MapControls that are running on a keyboard/mouse or a touch platform.

Would it be a good idea to implement for the different things you could do with a map functions in the shared MapControl and then select this functions by platform specific events?

  • Single tap
  • Double tap
  • Long tap
  • and so on

@pauldendulk
Copy link
Member Author

There are two kind of events we need to think about.

  1. Map specific Events on the MapControl. The current Info event is such an event (I want to rename it to MapInfo because that is more distinctive). It is an event similar to what users know from their UI framework but with MapInfo added. There are two routes to go. 1) Provide a number of events similar to the existing events but with MapInfo. 2) Provide extension methods to the existing events to fetch MapInfo. I currently think that last option is the best, but I have to see how it works out when working on the code.

  2. The manipulation of the widgets. We have an implementation for this but it is not always correct. And for editing we do not have mobile support yet. So, some work to do. This is what I will investigate first. I may want to use platform specific SingleTap/DoubleTap and call our shared events from that. I think this is what you were suggesting in you last remark

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants