.NET MAUI Android: Custom Gesture Recognition and TalkBack Compatibility Issue

fatih uyanık 245 Reputation points
2026-02-01T19:32:46.06+00:00

Hi everyone,

I am working on a .NET MAUI project for Android and I want to implement gesture-based shortcuts to improve the user experience. For example, I want to trigger specific actions when a user draws an "L" shape on the screen.

However, I've hit a roadblock in two main areas:

  1. Complex Gesture Recognition: The built-in TapGestureRecognizer or SwipeGestureRecognizer in MAUI only support basic movements. I am unable to capture complex shapes like an "L" during the drawing phase.
  2. TalkBack Compatibility: This feature needs to work both when TalkBack (Screen Reader) is active and when it is disabled. As you know, when TalkBack is on, the system intercepts and interprets touch inputs differently (e.g., single tap for focus, double tap for click).

What I’ve Tried:

• Used standard MAUI Gesture recognizers (Insufficient for complex shapes).

• Attempted to capture OnTouch events on the Android side, but when TalkBack is enabled, these events are either intercepted by the system or handled differently.

My Question:

How can I resolve this issue on the Android side? How can I capture these custom gestures within my app without them being intercepted by the system when TalkBack is active?

Has anyone encountered a similar accessibility scenario before or can suggest an approach to overcome this issue?

Thanks in advance!

Developer technologies | .NET | .NET MAUI
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-02-01T19:32:58.7666667+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    To implement complex gesture recognition in a .NET MAUI application on Android, especially for gestures like drawing an "L" shape, you may need to go beyond the built-in gesture recognizers such as TapGestureRecognizer and SwipeGestureRecognizer, which are limited to basic gestures. Here are some suggestions to help you achieve your goal:

    1. Custom Gesture Recognition: Since the built-in recognizers do not support complex gestures, consider implementing a custom gesture recognizer. You can derive from the GestureRecognizer class and override its methods to track touch movements and recognize specific patterns like an "L" shape. This will give you more control over how gestures are interpreted.
    2. Handling Touch Events: You mentioned trying to capture OnTouch events. To ensure that your custom gestures work even when TalkBack is enabled, you may need to implement a combination of touch event handling and gesture recognition logic. You can use the OnTouch method to track the touch points and determine if they match the desired gesture. However, be aware that TalkBack may still intercept some gestures, so you might need to test extensively to find a balance that works for both scenarios.
    3. Accessibility Considerations: When implementing custom gestures, ensure that your app remains accessible. You can provide alternative ways to trigger actions for users who rely on TalkBack. For instance, you could use buttons or other UI elements that can be easily accessed via screen readers, while still allowing gesture recognition for those who prefer it.
    4. Testing with TalkBack: Make sure to test your application with TalkBack enabled to see how it interacts with your custom gestures. You may need to adjust your gesture recognition logic based on how TalkBack interprets touch events.

    By combining custom gesture recognition with careful handling of touch events and maintaining accessibility standards, you should be able to create a user-friendly experience that accommodates both gesture-based interactions and screen reader functionality.


    References:


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.