[MAUI programming] Use Handler to implement custom cross-platform controls


Today, let's talk about the core concept of MAUI cross-platform technology - cross-platform controls.

Whether it is MAUI, Xamarin.Forms or other cross-platform technologies, they are the abstraction layers of multiple different platform functions, using a common method to achieve the so-called "one development, run everywhere".

The cross-platform framework needs to consider the compatibility of common methods on each platform, but due to the differences in the functions of each native platform (officially referred to as native), it may not be able to satisfy all the functions of a specific platform.

For example, as we all know, MAUI's gesture recognizer does not provide long press (LongPress) gesture recognition, and TapGestureRecognizer only recognizes pressing and lifting, and does not provide long press recognition.

At this time, developers need to implement the functions of specific platforms by themselves, which is custom controls.

If you want to rewrite controls, or enhance the functions or visual effects of default controls, the most basic function is to get cross-platform controls and native controls.

The properties defined by the cross-platform control are passed to the native control, and the changes of the custom properties are responded and processed in the native control. To achieve the purpose of custom controls.

Next, I will introduce the new feature in MAUI: Handler, which is easy to use but not many people know about it.

Handler

Because the implementation of cross-platform controls is provided by native views on each platform, MAUI creates interfaces for each control to abstract controls. Cross-platform controls that implement these interfaces are called 虚拟视图. Handlers map these virtual views to controls on each platform called 本机视图.

The Handler object in VisualElement is a class that implements the IElementHandler interface, through which you can access 虚拟视图and 本机视图.

public interface IViewHandler : IElementHandler
{
    bool HasContainer { get; set; }
    object? ContainerView { get; }
    IView? VirtualView { get; }

    Size GetDesiredSize(double widthConstraint, double heightConstraint);
    void PlatformArrange(Rect frame);
}

Each control has its own Handler and interface, please check the official documentation .

It can be used as an entry point to implement custom control functions on a specific native platform by registering a global mapper.
Then combined with the language features of .NET 6 conditional compilation, it is more convenient to write custom handlers for each platform on the file.

Entry is a single-line text input control that implements the IEntry interface, and its corresponding Handler is EntryHandler.

insert image description here

If we want to automatically select all text when the Entry control gets focus.

  Microsoft.Maui.Handlers.EntryHandler.Mapper.AppendToMapping("MyCustomization", (handler, view) =>
        {
#if ANDROID
            handler.PlatformView.SetSelectAllOnFocus(true);
#elif IOS || MACCATALYST
            handler.PlatformView.EditingDidBegin += (s, e) =>
            {
                handler.PlatformView.PerformSelector(new ObjCRuntime.Selector("selectAll"), null, 0.0f);
            };
#elif WINDOWS
            handler.PlatformView.GotFocus += (s, e) =>
            {
                handler.PlatformView.SelectAll();
            };
#endif
        });

Alternatively, you can use partial classes to organize your code into platform-specific folders and files. For more information on conditional compilation, please refer to the official documentation .

Differences from the Xamarin.Forms implementation

In the Xamarin.Forms era, a set of mechanisms for custom controls, the Renderer, has been provided.

Xamarin.Forms controls, such as Entry, are rendered in the EntryRenderer class encapsulated in a specific platform.

insert image description here

By overriding the control's default Renderer, you can completely change the way the control looks and behaves.

  • Element, Xamarin.Forms element
  • Control, a native view, widget, or control object

Why use Handler instead of Renderer

Although the Renderer function is very powerful, in most scenarios, it is not necessary to rewrite the control every time, but only to add some platform-specific enhancements to the control. If you also need to rewrite OnElementPropertyChanged to transfer the property value of the cross-platform control To native controls, this approach is too complex.

In my understanding, Handler is an optimization of Renderer, which solves these problems of Renderer: coupling of Renderer and cross-platform controls, life cycle management of custom controls, and finer-grained control of custom controls.

decoupling

In the Render of Xamarin.Froms, if you want to get the properties of the cross-platform control, you need to directly reference the cross-platform type, which leads to the coupling of the Renderer and the cross-platform control.

In MAUI, handlers decouple platform controls from the frame. Platform controls only need to handle the framework's needs. The benefit of this is that the handlers are also available for reuse in other frameworks such as Comet and Fabulous.

life cycle management

Handler customization can be done anywhere in the application via the handler's Mapper. Once a handler is customized, it will affect all controls of that type anywhere in the application.

You can manage the life cycle of the Handler through the controls HandlerChanged and HandlerChanging, and you can get the timing of the control to mount and remove the Handler through its parameters, and you can do some initialization and cleaning work here.

finer-grained control

Because the global mapper registration is implemented, the advantage is that there is no need to rewrite the subclass controls. We can get the controls that need to be processed by obtaining a certain property of the cross-platform control, or annotating the property. Implement free aspect-oriented filtering.

What about using Effect?

Or we just want to change the appearance of the control, which can be achieved through Effect. But whether it is Effect or Renderer, they can only be global. In business logic that requires state maintenance, such as long press, it is actually a process of pressing and lifting. Controls that have not been pressed do not respond to lifting, because In this way, to record which controls have been pressed, it may be necessary to maintain all custom controls with a dictionary.

And MAUI's custom mapper is actually a dictionary, which reduces the complexity of the code.

In MAUI, the official recommendation is to migrate to Handler. Although Renderers can still be used in MAUI, they belong to the compatibility scheme (Compatibility namespace) and do not provide the ExportRenderer tag, which needs to be added manually in CreateMauiApp:

.ConfigureMauiHandlers((handlers) =>
        {
#if ANDROID
            handlers.AddHandler(typeof(PressableView), typeof(XamarinCustomRenderer.Droid.Renderers.PressableViewRenderer));
#elif IOS
            handlers.AddHandler(typeof(PressableView), typeof(XamarinCustomRenderer.iOS.Renderers.PressableViewRenderer));
#endif
        });

For detailed steps of migrating from Renderer to Handler, please refer to the official document

As mentioned earlier, MAUI lacks long-press gesture control,

The so-called long press (LongPress) actually decomposes the action from touching the screen to leaving the screen. When the finger touches the screen, the Pressed event is triggered, and when the finger leaves the screen, the Released event is triggered. If the time interval between pressing and lifting exceeds a certain amount of time, it is considered a long press.

For such a simple function, the MAUI team does not intend to add it to gesture recognition. It is possible to delegate this requirement to the community for implementation. I found this issue in CommunityToolkit (https://github.com/CommunityToolkit/Maui/issues/86), but so far, the only official gesture recognition implemented with Effect Case (https://docs.microsoft.com/xamarin/xamarin-forms/app-fundamentals/effects/touch-tracking)

Then let's refer to this official case to implement a long-press gesture control on MAUI

Custom gesture monitoring controls

Define the gesture categories that can be monitored, namely press, move, lift, cancel, enter, and exit


 public enum TouchActionType
    {
        Entered,
        Pressed,
        Moved,
        Released,
        Exited,
        Cancelled
    }

Add a gesture listener TouchRecognizer, which will provide an event OnTouchActionInvoked to trigger gesture actions.

public partial class TouchRecognizer: IDisposable
{
    public event EventHandler<TouchActionEventArgs> OnTouchActionInvoked;
    public partial void Dispose();
}

EventArg class TouchActionEventArgs, used to pass gesture parameters

public long Id { private set; get; }

public TouchActionType Type { private set; get; }

public Point Location { private set; get; }

public bool IsInContact { private set; get; }

Implement TouchRecognizer on each platform

Use the distribution class (partial class) method to create TouchRecognizer.iOS.cs, TouchRecognizer.Android.csand TouchRecognizer.Windows.csfiles, and implement TouchRecognizer on each platform. The implementation codes on each platform will not be mixed together, which is easy to maintain.

Implementation in iOS

public partial class TouchRecognizer : UIGestureRecognizer, IDisposable
{
    UIView iosView;

    public TouchRecognizer(UIView view)
    {
        this.iosView = view;
    }

    public override void TouchesBegan(NSSet touches, UIEvent evt)
    {
        base.TouchesBegan(touches, evt);

        foreach (UITouch touch in touches.Cast<UITouch>())
        {
            long id = touch.Handle.Handle.ToInt64();
            InvokeTouchActionEvent(this, id, TouchActionType.Pressed, touch, true);
        }


    }

    public override void TouchesMoved(NSSet touches, UIEvent evt)
    {
        base.TouchesMoved(touches, evt);

        foreach (UITouch touch in touches.Cast<UITouch>())
        {
            long id = touch.Handle.Handle.ToInt64();

            InvokeTouchActionEvent(this, id, TouchActionType.Moved, touch, true);

        }
    }

    public override void TouchesEnded(NSSet touches, UIEvent evt)
    {
        base.TouchesEnded(touches, evt);

        foreach (UITouch touch in touches.Cast<UITouch>())
        {
            long id = touch.Handle.Handle.ToInt64();

            InvokeTouchActionEvent(this, id, TouchActionType.Released, touch, false);

        }
    }

    public override void TouchesCancelled(NSSet touches, UIEvent evt)
    {
        base.TouchesCancelled(touches, evt);

        foreach (UITouch touch in touches.Cast<UITouch>())
        {
            long id = touch.Handle.Handle.ToInt64();

            InvokeTouchActionEvent(this, id, TouchActionType.Cancelled, touch, false);

        }
    }


    void InvokeTouchActionEvent(TouchRecognizer recognizer, long id, TouchActionType actionType, UITouch touch, bool isInContact)
    {
        var cgPoint = touch.LocationInView(recognizer.View);
        var xfPoint = new Point(cgPoint.X, cgPoint.Y);
        OnTouchActionInvoked?.Invoke(this, new TouchActionEventArgs(id, actionType, xfPoint, isInContact));
    }
}

Implementation in Android

public partial class TouchRecognizer : IDisposable
{
    Android.Views.View androidView;
    Func<double, double> fromPixels;
    int[] twoIntArray = new int[2];
    private Point _oldscreenPointerCoords;

    public TouchRecognizer(Android.Views.View view)
    {
        this.androidView = view;
        if (view != null)
        {
            fromPixels = view.Context.FromPixels;
            view.Touch += OnTouch;
        }
    }

    public partial void Dispose()
    {
        androidView.Touch -= OnTouch;
    }

    void OnTouch(object sender, Android.Views.View.TouchEventArgs args)
    {
        var senderView = sender as Android.Views.View;
        var motionEvent = args.Event;
        var pointerIndex = motionEvent.ActionIndex;
        var id = motionEvent.GetPointerId(pointerIndex);
        senderView.GetLocationOnScreen(twoIntArray);
        var screenPointerCoords = new Point(twoIntArray[0] + motionEvent.GetX(pointerIndex),
                                                twoIntArray[1] + motionEvent.GetY(pointerIndex));


        switch (args.Event.ActionMasked)
        {
            case MotionEventActions.Down:
            case MotionEventActions.PointerDown:
                InvokeTouchActionEvent(this, id, TouchActionType.Pressed, screenPointerCoords, true);
                break;

            case MotionEventActions.Move:
                for (pointerIndex = 0; pointerIndex < motionEvent.PointerCount; pointerIndex++)
                {
                    id = motionEvent.GetPointerId(pointerIndex);


                    senderView.GetLocationOnScreen(twoIntArray);

                    screenPointerCoords = new Point(twoIntArray[0] + motionEvent.GetX(pointerIndex),
                                                    twoIntArray[1] + motionEvent.GetY(pointerIndex));


                    if (IsOutPit(senderView, screenPointerCoords))
                    {
                        if (_oldscreenPointerCoords != default)
                        {
                            InvokeTouchActionEvent(this, id, TouchActionType.Exited, screenPointerCoords, true);
                            _oldscreenPointerCoords=default;
                        }

                    }
                    else
                    {
                        if (_oldscreenPointerCoords == default
                                                    ||screenPointerCoords!= _oldscreenPointerCoords)
                        {
                            _oldscreenPointerCoords=screenPointerCoords;
                            InvokeTouchActionEvent(this, id, TouchActionType.Moved, screenPointerCoords, true);
                        }
                    }


                }
                break;

            case MotionEventActions.Up:
            case MotionEventActions.Pointer1Up:
                InvokeTouchActionEvent(this, id, TouchActionType.Released, screenPointerCoords, false);
                break;

            case MotionEventActions.Cancel:

                InvokeTouchActionEvent(this, id, TouchActionType.Cancelled, screenPointerCoords, false);
                break;
        }
    }

    private bool IsOutPit(Android.Views.View senderView, Point screenPointerCoords)
    {
        return (screenPointerCoords.X<twoIntArray[0]||screenPointerCoords.Y<twoIntArray[1])
                                    ||(screenPointerCoords.X>twoIntArray[0]+senderView.Width||screenPointerCoords.Y>twoIntArray[1]+senderView.Height);
    }

    void InvokeTouchActionEvent(TouchRecognizer touchEffect, int id, TouchActionType actionType, Point pointerLocation, bool isInContact)
    {
        touchEffect.androidView.GetLocationOnScreen(twoIntArray);
        double x = pointerLocation.X - twoIntArray[0];
        double y = pointerLocation.Y - twoIntArray[1];
        var point = new Point(fromPixels(x), fromPixels(y));
        OnTouchActionInvoked?.Invoke(this, new TouchActionEventArgs(id, actionType, point, isInContact));
    }

}

Implementation in Windows

public partial class TouchRecognizer : IDisposable
{
    FrameworkElement windowsView;

    public TouchRecognizer(FrameworkElement view)
    {
        this.windowsView = view;
        if (this.windowsView != null)
        {
            this.windowsView.PointerEntered += View_PointerEntered;
            this.windowsView.PointerPressed += View_PointerPressed;
            this.windowsView.Tapped +=View_Tapped;
            this.windowsView.PointerMoved += View_PointerMoved;
            this.windowsView.PointerReleased += View_PointerReleased;
            this.windowsView.PointerExited += View_PointerExited;
            this.windowsView.PointerCanceled += View_PointerCancelled;
        }
    }

    public partial void Dispose()
    {
        windowsView.PointerEntered -= View_PointerEntered;
        windowsView.PointerPressed -= View_PointerPressed;
        windowsView.Tapped -=View_Tapped;
        windowsView.PointerMoved -= View_PointerMoved;
        windowsView.PointerReleased -= View_PointerReleased;
        windowsView.PointerExited -= View_PointerEntered;
        windowsView.PointerCanceled -= View_PointerCancelled;
    }
    private void View_Tapped(object sender, TappedRoutedEventArgs args)
    {
        //var windowsPoint = args.GetPosition(sender as UIElement);
        //Point point = new Point(windowsPoint.X, windowsPoint.Y);
        //InvokeTouchActionEvent(TouchActionType.Pressed, point, 0, true);

    }
    private void View_PointerEntered(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Entered, point, id, isInContact);
    }

    private void View_PointerPressed(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Pressed, point, id, isInContact);
        (sender as FrameworkElement).CapturePointer(args.Pointer);
    }

    private void View_PointerMoved(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Moved, point, id, isInContact);
    }

    private void View_PointerReleased(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Released, point, id, isInContact);
    }

    private void View_PointerExited(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Exited, point, id, isInContact);
    }

    private void View_PointerCancelled(object sender, PointerRoutedEventArgs args)
    {
        Point point = GetPoint(sender, args);
        var id = args.Pointer.PointerId;
        var isInContact = args.Pointer.IsInContact;
        InvokeTouchActionEvent(TouchActionType.Cancelled, point, id, isInContact);
    }

    private void InvokeTouchActionEvent(TouchActionType touchActionType, Point point, uint id, bool isInContact)
    {
        OnTouchActionInvoked?.Invoke(this, new TouchActionEventArgs(id, touchActionType, point, isInContact));

    }

    private static Point GetPoint(object sender, PointerRoutedEventArgs args)
    {
        var pointerPoint = args.GetCurrentPoint(sender as UIElement);
        Windows.Foundation.Point windowsPoint = pointerPoint.Position;
        Point point = new Point(windowsPoint.X, windowsPoint.Y);
        return point;
    }
}

create control

Create a gesture monitoring control TouchContentView, which inherits from ContentView.

Note: Try to avoid calling ViewHandler.ViewMapper.AppendToMapping in the constructor, it will cause starting from the XAML root element of the page, recursively traverse all IView virtual view child elements, and add them to the ViewMapper

We use HandlerChanging to monitor Handler changes. When the OldHandler property is not empty, it means that the existing native controls are about to be deleted from the cross-platform controls. At this time, we need to remove the TouchRecognizer to avoid memory leaks.

public class TouchContentView : ContentView
{
    private TouchRecognizer touchRecognizer;

    public event EventHandler<TouchActionEventArgs> OnTouchActionInvoked;



    public TouchContentView()
    {
        this.HandlerChanged+=TouchContentView_HandlerChanged;
        this.HandlerChanging+=TouchContentView_HandlerChanging;
    }


    private void TouchContentView_HandlerChanged(object sender, EventArgs e)
    {

        var handler = this.Handler;
        if (handler != null)
        {
#if WINDOWS
            touchRecognizer = new TouchRecognizer(handler.PlatformView as Microsoft.UI.Xaml.FrameworkElement);
            touchRecognizer.OnTouchActionInvoked += TouchRecognizer_OnTouchActionInvoked;
#endif
#if ANDROID
            touchRecognizer = new TouchRecognizer(handler.PlatformView as Android.Views.View);
            touchRecognizer.OnTouchActionInvoked += TouchRecognizer_OnTouchActionInvoked;

#endif

#if IOS|| MACCATALYST
            touchRecognizer = new TouchRecognizer(handler.PlatformView as UIKit.UIView);
            touchRecognizer.OnTouchActionInvoked += TouchRecognizer_OnTouchActionInvoked;

            (handler.PlatformView as UIKit.UIView).UserInteractionEnabled = true;
            (handler.PlatformView as UIKit.UIView).AddGestureRecognizer(touchRecognizer);
#endif
        }

    }

    private void TouchContentView_HandlerChanging(object sender, HandlerChangingEventArgs e)
    {


        if (e.OldHandler != null)
        {
            var handler = e.OldHandler;

#if WINDOWS
            touchRecognizer.OnTouchActionInvoked -= TouchRecognizer_OnTouchActionInvoked;
#endif
#if ANDROID
            touchRecognizer.OnTouchActionInvoked -= TouchRecognizer_OnTouchActionInvoked;

#endif

#if IOS|| MACCATALYST
            touchRecognizer.OnTouchActionInvoked -= TouchRecognizer_OnTouchActionInvoked;

            (handler.PlatformView as UIKit.UIView).UserInteractionEnabled = false;
            (handler.PlatformView as UIKit.UIView).RemoveGestureRecognizer(touchRecognizer);
#endif


        }
    }

    private void TouchRecognizer_OnTouchActionInvoked(object sender, TouchActionEventArgs e)
    {
        OnTouchActionInvoked?.Invoke(this, e);
        Debug.WriteLine(e.Type + " is Invoked, position:" + e.Location);
    }
}

use controls

Refer to the namespace where TouchContentView is located in Xaml

xmlns:controls="clr-namespace:Lession2.TouchRecognizer;assembly=Lession2"

Put your control in TouchContentView, and then listen to the OnTouchActionInvoked event of TouchContentView.
Note: For click controls like Button, the click event will not be passed down, so if the Button is wrapped, the OnTouchActionInvoked event will not be triggered.

<controls:TouchContentView Style="{StaticResource HoldDownButtonStyle}"
                            
                            Grid.Column="0"
                            OnTouchActionInvoked="TouchContentView_OnTouchActionInvoked">
    <BoxView CornerRadius="10" Color="Red"></BoxView>

</controls:TouchContentView>


<controls:TouchContentView Style="{StaticResource HoldDownButtonStyle}"

                            Grid.Column="1"
                            OnTouchActionInvoked="TouchContentView_OnTouchActionInvoked">
    <Image Source="./dotnet_bot.svg"></Image>

</controls:TouchContentView>


<controls:TouchContentView Style="{StaticResource HoldDownButtonStyle}"

                            Grid.Column="2"
                            OnTouchActionInvoked="TouchContentView_OnTouchActionInvoked">
    <Label Text="假装我是一个按钮"></Label>

</controls:TouchContentView>

final effect

The gesture listener will be applied in the control.

insert image description here

project address

Github:maui-learning

Guess you like

Origin blog.csdn.net/jevonsflash/article/details/131035307