Turn to "Analysis of Linux Input Subsystem One: Software Level"

This article is transferred from: https://blog.csdn.net/yueqian_scut/article/details/47903853



Input and output are the means of interaction between users and products, so input-driven development is very common in Linux-driven development. At the same time, the hierarchical architecture of the input subsystem is very representative and advanced in Linux driver design, so it is meaningful to conduct an in-depth analysis of the Linux input subsystem.

1. Knowledge points of input subsystem

The complete input subsystem analysis includes the following aspects:

1) Software level

2) Input subsystem layering (input_handler, input_core, input_device)

3) Input device (TS) driver development

4) evdev handler analysis

5) Input device model view (sysfs) and running image (procfs)

6) tslib analysis

7) Event processing analysis of application framework

The author has always advocated that learning embedded should cultivate as much as possible the overall view of analyzing Linux software architecture. This article will analyze the roles and completed functions of the applications and kernel-level modules involved in Linux input from the perspective of requirements, so that developers can have a clear understanding of the entire software level involved in Linux drivers. Other knowledge points will be explained later, so stay tuned.

2. Software level analysis

The software hierarchy involved in Linux input is as follows:


Application frameworks based on the Linux kernel are commonly found in Android and QT. Since Android4.2 has undergone major changes to the touch screen driver support (throwing the tasks completed by tslib to the driver layer to complete), we use a relatively simple QT framework to illustrate the Linux input call process. Assuming the following QT-based address book application scenarios, we focus on analyzing the input response process of querying this button control.


1. APP is the address book application, enter the first letter of the pinyin of the name, and then click query to output the result (name and phone number). APP uses the WYSIWYG method to drag the Button control as the "query" button on the QT creator visual development environment. The APP does not need to pay attention to the display of the button, nor does it need to care about the way the user presses the button, all it needs to do is to respond to the click event of the button, that is, to query the name database through the initials of the name and output it to the structure Box.

2. The QT application framework needs to complete things that are not needed for APP development, that is, encapsulate input events, distribute events to the target control, complete the state change and image display of the target control, and of course also include window management. The above application icon is an LCD screen display page, the coordinate point of the LCD is in the upper left corner, as shown in the figure below:


For the QT application framework, when the user clicks on the screen, QT's event processing will receive a coordinate event, which is the contact coordinate (x, y) relative to the origin of the LCD coordinate. QT will analyze the coordinates and detect which control's display range the coordinates fall into in the current focus window system. Obviously, the "query" button has a coordinate range in the window, and its range is determined by two coordinate points in the upper left corner and the lower right corner. If it is judged that the coordinates of the user's touch fall within this range, QT will distribute a click event to the "query" button control, and finally QT will call the query logic function written in APP development through a callback method and output it to the result box.

3. Tslib, it is clear from the name that it is an intermediate library for touch screen TS scenes. It mainly completes functions such as debounce, filtering and calibration of touch coordinate messages. Its core function is the linear conversion from the touch screen coordinate system to the LCD display coordinate system. The touch screen of an electronic product generally includes two parts: a touch screen and a display screen. The touch screen is on the display screen, and there are resistance and capacitance voltage division principles. In the TS driver implementation, the coordinate origin of the touch screen is the lower left corner of the screen, that is, the coordinate system coordinates based on the lower left corner as the origin provided by the driver to TSLIB, and TSLIB needs to provide the QT application framework with the coordinate system coordinates based on the upper left corner as the origin , So TSLIB needs to complete the conversion of the coordinate system. As shown below:


In addition, the resolution of the touch screen and the resolution of the LCD display screen may also be different, so the conversion of coordinates should also consider the factor of resolution.

4. C library and Tslib must access the kernel layer driver through open and read standard c interfaces, and these interfaces will eventually use the syscall instruction to jump to the kernel mode.

5. VFS, open, read and other interfaces will eventually be called to vfs_open, vfs_read and other interfaces through the syscall system call layer. The parameter of Open is the name of the input device file, such as /dev/input/event1, vfs_open finds the inode corresponding to the device file in the dentry linked list through lookup, and then analyzes that the file is a character device file, which is handed over to the character device driver framework The chardev_open process, and finally obtain the file_operations defined by the input-core layer corresponding to the input subsystem (the main device number is 13), and encapsulate it in the file structure of the process, and finally return the file corresponding handle fd to the application layer, and read is to read forwarding and access operations through the file_operations.

6. The character device driver framework layer. chardev_open reads the corresponding major device number 13 through the inode corresponding to the device file, and finds the file_operations corresponding to the major device number 13 in the character device driver global linked list cdev_map, that is, the input_fops registered with the system when the input subsystem is initialized. As shown below:


How to access different input devices, such as touch screens and buttons, etc., is responsible for the open of input_fops. input_fops is an integral part of input-core.

7. Input subsystem. The input subsystem highly abstracts Linux's input device drivers, and is finally divided into three layers, including the input core layer, the input event processing layer and the input device driver layer. As shown below:


1) The major device number of all input devices is 13, and the input devices are classified by the secondary device, as shown in the figure below:


2) For the application layer, it does not care about the input device driver layer, it only cares about the input event processing layer, that is, which type of input device is concerned.

3) When the system is initialized, the event processing layer (input-handler) registers itself with the input-core and informs the input-core that it can handle the type and ability of the underlying device driver. For example, the event_handler event handler (evdev.c) can handle touch screens and Button drive.

4) When the input device driver input-device is registered with input-core, input-core will match it with the appropriate event processing layer input-handler, and finally use input-handle to associate the two, and under the control of input-handler Generate character device files accessed by users. For example, the minor device number corresponding to /dev/input/event0 is 64.

8. The input core layer, with the above analysis, we will continue to analyze the input_open_file of input-core, if the device file name is /dev/input/event0, the device number is 64, and input_open_file gets the corresponding event through the minor device number The event_handler is processed at the layer and handed over to the handler for management.

9. Input event processing layer, event_handler can get the offset (0) of the input driver's evdev_table in its management device array according to the minor device number, and then get the corresponding input_handle, and finally find the associated input device driver input-device.

10. The input device driver layer, the open of the input device driver layer will perform hardware initialization and so on.

11. Below the VFS layer, we focused on analyzing the open process of the driver. Next, let's analyze the generation and reading process of touch screen messages.

1) It can be imagined that there will be a high-priority thread in the QT application framework that has been waiting to read touch and key messages. When no messages are generated, the thread is dormant.

2) When the user touches the screen, the touch screen will generate an external hardware interrupt. In the interrupt processing of the input device driver, the touch screen coordinates will be read and reported to its corresponding input-handler. The input-handler is analyzing whether the message is After repeating the message and other processes, it sends a semaphore to the thread that opened the handler to wake up, and fills the thread's message queue with the touch screen message.

3) The QT application framework thread will read the message through the tslib interface, and finally obtain the LCD coordinate system coordinates, and perform subsequent processing.

 

Remarks: Input subsystem layering (input_handler, input_core, input_device), input device (TS) driver development, evdev handler analysis, input device model view (sysfs) and running image (procfs), tslib mechanism analysis, application framework events The series of knowledge of processing analysis needs to be supplemented later. Thank you!

 

At the same time, we should also see that the touch screen may be an I2C bus interface, UART interface, USB interface, etc. Therefore, the touch screen device driver layer not only exists as an input_device, but may also exist as an I2C device and so on. The I2C device driver needs to analyze the I2C subsystem. Keep working hard~~

Guess you like

Origin blog.csdn.net/amwha/article/details/79868238