Android 12 Has Improved Gestures Using Machine Learning

Feb 22, 2021 News

Android 12 Has Improved Gestures Using Machine Learning

The Android 12 developer preview is officially launched recently, and some developers can experience the features brought by the new system first. Android 12 added support for AVIF format images, compatible with AVC video encoding, and optimized the notification interface to make it more beautiful.

According to the media toms guide, Android 10 supports return gesture experience for the first time, but it still faces many problems when using it. Since most of the current mobile phones have full screens and the physical/touch buttons are eliminated, the return operation becomes a sliding right on the screen's left edge. However, this can lead to the false triggering of controls and functions in the app.

Google has previously admitted this problem. According to the XDA forum developer, the Android 12 preview version seems to be using machine learning to try to predict the user's intentions and avoid false touches. A developer named Quinny899 found a file in the Tensor Flow Light module, listing 43,000 applications. The developer suspects that this is a list of applications that Google uses machine learning models for training. The system will detect the start and end of the user's sliding, try to find the pattern, and determine whether the user intends to return or interact in the application. The Android 12 system UI has confirmed that it includes this "smart gesture prediction" function. Once it is turned on, the system will judge gesture operations based on multiple indicators.

IT Home is informed that this feature is currently disabled by default in Android 12. Still, the developer said that the quality is closed because it is now in the initial beta version. When Android 12 is stable enough for public testing, it is expected that the smart gesture prediction function will be enabled by default.

android screen



Rate this article:

Related articles