flutter interactional widget

  Widgets

Github restricts the size of pictures. For invalid pictures, please go to Nuggets to view: take it! Flutter imitates the naked eye 3D effect of Ziru App

introduction

Recently seen freely team published achieve comfortable guest APP naked eye 3D effect , this layout has really done a very fun, more play the more addictive, thanks for sharing freely team. Then use Flutter to implement it again according to your own ideas to see the final effect.

apk download and run directly: https://github.com/fluttercandies/flutter_interactional_widget/blob/main/app-release.apk

This article will focus on my thoughts and design in the implementation process, so whether you are a front-end/iOS/Android/Flutter, you can refer to the same way to achieve it. If you have any questions, please feel free to discuss them.


1. The overall idea

It can be seen from the effect that as our device rotates, some parts slide in the tilt direction, some move in the opposite direction, and some do not move. Therefore, the elements on the picture must be divided into different layers, and the effect can be achieved by rotating the device to make the layers move.

The picture is divided into three layers: front, middle, and back. As the phone rotates, the middle layer remains motionless, the upper layer moves in the direction of rotation, and the lower layer is the opposite of the upper layer.

 (Picture from free sharing)

So after the image is layered, the effect becomes two steps:

1. Get the rotation information of the phone

2. Move different layers according to the rotation information


Second, get the rotation information of the phone

There is such a plug-in sensors_plus in Flutter , which can help us get the information of two sensors: Accelerometer (acceleration sensor), Gyroscope (gyro).

Sensor.gif

Each sensor provides a Stream, and the events sent by it include X, Y, and Z, which indicate the speed of changes in different directions of the phone. By monitoring Stream, we can obtain relevant sensor data in real time.

There is also a demo of a snake-eating body attached to this warehouse. Tilt the device and the little snake will move in a tilted direction.

Greedy snake.gif

For more introductions to plug-ins, please view the video: Flutter Widgets Introduction Collection—— 103. Sensors_plus

The effect we achieve needs to move the layer according to the rotation of the phone, and naturally use the gyroscope sensor:

 gyroscopeEvents. listen (
       ( GyroscopeEvent event) {
        // event.x event.y event.z
     ··································
   },
 ),

The callback GyroscopeEvent contains three attributes, x, y, and z, which correspond to the rotation speed detected in the three directions in the figure below (unit: radians/second)

xyz.png

According to the requirements, we only need to use the data of Y axis (corresponding to the movement of the image in the horizontal direction) and X axis (corresponding to the movement of the image in the vertical direction).


Third, move the layer according to the rotation information

I found a psd file on the Internet. After exporting the picture, the whole looks like this:

Cover.png

I exported 3 layers in a psd file. It should be noted that the image format should be .png, so that the transparent area of ​​the previous layer will not be filled with white and block the next layer, and then use the Image widget to display the image directly. Can:

prospectMedium shot (white text, so invisible)background
fore.pngmid.pngback.png

1. Let the layers move

Divided into three pictures, we naturally think of using Stackas a container, turn into three layers (Widget)


 Widget? backgroundWidget;

 Widget? middleWidget;

 Widget? foregroundWidget;

Layer movement is actually very simple, that is, to modify the offset of each layer. And then observe the effect of this realization, you will find as we rotate content in the layer it seems to come out the same.

So when we first entered, we must see only part of the picture. My idea is to set each layer scaleto enlarge the picture. The display window is fixed, so you can only see the center of the picture at the beginning. (The middle layer can not be used, because the middle layer itself does not move, so there is no need to enlarge it)

image.png

Rotate the phone to modify the offset, and set opposite offsets for the foreground and background layers to achieve the effect of the reverse movement of the two layers.

Two factors need to be considered when calculating the offset:

1. The maximum offset of the layer

The layer has been enlarged by a certain proportion, so there is a maximum offset range, and the offset cannot exceed this range.

image.png

It is not difficult to see that the maximum offset calculation method in the (Zoom -1) * width / 2, horizontal direction is: the same in the vertical direction.

2. The relative offset speed of the foreground and background layers

Because the zoom ratios of the foreground and background may be different, if the two are offset by a relative offset of 1:1, the following situations may occur.

 If the foreground zoom is 1.4 and the background is 1.8, when the display area moves 2 pixels to the left. At this time, the area displayed by the background layer also moves 2 pixels to the left, and the foreground layer is the opposite. But at this time the foreground has reached the maximum offset and can no longer move. In fact, there are still areas in the background that cannot be displayed, so the corresponding offset ratio can be calculated by the zoom ratio of the two to ensure that the two pictures can be displayed completely.


 Offset getForegroundOffset(Offset backgroundOffset) {
   
   double offsetRate = ((widget.foregroundScale ?? 1) - 1) /
       ((widget.backgroundScale ?? 1) - 1)

   return -Offset(
       backgroundOffset.dx * offsetRate, backgroundOffset.dy * offsetRate);
 }

Here I use the background offset as the standard to calculate the foreground offset, and consider the maximum offset range before calculating the background offset, so as to ensure that neither the foreground nor the background cross-border behavior. First use drag to change the offset and call setState to update the interface to see the effect achieved by the layer part:

The background shifts as the finger slides, while the foreground moves in the opposite direction. When it reaches the boundary of the layer, it cannot continue, and the layer remains motionless during the entire process.

2. Sensor control offset

After the layer displacement is realized, we only need to change the above offset triggered by finger sliding to triggered by the sensor.

Here we come to think about a problem, when our device is in a horizontal state, the display area is centered, and when the device is tilted, the display area moves.

image.png

So how many angles should be rotated to achieve the maximum offset?

So here I have defined two variables:

  double maxAngleX;
   double maxAngleY;

Indicates the maximum rotation angle in the horizontal and vertical directions, respectively. Assuming maxAngleX is 10, it means that when you rotate the device 10° in the horizontal direction, the image is displayed to the boundary.

With this definition, we can infer that the offset of the background layer rotated by 1° is:

1/maxAngleX * maxBackgroundOffset.dx , the same is true in the vertical direction.

The idea is like this, but I also encountered a thorny problem when I realized it:

Since the sensors_plus plugin provides the rotation speed (rad/s) in each direction, how do we calculate the actual rotation angle? .

In fact, it is not difficult: Rotation radian = (rotation speed (rad/s) * time) , then what is the time here?

Look at the Android implementation of the sensors_plus plug-in. This plug-in registers the callback of the gyroscope sensor through the SensorManager, and transmits the collected data directly to the Flutter side through the chanel.

sensorManager . registerListener(sensorEventListener, sensor, SensorManager . SENSOR_DELAY_NORMAL );

There are several types of acquisition sensitivity of SensorManager on the Android side

  • SensorManager.SENSOR_DELAY_FASTEST (0 microseconds): the fastest. Lowest latency, generally not particularly sensitive processing is not recommended, this mode may consume a lot of power in the mobile phone, because the raw data is transmitted, the algorithm does not handle well, which will affect the game logic and UI performance
  • SensorManager.SENSOR_DELAY_GAME (20000 microseconds): game. Game delay, generally most real-time games use this level
  • SensorManager.SENSOR_DELAY_NORMAL (200000 microseconds): Normal. Standard delay, it can be used for general puzzle or EASY level games, but too low sampling rate may cause frame skipping in some racing games
  • SensorManager.SENSOR_DELAY_UI (60000 microseconds): User interface. Generally used for automatic rotation of the screen orientation, relatively saving power and logic processing, not used in general game development

Different sensitivities of different acquisition times, sensors_plus default SENSOR_DELAY_NORMALi.e. 0.2S, then no timely response to feeling the actual use. So I’ll fork projects down, will SENSOR_DELAY_NORMALchanged to SENSOR_DELAY_GAME, i.e. each acquisition time 20,000 sec (0.02 sec) .

Converted to an angle is: x * 0.02 * 180 / π , and then use the angle to convert the background offset. After the background offset takes into account the maximum offset range, calculate the foreground and call setState to update the interface. The key steps are as follows:

gyroscopeEvents. the listen ((Event) {
   the setState (() {
     // calculate the rotational speed acquired by the background delta offset 
    Offset deltaOffset =  gyroscopeToOffset ( - event.y, - event.x);
     // initial offset + delta Consider out-of-bounds after offset 
    backgroundOffset =  considerBoundary (deltaOffset + backgroundOffset);
     // background offset obtains foreground offset according to zoom ratio 
    foregroundOffset =  getForegroundOffset (backgroundOffset);
  });
});

Four, the constructor description

InteractionalWidget

AttributesinstructionIs it mandatory
double widthWindow widthYes
double heightWindow heightYes
double maxAngleXMaximum horizontal rotation angleYes
double maxAngleYMaximum vertical rotation angleYes
double? backgroundScaleBackground zoom rationo
double? middleScaleMid-range layer zoom rationo
double? foregroundScaleThe zoom ratio of the foreground layerno
Widget? backgroundWidgetBackground layer widgetno
Widget? middleWidgetMiddle shot widgetno
Widget? foregroundWidgetForeground widgetno

All three layers are not mandatory, so you can also specify the displacement of a single layer of foreground/background.

The warehouse has been uploaded to pub through dependency:

All the codes have been uploaded to github, and the demo program apk can be downloaded and run directly. The warehouse will also update some interactive widgets. Give me a like, follow, and star.


Five, finally

Next was originally intended to write network programming, see halfway to achieve comfortable guest APP naked eye 3D effect so quickly take advantage of the weekend to achieve a bit, thanks again freely team to provide such a wonderful idea. In the next issue, according to the previous plan, a basic LAN multi-terminal group chat service will be implemented through broadcast/multicast.

If you have any questions, you can contact me through the official account. If the article is inspiring for you, I hope to get your praise, attention and collection. This is the biggest motivation for my continuous writing. Thanks~

Public account : The most detailed Flutter advancement and optimization guides are organized and collected in the attacking Flutter or runflutter , welcome to pay attention.

Download flutter interactional widget source code on GitHub


https://github.com/fluttercandies/flutter_interactional_widget
32 forks.
164 stars.
1 open issues.

Recent commits: