Android imitation firefly video desktop magic LiveWallPaper

This article has been originally published on my public account hongyangAndroid.
Please indicate the source for reprinting:
http://blog.csdn.net/lmj623565791/article/details/72170299
This article is from Zhang Hongyang's blog

I. Overview

Last week, my WeChat public account pushed an article on Android to achieve "transparent screen" . I felt very interested after seeing it at that time, and immediately contacted the author to ask for authorization~~

Everyone is welcome to scan the QR code on the left to follow my official account, and push excellent technical blog posts at 7:30 every day.

The reason for my interest is that I am a senior user of Neihan Duanzi, and I was basically swiped by a software called Firefly Video Desktop (that is, using video as a desktop) some time ago, so I read the author's code and saw SurfaceHolder , I immediately thought that it can definitely be used to play video to achieve the effect of video desktop, so I tried it on the weekend, and it was really simple.

So this article is infinitely grateful to Android for implementing the "transparent screen" article, and the code is partly referenced from the transparent camera provided by it.

https://github.com/songixan/Wallpaper

The renderings are as follows:

Note: The test machine in this article is Xiaomi 5s, there may be some compatibility problems with different mobile phones, try to solve them, welcome to leave a message.

2. Realization

(1) Configuration related

First write an xml file to describe wallpaper thumbnail, description, settingsActivityetc. Here, for simplicity, only thumbnail is set.

<?xml version="1.0" encoding="utf-8"?>
<wallpaper xmlns:android="http://schemas.android.com/apk/res/android"
    android:thumbnail="@mipmap/ic_launcher" />

(2) write code

Wallpaper needs to be displayed on the screen all the time, and behind it is actually a Service, so implementing a Wallpaper needs to inherit from WallpaperServiceand implement its abstract methods onCreateEngine, as follows:

public class VideoLiveWallpaper extends WallpaperService {
    public Engine onCreateEngine() {
        return new VideoEngine();
    }
    //...
}   

You can see that the return value is an Engine, and Engine is the inner class of WallpaperService, which contains methods such as onSurfaceCreated, , onSurfaceChanged, onSurfaceDestroyed, onTouchEventetc. When you see these methods, you immediately think of SurfaceView. For SurfaceView related knowledge, please refer to:

Also, do you remember playing videos on Android?

The conventional practice is through VideoView, in addition to that through MediaPlayer and SurfaceView. Today's example is similar to the latter.

We only need to continuously transmit the decoded data to the incoming Surface through MediaPlayer.

class VideoEngine extends Engine {

    private MediaPlayer mMediaPlayer;

    @Override
    public void onSurfaceCreated(SurfaceHolder holder) {
        L.d("VideoEngine#onSurfaceCreated ");
        super.onSurfaceCreated(holder);
        mMediaPlayer = new MediaPlayer();
        mMediaPlayer.setSurface(holder.getSurface());
        try {
            AssetManager assetMg = getApplicationContext().getAssets();
            AssetFileDescriptor fileDescriptor = assetMg.openFd("test1.mp4");
            mMediaPlayer.setDataSource(fileDescriptor.getFileDescriptor(),
                    fileDescriptor.getStartOffset(), fileDescriptor.getLength());
            mMediaPlayer.setLooping(true);
            mMediaPlayer.setVolume(0, 0);
            mMediaPlayer.prepare();
            mMediaPlayer.start();

        } catch (IOException e) {
            e.printStackTrace();
        }

    }

     @Override
    public void onVisibilityChanged(boolean visible) {
        L.d("VideoEngine#onVisibilityChanged visible = " + visible);
        if (visible) {
            mMediaPlayer.start();
        } else {
            mMediaPlayer.pause();
        }
    }

    @Override
    public void onSurfaceDestroyed(SurfaceHolder holder) {
        L.d("VideoEngine#onSurfaceDestroyed ");
        super.onSurfaceDestroyed(holder);
        mMediaPlayer.release();
        mMediaPlayer = null;

    }

The code is very simple. To initialize mMediaPlayer in onSurfaceCreated, the core code is to set the setSurface method. Here I set mute by default.

onVisibilityChanged, that is, when the desktop is not visible, we want to pause the playback and wait until we return to the desktop to continue.

Release resources when onSurfaceDestroyed~~

In this way, our VideoLiveWallpaper is written. Don't forget that it is a Service and needs us to register.

<service
    android:name=".VideoLiveWallpaper"
    android:label="@string/app_name"
    android:permission="android.permission.BIND_WALLPAPER"
    android:process=":wallpaper">
    <!-- 配置intent-filter -->
    <intent-filter>
        <action android:name="android.service.wallpaper.WallpaperService" />
    </intent-filter>
    <!-- 配置meta-data -->
    <meta-data
        android:name="android.service.wallpaper"
        android:resource="@xml/livewallpaper" />
</service>

(3) Set as wallpaper

After the registration is complete, we add a button click in MainActivity to set it as the desktop background, and the calling code is as follows

public static void setToWallPaper(Context context) {
    final Intent intent = new Intent(WallpaperManager.ACTION_CHANGE_LIVE_WALLPAPER);
    intent.putExtra(WallpaperManager.EXTRA_LIVE_WALLPAPER_COMPONENT,
            new ComponentName(context, VideoLiveWallpaper.class));
    context.startActivity(intent);
}

This completes the preliminary writing of the code~~

(4) Add support for some parameters

Just now we set the default to mute. Maybe sometimes we want to be able to dynamically control the parameters of the video desktop. Normally, we should try to use settingsActivityit, but I think broadcasting is also quite suitable. It is nothing more than Service (maybe in a separate process) and Activity wait for communication~~

Here we add a checkbox to support setting the sound to be turned on or off.

public static final String VIDEO_PARAMS_CONTROL_ACTION = "com.zhy.livewallpaper";
public static final String KEY_ACTION = "action";
public static final int ACTION_VOICE_SILENCE = 110;
public static final int ACTION_VOICE_NORMAL = 111;

class VideoEngine extends Engine {
    // 省略其他代码
    private BroadcastReceiver mVideoParamsControlReceiver;

    @Override
    public void onCreate(SurfaceHolder surfaceHolder) {
        super.onCreate(surfaceHolder);
        IntentFilter intentFilter = new IntentFilter(VIDEO_PARAMS_CONTROL_ACTION);
        registerReceiver(mVideoParamsControlReceiver = new BroadcastReceiver() {
            @Override
            public void onReceive(Context context, Intent intent) {
                L.d("onReceive");
                int action = intent.getIntExtra(KEY_ACTION, -1);

                switch (action) {
                    case ACTION_VOICE_NORMAL:
                        mMediaPlayer.setVolume(1.0f, 1.0f);
                        break;
                    case ACTION_VOICE_SILENCE:
                        mMediaPlayer.setVolume(0, 0);
                        break;
                }
            }
        }, intentFilter);
    }
    @Override
    public void onDestroy() {
        unregisterReceiver(mVideoParamsControlReceiver);
        super.onDestroy();

    }
}

Engine also has onCreate and onDestroy declaration cycle methods, which can register dynamic broadcasts in onCreate, and ACTION_VOICE_NORMALturn on the sound when receiving the sent action ACTION_VOICE_SILENCE;

Finally, add two static methods directly to VideoLiveWallpaper for sending broadcasts:

public static void voiceSilence(Context context) {
    Intent intent = new Intent(VideoLiveWallpaper.VIDEO_PARAMS_CONTROL_ACTION);
    intent.putExtra(VideoLiveWallpaper.KEY_ACTION, VideoLiveWallpaper.ACTION_VOICE_SILENCE);
    context.sendBroadcast(intent);
}

public static void voiceNormal(Context context) {
    Intent intent = new Intent(VideoLiveWallpaper.VIDEO_PARAMS_CONTROL_ACTION);
    intent.putExtra(VideoLiveWallpaper.KEY_ACTION, VideoLiveWallpaper.ACTION_VOICE_NORMAL);
    context.sendBroadcast(intent);
}

In Actiivty:

public class MainActivity extends AppCompatActivity {
    private CheckBox mCbVoice;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        mCbVoice = (CheckBox) findViewById(R.id.id_cb_voice);

        mCbVoice.setOnCheckedChangeListener(
                new CompoundButton.OnCheckedChangeListener() {
                    @Override
                    public void onCheckedChanged(
                            CompoundButton buttonView, boolean isChecked) {
                        if (isChecked) {
                            // 静音
                            VideoLiveWallpaper.voiceSilence(getApplicationContext());
                        } else {
                            VideoLiveWallpaper.voiceNormal(getApplicationContext());
                        }
                    }
                });
    }
}

Listen to the CheckBox status and send a broadcast.

ok, such a simple video desktop is done~~

Source address:

Import this directory directly as a project.


If you support me, you can follow my official account, and I will push new knowledge every day~

Welcome to pay attention to my WeChat public account: hongyangAndroid
(you can leave me a message for the articles you want to learn, support submission)

refer to

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325765259&siteId=291194637