Make software that mirrors Android screen to PC 1

Introduction

This article explains the function of mirroring Android screen to PC. Continuing from this article Creating software that mirrors the Android screen to a PC 2 Real-time touch there is.

I will make something like this

capture3.gif

Create software that mirrors the Android screen to your PC. I don't know with Gif, but it works slimy at 50-60FPS. It also works on Windows, Mac and Linux environments.

pic.jpg The bit rate of H.264 is 5000Kbps, the resolution is 1/2, and the delay is around 300ms, so it is fairly real-time. It may be shorter depending on the setting.

Motivation

There are several softwares that mirror Android screens.

Among them, Vysor is a software with excellent image quality and frame rate. I was impressed when I used it for the first time because it normally moves at 60 FPS. The image quality is limited in the free version, but it will be canceled when you charge. (There are subscription system and purchase-out system)

However, it is difficult for students to spend money on software. I decided to make it myself.

However, this time we will only create a mirroring function.

specification

fig1.png

Build a server on the Android side and connect from the PC. It is possible via Wi-Fi, but we will communicate via USB for stability.

Being on the Android side

Capture the screen using Media Projection. (Therefore, compatible terminals will be 5.0 or later.) Encode the capture with Media Codec and send it to the PC side.

Being on the PC side

Connect to the server to decode and view the stream. However, this time I will throw it all over to ffplay and will not make any programs on the PC side (laugh)

ffplay is a video playback software included in the famous video conversion tool FFmpeg. You can play various things by specifying the parameters. This time, we will use it to decode the stream and display it in real time. I talked about the OS environment at the beginning this time because this ffmpeg is compatible with various OSs.

About the codec to use

A list of codecs that can be encoded on the Android side can be found in Supported Media Formats, but in the end it seems to be apt depending on the terminal. I tried it on multiple real machines and emulators, but only H.264 worked on all of them.

Also, although VP8 can generate the encoder itself, it seems that there is something wrong with the acquired buffer and it fails with an error. VP9 became [Invalid data found when processing input] and ffplay didn't recognize it. H.265 can be used with any terminal that can be used.

In this sample, the codec can be specified, so please try it on the actual machine to see which one works. If you can use VP8 or 9, it's easy without worrying about the license, but it's a shame.

Regarding the error, I will add it as soon as the cause is known. (I would be grateful if you could tell me what information you have.)

For details on the codec types, see Differences from video codec types (H.264, VP9, MPEG, Xvid, DivX, WMV, etc.) [Comparison]. Introduced.

Capture the screen of Android

You can get the screen from the application side on Android 5.0 or later. Specifically, use Media Projection. Take a screenshot from the ANDROID 5.0 app It is explained in detail here.

Flow of using Media Projection

It's easy, but here's how to use Media Projection. You may want to look at it by referring to the code in the above article.

Class to use

・ ** MediaProjectionManager ** Display a dialog asking the user for permission to capture the screen, and get Media Projection if permitted.

・ ** Media Projection ** Provides the function to acquire the screen. To be precise, it creates a buffer called a virtual display and mirrors the screen there. There are several modes other than mirroring.

・ ** VirtualDisplay ** A buffer created and written by MediaProjection. It has a Surface to write inside, and it's actually a buffer. You can specify that Surface when you create it. Therefore, if you specify the Surface of ImageReader, you can get the image via ImageReader, If you specify the Surface of SurfaceView, it will be displayed in View in real time.

・ ** Surface ** A buffer that specializes in "handling images" unlike ordinary buffers In addition to Virtual Display, it is also used inside SurfaceView and video playback players used when making games.

Image when using ImageReader

Actually, ImageReader has a mechanism to store frames, but it looks like this. fig2.png

Procedure (code is in the second half)

  1. Get ** MeidaProjectionManager ** with ** getSystemService **
  2. Create an intent that asks for permission to capture the screen with ** createScreenCaptureIntent ** of ** Manager ** Throw the intent created in 3.2 and catch it with ** onActivityResult ** of ** Activity **
  3. If the user has permission, get ** MediaProjection ** with ** getMediaProjection ** of ** Manager **
  4. Create a virtual display in mirroring mode with ** createVirtualDisplay ** of the acquired ** Media Projection ** At this time, specify ** Surface ** that you want to write Since the screen contents are written in real time on the ** Surface ** specified in 6.5, use it.

Encode video on Android

Use MediaCodec.

The following articles were helpful. Official Document MediaCodec class overview Japanese translation How to compress video without FFmpeg using MediaCodec on Android (with library) What was introduced in the above article EncodeAndMuxTest (Although the method is old, the procedure was helpful)

Flow of using MediaCodec

Here are some general steps to take when using MediaCodec.

Class to use

・ ** MediaCodec ** Video encoders and decoders ・ ** MediaFormat ** Stores video information such as codecs, bit rates, and frame rates. Used to set MediaCodec.

Image of use

You can use Buffer and Surface for input and output of frames. It is also possible to use Surface for input and Buffer for output.

fig3.png

Procedure (code is in the second half)

  1. Create encoders and decoders with ** createEncoderByType / createDecoderByType **
  2. Create ** MediaFormat ** and set the video to be encoded and decoded.
  3. Execute ** configure ** of ** Media Codec **. Specify the Media Format created in 2.
  4. Set a callback when processing asynchronously
  5. Start conversion with ** start **
  6. Send the data before encoding / decoding to Input
  7. Extract the processed data from Output

Precautions when inputting / outputting data

As mentioned above, Surface and Buffer can be used for input / output of MediaCodec data. However, there are differences in the delivery method depending on what you use.

For Input

When using Buffer, you need to manually pass the data to MediaCodec. The Surface will be handed over automatically when the contents are updated.

For Output

You need to get the data manually when using Buffer. The contents of Surface will be updated automatically.

This software uses Surface for input and Buffer for output.

Now, implementation

Layout

fig5.png ** Layout xml is here **

Process flow

The process starts when the start button is clicked. fig4-2.png

code

I've put it all together in MainActivity.java, so I haven't implemented it on a large scale. Also, please note that some errors are not checked. ** The whole code is here **

The following is an excerpt of the code, so please refer to the entire code for viewing.

1. OnClick Displays a dialog asking for permission to capture

Code on lines 130-155

MainActivity.java


    button_start.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {

                switch (states) {
                    case Stop:
                        //Display a dialog for confirming capture
                        manager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
                        startActivityForResult(manager.createScreenCaptureIntent(), REQUEST_CODE);
                        break;
                    case Waiting:
                        //Cancel standby
                        Disconnect();
                        break;
                    case Running:
                        //Disconnect
                        Disconnect();
                        break;
                }

            }
        });

Since this button is also used to stop, the process is branched according to the state. Processing starts at Stop. I have acquired ** MediaProjectionManager ** and are displaying a dialog for confirming capture to the user.

2. Process the result of the onActivityResult dialog

This is the code on lines 162-206.

MainActivity.java


    @Override
    public void onActivityResult(int requestCode, int resultCode, Intent intent) {
        if (resultCode != RESULT_OK) {
            Toast.makeText(this, "permission denied", Toast.LENGTH_LONG).show();
            return;
        }

        //If the user approves the screen capture
        //Get Media Projection
        mediaProjection = manager.getMediaProjection(resultCode, intent);


        //Determine the size of the virtual display
        double SCALE = seekBar_scale.getProgress() * 0.01;

        DisplayMetrics metrics = getResources().getDisplayMetrics();
        final int WIDTH = (int) (metrics.widthPixels * SCALE);
        final int HEIGHT = (int) (metrics.heightPixels * SCALE);
        final int DENSITY = metrics.densityDpi;


        try {

            PrepareEncoder(
                    WIDTH,
                    HEIGHT,
                    codecs[spinner_codec.getSelectedItemPosition()],
                    seekBar_bitrate.getProgress(),
                    seekBar_fps.getProgress(),
                    10//The I frame is fixed
            );

            SetupVirtualDisplay(WIDTH, HEIGHT, DENSITY);

            StartServer();



        } catch (Exception ex) {//Errors when creating encoders
            ex.printStackTrace();
            Toast.makeText(this, ex.getMessage(), Toast.LENGTH_LONG).show();
        }


    }

When the user taps the dialog displayed in 1., ** onActivityResult ** is generated. If allowed, get ** MediaProjection ** with ** getMediaProjection **. Then get the screen size and prepare the encoder and virtual display.

3.PrepareEncoder Preparation of encoder

This is the code on lines 218-274.

MainActivity.java


//Encoder preparation
    private void PrepareEncoder(int WIDTH, int HEIGHT, String MIME_TYPE, int BIT_RATE, int FPS, int IFRAME_INTERVAL) throws Exception {

        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, WIDTH, HEIGHT);
        //Set format properties
        //If you do not set the minimum properties, you will get an error in configure
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FPS);
        format.setInteger(MediaFormat.KEY_CAPTURE_RATE, FPS);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);


        //Get encoder
        codec = MediaCodec.createEncoderByType(MIME_TYPE);

        codec.setCallback(new MediaCodec.Callback() {
            @Override
            public void onInputBufferAvailable(@NonNull MediaCodec codec, int index) {
                Log.d("MediaCodec", "onInputBufferAvailable : " + codec.getCodecInfo());

            }

            @Override
            public void onOutputBufferAvailable(@NonNull final MediaCodec codec, final int index, @NonNull MediaCodec.BufferInfo info) {
                Log.d("MediaCodec", "onOutputBufferAvailable : " + info.toString());
                ByteBuffer buffer = codec.getOutputBuffer(index);
                byte[] array = new byte[buffer.limit()];
                buffer.get(array);

                //Send encoded data
                Send(array);

                //Free buffer
                codec.releaseOutputBuffer(index, false);
            }

            @Override
            public void onError(@NonNull MediaCodec codec, @NonNull MediaCodec.CodecException e) {
                Log.d("MediaCodec", "onError : " + e.getMessage());
            }

            @Override
            public void onOutputFormatChanged(@NonNull MediaCodec codec, @NonNull MediaFormat format) {
                Log.d("MediaCodec", "onOutputFormatChanged : " + format.getString(MediaFormat.KEY_MIME));
            }
        });

        //Set encoder
        codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

        //Get the Surface used to pass the frame to the encoder
        //Must be called between configure and start
        inputSurface = codec.createInputSurface();

    }

First, create ** Media Format **. Then set the parameters required for encoding. Then use ** createEncoderByType ** to create ** MediaCodec **. Then run ** configure ** to set ** Media Format **.

Finally, call ** createInputSurface ** to get the Surface for input. When you write an image on this Surface, its contents will be encoded automatically.

Also, I'm setting a callback here, but I'm using Only ** onOutputBufferAvailable ** that will be called when the encoded data becomes available. Acquires the encoded data as a byte array and sends it to the PC side.

4.SetupVirtualDisplay Creating a virtual display

This is the code on lines 208-216.

MainActivity.java


//Virtual display setup
    private void SetupVirtualDisplay(int WIDTH, int HEIGHT, int DENSITY) {

        virtualDisplay = mediaProjection
                .createVirtualDisplay("Capturing Display",
                        WIDTH, HEIGHT, DENSITY,
                        DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
                        inputSurface, null, null);//Use the one obtained from the encoder for the writing Surface
    }

I am creating a virtual display. ** The most important thing here is to set the inputSurface obtained from the encoder to the writing Surface. ** ** This will allow the mirrored screen to be written directly to the encoder's InputSurface. The screen is encoded without any special action. The flow is shown below.

fig6.png

5. StartServer Start server thread

This is the code on lines 312 to 322.

MainAcitvity


//Start standby and send threads
    private void StartServer() {
        senderThread = new HandlerThread("senderThread");
        senderThread.start();
        senderHandler = new Handler(senderThread.getLooper());

        serverThread = new Thread(this);
        serverThread.start();

        setState(States.Waiting);
    }

We are starting a thread for sending and a thread for listening. Since the standby thread implements Runnable in Activity, it performs the processing in run () there. The sending thread uses HandlerThread so that it can be queued.

6. Server processing Waiting for a connection from a PC

This is the code on lines 324 to 346.

MainActivity.java


    //Server thread
    //Accepts connection only once
    public void run() {
        try {
            listener = new ServerSocket();
            listener.setReuseAddress(true);
            listener.bind(new InetSocketAddress(8080));
            System.out.println("Server listening on port 8080...");

            clientSocket = listener.accept();//Wait until connection

            inputStream = clientSocket.getInputStream();
            outputStream = clientSocket.getOutputStream();

            //Encoding needs to start when the client is connected
            codec.start();

            setState(States.Running);

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

Launch the server socket and wait. This time, it is not necessary to distribute to multiple PCs, so we are accepting connections only once.

After connecting, the encoder is started. Otherwise, it could not be played on the PC side. The first frame after starting encoding is the I frame, which is essential for future decoding. If you do not receive the I frame first, you will not be able to play it on your PC. I think this is the reason why it can't be played. (Please point out if it is different)

I frame? ?? Those who say What is a keyframe? Difference between I frame, P frame and B frame [GOP] Please see.

7. Data transmission

Code on lines 348-366

MainActivity.java


    //Send data
    //Don't change the order
    //Add to queue
    private void Send(final byte[] array) {
        senderHandler.post(new Runnable() {
            @Override
            public void run() {

                try {
                    outputStream.write(array);
                } catch (IOException ex) {
                    //If it cannot be sent, it is considered disconnected.
                    ex.printStackTrace();
                    Disconnect();
                }

            }
        });
    }

It is called in the callback set in the encoder of 3. The callback is running on the main thread, as is this method called. Due to the restriction that network related processing should not be performed in the main thread I am trying to perform the transmission process in the thread for transmission.

Also, as an aside, if you write the following, the screen displayed on the PC side may be disturbed.

MainActivity.java


    private void Send(final byte[] array) {

        new Thread(new Runnable() {
            @Override
            public void run() {

                try {
                    outputStream.write(array);
                } catch (IOException ex) {
                    //If it cannot be sent, it is considered disconnected.
                    ex.printStackTrace();
                    Disconnect();
                }

            }
        }).start();
    }

In the first place, it is not good code at the time of spawning threads, but This does not guarantee the order of the frames to be sent. As mentioned in the I-frame commentary article earlier, Because the compressed frame only represents the difference between the previous and next frames If the context is disturbed, it will not be decoded correctly.

8. Cutting, post-processing

This is the code on lines 368-387.

MainActivity.java


//Cutting process
    private void Disconnect() {

        try {
            codec.stop();
            codec.release();
            virtualDisplay.release();
            mediaProjection.stop();


            listener.close();
            if (clientSocket != null)
                clientSocket.close();

        } catch (IOException ex) {
            ex.printStackTrace();
        }

        setState(States.Stop);
    }

The objects used so far are stopped and released. This will return you to the Stop state, so if you press the button again, the process will start from the beginning and you will be able to connect again.

Communicate between PC and Android device via USB

Specifically, it can be realized by using adb-server like a proxy server. Play with adb I referred to here.

Easy with a single command

adb forward tcp:xxxx tcp:yyyy

Specify the port number used on the PC side in xxxx and the port number used on the terminal in yyyy. This time

adb forward tcp:8080 tcp:8080

I think it's okay. Now when you connect to the 8080 port of localhost (127.0.0.1) on the PC side, it will connect to the 8080 on the terminal side ** via USB.

As an aside, why was 127.0.0.1 assigned to the localhost IP? I was curious about it, so when I looked it up, it seems that there is a historical background of IPv4. Why is 127.0.0.1 the local host?

Display the screen on the PC

Thank you for reading this long article. Finally, I would like to display the screen on the PC and finish. Since ffplay introduced at the beginning of the article is used, please download it if you do not have it. Download FFmpeg After downloading, unzip the file and you will find the bin folder, which contains the main unit. Like FFmpeg, ffplay is started by specifying parameters from CUI.

procedure

    1. Connect your Android device to your PC so that it will be recognized by adb.
  1. Execute adb forward tcp: 8080 tcp: 8080
    1. Launch the app on your Android device and press Start
  2. After starting the command prompt or PowerShell, move to the directory where ffplay is located, and then execute the following
ffplay -framerate 60 -analyzeduration 100 -i tcp://127.0.0.1:8080

This will display the Android screen on your PC. (If it does not appear, please lower the status bar or return to home to refresh the screen.)

You can exit with Esc.

Meaning of parameters

-framerate 60 simply specifies the frame rate. Must be the same as the cast app settings.

-analyzeduration 100 Limits the amount of time ffplay parses received frames. (100ms this time) ffplay analyzes and displays after a certain amount of frames have accumulated, so if this option is not specified, it will be displayed with a delay.

-i tcp: //127.0.0.1:8080 The address to receive the stream. If you want to try via Wi-Fi, please specify the IP of the terminal. Also, if you specify the file path here, you can play the video normally.

I'm in trouble

I have a personal problem. If you have any information, please let me know. ** On Android 8.0, even though the process of waiting for the connection of the server socket is done in a separate thread **

MainActivity.java


clientSocket = listener.accept();

The UI is blocked. The physical buttons also stop working at all, and the system UI will restart after a while if you don't connect and unblock. You can reproduce it with the emulator, so please try it.

Did you change any specifications in 8.0 ...? It works fine before 7.1.

Impressions

It's still not enough to replace Vysor, but I was surprised how easy it was to implement mirroring. Functions such as real-time touch processing are still lacking, but I would like to create one in the future.

Also, I would like to create a function that can automatically operate the terminal with a script. In that regard, embed scripting & editors in C # apps running on Windows Try adding scripting function to C # app We have also published an article called, so please have a look if you are interested.

Then, thank you for watching until the end.

Next Creating software that mirrors the Android screen to a PC 2 Real-time touch

Recommended Posts

Make software that mirrors Android screen to PC 1
How to make a splash screen
[Android] How to make Dialog Fragment
[Android] I tried to make a material list screen with ListView + Bottom Sheet
How to make Unity Native Plugin (Android version)
[Java] 4 steps to implement splash screen on Android
How to make an crazy Android music player
[Android Studio] [Java] How to fix the screen vertically
I tried to make FizzBuzz that is uselessly flexible
App development beginners tried to make an Android calculator app
[Introduction to Android application development] Let's make a counter