Implement Screen Share from android Application to web users

Hi Team
I want to implement screen share option from android application to web users, where android user can share screen and show require information to browser users.

Can you guide me how to archive this functionality?

Team any update on this ?

Chrome Browser from Android doesn’t seem to support getDisplayMedia needed for screen share.

Sorry, you mentioned Android application…

OpenVidu doesn’t support screen share in the Android sample app, but maybe you can get inspired in projects like GitHub - Jeffiano/ScreenShareRTC: WebRTC ScreenShare Android to implement it.

You can later contribute to the open source Android app.

Regards

I’ve finally achieved using native webrtc code on Android.
It’s like this.

Inside your LocalParticipant class add this function:

public void toggleScreenSharing(Boolean isScreenSharing, int mMediaProjectionPermissionResultCode, Intent mMediaProjectionPermissionResultData) {
        session.getLocalParticipant().getMediaStream().removeTrack(videoTrack);
        try {
            videoCapturer.stopCapture();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }

        videoCapturer.dispose();
        videoTrack.dispose();
        surfaceTextureHelper.stopListening();
        surfaceTextureHelper.dispose();

        if (isScreenSharing) {
            videoCapturer = createScreenCapturer(mMediaProjectionPermissionResultData, mMediaProjectionPermissionResultCode);
            if (videoCapturer != null) {

                //final EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();
                surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", EglUtils.getRootEglBaseContext());
                //EglBase eglBase = EglBase.create();
                //

                videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
                videoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);
                videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
                videoCapturer.startCapture(resolutionWidth, resolutionHeight, 30);

                // create VideoTrack
                videoTrack.setEnabled(true);
                session.getLocalParticipant().getMediaStream().addTrack(videoTrack);
            }
        } else {
            //final EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();
            surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", EglUtils.getRootEglBaseContext());

            videoCapturer = new CustomCameraCapturer(previewView, 6);
            videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
            this.videoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);
            videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
            session.getLocalParticipant().getMediaStream().addTrack(videoTrack);
            // create VideoTrack

        }
    }
public VideoCapturer createScreenCapturer(Intent mMediaProjectionPermissionResultData, int mMediaProjectionPermissionResultCode) {
        if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
            //    showToast("User didn't give permission to capture the screen.");
            return null;
        }
        return new ScreenCapturerAndroid(
                mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
            @Override
            public void onStop() {
                //    showToast("User revoked permission to capture the screen.");
            }
        });
    }

Add this to your CustomWebSocket:

private void handleServerResponse(JSONObject json) throws JSONException {
...
...
...
this.session.getLocalParticipant().mediaStream = stream;
}

And when you click on the “Toggle screen share/Camera” button on screen call the function.
Also keep a flag whether you’re streaming camera or doing a screen share

if(isScreenSharing) {
                    stopScreenSharingAndStartCameraStreaming()
                } else {
                    stopCameraAndStartScreenSharing()
                }
private fun stopCameraAndStartScreenSharing() {
        val mediaProjectionManager = getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
        startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), 29)
    }
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if(requestCode == 29) {
            if(resultCode == RESULT_OK) {
                startScreenSharing(resultCode, data)
            } else {
                showToast("Screen sharing permission denied!")
            }
        }
    }
 private fun startScreenSharing(resultCode: Int, data: Intent?) {
        mMediaProjectionPermissionResultCode = resultCode
        mMediaProjectionPermissionResultData = data
        cameraProvider?.unbindAll() //Im using camerax API for camera streaming to support flash and zoom features
        localParticipant?.toggleScreenSharing(true, resultCode, data)
        isScreenSharing = true
    }
private fun stopScreenSharingAndStartCameraStreaming() {
        bindCameraUseCases()
        llScreenShared?.gone()
        localParticipant?.toggleScreenSharing(false, 0, null)
        isScreenSharing = false
    }

That’s All! This should do the trick.

Also make sure you have a foreground service running with

android:foregroundServiceType="mediaProjection"
1 Like

Hi!

First of all, Thanks for providing a solution for this problem of the lack of a screen share feature on Android.
I read your code but as per your code it seems, At any point, The application can send a camera feed or the screen feed but not both at the same time.
Let’s consider this scenario. A user joins in a conference room with audio and video from their front-facing camera. Now this user wants to share their screen without turning off their camera in the same conference session. How would they do this task?
What if a transceiver can be added in the runtime whenever the user starts the screen share and perform the renegotiation of the media to allow audio, video(camera), and video(screen) at the same time?

To send more than one media stream, simply create a new Connection to the OpenVidu deployment from the client, as if it was a new user connecting to the Session.

How this media stream are setting private void handleServerResponse(JSONObject json) throws JSONException {
…
…
…
this.session.getLocalParticipant().mediaStream = stream;
}