Implement Screen Share from android Application to web users

Hi Team
I want to implement screen share option from android application to web users, where android user can share screen and show require information to browser users.

Can you guide me how to archive this functionality?

Team any update on this ?

Chrome Browser from Android doesn’t seem to support getDisplayMedia needed for screen share.

Sorry, you mentioned Android application…

OpenVidu doesn’t support screen share in the Android sample app, but maybe you can get inspired in projects like GitHub - Jeffiano/ScreenShareRTC: WebRTC ScreenShare Android to implement it.

You can later contribute to the open source Android app.

Regards

I’ve finally achieved using native webrtc code on Android.
It’s like this.

Inside your LocalParticipant class add this function:

public void toggleScreenSharing(Boolean isScreenSharing, int mMediaProjectionPermissionResultCode, Intent mMediaProjectionPermissionResultData) {
        session.getLocalParticipant().getMediaStream().removeTrack(videoTrack);
        try {
            videoCapturer.stopCapture();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }

        videoCapturer.dispose();
        videoTrack.dispose();
        surfaceTextureHelper.stopListening();
        surfaceTextureHelper.dispose();

        if (isScreenSharing) {
            videoCapturer = createScreenCapturer(mMediaProjectionPermissionResultData, mMediaProjectionPermissionResultCode);
            if (videoCapturer != null) {

                //final EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();
                surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", EglUtils.getRootEglBaseContext());
                //EglBase eglBase = EglBase.create();
                //

                videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
                videoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);
                videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
                videoCapturer.startCapture(resolutionWidth, resolutionHeight, 30);

                // create VideoTrack
                videoTrack.setEnabled(true);
                session.getLocalParticipant().getMediaStream().addTrack(videoTrack);
            }
        } else {
            //final EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();
            surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", EglUtils.getRootEglBaseContext());

            videoCapturer = new CustomCameraCapturer(previewView, 6);
            videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
            this.videoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);
            videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
            session.getLocalParticipant().getMediaStream().addTrack(videoTrack);
            // create VideoTrack

        }
    }
public VideoCapturer createScreenCapturer(Intent mMediaProjectionPermissionResultData, int mMediaProjectionPermissionResultCode) {
        if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
            //    showToast("User didn't give permission to capture the screen.");
            return null;
        }
        return new ScreenCapturerAndroid(
                mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
            @Override
            public void onStop() {
                //    showToast("User revoked permission to capture the screen.");
            }
        });
    }

Add this to your CustomWebSocket:

private void handleServerResponse(JSONObject json) throws JSONException {
...
...
...
this.session.getLocalParticipant().mediaStream = stream;
}

And when you click on the “Toggle screen share/Camera” button on screen call the function.
Also keep a flag whether you’re streaming camera or doing a screen share

if(isScreenSharing) {
                    stopScreenSharingAndStartCameraStreaming()
                } else {
                    stopCameraAndStartScreenSharing()
                }
private fun stopCameraAndStartScreenSharing() {
        val mediaProjectionManager = getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
        startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), 29)
    }
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if(requestCode == 29) {
            if(resultCode == RESULT_OK) {
                startScreenSharing(resultCode, data)
            } else {
                showToast("Screen sharing permission denied!")
            }
        }
    }
 private fun startScreenSharing(resultCode: Int, data: Intent?) {
        mMediaProjectionPermissionResultCode = resultCode
        mMediaProjectionPermissionResultData = data
        cameraProvider?.unbindAll() //Im using camerax API for camera streaming to support flash and zoom features
        localParticipant?.toggleScreenSharing(true, resultCode, data)
        isScreenSharing = true
    }
private fun stopScreenSharingAndStartCameraStreaming() {
        bindCameraUseCases()
        llScreenShared?.gone()
        localParticipant?.toggleScreenSharing(false, 0, null)
        isScreenSharing = false
    }

That’s All! This should do the trick.

Also make sure you have a foreground service running with

android:foregroundServiceType="mediaProjection"
1 Like

Hi!

First of all, Thanks for providing a solution for this problem of the lack of a screen share feature on Android.
I read your code but as per your code it seems, At any point, The application can send a camera feed or the screen feed but not both at the same time.
Let’s consider this scenario. A user joins in a conference room with audio and video from their front-facing camera. Now this user wants to share their screen without turning off their camera in the same conference session. How would they do this task?
What if a transceiver can be added in the runtime whenever the user starts the screen share and perform the renegotiation of the media to allow audio, video(camera), and video(screen) at the same time?

To send more than one media stream, simply create a new Connection to the OpenVidu deployment from the client, as if it was a new user connecting to the Session.

How this media stream are setting private void handleServerResponse(JSONObject json) throws JSONException {
…
…
…
this.session.getLocalParticipant().mediaStream = stream;
}

Thanks a ton for sharing your experience and detailed code on using native WebRTC on Android. It’s super helpful!

I’ve found a way to switch between camera and screen sharing smoothly in the OpenVidu session. Here’s how I did it:
Updating the Session Class

  1. Replace the Video Track:

public void replaceVideoTrack(VideoTrack newVideoTrack) {
    PeerConnection peerConnection = localParticipant.getPeerConnection();

    if (peerConnection != null) {
        for (RtpSender sender : peerConnection.getSenders()) {
            if (sender.track() != null && sender.track().kind().equals("video")) {
                sender.setTrack(newVideoTrack, false);
                break;
            }
        }

        renegotiate();
    }
}

private void renegotiate() {
    MediaConstraints constraints = new MediaConstraints();
    createOfferForPublishing(constraints);
}

Integrating Screen Sharing Toggle
Updated handleToggleScreen to use the new method:

    private void handleToggleScreen(Boolean isScreenSharing, int resultCode, Intent data) {
        if (isScreenSharing) {
            PeerConnectionFactory peerConnectionFactory = this.session.getPeerConnectionFactory();

            videoCapturer = createScreenCapture(data, resultCode);

            if (videoCapturer != null) {

                DisplayMetrics displayMetrics = new DisplayMetrics();
                WindowManager windowsManager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
                windowsManager.getDefaultDisplay().getMetrics(displayMetrics);
                int screenWidthPixels = displayMetrics.widthPixels;
                int screenHeightPixels = displayMetrics.heightPixels;

                eglBaseContext = EglBase.create().getEglBaseContext();
                surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);

                VideoSource screenSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
                videoCapturer.initialize(surfaceTextureHelper, context, screenSource.getCapturerObserver());
                videoCapturer.startCapture(screenWidthPixels, screenHeightPixels, 15);

                // Create new VideoTrack
                VideoTrack newVideoTrack = peerConnectionFactory.createVideoTrack("100", screenSource);
                newVideoTrack.addSink(localVideoView);
                newVideoTrack.setEnabled(true);

                // Update the local participant's video track reference
                setVideoTrack(newVideoTrack);

                // Replace the current video track in the peer connection
                session.replaceVideoTrack(newVideoTrack);

            }
        } else {
            startCamera();
            session.replaceVideoTrack(getVideoTrack());

        }
    }

Summary

  1. Add replaceVideoTrack Method: This method allows me to switch the video track in the peer connection.
  2. Update handleToggleScreen: This sets up the new video track and uses replaceVideoTrack.
  3. Renegotiate if Needed: Ensures the connection updates correctly.

With this setup, I can easily switch between camera and screen sharing in my OpenVidu session.

Thanks again for your help! This has made my project a lot better. If you have any more tips or things I should look out for, please let me know!

What is createOfferForPublishing(); is there a piece of documentation for it?