I am trying to build a WebRTC application in Android Studio, but I am failing to understand why the SurfaceViewRender is not displaying the captured frames (even though in the Android Logs I see that the frames are captured by the VideoCapturer)
I tried many ways to make the SurfaceViewRender to display the frames.
Here is the latest configuration: Android Manifest:
<uses-feature
android:name="android.hardware.camera"
android:required="true" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MICROPHONE" />
<queries android:name="android.permission.QUERY_ALL_PACKAGES"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-feature android:name="android.hardware.camera.front"
android:required="true"/>
Init of the surfaceViewRenderer
val surfaceViewRenderer = findViewById<SurfaceViewRenderer>(R.id.surfaceViewRenderer)
surfaceViewRenderer.init(rootEglBase.eglBaseContext,null)
surfaceViewRenderer.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT)
surfaceViewRenderer.setZOrderMediaOverlay(true)
surfaceViewRenderer.setEnableHardwareScaler(true)
surfaceViewRenderer.setMirror(true)
surfaceViewRenderer.setBackgroundColor(Color.Transparent.toArgb())
surfaceViewRenderer.visibility = View.VISIBLE
The addFrametoLocalVideoView (which is SurfaceVideoView called from another class):
private void addFramesToLocalVideoView() {
if(localVideoView!=null && localVideoView.isAttachedToWindow()){
var widthPixels = (int)(120 * context.getResources().getDisplayMetrics().density);
var heightPixels = (int)(160 * context.getResources().getDisplayMetrics().density);
localVideoView.post(()->videoCapturer.startCapture(widthPixels, heightPixels, 24));
videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
localVideoTrack = peerConnectionFactory.createVideoTrack("localVideoTrack", videoSource);
localVideoTrack.addSink(localVideoView);
peerConnection.addTrack(localVideoTrack);
Log.d(TAG, "initializePeerConnections:" + localVideoView.isEnabled());
}
}
The VideoCapturer Methods:
public VideoCapturer createVideoCapturer() {
// Get the list of available cameras using Camera2Enumerator
Camera2Enumerator enumerator = new Camera2Enumerator(context);
String[] deviceNames = enumerator.getDeviceNames();
// Select the front or back camera (or any specific camera)
for (String deviceName : deviceNames) {
if (enumerator.isFrontFacing(deviceName)) {
// Use the front-facing camera
return createCameraCapturer(enumerator, deviceName);
}
}
return null; // Handle case where no suitable camera is found
}
private VideoCapturer createCameraCapturer(CameraEnumerator enumerator, String deviceName) {
SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create("CameraTexture", null);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, new CustomVideoCapturer("VideoCapturer"));
videoCapturer.initialize(surfaceTextureHelper, context,new CustomCapturerObserver("CapturerObserver"));
return videoCapturer;
}
Some logs:
Camera2Session: Opening camera 1
onCameraOpening: 1
startInputReason = 1
Camera2Session: Camera opened.
Camera2Session: Camera capture session configured.
Camera2Session: Using video stabilization.
Camera2Session: Using continuous video auto-focus.
Camera2Session: Camera device successfully started.
CameraCapturer: Create session done. Switch state: IDLE
onCapturerStarted: true
onFirstFrameAvailable
onFrameCaptured: org.webrtc.VideoFrame@23da5de
onFrameCaptured: org.webrtc.VideoFrame@47d73bf