I am using QuickBlox in an app with video calls that needs to work with the smart glasses device Moverio BT-300. I need to be able to get a frame as a Bitmap and save it in the device. After some research, I found multiple ways to do that, but had different problems with each one:
Stopping the video streaming allowing access to the camera, so that the camera app can be called by intent seems to work for most devices. Moverio shows problems here. The camera opens and seems to work fine, but as the image is returned the method onActivityResult is not called and any subsequent attempts to access the camera app during the call results in a camera error. This method seems to work outside the call in Moverio just fine.
QuickBlox allows the use of frameListeners, but even after adding it properly to the SurfaceViewRenderer the onFrame(bitmap: Bitmap) method was never called during the video call.
QuickBlox uses the definition for frame (I420Frame) from WebRTC, with the information to allow the building of an YuvImage. The frame contains a flag (yuvFrame) that tells if YUV information is available for this frame. For most devices, this information is enough, but for Moverio's camera, this flag comes false and YUV information comes null.
The frame definition also contains the opengl-es texture name, used by the video call if YUV information is unavailable, so it should be possible to render this texture in a buffer and get the Bitmap from there. This was the most promising way, since the video call works fine with Moverio, which means the texture information is available with the frame. But the problem I faced here was using openGL simultaneously with the video call rendering. I am able to render the texture and access the Bitmap, but as soon as the call tries to render the next frame on screen, the app crashes.
The method I'm currently trying to implement is 4. Here is the code for creating the bitmap:
val drawer = GlRectDrawer()
val texMatrix = RendererCommon.rotateTextureMatrix(frame.samplingMatrix, frame.rotationDegree.toFloat())
val bitmapMatrix = RendererCommon.multiplyMatrices(texMatrix, RendererCommon.verticalFlipMatrix())
val buffers = IntArray(1)
GLES20.glGenFramebuffers(1, buffers, 0)
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, buffers[0])
GLES20.glFramebufferTexture2D(
GLES20.GL_FRAMEBUFFER,
GLES20.GL_COLOR_ATTACHMENT0,
GLES20.GL_TEXTURE_2D,
buffers[0], 0)
drawer.drawOes(textureId, bitmapMatrix, rotatedWidth(), rotatedHeight(), 0, 0, rotatedWidth(), rotatedHeight())
val bitmapBuffer = ByteBuffer.allocateDirect(rotatedWidth() * rotatedHeight() * 4)
GLES20.glReadPixels(0, 0, rotatedWidth(), rotatedHeight(), GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, bitmapBuffer)
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0)
GLES20.glDeleteFramebuffers(1, buffers, 0)
val bitmap = Bitmap.createBitmap(rotatedWidth(), rotatedHeight(), Bitmap.Config.ARGB_8888)
bitmap.copyPixelsFromBuffer(bitmapBuffer)
This is the Exception Log:
08-17 17:25:26.740 5288-5486/? W/System.err: java.lang.RuntimeException: java.lang.RuntimeException: YuvConverter.convert: GLES20 error: 1282
at org.webrtc.GlUtil.checkNoGLES2Error(GlUtil.java:29)
at org.webrtc.YuvConverter.convert(YuvConverter.java:239)
at org.webrtc.SurfaceTextureHelper$7.run(SurfaceTextureHelper.java:234)
at org.webrtc.ThreadUtils$5.call(ThreadUtils.java:208)
at org.webrtc.ThreadUtils$5.call(ThreadUtils.java:205)
at org.webrtc.ThreadUtils$4.run(ThreadUtils.java:182)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
This is the code where the exception happens (from the WebRTC package):
GLES20.glBindFramebuffer(36160, this.frameBufferId);
GlUtil.checkNoGLES2Error("glBindFramebuffer");
if (this.frameBufferWidth != stride / 4 || this.frameBufferHeight != total_height) {
this.frameBufferWidth = stride / 4;
this.frameBufferHeight = total_height;
GLES20.glActiveTexture(33984);
GLES20.glBindTexture(3553, this.frameTextureId);
GLES20.glTexImage2D(3553, 0, 6408, this.frameBufferWidth, this.frameBufferHeight, 0, 6408, 5121, (Buffer)null);
int status = GLES20.glCheckFramebufferStatus(36160);
if (status != 36053) {
throw new IllegalStateException("Framebuffer not complete, status: " + status);
}
}
GLES20.glActiveTexture(33984);
GLES20.glBindTexture(36197, srcTextureId);
...
GLES20.glReadPixels(0, 0, this.frameBufferWidth, this.frameBufferHeight, 6408, 5121, buf);
GlUtil.checkNoGLES2Error("YuvConverter.convert");
GLES20.glBindFramebuffer(36160, 0);
GLES20.glBindTexture(3553, 0);
GLES20.glBindTexture(36197, 0);
Depending on the exact moment I render the frame, sometimes the error happens on glBindFramebuffer(36160, this.frameBufferId) or GLES20.glReadPixels. A couple of times, the timing allowed for the image to be showed in screen (in a dialog meant to confirm the image for saving) for a split second, before the next frame was called and the app crashed.
As said in the OpenGL reference for glBindFramebuffer, the error code denotes that the value provided is not the name of a generated framebuffer, but the framebuffer was generated correctly and the code (from WebRTC package) works fine any other time.
Am I forgetting something in my code that is causing this?