I'm working on a Camera2 app where I have a custom OpenGL pipeline set up for processing video frames.
In my pipeline, I use a couple of different third party frame-processing APIs that expect an android.media.Image as parameter (e.g. for detecting faces, or other models).
I have a OpenGL texture which I can use to create & fill a AHardwareBuffer. How do I now synchronously create android.media.Image instances which are backed by my OpenGL Texture/HardwareBuffer?
- I understand that the
android.media.Imageclass' implementation is pretty much entirely in NDK/C++, butAImagefrommediandkalso doesn't provide a constructor. - I say synchronously, because I am aware of creating an
ImageReaderand then streaming into it's Surface, but this is not what I want since those are not synchronously created, but instead only created in thesetOnImageAvailableListenercallback.
While an Android
media.Imagein fact is mostly a wrapper around a graphics buffer, and therefore at the low level can be treated as a AHardwareBuffer (plus metadata), there's unfortunately no public API to create a media.Image from a AHardwareBuffer (though the other way around is possible) since themedia.Imagealso has ownership semantics related to ImageReaders and Writers.I know you'd like a synchronous solution, but I don't believe there is one, in the public API. You basically have to use an ImageReader as the destination of your EGL rendering.