I'm sending a live mjpeg stream over tcp. My server is sending base64 encoded frames and my Flutter client application is correctly receiving and decoding those frames. The stream works when I run my flutter app on Linux (desktop) but not on the Android emulator. Here's the code that displays the stream:
String data = utf8.decode(snapshot.data as List<int>);
return Expanded(
child: Image.memory(
Uint8List.fromList(Base64Decoder().convert(data)),
gaplessPlayback: true,
excludeFromSemantics: true,
errorBuilder: (context, object, stacktrace) {
return Text("error");
},
),
);
E/FlutterJNI( 1662): Failed to decode image
E/FlutterJNI( 1662): android.graphics.ImageDecoder$DecodeException: Failed to create image decoder with message 'unimplemented'Input contained an error.
E/FlutterJNI( 1662): at android.graphics.ImageDecoder.nCreate(Native Method)
E/FlutterJNI( 1662): at android.graphics.ImageDecoder.access$200(ImageDecoder.java:173)
E/FlutterJNI( 1662): at android.graphics.ImageDecoder$ByteBufferSource.createImageDecoder(ImageDecoder.java:250)
E/FlutterJNI( 1662): at android.graphics.ImageDecoder.decodeBitmapImpl(ImageDecoder.java:1862)
E/FlutterJNI( 1662): at android.graphics.ImageDecoder.decodeBitmap(ImageDecoder.java:1855)
E/FlutterJNI( 1662): at io.flutter.embedding.engine.FlutterJNI.decodeImage(FlutterJNI.java:524)
I've played around with different combinations of decoding the frames, but none of them have worked. I think the issue could be that android.graphics is trying to read the image as a bmp?
One possibility may be that whatever package you used did not change its file encoding/format when downloading, resulting in wrongly encoded files (.jpg with ASCII encoding, for example)