I am converting an android Image captured in my application to a bitmap. I am doing this by getting the image buffer from the pixel plane of the image and then using BitMapFactory to decode it into a Bitmap. However, doing so seems to change the resolution of the Image from 1920 x 1440 to 1800 x 1600, cropping out the top and bottom of the image in the process. The code for the method is shown here.
`protected void getImageFromBuffer(ImageReader reader){
Image image = null;
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
System.out.println("Getting Image Ready");
synchronized (this){
image_to_upload = new byte[buffer.capacity()];
buffer.get(image_to_upload);
Bitmap storedBitmap = BitmapFactory.decodeByteArray(image_to_upload, 0, image_to_upload.length, null);
Matrix mat = new Matrix();
mat.postRotate(jpegOrientation); // angle is the desired angle you wish to rotate
storedBitmap = Bitmap.createBitmap(storedBitmap, 0, 0, storedBitmap.getWidth(), storedBitmap.getHeight(), mat, true);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
storedBitmap.compress(Bitmap.CompressFormat.JPEG,70, byteArrayOutputStream);
image_to_upload = byteArrayOutputStream.toByteArray();
image_ready = true;
System.out.println("Image Ready");
}
}`
Debugging shows that the height and width of the Image are correct before the buffer is converted to a bitmap, but the bitmap dimensions are wrong immediately after decodeByteArray. Can anyone suggest why this may be? I have checked the dimensions before applying the matrix transformation.
EDIT: To add further details, I have tried using BitmapFactory.Options() to disable scaling or to set the target density and neither have any impact on the resulting Bitmap, it is always size 1800 x 1600.
You can change some options that affects the result resolution of your bitmap by using the Options param to the decodeByteArray: