I want to get distance to all the depth pixels from position x and y of the camera. Now. in the documentation there is getMillimetersDepth() that return depth in millimeters. I think, it is same function. However, when I use that function in their "helloar" code, I am getting errors. I used different x and y values.
In HelloArRenderer.kt, I called getMillimeterDepth() function here:
if (camera.trackingState == TrackingState.TRACKING && shouldGetDepthImage) {
try {
val depthImage = frame.acquireDepthImage16Bits()
//Get center x and y position of surface
val point = render.point
val depthValue = getMillimetersDepth(depthImage, 100, 100)
//Log.d("DepthValue", depthValue.toString())
backgroundRenderer.updateCameraDepthTexture(depthImage)
depthImage.close()
} catch (e: NotYetAvailableException) {
// This normally means that depth data is not available yet. This is normal so we will not
// spam the logcat with this.
}
}
Where as getMillimetersDepth() function is
/** Obtain the depth in millimeters for [depthImage] at coordinates ([x], [y]). */
fun getMillimetersDepth(depthImage: Image, x: Int, y: Int): Int {
// The depth image has a single plane, which stores depth for each
// pixel as 16-bit unsigned integers.
val plane = depthImage.planes[0]
val byteIndex = x * plane.pixelStride + y * plane.rowStride
val buffer = plane.buffer.order(ByteOrder.nativeOrder())
val depthSample = buffer.getShort(byteIndex)
return depthSample.toInt()
}
I am getting following errors.
Error 1, if value of x and y is increased. I am using Google Pixel 7a of 2219*1080 resolution.
E/Surface: freeAllBuffers: 1 buffers were freed while being dequeued! E/AndroidRuntime: FATAL EXCEPTION: GLThread 142 Process: com.example.arcoretest, PID: 26925 java.lang.IndexOutOfBoundsException: index=366200 out of bounds (limit=28800, nb=2) at java.nio.Buffer.checkIndex(Buffer.java:717) at java.nio.DirectByteBuffer.getShort(DirectByteBuffer.java:484) at com.google.ar.core.examples.kotlin.helloar.HelloArRenderer.getMillimetersDepth(HelloArRenderer.kt:432) at com.google.ar.core.examples.kotlin.helloar.HelloArRenderer.onDrawFrame(HelloArRenderer.kt:316) at com.example.arcoretest.java.common.samplerender.SampleRender$1.onDrawFrame(SampleRender.java:72) at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1573) at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1272)
Error 2
Failed: Image has too few landmarks. [Required: 9, Actual: 0].; Initializer's SSBA failed to produce a valid output. === Source Location Trace: === third_party/redwood/perception/odometry/visual_inertial_initialization/bundle_adjustment_initializer.cc:330
Why I am getting these errors? What is the best way to calculate the distance from camera coordinates x and y? Thanks for your help.
I found the reason. x and y coordinates of the camera should be valid according to depth image. If they are not in depth image then you will get the error. As, there is interpolation of data of real world and camera image, therefore you need to get depth Image coordinates for CPU coordinates. When you use Depth Image coordinates, you will not get an error.