RGBA16UI support across browsers for WebGL2

63 Views Asked by At

I was trying to get WebGL2 RGBA16UI textures working, and they seemed to work as shown by the answer from my previous question: How to create a 16 bit unsigned integer/unsigned short 2D texture in WebGL

However, although the code from the answer alone works:

const gl = canvas.getContext('webgl2');
const tex = gl.createTexture();

gl.bindTexture(gl.TEXTURE_2D, tex);
gl.texImage2D(
      gl.TEXTURE_2D,
      0,
      gl.RGBA16UI,
      8,
      8,
      0,
      gl.RGBA_INTEGER,
      gl.UNSIGNED_SHORT,
      null
);
console.log(gl.getError());

It doesn't work when I try to add in the actual pixels.

const gl = canvas.getContext('webgl2');
const tex = gl.createTexture();

gl.bindTexture(gl.TEXTURE_2D, tex);
let data = new Uint16Array(8 * 8 * 4);
gl.texImage2D(
      gl.TEXTURE_2D,
      0,
      gl.RGBA16UI,
      8,
      8,
      0,
      gl.RGBA_INTEGER,
      gl.UNSIGNED_SHORT,
      data
);
console.log(gl.getError());

This code won't work even when overallocating to, for instance size 1000 for this 8x8 texture.

I've also tried using different types of typed arrays, but since this is using gl.UNSIGNED_SHORT, it explicitly asks in console for me to use Uint16Array.

This code doesn't function on Safari nor Chrome, but function fine on Firefox despite it appear to be a decently well supported part of WebGL2 which is what I'm using. https://webgl2fundamentals.org/webgl/lessons/webgl-readpixels.html

I think this is most likely just a discrepancy between the WebGL API on these browsers however, I'm wondering why this issue wasn't a thing when creating the texture without data, why the error message is so vague (it was only a warning, only produced when I added the array data, and only said that the format and type were invalid instead of unsupported), what extensions I should use (are recommended) for combating this issue and their level support, and if I'm wrong how to fix this issue.

It'd also be great if you guys could send some good docs for supported formats since the ones I've looked at:

Seem to show that this is supported in WebGL2

1

There are 1 best solutions below

0
Olivia Smith On

Just did a check and it was a problem with some initialization code in a library I used before the creation of the texture. Sorry, about that. RGBA16UI textures now appear to work fine even when loading data with WebGL2.