I've got camera that produces video stream in different formats. One of them is h264 encoded stream. I use dshow along with gstreamer to acquire this stream and process it with c++ and send it also via shared memory. My question is related specifically to field stream-format in gst caps. How do I know whether it is byte-stream or avc? Is it tightly coupled to stream produced by camera or is it like transport thing that I set with gstreamer by my own? I work with rather complicated commercial specific pipelines that I cannot share full here. When I set in my code gst_caps_set_simple (gstCaps, "stream-format", G_TYPE_STRING, "byte-stream", nullptr); it works but I am not 100% why so I want to clarify that.
I've tried checking information about this specific camera with ffmpeg but I cannot get info about stream-format. Best I got is:
ffmpeg -f dshow -list_options true -i video="nameOfCamera"
(...)
vcodec=h264 min s=128x96 fps=0.015625 max s=768x480 fps=29.97
vcodec=h264 min s=128x96 fps=0.015625 max s=768x576 fps=25
vcodec=h264 min s=128x96 fps=0.015625 max s=768x480 fps=29.97
vcodec=h264 min s=128x96 fps=0.015625 max s=768x576 fps=25
vcodec=h264 min s=128x96 fps=0.015625 max s=768x480 fps=29.97
vcodec=h264 min s=128x96 fps=0.015625 max s=768x576 fps=25
vcodec=h264 min s=128x96 fps=0.015625 max s=768x576 fps=25
(...)
I would think the interface from how you get the data from should specify the data format.
byte-streamin GStreamer terms is the regular H.264 byte-stream as per H.264 specification with start codes, start code emulation bytes etc.Alternatively
avcsamples can be an alternative format (MP4 files store H.264 data as AVC samples, and not byte-stream). AVC samples are basically[nal length][nal data of length size]... chunks (no start codes or start code emulation bytes).AVC samples give you quick NAL chunk access, for the regular byte-stream would have to do a bit more parsing.
So you will have to set the type whatever you get into GStreamer so the downstream element knows what type of data to expect.