On my Jetson Nano, I'm running this pipeline:

gst-launch-1.0 -vvvvv v4l2src device=/dev/video2 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! rtpjpegpay ! udpsink host=224.1.2.3 port=8556

That results in the following debug output

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 2954758611
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 5677

I can see the traffic on my client machine (packets sent from the server to a multicast address) which results in around 50Mbps.

However, when trying to subscribe to that stream and display the output using this pipeline:

gst-launch-1.0 -vvvvvv udpsrc address=224.1.2.3 port=8556 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjpegdepay ! jpegdec ! videoconvert ! videoscale ! autovideosink

I only get this debug output and no image pops up.

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080

everything seems to be detected properly except from the framerate showing "0/1". Also no message from autovideosink and jpegdec src pad

I'm also providing an udp packet screenshot which seems suspicious to me as it has Don't fragment flag set while its length is only 1442 bytes (which doesn't accomodate the whole FullHD image) udp packet

I've tried lowering the image resolution to a minimum the camera supports and I occasionally get one frame displayed on the screen, but no continoous video.

I'm using this USB camera btw: UC-684

1

There are 1 best solutions below

2
Ivan On

My guess it's because you're facing jitter/package loss. Chances that some of the packets are out of order or missing are substantial when sending 50mbps. In the current pipeline if any of the packets are shuffled or lost the whole frame is probably going to be dropped. If network between client and server is not perfect, this might lead to no frames being decoded.

This explains why you might be seeing some of the frames in lower resolution: lower bitrate -> higher chance of receiving all packets for a frame in an expected order.

It can be confirmed by adding rtpjitterbuffer after udpsrc. Jitter buffer will take care of packets ordering. Then you might look up its stats property - it will show the number of lost packets.

Another way would be to connect your jetson and pc via wire to hopefully reduce package loss and see if it helps.

You also might replace v4l2src device=/dev/video2 with videotestsrc ! jpegenc to make sure it's not something with your usb camera stream. Also you might test streaming with lower resolution with videotestsrc and see if it helps.