The reason why I ask these question is I got byte [] of container data(name is dhav) one by one and I need to push that data continuously to RTMP to play。

What's the current progress I made?

For now ,I can push data to RTMP and play RTMP by VLC just for few seconds,then the RTMP stream is end .

because the grabber created by inputstream only contain a few of the data come from ByteBuffer ,when that inputstream is end, the RTMP is closed.

synchronized (buffer) {
                                buffer.flip();
                                byte[] bytes = new byte[buffer.remaining()];
                                buffer.get(bytes);
                                buffer.clear();
                                isByteBufferFull[0] = false;
                                try {
                                    grabAndPush(bytes, SRS_PUSH_ADDRESS);
                                } catch (Exception e) {
                                    //throw new RuntimeException(e);
                                }

                            }
private static synchronized void grabAndPush(byte[] bytes, String pushAddress) throws Exception {
        avutil.av_log_set_level(avutil.AV_LOG_INFO);
        FFmpegLogCallback.set();

        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(new ByteArrayInputStream(bytes));
...
}

So can anyone tell me how to keep the RTMP aways alive by FFmpegFrameGrabber and FFmpegFrameRecorder when the source data come from one by one. very appreciate

this is my code:

import lombok.extern.slf4j.Slf4j;
import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.avformat.AVStream;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;
import org.jfjy.ch2ji.ecctv.dh.api.ApiService;
import org.jfjy.ch2ji.ecctv.dh.callback.RealPlayCallback;

import java.io.ByteArrayInputStream;
import java.nio.ByteBuffer;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

@Slf4j
public class GetBytes2PushRTMPNew2 {

    private static final String SRS_PUSH_ADDRESS = "rtmp://127.0.0.1:1935/live/livestream";

    static int BUFFER_CAPACITY = 1 * 1024 * 1024;

    public static void main(String[] args) throws Exception {
        FFmpegLogCallback.set();
        ApiService apiService = new ApiService();
        Long login = apiService.login("10.3.0.54", 8801, "admin", "xxxx");
        ByteBuffer buffer = ByteBuffer.allocate(BUFFER_CAPACITY);
        final boolean[] isByteBufferFull = {false};
        apiService.startRealPlay(new RealPlayCallback<Long, Integer, byte[]>() {
            @Override
            public void apply(Long aLong, Integer integer, byte[] bytes) {
                try {
                    //push data to bytebuffer
                    synchronized (buffer) {
                        if (buffer.remaining() > bytes.length) {
                            buffer.put(bytes);
                        } else {
                            isByteBufferFull[0] = true;
                        }
                    }
                } catch (Exception e) {
                    throw new RuntimeException(e);
                }
            }
        }, 0, 0);

        ExecutorService executorService = Executors.newFixedThreadPool(1);
        executorService.execute(new Runnable() {
            @Override
            public void run() {
                while (true) {
                    //get data from bytebuffer when buffer is full
                    synchronized (isByteBufferFull) {
                        if (isByteBufferFull[0]) {
                            synchronized (buffer) {
                                buffer.flip();
                                byte[] bytes = new byte[buffer.remaining()];
                                buffer.get(bytes);
                                buffer.clear();
                                isByteBufferFull[0] = false;
                                try {
                                    //using grabber and recorder to push RTMP
                                    grabAndPush(bytes, SRS_PUSH_ADDRESS);
                                } catch (Exception e) {
                                    //throw new RuntimeException(e);
                                }

                            }
                        }
                    }
                    try {
                        Thread.sleep(500);
                    } catch (InterruptedException e) {
                        throw new RuntimeException(e);
                    }
                }

            }
        });
        while (true) {

        }
    }

    private static synchronized void grabAndPush(byte[] bytes, String pushAddress) throws Exception {
        avutil.av_log_set_level(avutil.AV_LOG_INFO);
        FFmpegLogCallback.set();

        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(new ByteArrayInputStream(bytes));


        grabber.setFormat("dhav");
        grabber.start();

        AVFormatContext avFormatContext = grabber.getFormatContext();

        int streamNum = avFormatContext.nb_streams();

        if (streamNum < 1) {
            log.error("no media!");
            return;
        }

        int frameRate = (int) grabber.getVideoFrameRate();
        if (0 == frameRate) {
            frameRate = 15;
        }
        log.info("frameRate[{}],duration[{}]秒,nb_streams[{}]",
                frameRate,
                avFormatContext.duration() / 1000000,
                avFormatContext.nb_streams());

        for (int i = 0; i < streamNum; i++) {
            AVStream avStream = avFormatContext.streams(i);
            AVCodecParameters avCodecParameters = avStream.codecpar();
            log.info("stream index[{}],codec type[{}],codec ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
        }

        int frameWidth = grabber.getImageWidth();
        int frameHeight = grabber.getImageHeight();
        int audioChannels = grabber.getAudioChannels();

        log.info("frameWidth[{}],frameHeight[{}],audioChannels[{}]",
                frameWidth,
                frameHeight,
                audioChannels);

        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(pushAddress,
                frameWidth,
                frameHeight,
                audioChannels);

        recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
        recorder.setInterleaved(true);

        recorder.setFormat("flv");

        recorder.setFrameRate(frameRate);

        recorder.setGopSize(frameRate);

        recorder.setAudioChannels(grabber.getAudioChannels());


        recorder.start();


        Frame frame;


        log.info("start push");

        int videoFrameNum = 0;
        int audioFrameNum = 0;
        int dataFrameNum = 0;

        int interVal = 1000 / frameRate;
        interVal /= 8;

        while (null != (frame = grabber.grab())) {

            if (null != frame.image) {
                videoFrameNum++;
            }

            if (null != frame.samples) {
                audioFrameNum++;
            }

            if (null != frame.data) {
                dataFrameNum++;
            }

            recorder.record(frame);

            Thread.sleep(interVal);
        }

        log.info("push complete,videoFrameNum[{}],audioFrameNum[{}],dataFrameNum[{}]",
                videoFrameNum,
                audioFrameNum,
                dataFrameNum);

        recorder.close();
        grabber.close();
    }


}
1

There are 1 best solutions below

0
zhoutian On

finally I found solution: using PipedOutputStream and PipedInputStream to provide an inputstream for FFmpegFrameGrabber rather than create FFmpegFrameGrabber again and again can make video stream keep alive