RTMP streaming using OpenCV and FFMPEG libraries

Do you have a project using OpenCV to process video?
At some point, after your proof-of-concept works,  you want to send the result of the processing in real time to some other device. For example to a VM in the cloud with a web server that can push the stream to a browser. You need to use streaming!

When googling for examples of streaming, the option of using ffmpeg libraries becomes appealing but…

FFmpeg examples are mostly about using the FFmpeg command line tool to produce a stream from a video file, while you want to stream on the fly the cv::Mat frames produced by your code.

Using libavcodec libavformat libavutil libswscale libavfilter and libavdevice is not straightforward. It is fairly easy to have crashes and memory leaks that are extremely time consuming to discover and fix.

After spending some time walking through the FFmpeg code, docs and examples, I decided to implement a little example with basic rtmp streaming functionality and share it in the hope it can save some pain to people wanting to implement streaming from OpenCV based applications.

Here is the github repo with the example.

Feel free to contact me for questions, help or suggestions on how to improve it!

One thought on “RTMP streaming using OpenCV and FFMPEG libraries”

Leave a Reply

Your email address will not be published. Required fields are marked *