Live Streaming on Raspberry Pi Nerves Device

Hello! I'm new to Membrane, but seasoned Elixir dev. I'm currently working on a project (similar to 3D printer) where a RPi Nerves device is controlling an ongoing process. As a part of this, there are two USB cameras which plug into the pi and I would like to livestream to the user. One good resource that I found was this excellent tutorial by pressy4pie, one of the Nerves maintainers (huge thank you).

One major difference between what is documented here though is that I don't have a main server running on the internet somewhere. My Nerves project also bundles and starts a Phoenix LiveView UI on device boot. What this means: user connects directly to https://nerves.local:4000 which is running on the device. The way it currently works, the Phoenix project starts a GenServer which constantly reads the cameras using the OpenCV bindings (Evision project). Upon receiving the frame, it does some basic image processing, encodes the image as base64 JPG and then pushes it over the websocket to the LiveView client img tag. It works, but under poor network conditions we are starting to see that the LiveView process mailbox is getting clogged, resulting in significant lag in the live stream as well as for user input events.

I'd like to see if Membrane is suitable to tackle this problem. Preferably, we could use the hardware h264 accelerator in the pi for better performance. My requirements are to live stream to browser (via HLS? WebRTC? idk), and allow the software running on device grab the most recent frame from the camera stream for async image processing.

Some recommendations to get started in the right direction would be appreciated 🙂

Thanks!

14 responses