Sorry, more random questions!
What's the best place to start when trying to get a custom endpoint to stream mpegts video in real-time to the browser?
I have a working pipeline for mpegts in a UDP stream rendering in the browser via HLS - basically UDP -> MpegTS -> Demuxer -> H264 Parser -> HttpAdaptiveStream. My application is real-time so the HLS lag is too much so WebRTC seems like the way to go.
I've had a look at the RTSP Endpoint and am trying to create an equivalent for MpegTS received via UDP but have hit the limit of my understanding. I add my endpoint to the RTC Engine and it initialises, connects to the UDP stream, gets the MpegTS PMT and I can happily demux and parse the H264 video and wrap it up with RTP. I'm guessing that I need to wait for handle_pad_added before finalising the pipeline to feed that out via a TrackSender. My question is does that handle_pad_added callback only get called if I connect the browser to the associated peer?