Hi, I'm trying to write a simple pipeline which take in an mp4 source and streams it out via the RTMP sink. I have some general confusion on how to properly wire up the MP4.Demuxer.ISOM.
-
How do I decode/parse the output pads stream format downstream from the source? I haven't seen any examples or demos of transforming the
MP4.Payload.{AAC.AVC1}
-
How to properly handle dynamically attaching the output pads for each track to downstream elements? If I want to handle the
:new_track
message to attach to some sink with pads in the:always
availability mode (such as the RTMP sink) I can't attach that track to some grouping of elements which end at the sink temporarily. For example, if I get the:new_track
notification for an AAC track I can't attach just the:audio
pad of the RTMP sink, because when handling that callback there is no video pad to attach. -
Is it better to handle the track determination statically? IE, should I
ffprobe
the mp4 file and parse the available tracks beforehand? -
Does the demuxer only handle mp4 with embedded h264/aac? A container with h264/mp3 won't be able to be demuxed?
Thanks!