[erlang-questions] Making erlang more realtime?
Sat Mar 4 00:28:23 CET 2017
> Probably it does not help an awful lot, but I recall an interesting project from couple of years ago called erlyvideo (later flussonic)
Pretty interesting project I would have to dig into the erlyvideo source sometime.
> main question here is: what does requires sending frames exactly once per 17 ms?
17 ms was just a random number I picked. Since ~16.777ms represents 60fps. I wanted to see how much extra delay could potentially happen per frame.
Realistically we want to forward frames as soon as we receive them. Erlang is just routing, not actually encoding the frames. I wanted to see if I could
measure slowdowns between frames, for example if frame 1 spends ~6ms to be forwarded by erlang, we could arrive too late on the client end and the client might need to
skip a frame.
> Browser do not require it and Erlang precision is enough.
Im thinking so too now, I have also gotten better result (lower latency) setting process priority to max.
> - network spikes won't be resolved due to difference I/P/B frame sizes; (need to have a smart algorithm of a sender queue)
> You can check an implementation of a sender queue in WebRTC project: modules/pacing/paced_sender.cc for instance.
I need to consider this. Im not sure if it is possible to drop x264 frames though if the end-to-end transmission is bottle necked.
I think the solution would be to drop to a lower bitrate.
> - playback sync won't be resolved due to network/receiver fluctuations; (need to have JB on a receiver)Not sure what JB is.
> Probably i missed something and you can add more information about an initial task.
I am getting clumps of frames arriving in the client using websockets, which is a NaCL web browser application for Google Chrome (localnet). I am trying
to narrow down the cause of it. I think I have arrived that it might be because I am sending large ws frames (130kb+) and websocket in chrome chokes,
resulting in 5-8 frames being clumped up and presented to the decoder at the same time.
I am going to try using a basic TCP connection to the client to see if that fixes the issue. This thought will unfortunately limit my application to
running as an extension only.
On Friday, March 3, 2017 12:44 PM, Ilya Shcherbak <tthread@REDACTED> wrote:
sorry, but the issue is unclear for me. can you describe a bit more specific a high level issue. why do you need to send frames in 17ms interval? it shouldn't be a requirement for a sender by design in my understanding.
The constant delay doesn't look like a right solution for any reason:
- network spikes won't be resolved due to difference I/P/B frame sizes; (need to have a smart algorithm of a sender queue)
- playback sync won't be resolved due to network/receiver fluctuations; (need to have JB on a receiver)
You can check an implementation of a sender queue in WebRTC project: modules/pacing/paced_sender.cc for instance.
If you are transmitting high res video in a local network and facing an issue with transmitting you can try to tune socket buffer sizes for send/recv.
Probably i missed something and you can add more information about an initial task.
2017-03-03 22:23 GMT+07:00 Max Lapshin <max.lapshin@REDACTED>:
main question here is: what does requires sending frames exactly once per 17 ms?
>Browser do not require it and erlang precision is enough.
>C may become mandatory if you want to send about 500-800 mbit of video, mux several streams to single multi-stream and maintain strict constant bitrate via UDP to send all this to dumb IP -> DVB-C transmitter.
>But usually erlang is enough: we are feeding satellite encoder from flussonic now and it is ok.
>erlang-questions mailing list
erlang-questions mailing list
More information about the erlang-questions