Deliver video frames over WebSocket when JS is allowed #141
Labels
stage: dev
On/for a development version
type: fix
Iterations on existing features or infrastructure
work: complicated
The situation is complicated (known unknowns), good practices used
Refer to:
We can send msgpack blobs of frames (including annotation data/metadata) over a dedicated Action Cable connection. Refer to:
We should use a separate WebSocket connection for the underlying transport of the Action Cable connection because WebSocket is susceptible to head-of-line blocking (refer to https://hpbn.co/websocket/ ). So for example we'd have
/cable
for normal Action Cable channels, and then/video-cable
for the video delivery Action Cable channels.Probably the simplest approach is to deliver JPEG frames (along with all metadata) over an Action Cable connection, allowing flow control/feedback to be associated with the Action Cable channels used for video frame delivery. Then we'd put frames in a video player component or custom element (e.g. a canvas).
If we don't send the JPEG frames over Turbo Streams, maybe the stream player can be a custom HTML element which also displays accompanying timestamps and (optionally) metrics about dropped frames at ingress from the source and egress to the current browser. Refer to:
Then we could also build up a buffer for smoother playback for users who are not performing real-time control (e.g. not designated as pump operators).
Tasks:
If we must multiplex using Action Cable, improve fairness of the scheduling of video-cable utilization, to prevent any one stream from being starved - maybe with round-robin reading from each send channel for a given action cable connection?Not needed - having one WebSocket connection per connection works and is simpler, and it's more like the MJPEG-over-HTTP approach for no-JS browsers.The text was updated successfully, but these errors were encountered: