Let's say I have a client that introduces an amount of latency that's
variable at runtime and potentially unbounded. From JACK's docs it
seems that you need to recompute the min/max latencies in the latency
callback that's called "by the server" whenever it feels like, but you
can force that by calling jack_recompute_total_latencies (right?).
The problem is, you are advised to call this last function only after
calling jack_port_set_latency_range(), which you should only call in
the latency callback, which may be called next month... am I dumb
(probably) or is there a deadly loop?
I couldn't find code handling dynamic latencies around and I'm way too
lazy to try to understand JACK internals or to make silly tests etc.
(also because they would tell nothing since we have multiple
implementations of the same JACK API).
Linux-audio-dev mailing list