> I assume that computing parameter trajectories basically means
That's a key point. Interpolating any sampled signal introduces latency.
> let the host pass a limited number of future parameter samples at each
I *really* hate this idea.
I play my MIDI keyboard into my DAW, perhaps while using my Mod-Wheel, or
artistically using the filter-cuttoff parameter...I hit record...stop..
Then I push 'offline render'.
You would say - shift all my parameter events earlier in time and render
the result to disk? It's going to sound different. The timing will be
wrong. A DAW is like a tape recorder. Playback or offline rendering should
result in an identical performance surely?.
Why are you selectively shifting some musical events in time but not
others, why not note-ons too?
You can't provide live MIDI playing 'in advance', you can't prove parameter
updates in advance, just like you can't provide live audio in advance. If
the plugin wants 'future' data to interpolate stuff, it needs to introduce
latency. A good host will compensate for latency if the plugin API supports
Parameters aren't special, they don't require any different handling than
MIDI. What's the difference between a MIDI controller tweaking the filter
cuttoff, or directly tweaking the parameter? Nothing. They both need
smoothing, they both need interpolating, they both will have latency. Don't
Linux-audio-dev mailing list