On Thu, Feb 18, 2010 at 12:32 PM, alex stone wrote:
there seems to be some confusion.
there are 2 kinds of data flows:
1) streaming - there is 1 value per unit of time
2) event-based - there are N values per unit of time where N is an
integer starting at zero
what are the units of time? for what we currently call "audio" data
flows, a unit of of time is the sample interval. audio data is defined
as 1 floating point value per sample interval.
the options to extend the types of data are, therefore:
a) defining a different time interval, and the data type that will
be provided for each such interval. this is effectively "reduced
sample rate streaming data"
b) defining another event based data flow, like MIDI, in which
events are discrete and (most often) sparse
i personally do not believe that there is much need for (a), but i'm
willing to be shown to be in error.
Linux-audio-dev mailing list