On Sun, 2012-03-04 at 08:38 -0500, Paul Davis wrote:
To me, one of the primary limitations about MIDI is the lack of ability
to control individual notes. For example, a good protocol could let you
start at C, slide up to A, do some vibrato, etc., e.g. you could draw
lines in your sequencer.
Identifying the note by frequency would preclude this. Note numbers are
> > The way OSC is used, and in libmapper in particular, is to say things
The thing is, reality has turned MIDI in practice to everything being
learn-based (except notes) anyway. Most of the crap in MIDI can just go
away, since the use case is "send a whatever with a number(s) in it
somewhere the receiver can pick up on".
There are a few things that need to be standardized, like notes and
transport control, but I think everything having a universal meaning is
at best dubious, and likely a mistake. It is inevitable that you need
learn and/or controller mappings anyway, so the protocol can be a simple
description of what's happening on the *sender*. What the receiver does
with it is its own business.
This is a pretty controller-centric opinion, but I don't think OSC is
really good for much beyond that anyway, and the only cases that it
might be (controlling Pd and such) the (power) user is designing their
own messages anyway so it's a non-issue.
Linux-audio-dev mailing list