Re: [LAD] "bleeding edge html5" has interesting Audio APIs

Previous message: [thread] [date] [author]
Next message: [thread] [date] [author]
To: <fons@...>, <linux-audio-dev@...>
Date: Tuesday, November 22, 2011 - 9:02 pm

--_0ed8a482-3342-4964-9b66-cdac08896e57_
Content-Type: text/plain; charset="Windows-1252"
Content-Transfer-Encoding: quoted-printable

This is sent with apologies to Fons who now has about 50=2C000 copies of th=
e
same submit grace of hotmail (great web interface).

> From: fons@linuxaudio.org

define its
So let's talk about the tail wagging the dog.

An application needs input and output=2C toolkits are used for all of this =
for lots
of reasons such portability=2C segregation of duties=2C whatever. The toolk=
it needs to
provide input to the app=2C the app needs to generate the output.

Input to an app are the cursor motion=2C keystrokes=2C on this list there i=
s also MIDI
and similar control channels.

Is audio the only output? I kind of disagree that the output is only audio =
since
we are looking at whatever we are working with as much as we are listening =
to it.
The graphical output is as important as the audio output=2C you might disag=
ree but
at least you can say that minimally these are both outputs rather than inpu=
ts - they=20
are what you see and what you hear from the application.

Input to the application should be values that are extracted from the point=
er=20
positions in relation to what they are controlling. They are inputs=2C they=
have nothing
to do with slider positions on the screen just as they have nothing to do w=
ith the=20
actual generation of the sound=2C they are floating point numbers from the =
user interface=20
into the program.

The program/application uses these numbers to change audible output since s=
ome
control value has changed=2C they should also define the graphical output s=
ince=2C let's
be honest: the graphical display is output=2C not input.

If you are using a toolkit that has a data flow of the following:

pointer motion->graphical display->values->application->output

Well=2C basically that is broken as you have a flow that is

input->output->input->application->output

invariably that is going to lead to issues. The tail (the toolkit) is waggi=
ng the dog=20
(the application) as it imposes restrictions on the values the application =
is allowed
to see.

In my opinion (ok=2C it is only opinion) is that the correct flow is=20

input->application->output

It is irrelevant whether the output we are discussing is graphical or audib=
le. The=20
API/library/toolkit should give values to the application=2C the applicatio=
n should=20
decide how these values need to be processed in the audio DSP and also how =
the=20
toolkit should represent them to the visible user interface. Remember=2C th=
e toolkit=20
has no inclination of what it is being used for so why should it define wha=
t values it
passes to its owner?

> Then when you numerically enter a value=2C the slider will

With this model=2C when you enter a value that value will be given to your =
application
and it can use it as it sees fit to generate sound: it uses the value to te=
ll the DSP=20
code what it has to generate. It should also use similar processes to tell =
the other
output channel=2C the graphical one=2C what is should display to represent =
that value
in the graphical interface. The graphical output should not define the inpu=
t values=20
to the application.

Just as the audio is output=2C so are the graphics. Your slider might go fr=
om 0 to 101=2C
or if the GUI is resized them from 0 to 201. The input channel has the resp=
onsibility=20
to normalise that back to 0.0 to 1.0 and give it to the application. There =
is no reason
that the toolkit be based on pixels=2C if it is then it is broken as it doe=
s not have any=20
abstraction that makes it generally useful.

> > A better solution would be for the application callback to be given the=
actual

how the=20
dog
ed

I was referring to the application always being given a float of the value =
of the=20
control - 0.0 to 1.0 representing zero to full throw. The application decid=
es how=20
this should be used=2C if it is continuous so be it. If it is a stepped con=
trol (semitone
tuning) then the application should send the stepped value to the audio eng=
ine
and request the graphical engine depict a 'step' in its display - the graph=
ical=20
output now has steps just as the audio output does. If there is any quantis=
ation
that has to be done then let the application control that=2C not the toolki=
t. The app
is not responsible for what gets drawn but it does give directives about wh=
at it
wants to have displayed that reflect the parameter being represented.

Back in the '90s I was writing a mixing desk emulator call S-Lab which had =
direct
to disk sampling and some basic (cheesy) editing. It used a graphics toolki=
t.=20
Somewhere around 2001/2002 I started working on what become bristol=2C a se=
t of
synths for this Lab but the limitations of the toolkit became an issue=2C t=
he code I
wanted had to bend over backwards to get the desired output - the graphics =
were
grisly.

So I dumped the toolkit and wrote my own that was based on a input-app-outp=
ut
paradigm that left the application in control of what is wants to do. This =
is how=20
brighton works=2C it's a holistic approach=2C the user interface becomes as=
much what
they see as what they hear - the different output channels.

I really do not see why a toolkit should define how it is used and hearing =
people talk
about pixel boundary limitations makes me cringe.

> Igor sanin wrote

xactly the same=20
s prevalent in=20
Yeah=2C I like that. All the small discrepancies are a large part of why an=
alogue was/is so nice.

> > On a related note=2C Fons=2C weren't you actually working recently on s=
ome subpixel=20

the value
knob'
move by
elow
dB=2C

What about 25 pixel long faders? Don't fanning controls give you back the r=
esolution
you need here? I think I must be missing something in the pixel level contr=
ols with
respect to how the toolkits work=2C the issue is that I have not used anybo=
dy else'
graphical toolkit for over 10 years now.

"we have to make sure the old choice [Windows] doesn't disappear=94.
Jim Wong=2C president of IT products=2C Acer

From: nickycopeland@hotmail.com
To: fons@linuxaudio.org=3B linux-audio-dev@lists.linuxaudio.org
Date: Tue=2C 22 Nov 2011 18:05:51 +0100
Subject: Re: [LAD] "bleeding edge html5" has interesting Audio APIs

> From: fons@linuxaudio.org

e:
an
d
px...

Which toolkit is this? Having the graphical position of the slider/pot defi=
ne its
value sounds a little broken.

> In 4. the slider will move to the nearest (pixel) position.

Touchpad interfaces support subpixel (floating point) coordinates based on =
an
interpolation of where somebodies fat greasy digit smudges the screen=2C it=
is=20
actually quite useful. HTML5 also transports pointer motion as floats for t=
his
reason. Am an NOT advocating its use=2C just stating that subpixel is there=
.

> up and down again you don't get the value that was typed in=2C

Again=2C why does the graphical output have to define the value of the inpu=
t?
Surely that is a limitation of the toolkit?

> The only solution is to ensure that slider positions correspond

A better solution would be for the application callback to be given the act=
ual
value and it decide what that means for whatever it is controlling and how =
the=20
graphical output should be depicted. The tail should not be wagging the dog
but again that might depend on the toolkit(?)

On a related note=2C Fons=2C weren't you actually working recently on some =
subpixel=20
controllers? How were they implemented?

Kind regards=2C nick.
=20

_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev =

--_0ed8a482-3342-4964-9b66-cdac08896e57_
Content-Type: text/html; charset="Windows-1252"
Content-Transfer-Encoding: quoted-printable

This is sent with apologies to Fons who now has about 50=2C000 copies of th=
esame submit grace of hotmail (great web interface).&gt=3B From=
: fons@linuxaudio.org&gt=3B To: nickycopeland@hotmail.com&gt=3B Sub=
ject: Re: [LAD] "bleeding edge html5" has interesting Audio APIs&gt=3B =
&gt=3B On Tue=2C Nov 22=2C 2011 at 06:05:51PM +0100=2C Nick Copeland wr=
ote:&gt=3B &gt=3B &gt=3B Which toolkit is this? Having the graphica=
l position of the slider/pot define its&gt=3B &gt=3B value sounds a lit=
tle broken.&gt=3B &gt=3B ?? I must be missing something essential..=
.&gt=3B &gt=3B For example=2C you have a slider that is 100 pixels =
long=2C it has&gt=3B 101 possible positions=2C each one of those corres=
ponds to some&gt=3B parameter value which is recomputed whenever you mo=
ve the slider. &gt=3B &gt=3B Other approaches are possible=2C but t=
his is AFAIK what most&gt=3B toolkit sliders do. So the graphical state=
determines the&gt=3B parameter value...&gt=3B Igor sanin wrot=
e:&gt=3B I may be missing something here. The purpose of a slider is t=
o set a&gt=3B value. The position of the slider is a graphical represe=
ntation of&gt=3B that value.So let's talk about the tail wagg=
ing the dog.An application needs input and output=2C toolkits are u=
sed for all of this for lotsof reasons such portability=2C segregation =
of duties=2C whatever. The toolkit needs toprovide input to the app=2C =
the app needs to generate the output.Input to an app are the cursor=
motion=2C keystrokes=2C on this list there is also MIDIand similar con=
trol channels.Is audio the only output? I kind of disagree that the=
output is only audio sincewe are looking at whatever we are working wi=
th as much as we are listening to it.The graphical output is as importa=
nt as the audio output=2C you might disagree butat least you can say th=
at minimally these are both outputs rather than inputs - they are what =
you see and what you hear from the application.Input to the applica=
tion should be values that are extracted from the pointer positions in =
relation to what they are controlling. They are inputs=2C they have nothing=
to do with slider positions on the screen just as they have nothing to =
do with the actual generation of the sound=2C they are floating point n=
umbers from the user interface into the program.The program/app=
lication uses these numbers to change audible output since somecontrol =
value has changed=2C they should also define the graphical output since=2C =
let'sbe honest: the graphical display is output=2C not input.If=
you are using a toolkit that has a data flow of the following:&nbs=
p=3B&nbsp=3B&nbsp=3B&nbsp=3B&nbsp=3B pointer motion-&gt=3Bgraphical display=
-&gt=3Bvalues-&gt=3Bapplication-&gt=3BoutputWell=2C basically that =
is broken as you have a flow that is&nbsp=3B&nbsp=3B&nbsp=3B&nbsp=
=3B&nbsp=3B input-&gt=3Boutput-&gt=3Binput-&gt=3Bapplication-&gt=3Boutputinvariably that is going to lead to issues. The tail (the toolkit) is=
wagging the dog (the application) as it imposes restrictions on the va=
lues the application is allowedto see.In my opinion (ok=2C it i=
s only opinion) is that the correct flow is &nbsp=3B&nbsp=3B&nbsp=
=3B&nbsp=3B input-&gt=3Bapplication-&gt=3BoutputIt is irrelevant wh=
ether the output we are discussing is graphical or audible. The API/lib=
rary/toolkit should give values to the application=2C the application shoul=
d decide how these values need to be processed in the audio DSP and als=
o how the toolkit should represent them to the visible user interface. =
Remember=2C the toolkit has no inclination of what it is being used for=
so why should it define what values itpasses to its owner?&gt=
=3B Then when you numerically enter a value=2C the slider will&gt=3B mo=
ve to the nearest position. But in most cases that won't&gt=3B correspo=
nd to the value you entered=2C but to something near.&gt=3B So as soon =
as you move the slider=2C you're back to one of&gt=3B its 101 possible =
values=2C and you can't return to the typed-in&gt=3B one except by typi=
ng it again.With this model=2C when you enter a value that value wi=
ll be given to your applicationand it can use it as it sees fit to gene=
rate sound: it uses the value to tell the DSP code what it has to gener=
ate. It should also use similar processes to tell the otheroutput chann=
el=2C the graphical one=2C what is should display to represent that valuein the graphical interface. The graphical output should not define the in=
put values to the application.Just as the audio is output=2C so=
are the graphics. Your slider might go from 0 to 101=2Cor if the GUI i=
s resized them from 0 to 201. The input channel has the responsibility =
to normalise that back to 0.0 to 1.0 and give it to the application. There =
is no reasonthat the toolkit be based on pixels=2C if it is then it is =
broken as it does not have any abstraction that makes it generally usef=
ul.&gt=3B &gt=3B A better solution would be for the application cal=
lback to be given the actual&gt=3B &gt=3B value and it decide what that=
means for whatever it is controlling and how the &gt=3B &gt=3B graphic=
al output should be depicted. The tail should not be wagging the dog&gt=
=3B &gt=3B but again that might depend on the toolkit(?)&gt=3B &gt=
=3B Are you referring to the case where the application needs less resoluti=
on&gt=3B than the widget provides ? I that case it can compute its near=
est preferred&gt=3B value and make the widget display that (as soon as =
the user releases the&gt=3B widget).I was referring to the appl=
ication always being given a float of the value of the control - 0.0 to=
1.0 representing zero to full throw. The application decides how this =
should be used=2C if it is continuous so be it. If it is a stepped control =
(semitonetuning) then the application should send the stepped value to =
the audio engineand request the graphical engine depict a 'step' in its=
display - the graphical output now has steps just as the audio output =
does. If there is any quantisationthat has to be done then let the appl=
ication control that=2C not the toolkit. The appis not responsible for =
what gets drawn but it does give directives about what itwants to have =
displayed that reflect the parameter being represented.Back in the =
'90s I was writing a mixing desk emulator call S-Lab which had directto=
disk sampling and some basic (cheesy) editing. It used a graphics toolkit.=
Somewhere around 2001/2002 I started working on what become bristol=2C=
a set ofsynths for this Lab but the limitations of the toolkit became =
an issue=2C the code Iwanted had to bend over backwards to get the desi=
red output - the graphics weregrisly.So I dumped the toolkit an=
d wrote my own that was based on a input-app-outputparadigm that left t=
he application in control of what is wants to do. This is how brighton =
works=2C it's a holistic approach=2C the user interface becomes as much wha=
tthey see as what they hear - the different output channels.I r=
eally do not see why a toolkit should define how it is used and hearing peo=
ple talkabout pixel boundary limitations makes me cringe.&=
gt=3B Igor sanin wrote&gt=3B On a related note=2C the probability that =
a potentiometer can be set to exactly the same &gt=3B resistance twice =
is close to zero. This does not make potentiometers less prevalent in =
&gt=3B musical applications.Yeah=2C I like that. All the small di=
screpancies are a large part of why analogue was/is so nice.&gt=3B =
&gt=3B On a related note=2C Fons=2C weren't you actually working recently o=
n some subpixel &gt=3B &gt=3B controllers? How were they implemented?&gt=3B &gt=3B Yes=2C I did a mixer gain fader that has subpixel resol=
ution=2C not only the value&gt=3B but also the graphical state itself. =
Nothing special. If you create the 'knob'&gt=3B pictures using e.g. cai=
ro (pycairo in this case)=2C it's easy to make it move by&gt=3B a fract=
ion of a pixel. &gt=3B &gt=3B But I also found that for a 250 pixel=
long gain fader it's just overkill.&gt=3B The one I use has exact 0.25=
dB steps over the upper part of its range. Below&gt=3B that the steps =
size increases=2C and you get values that are not k * 0.25 dB=2C&gt=3B =
but below -30 dB or so that doesn't matter much.What about 25 pixel=
long faders? Don't fanning controls give you back the resolutionyou ne=
ed here? I think I must be missing something in the pixel level controls wi=
threspect to how the toolkits work=2C the issue is that I have not used=
anybody else'graphical toolkit for over 10 years now."we have =
to make sure the old choice [Windows] doesn't disappear=94.Jim Wong=2C =
president of IT products=2C AcerFr=
om: nickycopeland@hotmail.comTo: fons@linuxaudio.org=3B linux-audio-dev=
@lists.linuxaudio.orgDate: Tue=2C 22 Nov 2011 18:05:51 +0100Subject=
: Re: [LAD] "bleeding edge html5" has interesting Audio APIs

.ExternalClass .ecxhmmessage P
{padding:0px=3B}
.ExternalClass body.ecxhmmessage
{font-size:10pt=3Bfont-family:Tahoma=3B}

&gt=3B From: fons@linuxaudio.org&gt=3B To: linux-audio-dev@lists.l=
inuxaudio.org&gt=3B Subject: Re: [LAD] "bleeding edge html5" has intere=
sting Audio APIs&gt=3B &gt=3B On Tue=2C Nov 22=2C 2011 at 05:59:40P=
M +0400=2C Alexandre Prokoudine wrote:&gt=3B &gt=3B On Tue=2C Nov 22=2C=
2011 at 5:54 PM=2C Fons Adriaensen wrote:&gt=3B &gt=3B &gt=3B &gt=
=3B &gt=3B&gt=3B For darktable we examined the slider from phat and created=
a similar&gt=3B &gt=3B &gt=3B&gt=3B new=2C more usable widget which co=
mbines a label and a slider. You can&gt=3B &gt=3B &gt=3B&gt=3B enter pr=
ecise value after a right click inside the slider area=2C and&gt=3B &gt=
=3B &gt=3B&gt=3B you can use pretty much anything as displayed unit: %=2C "=
=2C dB=2C px...&gt=3B &gt=3B &gt=3B&gt=3B whatever. Here is an example:=
&gt=3B &gt=3B &gt=3B&gt=3B &gt=3B &gt=3B Not a real solution. You n=
ow have a value that is not represented&gt=3B &gt=3B &gt=3B by a slider=
position.&gt=3B &gt=3B &gt=3B &gt=3B Er... WHAT???&gt=3B &gt=
=3B &gt=3B &gt=3B It absolutely IS represented.&gt=3B &gt=3B &g=
t=3B &gt=3B 1. Right click&gt=3B &gt=3B 2. Enter new value&gt=3B &g=
t=3B 3. Press Enter to confirm&gt=3B &gt=3B 4. Slider updates&gt=3B=
&gt=3B That is assuming that the slider has infinite resolution.Which toolkit is this? Having the graphical position of the slider/pot de=
fine itsvalue sounds a little broken.&gt=3B In 4. the slider wi=
ll move to the nearest (pixel) position.&gt=3B You could paint it with =
higher resolution=2C but mouse gestures&gt=3B are still quantised per p=
ixel. If you move the slider 1 pixelTouchpad interfaces support sub=
pixel (floating point) coordinates based on aninterpolation of where so=
mebodies fat greasy digit smudges the screen=2C it is actually quite us=
eful. HTML5 also transports pointer motion as floats for thisreason. Am=
an NOT advocating its use=2C just stating that subpixel is there.&=
gt=3B up and down again you don't get the value that was typed in=2C&gt=
=3B unless it happens to be one corressponding to a pixel.Again=2C =
why does the graphical output have to define the value of the input?Sur=
ely that is a limitation of the toolkit?&gt=3B The only solution is=
to ensure that slider positions correspond&gt=3B to values that make s=
ense for the parameter being controlled=2C&gt=3B e.g. exact semitones (=
or fractions thereof) for a VCO frequency=2C&gt=3B 1/5 dB steps for the=
top half of a fader=2C etc. And if they do&gt=3B then you don't need t=
he text input.A better solution would be for the application callba=
ck to be given the actualvalue and it decide what that means for whatev=
er it is controlling and how the graphical output should be depicted. T=
he tail should not be wagging the dogbut again that might depend on the=
toolkit(?)On a related note=2C Fons=2C weren't you actually workin=
g recently on some subpixel controllers? How were they implemented?=
Kind regards=2C nick.
_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev
=

--_0ed8a482-3342-4964-9b66-cdac08896e57_--

Previous message: [thread] [date] [author]
Next message: [thread] [date] [author]

Messages in current thread:
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 6:40 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 7:19 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 7:49 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Rui Nuno Capela, (Mon Nov 21, 9:55 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 10:03 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Patrick Shirkey, (Mon Nov 21, 11:46 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Nick Copeland, (Mon Nov 21, 10:42 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 11:33 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Gordon JC Pearce, (Mon Nov 21, 11:37 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Mon Nov 21, 10:58 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Mon Nov 21, 11:33 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Alexandre Prokoudine, (Tue Nov 22, 11:33 am)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Tue Nov 22, 1:54 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Tue Nov 22, 3:03 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Tue Nov 22, 3:32 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Alexandre Prokoudine, (Tue Nov 22, 1:59 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Tue Nov 22, 2:54 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Nick Copeland, (Tue Nov 22, 9:02 pm)
Re: [LAD] sliders/fans, James Morris, (Tue Nov 22, 12:05 pm)
Re: [LAD] sliders/fans, Alexandre Prokoudine, (Tue Nov 22, 2:58 pm)
Re: [LAD] sliders/fans, Paul Davis, (Tue Nov 22, 2:26 pm)
Re: [LAD] sliders/fans, hermann, (Tue Nov 22, 2:41 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 6:23 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 9:30 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 11:59 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 6:37 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 6:55 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 7:02 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 7:21 pm)
Re: [LAD] sliders/fans, Fons Adriaensen, (Thu Nov 24, 8:45 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 11:52 pm)
Re: [LAD] sliders/fans, Fons Adriaensen, (Fri Nov 25, 10:53 am)
Re: [LAD] sliders/fans, Nick Copeland, (Fri Nov 25, 8:52 am)
Re: [LAD] sliders/fans, Fons Adriaensen, (Fri Nov 25, 10:29 am)
Re: [LAD] sliders/fans, Jeff Koftinoff, (Thu Nov 24, 11:12 pm)
Re: [LAD] sliders/fans (again), Fons Adriaensen, (Thu Nov 24, 11:42 pm)
Re: [LAD] sliders/fans, Fons Adriaensen, (Thu Nov 24, 11:20 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 9:03 pm)
Re: [LAD] sliders/fans, Fons Adriaensen, (Thu Nov 24, 9:24 pm)
Re: [LAD] sliders/fans, hermann, (Thu Nov 24, 8:31 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Nov 24, 11:57 pm)
Re: [LAD] sliders/fans, hermann, (Fri Nov 25, 4:06 am)
Re: [LAD] sliders/fans, David Robillard, (Wed Dec 7, 7:28 pm)
Re: [LAD] sliders/fans, Paul Davis, (Wed Dec 7, 7:32 pm)
Re: [LAD] sliders/fans, Neil C Smith, (Wed Dec 7, 7:39 pm)
Re: [LAD] sliders/fans, Paul Davis, (Wed Dec 7, 7:43 pm)
Re: [LAD] sliders/fans, David Robillard, (Wed Dec 7, 8:03 pm)
Re: [LAD] sliders/fans, Thorsten Wilms, (Thu Dec 8, 9:44 am)
Re: [LAD] sliders/fans, Paul Davis, (Wed Dec 7, 10:33 pm)
Re: [LAD] sliders/fans, David Robillard, (Thu Dec 8, 12:40 am)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Dec 8, 5:23 am)
Re: [LAD] sliders/fans, David Robillard, (Fri Dec 9, 7:10 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Fri Nov 25, 8:40 am)
Re: [LAD] sliders/fans, Philipp √úberbacher, (Thu Nov 24, 10:17 pm)
Re: [LAD] sliders/fans, Fons Adriaensen, (Thu Nov 24, 8:57 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 8:50 pm)
Re: [LAD] sliders/fans, Nick Copeland, (Thu Nov 24, 8:31 pm)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, David Robillard, (Tue Nov 22, 1:04 am)
Re: [LAD] "bleeding edge html5" has interesting Audio APIs, Fons Adriaensen, (Mon Nov 21, 9:34 pm)