On 19/10/13 07:23, Len Ovens wrote:
> Yes of course. The brain is used to dealing with audio lag at varying distances.
a neat thing (mentioned before on this list I think) ...
try shifting your gaze to an analogue clock with a second hand that jumps every
second and as you do so compare the perceived time before the first tick with
the time between the other ticks. Generally the time to the first move seems
longer than the others. An explanation for this is that our brains maintain a
certain latency between receiving sensations and building a model of the world
from the combination of these (presumably for the same reason a computer does,
to give time for all the sensations to be received and processed) and that when
we shift gaze quickly the model we build is extrapolated back through this
latency, including assuming the stationary second hand was stationary through
this whole buffering time.
I guess it is related to our reaction time, and also related to the importance
of the parts of the nervous system that process and respond directly, not going
back to the brain at all.
On a more speculative level ... I've always noticed that if I burn my hand on a
soldering iron I smell the burn before I feel the pain, and I think I actually
pull my hand away before being aware of either. Maybe smell is not synced in the
same way as vision, touch and sound are??
As far as video and audio sync go ... with a live actor on stage, and a camera
on their face with a big screen behind them, in a space where the actor is
speaking and maybe 3 metres from the front row, I found that a video system
without genlock but using several PCI analogue camera inputs mixed together with
openGL gave me between 1/2 and 1 1/2 frames delay, the video lagging behind the
live voice and that was just acceptable in context, but quite noticeable, while
most other video input systems where impossible (especially firewire!).
Linux-audio-user mailing list