Hi all,
I understand that the latency from the 16 contacts on the wav trigger board is around 6 milliseconds, and this will be okay for my project.
I MAY want to look at serial control further down the line, but really need to know how much this affects the latency - if the increase is significant and unavoidable then I can forget about it and concentrate on other things.
Any ideas?
Regards,
Jim
This would likely need to be determined experimentally with a scope; the docs
https://cdn.sparkfun.com/assets/1/c/9/a … 230114.pdf
https://learn.sparkfun.com/tutorials/wa … -guide-v11
https://cdn.sparkfun.com/datasheets/Wid … 05RGT6.pdf
May have some figures that could be plugged in to get an estimate, but I would lean toward just getting one and testing it to be certain
Once a complete serial message is received by the WAV Trigger, there is no difference in latency via serial than via trigger inputs. So you would simply need to account for the time takes to transmit a serial command.
Is there any precedent that might help me to estimate what that might be?
If I were to add an Arduino (for instance) it would be so that I could have separate pitch and volume controls for each sound, possibly using rotary encoders.
You can do the math. A byte is 10 bits (8 bits, plus start and stop bits) and the WAV Trigger’s serial baud rate is 57600 bits per second, so each byte takes around 175 microseconds to transmit. You can look up up the number of bytes for each serial command in the user manual.