Arduino Due delay(1) vs delaymicroseconds(1000)

I am seeing really strange behaviour when exchanging delay(1) vs delaymicrosecond(1000) in code. Its hard to see exactly where the time difference is due to not having a scope but I am running an LED matrix and loop() time on average doubles (883ms to 1633ms) when I exchange delay(1) vs delaymicroseconds(1000) and this really effects the stutter of the display. There is only one delay in the entire code. It’s as if it effects the entire timing of the Arduino. All the rest of the code does is bit bang pins to update shift registers out to the display, along with a bit bang clock and a display enable.

Delays are a crude way of implementing timing.

I think it would be a decent idea to implement some sort of asynchronous timing mechanism in the Apollo3 core, similar to the IntervalTimer available in Teensy boards.

https://www.pjrc.com/teensy/td_timing_I … Timer.html

I’m creating an issue on the core to address this.

https://github.com/sparkfun/Arduino_Apollo3/issues/233

In the meantime you could try setting up a CTimer module to trigger an interrupt at a particular rate and use that to time your code. Check the datasheet for more info:

https://cdn.sparkfun.com/assets/learn_t … v0_9_1.pdf

One last idea… the RTC library allows you to set rolling alarms with hundredths of seconds precision. Not as fast as a CTimer could be, but maybe it would be an easy way forward for now. Some big improvements to the RTC library are about to be released in version 1.1.2, btw. So keep your eyes peeled.

BTW v1.1.2 of the core is now released and includes these RTC improvements