How to add delay without using delay()

I would like to introduce delay of a few tens of uS into a test build that I'm working on.  The standard routine, delayMicroseconds(), doesn't seem to behave as expected when the hardware timer Timer1 is is use, so I'm hoping to just create some delay within a subroutine.

Each loop of my wasteSomeTime() function appears to take around 40uS providing that the loop that it contains is executed at least 4 times.  But anything less than this and the measured delay drops right away.  I suspect that the compiler is somehow filtering out unnecessary statements to save time.  The sketch plus some display printout is attached.

Maybe there is a standard way to do this, and I'm just trying to re-invent the wheel.  All ideas welcome,

TIA, Robin

PaulOckenden's picture

Re: How to add delay without using delay()

Top of my head, try some serial output? At least the timing for that is fairly repeatable.

It's normally what slows our sketches down, so it seems logical to use it as a delay!


calypso_rae's picture

Re: How to add delay without using delay()

My intention was to slow down each loop of the code by just a small amount, to see when we run out of time.  Printing something to the screen several thousand times per second would probably kill the process stone dead!

I think the odd effects that I've been seeing with delayMicroseconds() may well have been caused by some regular Serial output that was active.  By keeping everything as a Serial-free zone, it now seems to be working much better.  Thanks anyway.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.