delay in microseconds

8 replies [Last post]
wama
Offline
Joined: 2011-12-12
Posts: 20

Hello,

has anybody an idea how to programme a delay of 10 microseconds?

I already use the driver functions which include a delay in milliseconds.
By modifying this function I just got unexact results.

It would be great if somebody could help me.
Thanks

0
Your rating: None

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
micrio
Offline
Joined: 2010-02-27
Posts: 273

I use the following code on a LPC1114 running at 48 Mhz.

You should understand that the delay will be effected by
it's alignment in memory. That is, if it runs out of cache
it will be faster than if it causes a cache miss. There was
a detailed discussion of that a while ago.

You should probably check the timing this code on your hardware.

Here you go;

 
#define US_TIME    SystemFrequency/100000    /* 10 uS */
void
sleep_us (int us){
    volatile int    i;
    while (us--) {
        for (i = 0; i < US_TIME; i++) {
            ;    /* Burn cycles. */
        }
    }
}

Polux rsv
Offline
Joined: 2010-07-19
Posts: 249

These small timings handlings are also affected by a bad interrupt handling. Laughing out loud

Angelo

Rob65
Offline
Joined: 2010-01-26
Posts: 681

Better program a timer to tick in microseconds - or even faster if you want to and then wait for the timer to expire.

The software loop is highly affected by memory timing and optimization.

Set the PCLK of the timer to CCLK (CCLK/1) and set the prescale register of the timer to SystemCoreClock/1000000, this will let your timer tick with a 1 us interval.
Also program a match register to 10 (for a 10 us delay) and set the match control register to stop when a match is reached. Now you can loop and check in your code if the timer has reached the match value but you could also enable the interrupt and use that to continue.

If there are no other interrupts active when you are waiting for the delay you can use asm volatile ("wfi"); in your program to wait for the interrupt.

Regards,
[INDENT]Rob
[/INDENT]

Luis Digital
Offline
Joined: 2010-03-16
Posts: 293

I'm also using a delay of 10 microseconds, and I'm using:

void SysTick_Handler(void) {
  usTicks++;
}

void init_delay(void) {
    SysTick_Config(SystemCoreClock / 1000000);
}

void delay_us(int us) {
    usTicks = 0;
    while(usTicks < us);
}

wama
Offline
Joined: 2011-12-12
Posts: 20

Thanks for your replies.
I already tried the example from micrio. To check the delay I have set and cleared an output pin before and after the delay. But the delay is too long.
I guess that toggeling a pin needs a little bit time as well. But this can't be the reason.

I will try the other examples to find a solution.

micrio
Offline
Joined: 2010-02-27
Posts: 273

What optimization are you using? What chip?
My example is likely to be highly specific to my environment!

I did time my code using an output pin. That is very accurate.

Pete.

wama
Offline
Joined: 2011-12-12
Posts: 20

I use a LPC1112 running at 12Mhz at the moment. I work with Keil uVision4.

For my application I have to set two output pins with a delay of 3 microseconds.

I use one of the two 32 bit timer to realize the delay. Therefor I set the prescale register to 12000000/1000000 = 12 and the first match register to 3.
I don't know how to set the PCLK from the timer to CCLK.
My result of the delay is always longer, circa 5 microseconds.

I don't understand where the delay come from. Maybe it needs to much time to set the GPIO output pin or something is wrong with my basic timing control.
Is there an opportunity to speed up setting and clearing a GPIO Pin?

micrio
Offline
Joined: 2010-02-27
Posts: 273

The delay comes from decrementing the counter. It spins on just a few instructions.

There are complications however. This thread discusses execution timing of loops with regard to their alignment in flash memory.

http://knowledgebase.nxp.com/showthread.php?t=460

You could set-up a timer and and an interrupt handler. That will work OK but getting a few microsecond delay when your clock speed is only 12 Mhz will be tricky. The overhead of setting up the timer and handling the interrupt may cause most of the delay. Also you are using a hardware resource that may be needed elsewhere.

It is hard to get delays that are not too far from the system clock speed. That is why I used the spin loop. In my application it was important that the delay be no shorter than the specified value. If the loop was interrupted by some other process and the delay streached out, there was no problem with what I needed to do. Your requirements may be different.

feedback