Let's Make Robots!

EDIT: Interrupts (in C) ?

Hi,

My question is if it is possible to estimate or calculate the time a µcontroller needs in order to execute a few commands. I'm trying to get exactly 1 second that a whole series of commands take, something like this (programmed in C):

while(j<time){

//several commands..

}

The whole while-loop should take exactly 1 second (to make a pretty correct clock), but I don't know if there is a way to calculate the time. I'm just doing trial and error right now but I guess there are better ways?

Thanks in advance!

_ZlaTanskY_

 

EDIT: I'm testing these interrupts now (in C language), found some tutorials on the internet, and got the timer working but the external interrupt doesn't work for me. The datasheet says that RA2-pin (I'm using a pickit2 with a PIC16F690) is the INT-pin.

Here's the code i'm using to light a led if the external interrupt on RA2-pin is activated:

#define _LEGACY_HEADERS
#include <htc.h>



__CONFIG(INTIO & WDTDIS & PWRTEN & MCLRDIS & UNPROTECT \
  & UNPROTECT & BORDIS & IESODIS & FCMDIS);



void interrupt LED(void){
    if (INTF){
        INTF = 0;
        RC0 = 1;
    }
}

main()
{

    PORTA = 0;              
    PORTC = 0;
    TRISA = 1;
    TRISC = 0;               
    INTCON = 0b10010000;
   
    while(1)               //  Loop Forever
    {
    } 
}

I use a male to male jumper cable to connect the Vdd to RA2, nothing happens. I tried putting an output high and connecting that output with RA2, nothing... What am I doing wrong?

Thanks in advance

_ZlaTanskY_

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Finally found out what the problem was!

I forgot the ANSEL = 0; command, which makes all the inputs digital... =)

Thanks for the replies everyone!

it has been a while, but, I believe there is another flag that has to be set to enable interrupts.

Bump! I yelled too soon :)

EDIT: I meant 'EDIT', not 'bump'

Oke, so I've found a very nice tutorial about the timer AND using the debugger. If anyone else should have any problems, you can check this out:

http://ww1.microchip.com/downloads/en/DeviceDoc/51682A.pdf

Let's get this clock to working! ^^

Hm, I guess timer interrupts are the most precise way to go. I've been looking for a small amoutn of time at the debugger, but can't seem to get it working. Must be doing something wrong, will look at it when I've got some more spare time :)

@Oddbot; I'm using a PIC16f690 from microchip (1GHz), programming language is C. I have yet to determine how to use stuff like sleep() or millis() with it, same as the interrupts. So I'll have to take a look at those manuals I guess.

Thanks!

_ZlaTanskY_

What processor / language are you using?
For now I will assume Arduino.

Arduino has millis() and micros() commands that give you the exact number of milliseconds or microseconds since the processor began working.

In my program I have a universal variable: unsigned long time;

The in my main loop:

if(millis()-time>999)
{
  time=millis();
  FunctionX();
}

although this setup will not call functionX precisely every second to the microsecond it will be accurate as long as your main loop repeats in less than 1mS and could be modified to still be accurate if you main loop took several milliseconds to repeat.

If you want to call functionX precisely every second then you would need to use a timer interrupt routine. RobotFreak has a good walkthrough here: http://letsmakerobots.com/node/28278

 

Use the stopwatch in mplab. It's in the simulator under the the debugger menu. Make sure you set the frequency your processor is running at under debugger->settings. Set breakpoints at the start and end of the section of code you want to time.

I know because I just caught myself looking for a way to "up vote" Merser's comment...