View previous topic :: View next topic |
Author |
Message |
Manel28
Joined: 06 May 2010 Posts: 33
|
Interruption priority problem |
Posted: Fri Mar 25, 2011 2:39 am |
|
|
Hi,
I'm having problems dealing with the priority of my two interruptions. The things is that i am developing an Ethernet-serial conversor and I am using rda for incoming data and timer1 to generate packets every 50ms. Both of them are critical (the packets generated every 50ms to keep synchronization with terminal connected to serial interface and the response of this terminal through the RX pin).
I am using PIC18f25j10 and PCH v.4.093 and doing:
Code: |
#int_timer1
void timer1_isr(){
timer++;
if (timer==2){
temp=1;
timer=0;
}
set_timer1(0x5D3B);
}
#int_rda
void serial_isr() {
int t2;
rcvchar=0x00;
if(kbhit()){
rcvchar=fgetc();
cbuff2[xbuff]=rcvchar;
t2=xbuff;
xbuff=(xbuff+1) %lenbuff2;
}
if (t2==(lenbuff2-1))
flagACKMesg=1;
}
|
I would like not to lose the received bytes through rda while executing the timer1 handler. |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Fri Mar 25, 2011 3:46 am |
|
|
You won't.
There is a effectively two character buffer in the hardware. The single character that has already arrived (which triggers the interrupt), and a second character being assembled. You have a whole character time to respond to the interrupt. In computer terms (assuming a rate like 9600bps, or 19200bps), an 'age'....
There are though some questionable bits in the code:
1) Only use the '%' test for buffer overflow, if your buffer is a 'binary' size (say 16, or 32 characters). With such sizes, the compiler just performs a shift to do the division, and the code is quick. However if you use this with a buffer size like '40' characters, the compiler has to perform a division, which takes enormously longer, and also means interrupts will be disabled in divisions in the external code. A much bigger problem....
If you must have a 'non binary' buffer size, simply test for the counter+1, equaling the buffer size, and set it to 0 if it does.
2) You don't need 'kbhit'. The interrupt means a character _is_ waiting.
3) fgetc, _requires_ a stream name. If working without streams, use getc.
Best Wishes |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Fri Mar 25, 2011 4:32 am |
|
|
I have made some corrections but anyway the code seems to make something wrong. I think the problem is the execution of handlers at the same time. Sometimes the Tx of packets get stopped (the timer interrupts every 50ms so make the Tx throught the Tx pin) and sometimes is the reception of data being sent by the terminal( 19 bytes every 60ms more o less, thats why cbuff2 is 19 bytes long) which is corrupted. How can I make one interruption to have more priority than the other one? I suposse the corrupted bytes at the receiver are becouse the first bytes get lost when coming becouse the timer is executing the handler every 50ms and then 4,5 or 6th byte are bufered at wrong positions.
Code: |
#int_rda
void serial_isr() {
int t2;
rcvchar=0x00;
rcvchar=getc();
cbuff2[xbuff]=rcvchar;
t2=xbuff;
xbuff++;
if (xbuff==lenbuff2)
xbuff=0;
if (t2==(lenbuff2-1))
flagACKMesg=1;
} |
|
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Fri Mar 25, 2011 5:01 am |
|
|
I really doubt if your problem is priorities
More likely a fault in the buffer handling. You seem to be using a linear, rather than a circular buffer. As such your buffer size checking approach is 'wrong'.
Think about it. How long does it take your _external_ code, to handle the 19 byte packet?. Can this be done in one character time?. If not, then the interrupt is going to start _overwriting_ the buffer, before you have finished dealing with it. Disaster....
Also, what happens if a byte gets corrupted, or missed. You seem to rely on the 19 byte count, but unless there is some form of 'start of packet', or 'end of packet' marker, this is a formula for problems.
Look at ex_sisr, for an example of how to handle a circular, rather than linear buffer. Then make it larger than the packet size - and keep the size binary with the example code, so (say) 32 characters. Work out some way of being _sure_ you have found the start of packet. Then either handle the characters as soon as data becomes available, or just check the difference between the 'input' location, and 'output' location, and when this gets to 19, handle the entire packet.
Best Wishes |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Fri Mar 25, 2011 6:09 am |
|
|
Yes, I know what you mean. A Circular buffer should be a better option.
I was talking about priorities becouse I just made a test commenting Timer1 handler code and it seems to receive all bytes without any corrupted bit even by using a linear buffer. I can assume then that the Timer1 interruption affects the 19byte packet reception.
Should I set priorities then?
The amazing world og the microcontrollers :D. Anyway thanks for your help |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Fri Mar 25, 2011 10:54 am |
|
|
How fast is your baud rate?.
What is your clock rate?.
Did your 'test' perform the actual buffer handling operation from the external code?. As I say, I suspect this is the real 'heart' cause of the problem....
The simplest 'priority' method, is just to declare the timer interrupt after the INT_RDA. This makes the compiler test the RDA bit _first_ in the global handler code.
Second method is to use the hardware high priority ability. Whether you can use this will depend on whether you are using any other interrupts, in particular INT_EXT.
Provided you are _not_ using INT_EXT, you just need to add "#device HIGH_INTS=TRUE" to the top of your source code, and declare INT_RDA as "#INT_RDA HIGH".
Best Wishes |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Mon Mar 28, 2011 2:04 am |
|
|
The baud rate is 19200 bps and the clock ist 40MHz (#use delay(clock=40M, oscillator=10M) #fuses H4_SW,NOWDT,NOPROTECT,NODEBUG).
The test I´ve made consist of sending 19 bytes packets every 70ms at 19200bps. It seems to receive all data perfectly when commenting timer code and if I add it again the data get corrupted. I was thinking about disabling timer interruptions inside the rda interruption handler..may be it could be a solution |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Mon Mar 28, 2011 2:51 am |
|
|
No.
All interrupts are already disabled in each other interrupt handler (unless you are using high ints).
Enable warnings when you compile the code. Now, do you have any 'interrupt disabled' warnings?. Do any appear/disappear if you enable the timer code?.
What you show for the timer, should have no effect whatsoever.
However, some different RDA code:
Code: |
#define BUFFSIZE (32)
char cbuff2[BUFFSIZE];
int in_locn=0;
int out_locn=0;
#int_rda
void serial_isr(void) {
int t2,t1;
do {
t1=in_locn;
cbuff2[in_locn++]=getc();
if (in_locn<out_locn) {
t2=(BUFFSIZE+in_locn)-out_locn;
}
else t2=in_locn-out_locn;
if (t2>=19) flagACKMesg=1;
if (in_locn>=BUFFSIZE) in_locn=0;
if (in_locn==out_locn) in_locn=t1; //Throw data if buffer overflow
} while (kbhit());
}
|
Obviously you need to use code like bgetc, to read the characters from your buffer.
This does multiple things:
1) Uses a binary sized buffer larger than your message.
2) Will read multiple characters if they are waiting.
3) Sets flagACKMesg if 19 characters are in the buffer.
4) Throws away characters if the buffer overflows.
Best Wishes |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Mon Mar 28, 2011 3:45 am |
|
|
No. I don't get any interrupt disabled warning when I compile the code. Now arises a new problem. As I said, the idea is to make a serial-ethernet converter and i was following the code of the tcp stack ex12.c.
The code when I put the serial received packets was something like:
Code: |
case UDP_TX_ISREADY:
if (UDPIsPutReady(tx_socket)) {
for(bits=0;bits<=lenbuff2-1;bits++) //lenbuff2=19
UDPPut(cbuff2[bits]);
UDPFlush();
UDPClose(tx_socket);
state=UDP_TX_WAIT;
}
break;
|
Here was pretty easy to send the 19 bytes but now I'm not sure how do i have to read the bytes from cbuff2. Sorry for so many questions but I am beginner with this stuff. |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Mon Mar 28, 2011 4:43 am |
|
|
I add also the setup in the main function. Maybe I'm doing something wrong:
Code: |
setup_spi(SPI_MASTER | SPI_L_TO_H | SPI_XMIT_L_TO_H | SPI_CLK_DIV_4);
enable_interrupts(INT_RDA);
setup_timer_1(T1_INTERNAL|T1_DIV_BY_8);
set_timer1(0x5D3B);
enable_interrupts(INT_TIMER1);
enable_interrupts(GLOBAL); |
|
|
|
Wayne_
Joined: 10 Oct 2007 Posts: 681
|
|
Posted: Mon Mar 28, 2011 8:53 am |
|
|
is lenbuff2 a var or a #define or const ?
in this case it would usually be a #define which would reduce the code required when compiled. if it is a var then the compiler has to treat it as a variable that may change its value which results in more code.
I doubt this is your problem, just an observation. |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Mon Mar 28, 2011 9:25 am |
|
|
It is defined as static int const. I have just tested it again but now with circular buffer with the rda_int code Ttelmah told me and I get the same behaviour..quite strange. I am 90% sure there is something the timer1 does that corrupts the received bytes..like interruptions during receiving process and thats why the data is bad. Again I've simulated the reception commenting this lines :
Code: |
setup_timer_1(T1_INTERNAL|T1_DIV_BY_8);
set_timer1(0x5D3B);
enable_interrupts(INT_TIMER1);
|
and it works perfectly. The thing is that I am not able to assign higher priority to rda_int. Maybe there is something I'm missing related to timer1 in PIC18f25J10. Thanks for suggestions Wayne..this is my first PIC and I'm getting crazy |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Mon Mar 28, 2011 1:53 pm |
|
|
I see the comment _simulated_. What simulator?. Have you tried an actual chip?.
You may be chasing a simulator problem...
Best Wishes |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19541
|
|
Posted: Mon Mar 28, 2011 2:54 pm |
|
|
Other thing is that presumably the timer code triggers _something_ externally, when you set 'temp'. What does this do?. The problem could be that something in this code, walks over part of the RAM, and corrupts values....
Best Wishes |
|
|
Manel28
Joined: 06 May 2010 Posts: 33
|
|
Posted: Wed Mar 30, 2011 2:20 am |
|
|
Hi again, I think Im getting closer to the problem. The "temp" variable you are asking for is a flag which activates a function inside the main process. You are right. The problem is not the interruption itself but the following process executed due to temp=1. Something like:
while(TRUE) {
StackTask();
UDPTxTask();
UDPRxTask();
if(temp==1)
putc('A');
putc('B');
putc('C');
putc('D');
putc('E');
putc('F');
putc('G');
putc('H');
putc('I');
}
At the same time I'm receiving through the Rx line 19bytes. The problem in fact is the rs232 behaviour. If I make for example if(temp==2) so that putc() does not execute I receive the 19 bytes perfectly.
I have defined :
#use rs232(baud=19200, parity=O, bits=8, xmit=PIN_C6, rcv=PIN_C7) |
|
|
|