CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to CCS Technical Support

SPI PIC33 master to PIC24 slave

 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

SPI PIC33 master to PIC24 slave
PostPosted: Mon Jun 22, 2015 4:05 am     Reply with quote

My project uses a PIC33FJ as main processor and a 24FJ256DA206 as a graphic display driver. I have got some comms over SPI but intermittently. I presume this is a framing error. I have tried slowing down master setting very low baud. The following code works sometimes when reset and when working stays in sync. Other times garbage is received. So I tried using enable/SS pin and can't get it to work. Documentation is not great on this subject, but I know there are some cleaver chaps who have good knowledge on this forum! Thank you for any help received please?

I have tried various combinations of enable and SS1IN.

Code from Master:
Code:

#include <33EP256MU806.h>

#FUSES NOWDT                    //No Watch Dog Timer
#FUSES CKSFSM                   //Clock Switching is enabled, fail Safe clock monitor is enabled
#FUSES NOBROWNOUT               //No brownout reset
#FUSES NOJTAG                   //JTAG disabled

#device ICSP=1
#use delay(clock=32MHz,crystal=12MHz)

#pin_select SCK1OUT=PIN_G6
#pin_select SDI1=PIN_G7
#pin_select SDO1=PIN_G8
#use spi(MASTER, SPI1, MODE=0, BITS=8, BAUD=28000,ENABLE=PIN_E6, stream=SPI_LCD)
//#use spi(MASTER, SPI1, MODE=0, BAUD=28000, BITS=8, stream=SPI_LCD)



void main()
{
   output_bit(PIN_E7,0);      //reset LCD processor

   input(PIN_E7);
   set_pullup(TRUE,PIN_E7);  //use pullup on MR_LCD to allow ICSP of LCD processor


   while(TRUE)
   {

      spi_xfer(SPI_LCD, 0, 8);
      delay_ms(500);
     
      spi_xfer(SPI_LCD, 120, 8);
      delay_ms(500);

      spi_xfer(SPI_LCD, 128, 8);
      delay_ms(500);
   }

}

Code from Slave:
Code:

#include <24FJ256DA206.h>


#FUSES NOWDT                    //No Watch Dog Timer
#FUSES NOJTAG                   //JTAG disabled
#FUSES CKSFSM                   //Clock Switching is enabled, fail Safe clock monitor is enabled
#FUSES OSCIO                    //switch osc off

#device ICSP=1
#use delay(internal=32MHz)

#define RX_BT     PIN_D2
#define TX_BT     PIN_D1

#define CK_LCD    PIN_B1
#define DI_LCD    PIN_B0
#define DO_LCD    PIN_B6
#define SS_LCD    PIN_B7


////////////////////////// SPI slave input from main processor
#pin_select SCK1IN=CK_LCD
#pin_select SDI1=DO_LCD // DI to DO
#pin_select SDO1=DI_LCD // DO to DI

#pin_select SS1IN=PIN_B7
//#use spi(SLAVE, SPI1, MODE=0, BITS=8, ENABLE=PIN_B7, stream=SPI_PORT1)
#use spi(SLAVE, SPI1, MODE=0, BITS=8, stream=SPI_PORT1)


void main()
{
   unsigned int8 i;


   while(TRUE){

      i=spi_xfer(8);

      fprintf(BLUETOOTH,"input %u\r\n",i);

   }
 
}


P.S. I am using ICSP1 pins for SPI, but programmer is disconnected.
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Mon Jun 22, 2015 9:16 am     Reply with quote

I got this working by moving the i=spi_xfer(8); into an interrupt:
Code:

#INT_SPI1
void  spi1_isr(void){
  unsigned int8 i;
  i=spi_xfer_in();
  fprintf(BLUETOOTH,"input %u\r\n",i);
}


And setting up:
Code:

#pin_select SCK1IN=CK_LCD
#pin_select SDI1=DO_LCD // DI to DO
#pin_select SDO1=DI_LCD // DO to DI
#pin_select SS1IN=PIN_B7
#use spi(SLAVE, SPI1, MODE=0, BITS=8, ENABLE=PIN_B7, stream=SPI_PORT1)


Not at all clear in any CCS documentation.

My problem now is on setting a more realistic processor speed of 120MHz on the Master, the SPI baud cannot be set low enough for the Slave running at its maximum speed of 32MHz.

After several days of effort on this, maybe I should use I2C instead?
Ttelmah



Joined: 11 Mar 2010
Posts: 19552

View user's profile Send private message

PostPosted: Mon Jun 22, 2015 11:06 am     Reply with quote

Er. I'm using 2MHz on a 32MHz chip. You can easily go a factor of 100 slower than the slave can handle.

The problem is delays _between_ the bytes, not the actual SPI speed. On the slave it typically takes about 30 instructions to get into the interrupt handler. As written your handler then bottlenecks massively. The fprintf, takes an age (you don't tell us the baud rate this is set for, but at 38400bps, the print will take up to nearly 3mSec...). Get rid of this, and instead buffer the data into a circular buffer (like ex_sisr.c). It then takes a handful of instructions to actually save the data, and another 30 to get out of the handler. Say 80 instructions total. So at your 16MIPS on the slave, you can potentially handle no more than perhaps 200000bytes/second. 1600KHz SPI rate. Your master chip can clock the SPI down to about 29KHz!.... Clock the SPI at 500KHz, and buffer the data.
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Mon Jun 22, 2015 2:56 pm     Reply with quote

Thanks for the detailed reply. I was only using the interrupt because I could find no other way of getting the frame to sync correctly. I was only using the UART for debug purposes and find it hard to believe that a single byte once every 500mS will hold up the SPI receive? Surly it can buffer the byte?

I basically can't find any reference on how to set up a Slave SPI that isn't contradicted somewhere else! The CCS manual is so full of typos and mistakes. I have used SPI Master before, but not as a Slave.

Any help would be very useful please?

How would you set up the slave and what function would you use to read the data please?
Ttelmah



Joined: 11 Mar 2010
Posts: 19552

View user's profile Send private message

PostPosted: Mon Jun 22, 2015 11:39 pm     Reply with quote

OK.

The first 'key' thing is to understand that with SPI the master clocks everything. This is why you have to wait for data to arrive from the master _before_ the slave can read it. Two ways of doing this, using the function spi_data_is_in, or using the interrupt.

Then while I like #use spi, for the master device, I've had 'doubts' about it for the slave.
In CCS, there are two separate ways of configuring/using SPI. The 'old' method, which can only handle the hardware (setup_spi, spi_read & spi_write), and the 'new' method '#use spi & spi_xfer'. The latter is better in terms of supporting software SPI, allowing different data lengths to be handled, and controlling the enable line (all good), but unlike setup_spi, there appears to be no documented way of setting up the slave to use the select line. So for a slave, since the hardware has to be used, I'd use the hardware controls and stick with setup_spi.
For this there is an example (the CCS examples should be used as part of the manual). ex_spi_slave.c. The slave defaults to using the slave select unless spi_ss_disabled is selected.
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Tue Jun 23, 2015 1:43 am     Reply with quote

OK, thank you, that helps explain a lot. Having used the newer #use spi for the master, I automatically used it for the slave and have not even tried the older method used as an example. I actually assumed wrongly that this older method would implement a software SPI.

Although my second example above does work the enable/ss lines for the interrupt, it does not work with spi_data_is_in. Also, there is an issue with slave not handling the speed of data coming from the master. I can see on a scope that the master behaves when I adjust its baud or use the register SPI1CON1 to set maximum clock divide.

So I will now try your advice and use setup SPI instead which will hopefully fix both issues.

Thank you, I will post results later...
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Tue Jun 23, 2015 5:59 am     Reply with quote

Yes, that works! Thank you! It took a while because I was also trying to send 16 bit words, which are also badly documented and unreliable. with master running at 120MHz and slave at 32MHz I am using the maximum speed of the master and getting reliable data with 8 bits. Working code below.

The following code works with Compiler version 5.021
#USE_SPI does not work reliably for a Slave with this compiler version at least.

Master:
Code:

#include <33EP256MU806.h>

#FUSES NOWDT                    //No Watch Dog Timer
#FUSES CKSFSM                   //Clock Switching is enabled, fail Safe clock monitor is enabled
#FUSES NOBROWNOUT               //No brownout reset
#FUSES NOJTAG                   //JTAG disabled

#device ICSP=1
#use delay(clock=120MHz,crystal=12MHz)

#pin_select SCK1OUT=PIN_G6
#pin_select SDI1=PIN_G7
#pin_select SDO1=PIN_G8
#use spi(MASTER, SPI1, MODE=0, BITS=8,ENABLE=PIN_E6, stream=SPI_LCD)

void main()
{
   output_bit(PIN_E7,0);      //reset LCD processor

   input(PIN_E7);
   set_pullup(TRUE,PIN_E7);  //use pullup on MR_LCD to allow ICSP of LCD processor


   while(TRUE)
   {

      spi_xfer(SPI_LCD, 0, 8);
      delay_ms(500);
     
      spi_xfer(SPI_LCD, 120, 8);
      delay_ms(500);

      spi_xfer(SPI_LCD, 128, 8);
      delay_ms(500);
   }

}


Slave:
Code:

#include <24FJ256DA206.h>


#FUSES NOWDT                    //No Watch Dog Timer
#FUSES NOJTAG                   //JTAG disabled
#FUSES CKSFSM                   //Clock Switching is enabled, fail Safe clock monitor is enabled
#FUSES OSCIO                    //switch osc off

#device ICSP=1
#use delay(internal=32MHz)

#define RX_BT     PIN_D2
#define TX_BT     PIN_D1

#define CK_LCD    PIN_B1
#define DI_LCD    PIN_B0
#define DO_LCD    PIN_B6
#define SS_LCD    PIN_B7


////////////////////////// SPI slave input from main processor
#pin_select SCK1IN=CK_LCD
#pin_select SDI1=DO_LCD // DI to DO
#pin_select SDO1=DI_LCD // DO to DI
#pin_select SS1IN=PIN_B7



void main()

{
   unsigned int8 i;
   setup_spi(SPI_SLAVE  | SPI_SS_ENABLED | SPI_H_TO_L );

   while(TRUE){
      if(spi_data_is_in()){
          i=spi_read();
          fprintf(BLUETOOTH,"input %u\r\n",i);
      }
   }
 
}
Ttelmah



Joined: 11 Mar 2010
Posts: 19552

View user's profile Send private message

PostPosted: Tue Jun 23, 2015 6:58 am     Reply with quote

Good.

The 'older' setup, _only supports the hardware_. It's one of those 'not obvious' things....
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Thu Jun 25, 2015 2:46 am     Reply with quote

UPDATE. This is not fully working. Occasionally there is an error on bit 7, when TX'ing several bytes on a row.

Now I know it has been spelled out in this forum not to mix #use_spi with setup_spi, but in this case it seems to get around the bugs and lack of documentation for the compiler V5.021.

I have now ended up with this abomination of code for the Master... but it works!

Code:

#include <33EP256MU806.h>

#FUSES NOWDT                    //No Watch Dog Timer
#FUSES CKSFSM                   //Clock Switching is enabled, fail Safe clock monitor is enabled
#FUSES NOBROWNOUT               //No brownout reset
#FUSES NOJTAG                   //JTAG disabled

#device ICSP=1
#use delay(clock=120MHz,crystal=12MHz)

#pin_select SCK1OUT=PIN_G6
#pin_select SDI1=PIN_G7
#pin_select SDO1=PIN_G8
#use spi(MASTER, SPI1, MODE=0, BITS=8,ENABLE=PIN_E6, stream=SPI_LCD)

void main()
{
int8 spi_wait=20;
   output_bit(PIN_E7,0);      //reset LCD processor
   delay_ms(10);
   input(PIN_E7);
   set_pullup(TRUE,PIN_E7);  //use pullup on MR_LCD to allow ICSP of LCD processor   
   delay_ms(100);
   
   setup_spi(SPI_MASTER  | SPI_CLK_DIV_1 | SPI_SS_ENABLED | SPI_H_TO_L);   //*********** THIS IS WRONG! BUT SPI NOT WORKING CORRECTLY WITHOUT IT ****** COMPILER BUGS IN SPI *******


   while(TRUE)
   {
      spi_xfer(SPI_LCD, 0xff, 8);
      delay_ms(spi_wait);
     
      spi_xfer(SPI_LCD, 0, 8);
      delay_ms(spi_wait);
     
      spi_xfer(SPI_LCD, 'B', 8);
      delay_ms(spi_wait);
     
      spi_xfer(SPI_LCD, CHR_PRINTCHAR & 0xFF, 8);
      delay_ms(spi_wait);
     
      spi_xfer(SPI_LCD, CHR_PRINTCHAR >> 8, 8);
      delay_ms(spi_wait);
     
      spi_xfer(SPI_LCD, 0xfe, 8);
      delay_ms(spi_wait);
     
      delay_ms(500);
   }

}

Note that even SPI_H_TO_L is the wrong way round for application, but it doesn't work the correct way round!? Although I have included the setup_spi line to frig the compiler into working, the code is written as if it wasn't there, ie. for #use_spi. Obviously the setup_spi is setting a register correctly somewhere where the compiler has failed to do so.
tinley



Joined: 09 May 2006
Posts: 67

View user's profile Send private message Visit poster's website

PostPosted: Thu Jun 25, 2015 3:03 am     Reply with quote

OK, fixed it. My last post was a bit hasty. Master now mode 2. So it's all about lack of, or worse, erroneous documentation!

This works in .h file as Master:

Code:

#use spi(MASTER, SPI1, MODE=2, BITS=8,ENABLE=SS_LCD, stream=SPI_LCD)


With this in main() as Slave:

Code:

setup_spi(SPI_SLAVE  | SPI_SS_ENABLED | SPI_H_TO_L);


To think I almost changed to Microchip compiler before undertaking this project, but decided not to to reduce learning time Rolling Eyes
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group