View previous topic :: View next topic |
Author |
Message |
guest Guest
|
Problem using I2C and controlling I2C lines manually |
Posted: Sat Sep 03, 2005 8:39 am |
|
|
Hello,
I am using the hardware I2C on a 18F452 to interface to a serial EEPROM and other I2C chips. This works perfectly. I also have another chip connected to the same I2C bus but this uses its own protocol (basically a shift register and a separate strobe line to latch the data shifted in).
The problem is that when I try to control the I2C data and clock lines manually for this chip using standard IO and the output_high()/output_low() function, the PIC does not actually do anything to the SDA and SCL outputs - they just stay high because of the pullup resistors.
The odd thing is that the code was working perfectly on a 16F877, but I had to change to the 18F452 because I ran out of ROM. I have tried using the CCS built-in software I2C routines but they are a waste of time because they don't seem to work at all.
I have tried all sorts of code but still cannot get it working - is this something different between the 16F and 18F chips that stops the I2C lines being controlled once the hardware I2C has been used or something??
Thanks for any help! |
|
|
newguy
Joined: 24 Jun 2004 Posts: 1909
|
|
Posted: Sat Sep 03, 2005 10:16 am |
|
|
The default for I2C lines is to be configured as inputs. Try setting them to be outputs, then you should be able to control them manually. Just be sure to set them back to inputs when you're done so that you don't affect the I2C routines. |
|
|
guest Guest
|
|
Posted: Sat Sep 03, 2005 1:10 pm |
|
|
How do I do that? I am using the #use standard_io() directive - I thought that you dont need to set the port direction when using this - just use input() (or output_float()) and output_high()/output_low() functions and it will automatically set the port direction?
Any idea on why the software I2C routines don't work at all and yet the hardware ones do perfectly?
Thanks. |
|
|
newguy
Joined: 24 Jun 2004 Posts: 1909
|
|
Posted: Sat Sep 03, 2005 1:39 pm |
|
|
I'm not totally sure, but it seems that there's some sort of hierarchy with the CCS built-in routines. You're right - if you're using standard_io, then the compiler should automatically reconfigure the tris register to make a pin an input or an output depending on what you're doing. But it seems like if you're using the software i2c, then the compiler is restricting access to the tris register for those two pins?
Try actually setting the tris instead of leaving it up to the compiler. Switch to fast_io for that port, take care of the tris yourself, and see what happens. |
|
|
guest Guest
|
Part way there |
Posted: Sat Sep 03, 2005 5:38 pm |
|
|
Ive had a look at the traces on a scope and found the reason for the software I2C routines not working - when addressing one particular I/O expander chip, the PIC does not wait long enough for the I/O expander to assert the ACK signal on the data line - the delay between the 8th and 9th clock pulse is much shorter than it should be (compared to the hardware generated I2C routines). I found a work-around by setting my clock frequency using the #use delay() to double what it actually was. This introduced enough delay for the ACK to be properly asserted in software mode.
A bit of useful info: I also noticed that the fast and slow I2C directive had no effect in software I2C mode, ONLY in hardware I2C mode.
The problem of having no control over the clock and data lines when using hardware I2C still exists (I did notice however there was no problem at all when using software I2C), so I will try and look at setting the tristate registers for the ports manually once I dig out the 18F452 datasheet.
All I can say is CCS's software I2C routines are c**p (well, they are in compiler version 3.227). I dont think its possible to modify/fix them either is it? |
|
|
Ttelmah Guest
|
|
Posted: Sun Sep 04, 2005 3:16 am |
|
|
The 'speed' parameters do work with software I2C, but you _must_ define the rate. This was not made clear when this 'extension' was added to the handling.
So:
#USE I2C(MASTER, SCL=PIN_B0, SDA=PIN_B1, SLOW=50000 )
Works with the software I2C.
In fact it really seems that the 'slow', and 'fast' keywords are now redundant.
Best Wishes |
|
|
guest Guest
|
|
Posted: Sun Sep 04, 2005 6:18 am |
|
|
Thanks Ttelmah. No wonder it doesnt work. I'm quite disappointed that the need to specify a speed is not mentioned at all in the CCS manual. |
|
|
Ttelmah Guest
|
|
Posted: Sun Sep 04, 2005 7:26 am |
|
|
Key thing with CCS. The manual is _always_ several versions out of date. You must read the 'readme' file that comes with each compiler release. There are several dozen fairly important changes here...
The fact that the manual lags so much, has been the source of many complaints.
I always treat the paperwork needed as:
Manual
Readme
The processor '.h' file
Chip data sheet
You need all four to really form a full 'manual' for what is going on!...
Best Wishes |
|
|
guest Guest
|
Wahey! |
Posted: Sun Sep 04, 2005 11:46 am |
|
|
Thanks everyone for the help!
I tried using the slow=50000 parameter for software I2C and it now works perfectly!
I have also solved the problem of getting manual control of the SDA and SCL lines while also having the hardware I2C configured!
After a LOT of reading datasheets and this forum I came to the conclusion that the hardware I2C was overriding the TRIS from the output_high()/output_low() commands, although the PIC datasheet doesn't specifically say the TRIS is overridden when using I2C (as far as I could see).
I came up with a work-around by disabling the hardware SSP module when I wanted manual control of the pins, then re-enabling it once I had finished with them. The code I used was:
(at the top)
#byte SSPCON1 = 0xFC6 //SSPCON1 reg in 18F452
(in main program)
bit_clear(SSPCON1, 5); //Disable hardware I2C module
..
..
bit_set(SSPCON1, 5); //Re-Enable hardware I2C module
Bit 5 of the SSPCON1 register is SSPEN (Synchronous Serial Port Enable Bit)
My syntax works but please correct me if I am doing it in the wrong way (or if there is an easier way).
The odd thing is that I didn't have to do any of this when using the 16F877. I suppose the hardware SSP module must be slightly different.
Thanks again! |
|
|
Ttelmah Guest
|
|
Posted: Sun Sep 04, 2005 2:46 pm |
|
|
The data sheet, does include the data about TRIS with the hardware I2C, but it is not exactly 'obvious'. Two places. The first is the pin diagram for each pin, where the logic of the peripheral output circuit is shown, and at the start of the I2C description, it says 'The user must configure these pins as inputs or outputs through the TRISC<4:3> bits'.
If you look at the pin diagram, you will see that if the peripheral is enabled, then for the output pin, the peripheral override comes into force (Note 3, Fig9.7). Turning the perihiperal off, is obviously a solution to this, and also saves power as well!. :-)
Best Wishes |
|
|
|