|
|
View previous topic :: View next topic |
Author |
Message |
leeccs
Joined: 09 May 2007 Posts: 1
|
int declaration is 8 bits, any way to make it 16 bits??? |
Posted: Wed May 09, 2007 6:18 am |
|
|
This might be a newbie question.
When I define an int variable, the compiler always makes it 8 bits instead of the normal 16 bits. Is there any way to make the compiler define int as a standard 16 bit integer, or do I always have to declare variables as int16 ??? This is a pain when converting existing programs. |
|
|
ckielstra
Joined: 18 Mar 2004 Posts: 3680 Location: The Netherlands
|
|
Posted: Wed May 09, 2007 6:28 am |
|
|
Yes, changing is possible. Check the #type directive in the CCS manual. |
|
|
jds-pic
Joined: 17 Sep 2003 Posts: 205
|
Re: int declaration is 8 bits, any way to make it 16 bits??? |
Posted: Wed May 09, 2007 1:32 pm |
|
|
leeccs wrote: |
When I define an int variable, the compiler always makes it 8 bits instead of the normal 16 bits. |
on a PIC, "normal" is 8 bits -- it is, after all, an 8 bit microcontroller. so, it is incorrect to assert that the compiler is doing something to make an int 8 bits wide, instead of 16. 8 bits wide is the natural state, 16 bits wide and greater requires work -- not the other way around.
leeccs wrote: |
Is there any way to make the compiler define int as a standard 16 bit integer, or do I always have to declare variables as int16 ??? This is a pain when converting existing programs. |
personally, i would never change the global int definition from 8 to 16bits.
number 1, you are asking for trouble, and will eventually run into some hard to find problem either in your own application code, your imported application code, the compiler libraries, or a device driver.
number 2, you are unnecessarily doubling the RAM usage.
number 3, almost everything you do on a PIC in terms of I/O is done with 8 bit values -- serial rs232, i2c, dallas onewire, SPI, and so on. you are going to cause yourself much grief when needlessly using 16 bit values.
my suggestion -- open your imported code in a suitable text editor, and use stepwise search and replace -- selectively changing the int's into int16's. then, pull that into your application codebase.
jds-pic |
|
|
dyeatman
Joined: 06 Sep 2003 Posts: 1937 Location: Norman, OK
|
|
Posted: Wed May 09, 2007 2:49 pm |
|
|
I fully agree with JDS-PIC in his reply.
However, to be fair and answer your question:
Yes, you can change the default. In the latest manual, see pages 23 (Basic and Special Types) and the #Type pre-processor command on page 103. |
|
|
Neutone
Joined: 08 Sep 2003 Posts: 839 Location: Houston
|
Re: int declaration is 8 bits, any way to make it 16 bits??? |
Posted: Wed May 09, 2007 4:00 pm |
|
|
leeccs wrote: | This might be a newbie question.
When I define an int variable, the compiler always makes it 8 bits instead of the normal 16 bits. Is there any way to make the compiler define int as a standard 16 bit integer, or do I always have to declare variables as int16 ??? This is a pain when converting existing programs. |
Remember that integer literally means its not a floating point number. I would say it falls under best practices to declare every variable used with embedded code explicitly. On a PC it's a lot looser.
Int1
Int8
Int16
Int32 |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB © 2001, 2005 phpBB Group
|