suslik wrote:I'm writting my OS for Intel x86-32 processors (actually for 80386). I don't want to play with other processors. So, it is reasonable to do this assumption: char - 8 bits, short - 16 bits, int - 32 bits, long - 32 bits? In 32-bit Windows/Linux all C-compilers (as I know) use these sizes. DOS C-compilers when compiling 32-bit code also use these sizes. In 64-bit Windows/Linux I suppose (I can't check it since I have not 64-bit computer) that I need to mention target arch as i386 and compiler will also use these sizes. Am I right?
Depends on what your definition of "reasonable" is. Given that it's very easy to avoid making these assumptions, why wouldn't you?
suslik wrote:P.S: I don't like to use int32_t, uint32_t and others, since I think that they are all needed when your C-program is intended for using on a plenty of different archs.
lol. What is the logic behind that statement?
XenOS wrote:The only integer types which have guaranteed fixed sizes are intX_t / uintX_t.
The only
scalar types, yes.
suslik wrote:To XenOS: do you know any compiler for Intel x86-32 that uses different sizes? OK, I can imagine one that uses 64-bit long, but "int - 32 bits, short - 16 bit, char - 8 bit" is de-facto standard for Intel x86-32 arch!
Why is this relevant? No one will do an exhaustive search on all C x86 ABI's, and no one can predict with certainty whether someone will implement a new, incompatible ABI in the future. One of the points of HLL's is independence of the underlying platforms, for portability---you are basically trying write your code such that it is dependent on your platform for no real reason, thus negating that portability for no real reason. So the way I see it, nothing good can come out of it; however, something bad does come out of it: you either restrict yourself to one particular ABI, or decrease maintainability and risk introducing bugs in the case you later change your mind about sticking to one particular ABI (that includes later deciding to also port to a new architecture).
That said, in there x86-64 world, there are competing C ABI's that define types differently (see
x32 vs.
SysV AMD64), even if it's not the types you happened to mention.
bluemoon wrote:No. If that happen to your tool-chain it's just luck.
You can't write it off to luck---if makes unnecessary assumptions, it's just bad code.
Owen wrote:Just use stdint.h, and note the int_fastX_t types also (since they are on some architectures faster)
<pedantic>It's <stdint.h>, not stdint.h, because it needn't be implemented as a file; indeed, C is often implemented on platforms where file systems don't exist.</pedantic> (u)int_leastN_t and (u)int_fastN_t define types that have a minimum width of N bits but they are also optional.
suslik wrote:Just use stdint.h, and note the int_fastX_t types also (since they are on some architectures faster)
- Surely it is good to write code that is conform to ANSI C, but sometimes it is safe to do some assumptions like I did. I just ask the osdev community is my assumption reasonable.
Those types do not exist in ANSI C (people generally mean C89 when they say that) because that was the only version of the standard that was defined by an ANSI committee. ANSI later adopted ISO C99 and will likely also adopt C11 in the future.
That said, I postulate that it is
never entirely safe to make assumptions. If there's something you cannot afford to cover due to limited resources, then state when you define your requirements. Assumptions is something sloppy programmers fall back to.
suslik wrote:XenOS wrote:I can write one that does deviate from these values and it will still be compliant with the C language specification.
-
This compiler will be original but unpractical.
It is only made unpractical by your own assumptions.
Brendan wrote:It might be reasonable to define a new language that is almost exactly the same a C, except for differences in the restrictions the formal specification gives to integer sizes.
While this is theoretically correct, it's anything but reasonable to define a new language as C with a particular ABI. Not to mention that it would only cause confusion.