Page 4 of 5

Re: which *int* ?

Posted: Tue Sep 15, 2009 7:26 pm
by earlz
NickJohnson wrote:
earlz wrote:
fronty wrote:When programming with C# you don't have to use System.Uint32 or Uint32 if you prefer uint. Int32 ^ int, Uint32 ^ uint, Int64 ^ long, and so on. And don't say that it's bad that those shorter names don't tell the actual width of variable, because when you program C#, you always know that your int is 32 bits wide. C# standard says that int represents 32-bit integral value. There are no other possibilities.
I've always hated microsofts decision to make 64bit programming *look* like 32bit programming.. it just ends up in 3 or 4 years, when 64bit arithmetic is slightly faster than 32bit arithmetic, Windows will be *even* slower than other OSs
Except that then they could just have the types actually be 64 bit, and things would act exactly the same, except for the memory usage. Of course, the entire system would have to be recompiled to update the library interfaces...
You can't just change a specification.

If they suddenly make .NET int's 64bit, I bet money it'll break at least 10 "common" applications

Re: which *int* ?

Posted: Wed Sep 16, 2009 4:24 pm
by spere
earlz wrote:If they suddenly make .NET int's 64bit, I bet money it'll break at least 10 "common" applications
This was the bit I wasn't clear about too, but apparenlty when they update the C# compilers, all ints will use 64 bits of space on a 64 bit machine instead of the normal 32. Now that going to break existing code ..... or may be not? :|
Looks like i've got some reading up to do

Re: which *int* ?

Posted: Wed Sep 16, 2009 4:30 pm
by ru2aqare
spere wrote:
earlz wrote:If they suddenly make .NET int's 64bit, I bet money it'll break at least 10 "common" applications
They wont. It is defined in the CLR standard (or some other related standard for that matter) that Int32 is 32 bits wide, Int64 is 64 bits wide. Noone can change that.

Re: which *int* ?

Posted: Wed Sep 16, 2009 6:08 pm
by AndrewAPrice
I'm surprised .Net didn't opt for an inbuilt arbitrary integer size, since it is used for desktop applications and what-not.

There is a System.Decimal that is a "1-bit sign, a 96-bit integer number, and a scaling factor used to divide the 96-bit integer and specify what portion of it is a decimal fraction. The scaling factor is implicitly the number 10, raised to an exponent ranging from 0 to 28" giving the range ((-2^-96 to 2^-96) / 10^(0 to 28)).

Re: which *int* ?

Posted: Thu Sep 17, 2009 12:51 pm
by Owen
It's a different matter for languages like Java and C# to fix "Int" to a size compared to C++. For a start, you're not ([ever {Java} | regularly {C#}]) dealing with pointers.

Also, with 64-bit longs, they happen to look like LP64 programming...

Re: which *int* ?

Posted: Mon Sep 21, 2009 3:05 pm
by Masterkiller
unsigned __int32 :wink: (still I am windows user...)

Re: which *int* ?

Posted: Mon Oct 26, 2009 4:19 pm
by smeezekitty
unsigned long because i am in real mode

Re: which *int* ?

Posted: Mon Oct 26, 2009 4:31 pm
by Solar
What has one to do with the other? I doubt you understood the question...

Re: which *int* ?

Posted: Mon Oct 26, 2009 4:40 pm
by neon
unsigned long because i am in real mode
C data types and the current processor mode has nothing to do with each other.

Re: which *int* ?

Posted: Mon Oct 26, 2009 4:42 pm
by smeezekitty
16 bit means int = 16bit
32bit means int = 32bit
so in 16bit you use long to get 32 bits

Re: which *int* ?

Posted: Mon Oct 26, 2009 4:51 pm
by neon
Data type size is irrelevant to the processor mode. The size of the data types depend on the generated code from the compiler, not what the processor mode is in.

Re: which *int* ?

Posted: Mon Oct 26, 2009 5:09 pm
by smeezekitty
i know.
i am using Turbo C++ that generates real mode code and uses 16 bit data types.

Re: which *int* ?

Posted: Mon Oct 26, 2009 7:25 pm
by Brynet-Inc
smeezekitty wrote:16 bit means int = 16bit
32bit means int = 32bit
so in 16bit you use long to get 32 bits
Seriously? this is specific to the compiler.. for example, very few 64-bit C compilers have 'int' as a 64-bit type.

As has been mentioned by others, please read topics before posting.

Re: which *int* ?

Posted: Tue Oct 27, 2009 2:08 am
by Solar
smeezekitty wrote:i am using Turbo C++ that generates real mode code and uses 16 bit data types.
And that might be handled different by other compilers, and you never know when you might have to change (or update...) compilers.

Thus, people got the idea to use a typedef to define a type that is known to be of a certain width for the current compiler, and use that type consistently wherever width matters.

The C99 standard uses int32_t for this matter. 55 out of 62 voters use this, or some other define. 7 voted they don't use a define, which is OK in a way. But your comment showed that you are unaware of the whole issue, which does not reflect well on your overall experience with the language.

Re: which *int* ?

Posted: Tue Oct 27, 2009 4:27 am
by qw
Solar wrote:
smeezekitty wrote:i am using Turbo C++ that generates real mode code and uses 16 bit data types.
And that might be handled different by other compilers, and you never know when you might have to change (or update...) compilers.
Come on guys, is there a single C compiler in the world that produces 16 bits real mode code and does not define long as 32 bits? Smeezekitty apparently doesn't care about scalability or portability, but IMHO that's exactly the point of this topic.