Like everything in programming, typedef's are best handled in limited dosage.
There is no sense in typedef'ing an integer simply to use a different name. Actually, that's the worst thing you can do with typedef's, because you are deliberately hiding useful information (what an object actually
is). As nice as
Code: Select all
typedef std::vector< std::string > StringVector;
might be at first glance, it's confusing because the other guy has to look up StringVector first if he wants to be sure what it is. And it gets positively evil with code like this:
(Actual in-the-wild code I've encountered.) Now the other guy has to do
two lookups to determine what he can and can't do with a TableRow object...
(All this isn't quite as bad with mere integers, but you get the idea.)
Where they
do come in handy is when you want to
add information. Integer sizes differ between platforms, but sometimes you need an integer that is
exactly 32 bits wide. Code like:
or, respectively,
or whatever works for your platform, allows you to use int32_t throughout the rest of the code, with the typedef'ed name adding information that a mere 'int' or 'short' cannot convey. You'd still be using 'int' for loop counters or 'size_t' for array indices, because that's what they're for, but you'd use int32_t where the
exact width is important.
Later, if you switch platforms, all you have to do is changing the typedef in
one location. (Or, of course, use <stdint.h> which already does this for you.)
At the end,
integer typedef's are usually unnecessary, and the few cases where they are really useful are mostly covered by the language standard already.
Oh, they're useful in one other regard: Function pointers. Their syntax still dazzles me, and I much prefer looking up the correct syntax
once and use a typedef for the rest of the code.
As for your conversion warning regarding puts( uint8_t * s ), that's one of the places where you're using a typedef for the wrong thing. puts() takes a char pointer, not a pointer to an unsigned integer of exactly 8 bits length. It's absolutely OK that the compiler buggers you about that...