Question about which tools to use, bugs, the best way to implement a function, etc should go here. Don't forget to see if your question is answered in the wiki first! When in doubt post here.
So according to this, everything should be fine if the machine is Little Endian?
GNU Online Docs wrote:— Macro: BITS_BIG_ENDIAN
Define this macro to have the value 1 if the most significant bit in a byte has the lowest number; otherwise define it to have the value zero. This means that bit-field instructions count from the most significant bit. If the machine has no bit-field instructions, then this must still be defined, but it doesn't matter which value it is defined to. This macro need not be a constant.
This macro does not affect the way structure fields are packed into bytes or words; that is controlled by BYTES_BIG_ENDIAN.
— Macro: BYTES_BIG_ENDIAN
Define this macro to have the value 1 if the most significant byte in a word has the lowest number. This macro need not be a constant.
I would not recommend using compiler-specific MACRO to workaround issue arise from undefined behavior on the specification...
There is nothing wrong with method one (byte array), or file with serializer.
When you pick an ABI for your OS (which is probably derived from one of the above), you decide what the bitfield packing is. For CPU structures, it is very much portable behavior; for device structs, somewhat less.
We weren't ignoring it; implementation-defined means that the implementation must consistently do something and that something is specified by its ABI. I believe the point was to minimize ABI dependence.
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]